Sample records for existing testing methods

  1. Four applications of permutation methods to testing a single-mediator model.

    PubMed

    Taylor, Aaron B; MacKinnon, David P

    2012-09-01

    Four applications of permutation tests to the single-mediator model are described and evaluated in this study. Permutation tests work by rearranging data in many possible ways in order to estimate the sampling distribution for the test statistic. The four applications to mediation evaluated here are the permutation test of ab, the permutation joint significance test, and the noniterative and iterative permutation confidence intervals for ab. A Monte Carlo simulation study was used to compare these four tests with the four best available tests for mediation found in previous research: the joint significance test, the distribution of the product test, and the percentile and bias-corrected bootstrap tests. We compared the different methods on Type I error, power, and confidence interval coverage. The noniterative permutation confidence interval for ab was the best performer among the new methods. It successfully controlled Type I error, had power nearly as good as the most powerful existing methods, and had better coverage than any existing method. The iterative permutation confidence interval for ab had lower power than do some existing methods, but it performed better than any other method in terms of coverage. The permutation confidence interval methods are recommended when estimating a confidence interval is a primary concern. SPSS and SAS macros that estimate these confidence intervals are provided.

  2. Generalized disequilibrium test for association in qualitative traits incorporating imprinting effects based on extended pedigrees.

    PubMed

    Li, Jian-Long; Wang, Peng; Fung, Wing Kam; Zhou, Ji-Yuan

    2017-10-16

    For dichotomous traits, the generalized disequilibrium test with the moment estimate of the variance (GDT-ME) is a powerful family-based association method. Genomic imprinting is an important epigenetic phenomenon and currently, there has been increasing interest of incorporating imprinting to improve the test power of association analysis. However, GDT-ME does not take imprinting effects into account, and it has not been investigated whether it can be used for association analysis when the effects indeed exist. In this article, based on a novel decomposition of the genotype score according to the paternal or maternal source of the allele, we propose the generalized disequilibrium test with imprinting (GDTI) for complete pedigrees without any missing genotypes. Then, we extend GDTI and GDT-ME to accommodate incomplete pedigrees with some pedigrees having missing genotypes, by using a Monte Carlo (MC) sampling and estimation scheme to infer missing genotypes given available genotypes in each pedigree, denoted by MCGDTI and MCGDT-ME, respectively. The proposed GDTI and MCGDTI methods evaluate the differences of the paternal as well as maternal allele scores for all discordant relative pairs in a pedigree, including beyond first-degree relative pairs. Advantages of the proposed GDTI and MCGDTI test statistics over existing methods are demonstrated by simulation studies under various simulation settings and by application to the rheumatoid arthritis dataset. Simulation results show that the proposed tests control the size well under the null hypothesis of no association, and outperform the existing methods under various imprinting effect models. The existing GDT-ME and the proposed MCGDT-ME can be used to test for association even when imprinting effects exist. For the application to the rheumatoid arthritis data, compared to the existing methods, MCGDTI identifies more loci statistically significantly associated with the disease. Under complete and incomplete imprinting effect models, our proposed GDTI and MCGDTI methods, by considering the information on imprinting effects and all discordant relative pairs within each pedigree, outperform all the existing test statistics and MCGDTI can recapture much of the missing information. Therefore, MCGDTI is recommended in practice.

  3. Comparison of AASHTO moisture sensitivity test (T-283) with Connecticut Department of Transportation modified test method

    DOT National Transportation Integrated Search

    1999-08-01

    Several different interpretations of the American Association of State Highway and Transportation Officials' (AASHTO's) Moisture Sensitivity Test exist. The official AASHTO interpretation of this test method does not account for water which has been ...

  4. Method for Smoke Spread Testing of Large Premises

    NASA Astrophysics Data System (ADS)

    Walmerdahl, P.; Werling, P.

    2001-11-01

    A method for performing non-destructive smoke spread tests has been developed, tested and applied to several existing buildings. Burning methanol in different size steel trays cooled by water generates the heat source. Several tray sizes are available to cover fire sources up to nearly 1MW. The smoke is supplied by means of a suitable number of smoke generators that produce a smoke, which can be described as a non-toxic aerosol. The advantage of the method is that it provides a means for performing non-destructive tests in already existing buildings and other installations for the purpose of evaluating the functionality and design of the active fire protection measures such as smoke extraction systems, etc. In the report, the method is described in detail and experimental data from the try-out of the method are also presented in addition to a discussion on applicability and flexibility of the method.

  5. Accelerated Dynamic Corrosion Test Method Development

    DTIC Science & Technology

    test method has poor correlation to outdoor exposures, particularly for non-chromate primers. As a result, more realistic cyclic environmental...exposures have been developed to more closely resemble actual atmospheric corrosion damage. Several existing tests correlate well with the outdoor performance

  6. CLT and AE methods of in-situ load testing : comparison and development of evaluation criteria : in-situ evaluation of post-tensioned parking garage, Kansas City, Missouri

    DOT National Transportation Integrated Search

    2008-02-01

    The objective of the proposed research project is to compare the results of two recently introduced nondestructive load test methods to the existing 24-hour load test method described in Chapter 20 of ACI 318-05. The two new methods of nondestructive...

  7. An Adaptive Association Test for Multiple Phenotypes with GWAS Summary Statistics.

    PubMed

    Kim, Junghi; Bai, Yun; Pan, Wei

    2015-12-01

    We study the problem of testing for single marker-multiple phenotype associations based on genome-wide association study (GWAS) summary statistics without access to individual-level genotype and phenotype data. For most published GWASs, because obtaining summary data is substantially easier than accessing individual-level phenotype and genotype data, while often multiple correlated traits have been collected, the problem studied here has become increasingly important. We propose a powerful adaptive test and compare its performance with some existing tests. We illustrate its applications to analyses of a meta-analyzed GWAS dataset with three blood lipid traits and another with sex-stratified anthropometric traits, and further demonstrate its potential power gain over some existing methods through realistic simulation studies. We start from the situation with only one set of (possibly meta-analyzed) genome-wide summary statistics, then extend the method to meta-analysis of multiple sets of genome-wide summary statistics, each from one GWAS. We expect the proposed test to be useful in practice as more powerful than or complementary to existing methods. © 2015 WILEY PERIODICALS, INC.

  8. 26 CFR 1.401(a)(4)-0 - Table of contents.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) Composition-of-work-force method. (3) Minimum-benefit method. (4) Grandfather rules for plans in existence on... allocation rates. (3) Safe harbor testing method for cash balance plans. (d) Safe-harbor testing method for...-crediting period. (e) Family aggregation rules. [Reserved] (f) Governmental plans. [Reserved] (g) Corrective...

  9. Bayesian Methods for Determining the Importance of Effects

    USDA-ARS?s Scientific Manuscript database

    Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...

  10. Novel Ultrasound Joint Selection Methods Using a Reduced Joint Number Demonstrate Inflammatory Improvement when Compared to Existing Methods and Disease Activity Score at 28 Joints.

    PubMed

    Tan, York Kiat; Allen, John C; Lye, Weng Kit; Conaghan, Philip G; D'Agostino, Maria Antonietta; Chew, Li-Ching; Thumboo, Julian

    2016-01-01

    A pilot study testing novel ultrasound (US) joint-selection methods in rheumatoid arthritis. Responsiveness of novel [individualized US (IUS) and individualized composite US (ICUS)] methods were compared with existing US methods and the Disease Activity Score at 28 joints (DAS28) for 12 patients followed for 3 months. IUS selected up to 7 and 12 most ultrasonographically inflamed joints, while ICUS additionally incorporated clinically symptomatic joints. The existing, IUS, and ICUS methods' standardized response means were -0.39, -1.08, and -1.11, respectively, for 7 joints; -0.49, -1.00, and -1.16, respectively, for 12 joints; and -0.94 for DAS28. Novel methods effectively demonstrate inflammatory improvement when compared with existing methods and DAS28.

  11. Study on the system-level test method of digital metering in smart substation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang; Yang, Min; Hu, Juan; Li, Fuchao; Luo, Ruixi; Li, Jinsong; Ai, Bing

    2017-03-01

    Nowadays, the test methods of digital metering system in smart substation are used to test and evaluate the performance of a single device, but these methods can only effectively guarantee the accuracy and reliability of the measurement results of a digital metering device in a single run, it does not completely reflect the performance when each device constitutes a complete system. This paper introduced the shortages of the existing test methods. A system-level test method of digital metering in smart substation was proposed, and the feasibility of the method was proved by the actual test.

  12. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  13. Testing for association with multiple traits in generalized estimation equations, with application to neuroimaging data.

    PubMed

    Zhang, Yiwei; Xu, Zhiyuan; Shen, Xiaotong; Pan, Wei

    2014-08-01

    There is an increasing need to develop and apply powerful statistical tests to detect multiple traits-single locus associations, as arising from neuroimaging genetics and other studies. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI), in addition to genome-wide single nucleotide polymorphisms (SNPs), thousands of neuroimaging and neuropsychological phenotypes as intermediate phenotypes for Alzheimer's disease, have been collected. Although some classic methods like MANOVA and newly proposed methods may be applied, they have their own limitations. For example, MANOVA cannot be applied to binary and other discrete traits. In addition, the relationships among these methods are not well understood. Importantly, since these tests are not data adaptive, depending on the unknown association patterns among multiple traits and between multiple traits and a locus, these tests may or may not be powerful. In this paper we propose a class of data-adaptive weights and the corresponding weighted tests in the general framework of generalized estimation equations (GEE). A highly adaptive test is proposed to select the most powerful one from this class of the weighted tests so that it can maintain high power across a wide range of situations. Our proposed tests are applicable to various types of traits with or without covariates. Importantly, we also analytically show relationships among some existing and our proposed tests, indicating that many existing tests are special cases of our proposed tests. Extensive simulation studies were conducted to compare and contrast the power properties of various existing and our new methods. Finally, we applied the methods to an ADNI dataset to illustrate the performance of the methods. We conclude with the recommendation for the use of the GEE-based Score test and our proposed adaptive test for their high and complementary performance. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Application of Bayesian Methods for Detecting Fraudulent Behavior on Tests

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2018-01-01

    Producers and consumers of test scores are increasingly concerned about fraudulent behavior before and during the test. There exist several statistical or psychometric methods for detecting fraudulent behavior on tests. This paper provides a review of the Bayesian approaches among them. Four hitherto-unpublished real data examples are provided to…

  15. Current limitations and recommendations to improve testing for the environmental assessment of endocrine active substances

    USGS Publications Warehouse

    Coady, Katherine K.; Biever, Ronald C.; Denslow, Nancy D.; Gross, Melanie; Guiney, Patrick D.; Holbech, Henrik; Karouna-Renier, Natalie K.; Katsiadaki, Ioanna; Krueger, Hank; Levine, Steven L.; Maack, Gerd; Williams, Mike; Wolf, Jeffrey C.; Ankley, Gerald T.

    2017-01-01

    In the present study, existing regulatory frameworks and test systems for assessing potential endocrine active chemicals are described, and associated challenges are discussed, along with proposed approaches to address these challenges. Regulatory frameworks vary somewhat across geographies, but all basically evaluate whether a chemical possesses endocrine activity and whether this activity can result in adverse outcomes either to humans or to the environment. Current test systems include in silico, in vitro, and in vivo techniques focused on detecting potential endocrine activity, and in vivo tests that collect apical data to detect possible adverse effects. These test systems are currently designed to robustly assess endocrine activity and/or adverse effects in the estrogen, androgen, and thyroid hormone signaling pathways; however, there are some limitations of current test systems for evaluating endocrine hazard and risk. These limitations include a lack of certainty regarding: 1) adequately sensitive species and life stages; 2) mechanistic endpoints that are diagnostic for endocrine pathways of concern; and 3) the linkage between mechanistic responses and apical, adverse outcomes. Furthermore, some existing test methods are resource intensive with regard to time, cost, and use of animals. However, based on recent experiences, there are opportunities to improve approaches to and guidance for existing test methods and to reduce uncertainty. For example, in vitro high-throughput screening could be used to prioritize chemicals for testing and provide insights as to the most appropriate assays for characterizing hazard and risk. Other recommendations include adding endpoints for elucidating connections between mechanistic effects and adverse outcomes, identifying potentially sensitive taxa for which test methods currently do not exist, and addressing key endocrine pathways of possible concern in addition to those associated with estrogen, androgen, and thyroid signaling. 

  16. An efficient genome-wide association test for multivariate phenotypes based on the Fisher combination function.

    PubMed

    Yang, James J; Li, Jia; Williams, L Keoki; Buu, Anne

    2016-01-05

    In genome-wide association studies (GWAS) for complex diseases, the association between a SNP and each phenotype is usually weak. Combining multiple related phenotypic traits can increase the power of gene search and thus is a practically important area that requires methodology work. This study provides a comprehensive review of existing methods for conducting GWAS on complex diseases with multiple phenotypes including the multivariate analysis of variance (MANOVA), the principal component analysis (PCA), the generalizing estimating equations (GEE), the trait-based association test involving the extended Simes procedure (TATES), and the classical Fisher combination test. We propose a new method that relaxes the unrealistic independence assumption of the classical Fisher combination test and is computationally efficient. To demonstrate applications of the proposed method, we also present the results of statistical analysis on the Study of Addiction: Genetics and Environment (SAGE) data. Our simulation study shows that the proposed method has higher power than existing methods while controlling for the type I error rate. The GEE and the classical Fisher combination test, on the other hand, do not control the type I error rate and thus are not recommended. In general, the power of the competing methods decreases as the correlation between phenotypes increases. All the methods tend to have lower power when the multivariate phenotypes come from long tailed distributions. The real data analysis also demonstrates that the proposed method allows us to compare the marginal results with the multivariate results and specify which SNPs are specific to a particular phenotype or contribute to the common construct. The proposed method outperforms existing methods in most settings and also has great applications in GWAS on complex diseases with multiple phenotypes such as the substance abuse disorders.

  17. Accelerated Colorimetric Micro-assay for Screening Mold Inhibitors

    Treesearch

    Carol A. Clausen; Vina W. Yang

    2014-01-01

    Rapid quantitative laboratory test methods are needed to screen potential antifungal agents. Existing laboratory test methods are relatively time consuming, may require specialized test equipment and rely on subjective visual ratings. A quantitative, colorimetric micro-assay has been developed that uses XTT tetrazolium salt to metabolically assess mold spore...

  18. Development of wheelchair caster testing equipment and preliminary testing of caster models

    PubMed Central

    Mhatre, Anand; Ott, Joseph

    2017-01-01

    Background Because of the adverse environmental conditions present in less-resourced environments (LREs), the World Health Organization (WHO) has recommended that specialised wheelchair test methods may need to be developed to support product quality standards in these environments. A group of experts identified caster test methods as a high priority because of their common failure in LREs, and the insufficiency of existing test methods described in the International Organization for Standardization (ISO) Wheelchair Testing Standards (ISO 7176). Objectives To develop and demonstrate the feasibility of a caster system test method. Method Background literature and expert opinions were collected to identify existing caster test methods, caster failures common in LREs and environmental conditions present in LREs. Several conceptual designs for the caster testing method were developed, and through an iterative process using expert feedback, a final concept and a design were developed and a prototype was fabricated. Feasibility tests were conducted by testing a series of caster systems from wheelchairs used in LREs, and failure modes were recorded and compared to anecdotal reports about field failures. Results The new caster testing system was developed and it provides the flexibility to expose caster systems to typical conditions in LREs. Caster failures such as stem bolt fractures, fork fractures, bearing failures and tire cracking occurred during testing trials and are consistent with field failures. Conclusion The new caster test system has the capability to incorporate necessary test factors that degrade caster quality in LREs. Future work includes developing and validating a testing protocol that results in failure modes common during wheelchair use in LRE. PMID:29062762

  19. Multiple Testing of Gene Sets from Gene Ontology: Possibilities and Pitfalls.

    PubMed

    Meijer, Rosa J; Goeman, Jelle J

    2016-09-01

    The use of multiple testing procedures in the context of gene-set testing is an important but relatively underexposed topic. If a multiple testing method is used, this is usually a standard familywise error rate (FWER) or false discovery rate (FDR) controlling procedure in which the logical relationships that exist between the different (self-contained) hypotheses are not taken into account. Taking those relationships into account, however, can lead to more powerful variants of existing multiple testing procedures and can make summarizing and interpreting the final results easier. We will show that, from the perspective of interpretation as well as from the perspective of power improvement, FWER controlling methods are more suitable than FDR controlling methods. As an example of a possible power improvement, we suggest a modified version of the popular method by Holm, which we also implemented in the R package cherry. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. Effect of Blast-Induced Vibration from New Railway Tunnel on Existing Adjacent Railway Tunnel in Xinjiang, China

    NASA Astrophysics Data System (ADS)

    Liang, Qingguo; Li, Jie; Li, Dewu; Ou, Erfeng

    2013-01-01

    The vibrations of existing service tunnels induced by blast-excavation of adjacent tunnels have attracted much attention from both academics and engineers during recent decades in China. The blasting vibration velocity (BVV) is the most widely used controlling index for in situ monitoring and safety assessment of existing lining structures. Although numerous in situ tests and simulations had been carried out to investigate blast-induced vibrations of existing tunnels due to excavation of new tunnels (mostly by bench excavation method), research on the overall dynamical response of existing service tunnels in terms of not only BVV but also stress/strain seemed limited for new tunnels excavated by the full-section blasting method. In this paper, the impacts of blast-induced vibrations from a new tunnel on an existing railway tunnel in Xinjiang, China were comprehensively investigated by using laboratory tests, in situ monitoring and numerical simulations. The measured data from laboratory tests and in situ monitoring were used to determine the parameters needed for numerical simulations, and were compared with the calculated results. Based on the results from in situ monitoring and numerical simulations, which were consistent with each other, the original blasting design and corresponding parameters were adjusted to reduce the maximum BVV, which proved to be effective and safe. The effect of both the static stress before blasting vibrations and the dynamic stress induced by blasting on the total stresses in the existing tunnel lining is also discussed. The methods and related results presented could be applied in projects with similar ground and distance between old and new tunnels if the new tunnel is to be excavated by the full-section blasting method.

  1. Comparison of bulk sediment and sediment elutriate toxicity testing methods

    EPA Science Inventory

    Elutriate bioassays are among numerous methods that exist for assessing the potential toxicity of sediments in aquatic systems. In this study, interlaboratory results were compared from 96-hour Ceriodaphnia dubia and Pimephales promelas static-renewal acute toxicity tests conduct...

  2. A Probability Based Framework for Testing the Missing Data Mechanism

    ERIC Educational Resources Information Center

    Lin, Johnny Cheng-Han

    2013-01-01

    Many methods exist for imputing missing data but fewer methods have been proposed to test the missing data mechanism. Little (1988) introduced a multivariate chi-square test for the missing completely at random data mechanism (MCAR) that compares observed means for each pattern with expectation-maximization (EM) estimated means. As an alternative,…

  3. Niépce-Bell or Turing: how to test odour reproduction.

    PubMed

    Harel, David

    2016-12-01

    Decades before the existence of anything resembling an artificial intelligence system, Alan Turing raised the question of how to test whether machines can think, or, in modern terminology, whether a computer claimed to exhibit intelligence indeed does so. This paper raises the analogous issue for olfaction: how to test the validity of a system claimed to reproduce arbitrary odours artificially, in a way recognizable to humans. Although odour reproduction systems are still far from being viable, the question of how to test candidates thereof is claimed to be interesting and non-trivial, and a novel method is proposed. Despite the similarity between the two questions and their surfacing long before the tested systems exist, the present question cannot be answered adequately by a Turing-like method. Instead, our test is very different: it is conditional, requiring from the artificial no more than is required from the original, and it employs a novel method of immersion that takes advantage of the availability of easily recognizable reproduction methods for sight and sound, a la Nicéphore Niépce and Alexander Graham Bell. © 2016 The Authors.

  4. Niépce–Bell or Turing: how to test odour reproduction

    PubMed Central

    2016-01-01

    Decades before the existence of anything resembling an artificial intelligence system, Alan Turing raised the question of how to test whether machines can think, or, in modern terminology, whether a computer claimed to exhibit intelligence indeed does so. This paper raises the analogous issue for olfaction: how to test the validity of a system claimed to reproduce arbitrary odours artificially, in a way recognizable to humans. Although odour reproduction systems are still far from being viable, the question of how to test candidates thereof is claimed to be interesting and non-trivial, and a novel method is proposed. Despite the similarity between the two questions and their surfacing long before the tested systems exist, the present question cannot be answered adequately by a Turing-like method. Instead, our test is very different: it is conditional, requiring from the artificial no more than is required from the original, and it employs a novel method of immersion that takes advantage of the availability of easily recognizable reproduction methods for sight and sound, a la Nicéphore Niépce and Alexander Graham Bell. PMID:28003527

  5. A simplified analytic form for generation of axisymmetric plasma boundaries

    DOE PAGES

    Luce, Timothy C.

    2017-02-23

    An improved method has been formulated for generating analytic boundary shapes as input for axisymmetric MHD equilibria. This method uses the family of superellipses as the basis function, as previously introduced. The improvements are a simplified notation, reduction of the number of simultaneous nonlinear equations to be solved, and the realization that not all combinations of input parameters admit a solution to the nonlinear constraint equations. The method tests for the existence of a self-consistent solution and, when no solution exists, it uses a deterministic method to find a nearby solution. As a result, examples of generation of boundaries, includingmore » tests with an equilibrium solver, are given.« less

  6. A simplified analytic form for generation of axisymmetric plasma boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luce, Timothy C.

    An improved method has been formulated for generating analytic boundary shapes as input for axisymmetric MHD equilibria. This method uses the family of superellipses as the basis function, as previously introduced. The improvements are a simplified notation, reduction of the number of simultaneous nonlinear equations to be solved, and the realization that not all combinations of input parameters admit a solution to the nonlinear constraint equations. The method tests for the existence of a self-consistent solution and, when no solution exists, it uses a deterministic method to find a nearby solution. As a result, examples of generation of boundaries, includingmore » tests with an equilibrium solver, are given.« less

  7. Construction of Expert Knowledge Monitoring and Assessment System Based on Integral Method of Knowledge Evaluation

    ERIC Educational Resources Information Center

    Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D.

    2016-01-01

    Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…

  8. [Implication of inverse-probability weighting method in the evaluation of diagnostic test with verification bias].

    PubMed

    Kang, Leni; Zhang, Shaokai; Zhao, Fanghui; Qiao, Youlin

    2014-03-01

    To evaluate and adjust the verification bias existed in the screening or diagnostic tests. Inverse-probability weighting method was used to adjust the sensitivity and specificity of the diagnostic tests, with an example of cervical cancer screening used to introduce the Compare Tests package in R software which could be implemented. Sensitivity and specificity calculated from the traditional method and maximum likelihood estimation method were compared to the results from Inverse-probability weighting method in the random-sampled example. The true sensitivity and specificity of the HPV self-sampling test were 83.53% (95%CI:74.23-89.93)and 85.86% (95%CI: 84.23-87.36). In the analysis of data with randomly missing verification by gold standard, the sensitivity and specificity calculated by traditional method were 90.48% (95%CI:80.74-95.56)and 71.96% (95%CI:68.71-75.00), respectively. The adjusted sensitivity and specificity under the use of Inverse-probability weighting method were 82.25% (95% CI:63.11-92.62) and 85.80% (95% CI: 85.09-86.47), respectively, whereas they were 80.13% (95%CI:66.81-93.46)and 85.80% (95%CI: 84.20-87.41) under the maximum likelihood estimation method. The inverse-probability weighting method could effectively adjust the sensitivity and specificity of a diagnostic test when verification bias existed, especially when complex sampling appeared.

  9. Approximations to the distribution of a test statistic in covariance structure analysis: A comprehensive study.

    PubMed

    Wu, Hao

    2018-05-01

    In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.

  10. Delay test generation for synchronous sequential circuits

    NASA Astrophysics Data System (ADS)

    Devadas, Srinivas

    1989-05-01

    We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.

  11. An interlaboratory comparison of sediment elutriate preparation and toxicity test methods

    EPA Science Inventory

    Elutriate bioassays are among numerous methods that exist for assessing the potential toxicity of sediments in aquatic systems. In this study, interlaboratory results were compared from 96-hour Ceriodaphnia dubia and Pimephales promelas static-renewal acute toxicity tests conduct...

  12. The Development of Testing Methods for Characterizing Emissions and Sources of Exposures from Polyurethane Products

    EPA Science Inventory

    The relationship between onsite manufacture of spray polyurethane foam insulation (SPFI) and potential exposures is not well understood. Currently, no comprehensive standard test methods exist for characterizing and quantifying product emissions. Exposures to diisocyanate compoun...

  13. Aerodynamics and performance verifications of test methods for laboratory fume cupboards.

    PubMed

    Tseng, Li-Ching; Huang, Rong Fung; Chen, Chih-Chieh; Chang, Cheng-Ping

    2007-03-01

    The laser-light-sheet-assisted smoke flow visualization technique is performed on a full-size, transparent, commercial grade chemical fume cupboard to diagnose the flow characteristics and to verify the validity of several current containment test methods. The visualized flow patterns identify the recirculation areas that would inevitably exist in the conventional fume cupboards because of the fundamental configurations and structures. The large-scale vortex structures exist around the side walls, the doorsill of the cupboard and in the vicinity of the near-wake region of the manikin. The identified recirculation areas are taken as the 'dangerous' regions where the risk of turbulent dispersion of contaminants may be high. Several existing tracer gas containment test methods (BS 7258:1994, prEN 14175-3:2003 and ANSI/ASHRAE 110:1995) are conducted to verify the effectiveness of these methods in detecting the contaminant leakage. By comparing the results of the flow visualization and the tracer gas tests, it is found that the local recirculation regions are more prone to contaminant leakage because of the complex interaction between the shear layers and the smoke movement through the mechanism of turbulent dispersion. From the point of view of aerodynamics, the present study verifies that the methodology of the prEN 14175-3:2003 protocol can produce more reliable and consistent results because it is based on the region-by-region measurement and encompasses the most area of the entire recirculation zone of the cupboard. A modified test method combined with the region-by-region approach at the presence of the manikin shows substantially different results of the containment. A better performance test method which can describe an operator's exposure and the correlation between flow characteristics and the contaminant leakage properties is therefore suggested.

  14. Overview of a workshop on screening methods for detecting potential (anti-) estrogenic/androgenic chemicals in wildlife

    USGS Publications Warehouse

    Ankley, Gerald T.; Mihaich, Ellen; Stahl, Ralph G.; Tillitt, Donald E.; Colborn, Theo; McMaster, Suzzanne; Miller, Ron; Bantle, John; Campbell, Pamela; Denslow, Nancy; Dickerson, Richard L.; Folmar, Leroy C.; Fry, Michael; Giesy, John P.; Gray, L. Earl; Guiney, Patrick; Hutchinson, Thomas; Kennedy, Sean W.; Kramer, Vincent; LeBlanc, Gerald A.; Mayes, Monte; Nimrod, Alison; Patino, Reynaldo; Peterson, Richard; Purdy, Richard; Ringer, Robert; Thomas, Peter C.; Touart, Les; Van Der Kraak, Glen; Zacharewski, Tim

    1998-01-01

    The U.S. Congress has passed legislation requiring the U.S. Environmental Protection Agency (U.S. EPA) to develop, validate, and implement screening tests for identifying potential endocrine-disrupting chemicals within 3 years. To aid in the identification of methods suitable for this purpose, the U.S. EPA, the Chemical Manufacturers Association, and the World Wildlife Fund sponsored several workshops, including the present one, which dealt with wildlife species. This workshop was convened with 30 international scientists representing multiple disciplines in March 1997 in Kansas City, Missouri, USA. Participants at the meeting identified methods in terms of their ability to indicate (anti-) estrogenic/androgenic effects, particularly in the context of developmental and reproductive processes. Data derived from structure-activity relationship models and in vitro test systems, although useful in certain contexts, cannot at present replace in vivo tests as the sole basis for screening. A consensus was reached that existing mammalian test methods (e.g., with rats or mice) generally are suitable as screens for assessing potential (anti-) estrogenic/ androgenic effects in mammalian wildlife. However, due to factors such as among-class variation in receptor structure and endocrine function, it is uncertain if these mammalian assays would be of broad utility as screens for other classes of vertebrate wildlife. Existing full and partial life-cycle tests with some avian and fish species could successfully identify chemicals causing endocrine disruption; however, these long-term tests are not suitable for routine screening. However, a number of short-term tests with species from these two classes exist that could serve as effective screening tools for chemicals inducing (anti-) estrogenic/androgenic effects. Existing methods suitable for identifying chemicals with these mechanisms of action in reptiles and amphibians are limited, but in the future, tests with species from these classes may prove highly effective as screens. In the case of invertebrate species, too little is known at present about the biological role of estrogens and androgens in reproduction and development to recommend specific assays.

  15. Nondestructive equipment study

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Identification of existing nondestructive Evaluation (NDE) methods that could be used in a low Earth orbit environment; evaluation of each method with respect to the set of criteria called out in the statement of work; selection of the most promising NDE methods for further evaluation; use of selected NDE methods to test samples of pressure vessel materials in a vacuum; pressure testing of a complex monolythic pressure vessel with known flaws using acoustic emissions in a vacuum; and recommendations for further studies based on analysis and testing are covered.

  16. A basic guide to overlay design using nondestructive testing equipment data

    NASA Astrophysics Data System (ADS)

    Turner, Vernon R.

    1990-08-01

    The purpose of this paper is to provide a basic and concise guide to designing asphalt concrete (AC) overlays over existing AC pavements. The basis for these designs is deflection data obtained from nondestructive testing (NDT) equipment. This data is used in design procedures which produce required overlay thickness or an estimate of remaining pavement life. This guide enables one to design overlays or better monitor the designs being performed by others. This paper will discuss three types of NDT equipment, the Asphalt Institute Overlay Designs by Deflection Analysis and by the effective thickness method as well as a method of estimating remaining pavement life, correlations between NDT equipment and recent correlations in Washington State. Asphalt overlays provide one of the most cost effective methods of improving existing pavements. Asphalt overlays can be used to strengthen existing pavements, to reduce maintenance costs, to increase pavement life, to provide a smoother ride, and to improve skid resistance.

  17. Optimizing the performance of the amphipod, Hyalella azteca, in chronic toxicity tests: Results of feeding studies with various foods and feeding regimes

    EPA Science Inventory

    The freshwater amphipod, Hyalella azteca, is a common organism used for sediment toxicity testing. Standard methods for 10-d and 42-d sediment toxicity tests with H. azteca were last revised and published by USEPA/ASTM in 2000. While Hyalella azteca methods exist for sediment tox...

  18. Dynamic Bayesian Networks as a Probabilistic Metamodel for Combat Simulations

    DTIC Science & Technology

    2014-09-18

    test is commonly used for large data sets and is the method of comparison presented in Section 5.5. 4.3.3 Kullback - Leibler Divergence Goodness of Fit ...methods exist that might improve the results. A goodness of fit test using the Kullback - Leibler Divergence was proposed in the first paper, but still... Kullback - Leibler Divergence Goodness of Fit Test . . .

  19. Advanced bridge safety initiative : recommended practices for live load testing of existing flat-slab concrete bridges - task 5.

    DOT National Transportation Integrated Search

    2012-12-01

    Current AASHTO provisions for load rating flat-slab concrete bridges use the equivalent strip : width method, which is regarded as overly conservative compared to more advanced analysis : methods and field live load testing. It has been shown that li...

  20. Current limitations and recommendations to improve testing ...

    EPA Pesticide Factsheets

    In this paper existing regulatory frameworks and test systems for assessing potential endocrine-active chemicals are described, and associated challenges discussed, along with proposed approaches to address these challenges. Regulatory frameworks vary somewhat across organizations, but all basically evaluate whether a chemical possesses endocrine activity and whether this activity can result in adverse outcomes either to humans or the environment. Current test systems include in silico, in vitro and in vivo techniques focused on detecting potential endocrine activity, and in vivo tests that collect apical data to detect possible adverse effects. These test systems are currently designed to robustly assess endocrine activity and/or adverse effects in the estrogen, androgen, and thyroid hormonal pathways; however, there are some limitations of current test systems for evaluating endocrine hazard and risk. These limitations include a lack of certainty regarding: 1)adequately sensitive species and life-stages, 2) mechanistic endpoints that are diagnostic for endocrine pathways of concern, and 3) the linkage between mechanistic responses and apical, adverse outcomes. Furthermore, some existing test methods are resource intensive in regard to time, cost, and use of animals. However, based on recent experiences, there are opportunities to improve approaches to, and guidance for existing test methods, and to reduce uncertainty. For example, in vitro high throughput

  1. Corrosion performance tests for reinforcing steel in concrete : test procedures.

    DOT National Transportation Integrated Search

    2009-09-01

    The existing test method to assess the corrosion performance of reinforcing steel embedded in concrete, mainly : ASTM G109, is labor intensive, time consuming, slow to provide comparative results, and often expensive. : However, corrosion of reinforc...

  2. Estimating the Cost of Standardized Student Testing in the United States.

    ERIC Educational Resources Information Center

    Phelps, Richard P.

    2000-01-01

    Describes and contrasts different methods of estimating costs of standardized testing. Using a cost-accounting approach, compares gross and marginal costs and considers testing objects (test materials and services, personnel and student time, and administrative/building overhead). Social marginal costs of replacing existing tests with a national…

  3. Accuracy of p53 Codon 72 Polymorphism Status Determined by Multiple Laboratory Methods: A Latent Class Model Analysis

    PubMed Central

    Walter, Stephen D.; Riddell, Corinne A.; Rabachini, Tatiana; Villa, Luisa L.; Franco, Eduardo L.

    2013-01-01

    Introduction Studies on the association of a polymorphism in codon 72 of the p53 tumour suppressor gene (rs1042522) with cervical neoplasia have inconsistent results. While several methods for genotyping p53 exist, they vary in accuracy and are often discrepant. Methods We used latent class models (LCM) to examine the accuracy of six methods for p53 determination, all conducted by the same laboratory. We also examined the association of p53 with cytological cervical abnormalities, recognising potential test inaccuracy. Results Pairwise disagreement between laboratory methods occurred approximately 10% of the time. Given the estimated true p53 status of each woman, we found that each laboratory method is most likely to classify a woman to her correct status. Arg/Arg women had the highest risk of squamous intraepithelial lesions (SIL). Test accuracy was independent of cytology. There was no strong evidence for correlations of test errors. Discussion Empirical analyses ignore possible laboratory errors, and so are inherently biased, but test accuracy estimated by the LCM approach is unbiased when model assumptions are met. LCM analysis avoids ambiguities arising from empirical test discrepancies, obviating the need to regard any of the methods as a “gold” standard measurement. The methods we presented here to analyse the p53 data can be applied in many other situations where multiple tests exist, but where none of them is a gold standard. PMID:23441193

  4. Good Laboratory Practices of Materials Testing at NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Hirsch, David; Williams, James H.

    2005-01-01

    An approach to good laboratory practices of materials testing at NASA White Sands Test Facility is presented. The contents include: 1) Current approach; 2) Data analysis; and 3) Improvements sought by WSTF to enhance the diagnostic capability of existing methods.

  5. Testing for genetic association taking into account phenotypic information of relatives.

    PubMed

    Uh, Hae-Won; Wijk, Henk Jan van der; Houwing-Duistermaat, Jeanine J

    2009-12-15

    We investigated efficient case-control association analysis using family data. The outcome of interest was coronary heart disease. We employed existing and new methods that take into account the correlations among related individuals to obtain the proper type I error rates. The methods considered for autosomal single-nucleotide polymorphisms were: 1) generalized estimating equations-based methods, 2) variance-modified Cochran-Armitage (MCA) trend test incorporating kinship coefficients, and 3) genotypic modified quasi-likelihood score test. Additionally, for X-linked single-nucleotide polymorphisms we proposed a two-degrees-of-freedom test. Performance of these methods was tested using Framingham Heart Study 500 k array data.

  6. NEAT: an efficient network enrichment analysis test.

    PubMed

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-09-05

    Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).

  7. Correlation between microdilution, Etest, and disk diffusion methods for antifungal susceptibility testing of fluconazole against Candida sp. blood isolates.

    PubMed

    Menezes, Everardo Albuquerque; Vasconcelos Júnior, Antônio Alexandre de; Ângelo, Maria Rozzelê Ferreira; Cunha, Maria da Conceição dos Santos Oliveira; Cunha, Francisco Afrânio

    2013-01-01

    Antifungal susceptibility testing assists in finding the appropriate treatment for fungal infections, which are increasingly common. However, such testing is not very widespread. There are several existing methods, and the correlation between such methods was evaluated in this study. The susceptibility to fluconazole of 35 strains of Candida sp. isolated from blood cultures was evaluated by the following methods: microdilution, Etest, and disk diffusion. The correlation between the methods was around 90%. The disk diffusion test exhibited a good correlation and can be used in laboratory routines to detect strains of Candida sp. that are resistant to fluconazole.

  8. The Testing Methods and Gender Differences in Multiple-Choice Assessment

    NASA Astrophysics Data System (ADS)

    Ng, Annie W. Y.; Chan, Alan H. S.

    2009-10-01

    This paper provides a comprehensive review of the multiple-choice assessment in the past two decades for facilitating people to conduct effective testing in various subject areas. It was revealed that a variety of multiple-choice test methods viz. conventional multiple-choice, liberal multiple-choice, elimination testing, confidence marking, probability testing, and order-of-preference scheme are available for use in assessing subjects' knowledge and decision ability. However, the best multiple-choice test method for use has not yet been identified. The review also indicated that the existence of gender differences in multiple-choice task performance might be due to the test area, instruction/scoring condition, and item difficulty.

  9. QQ-SNV: single nucleotide variant detection at low frequency by comparing the quality quantiles.

    PubMed

    Van der Borght, Koen; Thys, Kim; Wetzels, Yves; Clement, Lieven; Verbist, Bie; Reumers, Joke; van Vlijmen, Herman; Aerssens, Jeroen

    2015-11-10

    Next generation sequencing enables studying heterogeneous populations of viral infections. When the sequencing is done at high coverage depth ("deep sequencing"), low frequency variants can be detected. Here we present QQ-SNV (http://sourceforge.net/projects/qqsnv), a logistic regression classifier model developed for the Illumina sequencing platforms that uses the quantiles of the quality scores, to distinguish true single nucleotide variants from sequencing errors based on the estimated SNV probability. To train the model, we created a dataset of an in silico mixture of five HIV-1 plasmids. Testing of our method in comparison to the existing methods LoFreq, ShoRAH, and V-Phaser 2 was performed on two HIV and four HCV plasmid mixture datasets and one influenza H1N1 clinical dataset. For default application of QQ-SNV, variants were called using a SNV probability cutoff of 0.5 (QQ-SNV(D)). To improve the sensitivity we used a SNV probability cutoff of 0.0001 (QQ-SNV(HS)). To also increase specificity, SNVs called were overruled when their frequency was below the 80(th) percentile calculated on the distribution of error frequencies (QQ-SNV(HS-P80)). When comparing QQ-SNV versus the other methods on the plasmid mixture test sets, QQ-SNV(D) performed similarly to the existing approaches. QQ-SNV(HS) was more sensitive on all test sets but with more false positives. QQ-SNV(HS-P80) was found to be the most accurate method over all test sets by balancing sensitivity and specificity. When applied to a paired-end HCV sequencing study, with lowest spiked-in true frequency of 0.5%, QQ-SNV(HS-P80) revealed a sensitivity of 100% (vs. 40-60% for the existing methods) and a specificity of 100% (vs. 98.0-99.7% for the existing methods). In addition, QQ-SNV required the least overall computation time to process the test sets. Finally, when testing on a clinical sample, four putative true variants with frequency below 0.5% were consistently detected by QQ-SNV(HS-P80) from different generations of Illumina sequencers. We developed and successfully evaluated a novel method, called QQ-SNV, for highly efficient single nucleotide variant calling on Illumina deep sequencing virology data.

  10. An Empirical Comparison of Methods for Equating with Randomly Equivalent Groups of 50 to 400 Test Takers. Research Report. ETS RR-10-05

    ERIC Educational Resources Information Center

    Livingston, Samuel A.; Kim, Sooyeon

    2010-01-01

    A series of resampling studies investigated the accuracy of equating by four different methods in a random groups equating design with samples of 400, 200, 100, and 50 test takers taking each form. Six pairs of forms were constructed. Each pair was constructed by assigning items from an existing test taken by 9,000 or more test takers. The…

  11. Operational methods of HIV testing in emergency departments: a systematic review.

    PubMed

    Haukoos, Jason S; White, Douglas A E; Lyons, Michael S; Hopkins, Emily; Calderon, Yvette; Kalish, Brian; Rothman, Richard E

    2011-07-01

    Casual review of existing literature reveals a multitude of individualized approaches to emergency department (ED) HIV testing. Cataloging the operational options of each approach could assist translation by disseminating existing knowledge, endorsing variability as a means to address testing barriers, and laying a foundation for future work in the area of operational models and outcomes investigation. The objective of this study is to provide a detailed account of the various models and operational constructs that have been described for performing HIV testing in EDs. Systematic review of PUBMED, EMBASE, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), and the Web of Science through February 6, 2009 was performed. Three investigators independently reviewed all potential abstracts and identified all studies that met the following criteria for inclusion: original research, performance of HIV testing in an ED in the United States, description of operational methods, and reporting of specific testing outcomes. Each study was independently assessed and data from each were abstracted with standardized instruments. Summary and pooled descriptive statistics were reported by using recently published nomenclature and definitions for ED HIV testing. The primary search yielded 947 potential studies, of which 25 (3%) were included in the final analysis. Of the 25 included studies, 13 (52%) reported results using nontargeted screening as the only patient selection method. Most programs reported using voluntary, opt-in consent and separate, signed consent forms. A variety of assays and communication methods were used, but relatively limited outcomes data were reported. Currently, limited evidence exists to inform HIV testing practices in EDs. There appears to be recent progression toward the use of rapid assays and nontargeted patient selection methods, with the rate at which reports are published in the peer-reviewed literature increasing. Additional research will be required, including controlled clinical trials, more structured program evaluation, and a focus on an expanded profile of outcome measures, to further improve our understanding of which HIV testing methods are most effective in the ED. Copyright © 2011. Published by Mosby, Inc.

  12. Valid statistical inference methods for a case-control study with missing data.

    PubMed

    Tian, Guo-Liang; Zhang, Chi; Jiang, Xuejun

    2018-04-01

    The main objective of this paper is to derive the valid sampling distribution of the observed counts in a case-control study with missing data under the assumption of missing at random by employing the conditional sampling method and the mechanism augmentation method. The proposed sampling distribution, called the case-control sampling distribution, can be used to calculate the standard errors of the maximum likelihood estimates of parameters via the Fisher information matrix and to generate independent samples for constructing small-sample bootstrap confidence intervals. Theoretical comparisons of the new case-control sampling distribution with two existing sampling distributions exhibit a large difference. Simulations are conducted to investigate the influence of the three different sampling distributions on statistical inferences. One finding is that the conclusion by the Wald test for testing independency under the two existing sampling distributions could be completely different (even contradictory) from the Wald test for testing the equality of the success probabilities in control/case groups under the proposed distribution. A real cervical cancer data set is used to illustrate the proposed statistical methods.

  13. A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants

    PubMed Central

    Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.

    2016-01-01

    Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286

  14. A STANDARDIZED ASSESSMENT METHOD (SAM) FOR RIVERINE MACROINVERTEBRATES

    EPA Science Inventory

    A macroinvertebrate sampling method for large rivers based on desirable characteristics of existing nonwadeable methods was developed and tested. Six sites each were sampled on the Great Miami and Kentucky Rivers, reflecting a human disturbance gradient. Samples were collected ...

  15. The water flea Daphnia magna (Crustacea, Cladocera) as a test species for screening and evaluation of chemicals with endocrine disrupting effects on crustaceans.

    PubMed

    Tatarazako, Norihisa; Oda, Shigeto

    2007-02-01

    The water flea Daphnia magna (Crustacea, Cladocera) is a cyclical parthenogen, which can reproduce both by parthenogenesis and by sexual reproduction. With its ease of handling in the laboratory, several testing methods using D. magna exist for regulatory toxicity testing. Recently, several studies revealed that one of the major hormone groups in insects and crustaceans, juvenile hormones, are involved in the shift of reproductive mode from parthenogenesis to sexual reproduction (production of male neonates). Using offspring sex ratio as a new endpoint has made it possible to identify chemicals with juvenile hormone-like effects on crustaceans. The testing method using D. magna, in which offspring sex ratio is incorporated as a new endpoint, is now being proposed to the OECD as an enhanced version of the existing OECD Test Guideline 211: Daphnia magna reproduction test. No other clear-cut endpoint for identifying juvenile-hormone disrupting effects has ever been found in crustaceans than the induction of male neonates production in cladocerans. In this regard, it is expected that testing methods using D. magna are suitable for screening and risk assessment of chemicals with juvenile-hormone disrupting effects.

  16. Tempest: Mesoscale test case suite results and the effect of order-of-accuracy on pressure gradient force errors

    NASA Astrophysics Data System (ADS)

    Guerra, J. E.; Ullrich, P. A.

    2014-12-01

    Tempest is a new non-hydrostatic atmospheric modeling framework that allows for investigation and intercomparison of high-order numerical methods. It is composed of a dynamical core based on a finite-element formulation of arbitrary order operating on cubed-sphere and Cartesian meshes with topography. The underlying technology is briefly discussed, including a novel Hybrid Finite Element Method (HFEM) vertical coordinate coupled with high-order Implicit/Explicit (IMEX) time integration to control vertically propagating sound waves. Here, we show results from a suite of Mesoscale testing cases from the literature that demonstrate the accuracy, performance, and properties of Tempest on regular Cartesian meshes. The test cases include wave propagation behavior, Kelvin-Helmholtz instabilities, and flow interaction with topography. Comparisons are made to existing results highlighting improvements made in resolving atmospheric dynamics in the vertical direction where many existing methods are deficient.

  17. Partial F-tests with multiply imputed data in the linear regression framework via coefficient of determination.

    PubMed

    Chaurasia, Ashok; Harel, Ofer

    2015-02-10

    Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.

  18. Amplitude tests of direct channel resonances: The dibaryon

    NASA Astrophysics Data System (ADS)

    Goldstein, G. R.; Moravosik, M. J.; Arash, F.

    1985-02-01

    A recently formulated polarization amplitude test for the existence of one-particle-exchange mechanisms is modified to deal with direct-channel resonances. The results are applied to proton-proton elastic scattering at and around 800 MeV to test the suggested existence of a dibaryon resonance. This test is sensitive to somewhat different circumstances and parameters than the methods used in the past to find dibaryon resonances. The evidence, on the basis of the SAID data set, is negative for a resonance in any singlet partial wave, but is tantalizingly subliminal for a 3F3 resonance.

  19. Amplitude tests of direct channel resonances: the dibaryon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, G.R.; Moravcsik, M.J.; Arash, F.

    A recently formulated polarization amplitude test for the existence of one-particle-exchange mechanisms is modified to deal with direct-channel resonances. The results are applied to proton-proton elastic scattering at and around 800 MeV to test the suggested existence of a dibaryon resonance. This test is sensitive to somewhat different circumstances and parameters than the methods used in the past to find dibaryon resonances. The evidence, on the basis of the SAID data set, is negative for a resonance in any singlet partial wave, but is tantalizingly subliminal for a /sup 3/F/sub 3/ resonance. 7 refs., 4 figs.

  20. Determination of the optimal cutoff value for a serological assay: an example using the Johne's Absorbed EIA.

    PubMed Central

    Ridge, S E; Vizard, A L

    1993-01-01

    Traditionally, in order to improve diagnostic accuracy, existing tests have been replaced with newly developed diagnostic tests with superior sensitivity and specificity. However, it is possible to improve existing tests by altering the cutoff value chosen to distinguish infected individuals from uninfected individuals. This paper uses data obtained from an investigation of the operating characteristics of the Johne's Absorbed EIA to demonstrate a method of determining a preferred cutoff value from several potentially useful cutoff settings. A method of determining the financial gain from using the preferred rather than the current cutoff value and a decision analysis method to assist in determining the optimal cutoff value when critical population parameters are not known with certainty are demonstrated. The results of this study indicate that the currently recommended cutoff value for the Johne's Absorbed EIA is only close to optimal when the disease prevalence is very low and false-positive test results are deemed to be very costly. In other situations, there were considerable financial advantages to using cutoff values calculated to maximize the benefit of testing. It is probable that the current cutoff values for other diagnostic tests may not be the most appropriate for every testing situation. This paper offers methods for identifying the cutoff value that maximizes the benefit of medical and veterinary diagnostic tests. PMID:8501227

  1. Nonlinear least squares regression for single image scanning electron microscope signal-to-noise ratio estimation.

    PubMed

    Sim, K S; Norhisham, S

    2016-11-01

    A new method based on nonlinear least squares regression (NLLSR) is formulated to estimate signal-to-noise ratio (SNR) of scanning electron microscope (SEM) images. The estimation of SNR value based on NLLSR method is compared with the three existing methods of nearest neighbourhood, first-order interpolation and the combination of both nearest neighbourhood and first-order interpolation. Samples of SEM images with different textures, contrasts and edges were used to test the performance of NLLSR method in estimating the SNR values of the SEM images. It is shown that the NLLSR method is able to produce better estimation accuracy as compared to the other three existing methods. According to the SNR results obtained from the experiment, the NLLSR method is able to produce approximately less than 1% of SNR error difference as compared to the other three existing methods. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  2. Putting social impact assessment to the test as a method for implementing responsible tourism practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCombes, Lucy, E-mail: l.mccombes@leedsbeckett.ac.uk; Vanclay, Frank, E-mail: frank.vanclay@rug.nl; Evers, Yvette, E-mail: y.evers@tft-earth.org

    The discourse on the social impacts of tourism needs to shift from the current descriptive critique of tourism to considering what can be done in actual practice to embed the management of tourism's social impacts into the existing planning, product development and operational processes of tourism businesses. A pragmatic approach for designing research methodologies, social management systems and initial actions, which is shaped by the real world operational constraints and existing systems used in the tourism industry, is needed. Our pilot study with a small Bulgarian travel company put social impact assessment (SIA) to the test to see if itmore » could provide this desired approach and assist in implementing responsible tourism development practice, especially in small tourism businesses. Our findings showed that our adapted SIA method has value as a practical method for embedding a responsible tourism approach. While there were some challenges, SIA proved to be effective in assisting the staff of our test case tourism business to better understand their social impacts on their local communities and to identify actions to take. - Highlights: • Pragmatic approach is needed for the responsible management of social impacts of tourism. • Our adapted Social impact Assessment (SIA) method has value as a practical method. • SIA can be embedded into tourism businesses existing ‘ways of doing things’. • We identified challenges and ways to improve our method to better suit small tourism business context.« less

  3. A conjugate gradient method with descent properties under strong Wolfe line search

    NASA Astrophysics Data System (ADS)

    Zull, N.; ‘Aini, N.; Shoid, S.; Ghani, N. H. A.; Mohamed, N. S.; Rivaie, M.; Mamat, M.

    2017-09-01

    The conjugate gradient (CG) method is one of the optimization methods that are often used in practical applications. The continuous and numerous studies conducted on the CG method have led to vast improvements in its convergence properties and efficiency. In this paper, a new CG method possessing the sufficient descent and global convergence properties is proposed. The efficiency of the new CG algorithm relative to the existing CG methods is evaluated by testing them all on a set of test functions using MATLAB. The tests are measured in terms of iteration numbers and CPU time under strong Wolfe line search. Overall, this new method performs efficiently and comparable to the other famous methods.

  4. Robust volcano plot: identification of differential metabolites in the presence of outliers.

    PubMed

    Kumar, Nishith; Hoque, Md Aminul; Sugimoto, Masahiro

    2018-04-11

    The identification of differential metabolites in metabolomics is still a big challenge and plays a prominent role in metabolomics data analyses. Metabolomics datasets often contain outliers because of analytical, experimental, and biological ambiguity, but the currently available differential metabolite identification techniques are sensitive to outliers. We propose a kernel weight based outlier-robust volcano plot for identifying differential metabolites from noisy metabolomics datasets. Two numerical experiments are used to evaluate the performance of the proposed technique against nine existing techniques, including the t-test and the Kruskal-Wallis test. Artificially generated data with outliers reveal that the proposed method results in a lower misclassification error rate and a greater area under the receiver operating characteristic curve compared with existing methods. An experimentally measured breast cancer dataset to which outliers were artificially added reveals that our proposed method produces only two non-overlapping differential metabolites whereas the other nine methods produced between seven and 57 non-overlapping differential metabolites. Our data analyses show that the performance of the proposed differential metabolite identification technique is better than that of existing methods. Thus, the proposed method can contribute to analysis of metabolomics data with outliers. The R package and user manual of the proposed method are available at https://github.com/nishithkumarpaul/Rvolcano .

  5. Creating and Evaluating a Hypertext System of Documenting Analytical Test Methods in a Chemical Plant Quality Assurance Laboratory.

    ERIC Educational Resources Information Center

    White, Charles E., Jr.

    The purpose of this study was to develop and implement a hypertext documentation system in an industrial laboratory and to evaluate its usefulness by participative observation and a questionnaire. Existing word-processing test method documentation was converted directly into a hypertext format or "hyperdocument." The hyperdocument was designed and…

  6. Utilization Bound of Non-preemptive Fixed Priority Schedulers

    NASA Astrophysics Data System (ADS)

    Park, Moonju; Chae, Jinseok

    It is known that the schedulability of a non-preemptive task set with fixed priority can be determined in pseudo-polynomial time. However, since Rate Monotonic scheduling is not optimal for non-preemptive scheduling, the applicability of existing polynomial time tests that provide sufficient schedulability conditions, such as Liu and Layland's bound, is limited. This letter proposes a new sufficient condition for non-preemptive fixed priority scheduling that can be used for any fixed priority assignment scheme. It is also shown that the proposed schedulability test has a tighter utilization bound than existing test methods.

  7. Status and analysis of test standard for on-board charger

    NASA Astrophysics Data System (ADS)

    Hou, Shuai; Liu, Haiming; Jiang, Li; Chen, Xichen; Ma, Junjie; Zhao, Bing; Wu, Zaiyuan

    2018-05-01

    This paper analyzes the test standards of on -board charger (OBC). In the process of testing, we found that there exists some problems in test method and functional status, such as failed to follow up the latest test standards, estimated loosely, rectification uncertainty and consistency. Finally, putting forward some own viewpoints on these problems.

  8. Transport Test Problems for Hybrid Methods Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.

    2011-12-28

    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  9. Effects of Alternate Test Formats in Online Courses

    ERIC Educational Resources Information Center

    Francis, Alan

    2010-01-01

    The purpose of this study was to compare differences in methods of testing for two undergraduate online courses to determine the effect of alternate test formats in relation to participant grades. Specific purposes of this study were to determine whether a difference existed in student test scores between the control and treatment groups and…

  10. Analysis of Weibull Grading Test for Solid Tantalum Capacitors

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2010-01-01

    Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This, model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.

  11. Method of preliminary localization of the iris in biometric access control systems

    NASA Astrophysics Data System (ADS)

    Minacova, N.; Petrov, I.

    2015-10-01

    This paper presents a method of preliminary localization of the iris, based on the stable brightness features of the iris in images of the eye. In tests on images of eyes from publicly available databases method showed good accuracy and speed compared to existing methods preliminary localization.

  12. Methodical aspects of text testing in a driving simulator.

    PubMed

    Sundin, A; Patten, C J D; Bergmark, M; Hedberg, A; Iraeus, I-M; Pettersson, I

    2012-01-01

    A test with 30 test persons was conducted in a driving simulator. The test was a concept exploration and comparison of existing user interaction technologies for text message handling with focus on traffic safety and experience (technology familiarity and learning effects). Focus was put on methodical aspects how to measure and how to analyze the data. Results show difficulties with the eye tracking system (calibration etc.) per se, and also include the subsequent raw data preparation. The physical setup in the car where found important for the test completion.

  13. Polarization-based and specular-reflection-based noncontact latent fingerprint imaging and lifting

    NASA Astrophysics Data System (ADS)

    Lin, Shih-Schön; Yemelyanov, Konstantin M.; Pugh, Edward N., Jr.; Engheta, Nader

    2006-09-01

    In forensic science the finger marks left unintentionally by people at a crime scene are referred to as latent fingerprints. Most existing techniques to detect and lift latent fingerprints require application of a certain material directly onto the exhibit. The chemical and physical processing applied to the fingerprint potentially degrades or prevents further forensic testing on the same evidence sample. Many existing methods also have deleterious side effects. We introduce a method to detect and extract latent fingerprint images without applying any powder or chemicals on the object. Our method is based on the optical phenomena of polarization and specular reflection together with the physiology of fingerprint formation. The recovered image quality is comparable to existing methods. In some cases, such as the sticky side of tape, our method shows unique advantages.

  14. Novel scheme to compute chemical potentials of chain molecules on a lattice

    NASA Astrophysics Data System (ADS)

    Mooij, G. C. A. M.; Frenkel, D.

    We present a novel method that allows efficient computation of the total number of allowed conformations of a chain molecule in a dense phase. Using this method, it is possible to estimate the chemical potential of such a chain molecule. We have tested the present method in simulations of a two-dimensional monolayer of chain molecules on a lattice (Whittington-Chapman model) and compared it with existing schemes to compute the chemical potential. We find that the present approach is two to three orders of magnitude faster than the most efficient of the existing methods.

  15. Implementation of an anonymisation tool for clinical trials using a clinical trial processor integrated with an existing trial patient data information system.

    PubMed

    Aryanto, Kadek Y E; Broekema, André; Oudkerk, Matthijs; van Ooijen, Peter M A

    2012-01-01

    To present an adapted Clinical Trial Processor (CTP) test set-up for receiving, anonymising and saving Digital Imaging and Communications in Medicine (DICOM) data using external input from the original database of an existing clinical study information system to guide the anonymisation process. Two methods are presented for an adapted CTP test set-up. In the first method, images are pushed from the Picture Archiving and Communication System (PACS) using the DICOM protocol through a local network. In the second method, images are transferred through the internet using the HTTPS protocol. In total 25,000 images from 50 patients were moved from the PACS, anonymised and stored within roughly 2 h using the first method. In the second method, an average of 10 images per minute were transferred and processed over a residential connection. In both methods, no duplicated images were stored when previous images were retransferred. The anonymised images are stored in appropriate directories. The CTP can transfer and process DICOM images correctly in a very easy set-up providing a fast, secure and stable environment. The adapted CTP allows easy integration into an environment in which patient data are already included in an existing information system.

  16. Method of Testing Oxygen Regulators

    NASA Technical Reports Server (NTRS)

    Sontag, Harcourt; Borlik, E L

    1935-01-01

    Oxygen regulators are used in aircraft to regulate automatically the flow of oxygen to the pilot from a cylinder at pressures ranging up to 150 atmospheres. The instruments are adjusted to open at an altitude of about 15,000 ft. and thereafter to deliver oxygen at a rate which increases with the altitude. The instruments are tested to determine the rate of flow of oxygen delivered at various altitudes and to detect any mechanical defects which may exist. A method of testing oxygen regulators was desired in which the rate of flow could be determined more accurately than by the test method previously used (reference 1) and by which instruments defective mechanically could be detected. The new method of test fulfills these requirements.

  17. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  18. Corrosion performance tests for reinforcing steel in concrete : technical report.

    DOT National Transportation Integrated Search

    2009-10-01

    The existing test method used to assess the corrosion performance of reinforcing steel embedded in : concrete, mainly ASTM G 109, is labor intensive, time consuming, slow to provide comparative results, : and can be expensive. However, with corrosion...

  19. Evaluation of the base/subgrade soil under repeated loading : phase II, in-box and ALF cyclic plate load tests [tech summary].

    DOT National Transportation Integrated Search

    2012-03-01

    The inadequacy of many existing roads due to rapid growth in traffic volume provides a motivation for exploring alternatives to : existing methods of constructing and rehabilitating roads. The use of geosynthetics to stabilize and reinforce paved and...

  20. Diagnostic Utility of a Clonality Test for Lymphoproliferative Diseases in Koreans Using the BIOMED-2 PCR Assay

    PubMed Central

    Kim, Young; Choi, Yoo Duk; Choi, Chan

    2013-01-01

    Background A clonality test for immunoglobulin (IG) and T cell receptor (TCR) is a useful adjunctive method for the diagnosis of lymphoproliferative diseases (LPDs). Recently, the BIOMED-2 multiplex polymerase chain reaction (PCR) assay has been established as a standard method for assessing the clonality of LPDs. We tested clonality in LPDs in Koreans using the BIOMED-2 multiplex PCR and compared the results with those obtained in European, Taiwanese, and Thai participants. We also evaluated the usefulness of the test as an ancillary method for diagnosing LPDs. Methods Two hundred and nineteen specimens embedded in paraffin, including 78 B cell lymphomas, 80 T cell lymphomas and 61 cases of reactive lymphadenitis, were used for the clonality test. Results Mature B cell malignancies showed 95.7% clonality for IG, 2.9% co-existing clonality, and 4.3% polyclonality. Mature T cell malignancies exhibited 83.8% clonality for TCR, 8.1% co-existing clonality, and 16.2% polyclonality. Reactive lymphadenitis showed 93.4% polyclonality for IG and TCR. The majority of our results were similar to those obtained in Europeans. However, the clonality for IGK of B cell malignancies and TCRG of T cell malignancies was lower in Koreans than Europeans. Conclusions The BIOMED-2 multiplex PCR assay was a useful adjunctive method for diagnosing LPDs. PMID:24255634

  1. Section Preequating under the Equivalent Groups Design without IRT

    ERIC Educational Resources Information Center

    Guo, Hongwen; Puhan, Gautam

    2014-01-01

    In this article, we introduce a section preequating (SPE) method (linear and nonlinear) under the randomly equivalent groups design. In this equating design, sections of Test X (a future new form) and another existing Test Y (an old form already on scale) are administered. The sections of Test X are equated to Test Y, after adjusting for the…

  2. Quantitative PCR detection of Batrachochytrium dendrobatidis DNA from sediments and water

    USGS Publications Warehouse

    Kirshtein, Julie D.; Anderson, Chauncey W.; Wood, J.S.; Longcore, Joyce E.; Voytek, Mary A.

    2007-01-01

    The fungal pathogen Batrachochytrium dendrobatidis (Bd) causes chytridiomycosis, a disease implicated in amphibian declines on 5 continents. Polymerase chain reaction (PCR) primer sets exist with which amphibians can be tested for this disease, and advances in sampling techniques allow non-invasive testing of animals. We developed filtering and PCR based quantitative methods by modifying existing PCR assays to detect Bd DNA in water and sediments, without the need for testing amphibians; we tested the methods at 4 field sites. The SYBR based assay using Boyle primers (SYBR/Boyle assay) and the Taqman based assay using Wood primers performed similarly with samples generated in the laboratory (Bd spiked filters), but the SYBR/Boyle assay detected Bd DNA in more field samples. We detected Bd DNA in water from 3 of 4 sites tested, including one pond historically negative for chytridiomycosis. Zoospore equivalents in sampled water ranged from 19 to 454 l-1 (nominal detection limit is 10 DNA copies, or about 0.06 zoospore). We did not detect DNA of Bd from sediments collected at any sites. Our filtering and amplification methods provide a new tool to investigate critical aspects of Bd in the environment. ?? Inter-Research 2007.

  3. Alternative methods of flexible base compaction acceptance.

    DOT National Transportation Integrated Search

    2012-05-01

    In the Texas Department of Transportation, flexible base construction is governed by a series of stockpile : and field tests. A series of concerns with these existing methods, along with some premature failures in the : field, led to this project inv...

  4. Compare diagnostic tests using transformation-invariant smoothed ROC curves⋆

    PubMed Central

    Tang, Liansheng; Du, Pang; Wu, Chengqing

    2012-01-01

    Receiver operating characteristic (ROC) curve, plotting true positive rates against false positive rates as threshold varies, is an important tool for evaluating biomarkers in diagnostic medicine studies. By definition, ROC curve is monotone increasing from 0 to 1 and is invariant to any monotone transformation of test results. And it is often a curve with certain level of smoothness when test results from the diseased and non-diseased subjects follow continuous distributions. Most existing ROC curve estimation methods do not guarantee all of these properties. One of the exceptions is Du and Tang (2009) which applies certain monotone spline regression procedure to empirical ROC estimates. However, their method does not consider the inherent correlations between empirical ROC estimates. This makes the derivation of the asymptotic properties very difficult. In this paper we propose a penalized weighted least square estimation method, which incorporates the covariance between empirical ROC estimates as a weight matrix. The resulting estimator satisfies all the aforementioned properties, and we show that it is also consistent. Then a resampling approach is used to extend our method for comparisons of two or more diagnostic tests. Our simulations show a significantly improved performance over the existing method, especially for steep ROC curves. We then apply the proposed method to a cancer diagnostic study that compares several newly developed diagnostic biomarkers to a traditional one. PMID:22639484

  5. The Psychology Experiment Building Language (PEBL) and PEBL Test Battery

    PubMed Central

    Mueller, Shane T.; Piper, Brian J.

    2014-01-01

    Background We briefly describe the Psychology Experiment Building Language (PEBL), an open source software system for designing and running psychological experiments. New Method We describe the PEBL test battery, a set of approximately 70 behavioral tests which can be freely used, shared, and modified. Included is a comprehensive set of past research upon which tests in the battery are based. Results We report the results of benchmark tests that establish the timing precision of PEBL. Comparison with Existing Method We consider alternatives to the PEBL system and battery tests. Conclusions We conclude with a discussion of the ethical factors involved in the open source testing movement. PMID:24269254

  6. An investigation of new methods for estimating parameter sensitivities

    NASA Technical Reports Server (NTRS)

    Beltracchi, Todd J.; Gabriele, Gary A.

    1989-01-01

    The method proposed for estimating sensitivity derivatives is based on the Recursive Quadratic Programming (RQP) method and in conjunction a differencing formula to produce estimates of the sensitivities. This method is compared to existing methods and is shown to be very competitive in terms of the number of function evaluations required. In terms of accuracy, the method is shown to be equivalent to a modified version of the Kuhn-Tucker method, where the Hessian of the Lagrangian is estimated using the BFS method employed by the RQP algorithm. Initial testing on a test set with known sensitivities demonstrates that the method can accurately calculate the parameter sensitivity.

  7. [Application of Stata software to test heterogeneity in meta-analysis method].

    PubMed

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  8. Non-destructive inspection of polymer composite products

    NASA Astrophysics Data System (ADS)

    Anoshkin, A. N.; Sal'nikov, A. F.; Osokin, V. M.; Tretyakov, A. A.; Luzin, G. S.; Potrakhov, N. N.; Bessonov, V. B.

    2018-02-01

    The paper considers the main types of defects encountered in products made of polymer composite materials for aviation use. The analysis of existing methods of nondestructive testing is carried out, features of their application are considered taking into account design features, geometrical parameters and internal structure of objects of inspection. The advantages and disadvantages of the considered methods of nondestructive testing used in industrial production are shown.

  9. Sizing up arthropod genomes: an evaluation of the impact of environmental variation on genome size estimates by flow cytometry and the use of qPCR as a method of estimation.

    PubMed

    Gregory, T Ryan; Nathwani, Paula; Bonnett, Tiffany R; Huber, Dezene P W

    2013-09-01

    A study was undertaken to evaluate both a pre-existing method and a newly proposed approach for the estimation of nuclear genome sizes in arthropods. First, concerns regarding the reliability of the well-established method of flow cytometry relating to impacts of rearing conditions on genome size estimates were examined. Contrary to previous reports, a more carefully controlled test found negligible environmental effects on genome size estimates in the fly Drosophila melanogaster. Second, a more recently touted method based on quantitative real-time PCR (qPCR) was examined in terms of ease of use, efficiency, and (most importantly) accuracy using four test species: the flies Drosophila melanogaster and Musca domestica and the beetles Tribolium castaneum and Dendroctonus ponderosa. The results of this analysis demonstrated that qPCR has the tendency to produce substantially different genome size estimates from other established techniques while also being far less efficient than existing methods.

  10. Prediction of protein structural classes by recurrence quantification analysis based on chaos game representation.

    PubMed

    Yang, Jian-Yi; Peng, Zhen-Ling; Yu, Zu-Guo; Zhang, Rui-Jie; Anh, Vo; Wang, Desheng

    2009-04-21

    In this paper, we intend to predict protein structural classes (alpha, beta, alpha+beta, or alpha/beta) for low-homology data sets. Two data sets were used widely, 1189 (containing 1092 proteins) and 25PDB (containing 1673 proteins) with sequence homology being 40% and 25%, respectively. We propose to decompose the chaos game representation of proteins into two kinds of time series. Then, a novel and powerful nonlinear analysis technique, recurrence quantification analysis (RQA), is applied to analyze these time series. For a given protein sequence, a total of 16 characteristic parameters can be calculated with RQA, which are treated as feature representation of protein sequences. Based on such feature representation, the structural class for each protein is predicted with Fisher's linear discriminant algorithm. The jackknife test is used to test and compare our method with other existing methods. The overall accuracies with step-by-step procedure are 65.8% and 64.2% for 1189 and 25PDB data sets, respectively. With one-against-others procedure used widely, we compare our method with five other existing methods. Especially, the overall accuracies of our method are 6.3% and 4.1% higher for the two data sets, respectively. Furthermore, only 16 parameters are used in our method, which is less than that used by other methods. This suggests that the current method may play a complementary role to the existing methods and is promising to perform the prediction of protein structural classes.

  11. Paradigms of Evaluation in Natural Language Processing: Field Linguistics for Glass Box Testing

    ERIC Educational Resources Information Center

    Cohen, Kevin Bretonnel

    2010-01-01

    Although software testing has been well-studied in computer science, it has received little attention in natural language processing. Nonetheless, a fully developed methodology for glass box evaluation and testing of language processing applications already exists in the field methods of descriptive linguistics. This work lays out a number of…

  12. A Method to Examine Content Domain Structures

    ERIC Educational Resources Information Center

    D'Agostino, Jerome; Karpinski, Aryn; Welsh, Megan

    2011-01-01

    After a test is developed, most content validation analyses shift from ascertaining domain definition to studying domain representation and relevance because the domain is assumed to be set once a test exists. We present an approach that allows for the examination of alternative domain structures based on extant test items. In our example based on…

  13. How Magnetic Disturbance Influences the Attitude and Heading in Magnetic and Inertial Sensor-Based Orientation Estimation.

    PubMed

    Fan, Bingfei; Li, Qingguo; Liu, Tao

    2017-12-28

    With the advancements in micro-electromechanical systems (MEMS) technologies, magnetic and inertial sensors are becoming more and more accurate, lightweight, smaller in size as well as low-cost, which in turn boosts their applications in human movement analysis. However, challenges still exist in the field of sensor orientation estimation, where magnetic disturbance represents one of the obstacles limiting their practical application. The objective of this paper is to systematically analyze exactly how magnetic disturbances affects the attitude and heading estimation for a magnetic and inertial sensor. First, we reviewed four major components dealing with magnetic disturbance, namely decoupling attitude estimation from magnetic reading, gyro bias estimation, adaptive strategies of compensating magnetic disturbance and sensor fusion algorithms. We review and analyze the features of existing methods of each component. Second, to understand each component in magnetic disturbance rejection, four representative sensor fusion methods were implemented, including gradient descent algorithms, improved explicit complementary filter, dual-linear Kalman filter and extended Kalman filter. Finally, a new standardized testing procedure has been developed to objectively assess the performance of each method against magnetic disturbance. Based upon the testing results, the strength and weakness of the existing sensor fusion methods were easily examined, and suggestions were presented for selecting a proper sensor fusion algorithm or developing new sensor fusion method.

  14. Verification of responses of Japanese medaka (Oryzias latipes) to anti-androgens, vinclozolin and flutamide, in short-term assays.

    PubMed

    Nakamura, Ataru; Takanobu, Hitomi; Tamura, Ikumi; Yamamuro, Masumi; Iguchi, Taisen; Tatarazako, Norihisa

    2014-05-01

    Various testing methods for the detection of the endocrine disruptive activities of chemicals have been developed in freshwater fish species. However, a few relatively easier specific methods for detecting anti-androgenic activities are available for fish. The aim of this study was to verify the papillary process in Japanese medaka (Oryzias latipes) as an indicator of the anti-androgenic activity of chemicals. Japanese medaka were exposed to two types of anti-androgenic compounds, vinclozolin and flutamide, using two short-term assays; one was conformed to the existing short-term reproduction assay using adult fish (adult test) and the other was a test based on the same methods but using juvenile fish at the beginning of exposure (juvenile test). Significant decreases in male papillary processes were observed in the juvenile test treated with the highest concentration of both antiandrogens (640 µg l(-1) vinclozolin and 1000 µg l(-1) flutamide); however, no significant effects were observed in the adult test. Consequently, our results indicate that papillary processes in Japanese medaka can be used as the end-point for screening the anti-androgenic activity of chemicals using juvenile fish for a specific period based on the existing short-term reproduction assay. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Quantification of Shape, Angularity, and Surface texture of Base Course Materials

    DOT National Transportation Integrated Search

    1998-01-01

    A state-of-the-art review was conducted to determine existing test methods for characterizing the shape, angularity, and surface texture of coarse aggregates. The review found direct methods used by geologists to determine these characteristics. Thes...

  16. A uniformly valid approximation algorithm for nonlinear ordinary singular perturbation problems with boundary layer solutions.

    PubMed

    Cengizci, Süleyman; Atay, Mehmet Tarık; Eryılmaz, Aytekin

    2016-01-01

    This paper is concerned with two-point boundary value problems for singularly perturbed nonlinear ordinary differential equations. The case when the solution only has one boundary layer is examined. An efficient method so called Successive Complementary Expansion Method (SCEM) is used to obtain uniformly valid approximations to this kind of solutions. Four test problems are considered to check the efficiency and accuracy of the proposed method. The numerical results are found in good agreement with exact and existing solutions in literature. The results confirm that SCEM has a superiority over other existing methods in terms of easy-applicability and effectiveness.

  17. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    PubMed

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  18. An examination of the challenges influencing science instruction in Florida elementary classrooms

    NASA Astrophysics Data System (ADS)

    North, Stephanie Gwinn

    It has been shown that the mechanical properties of thin films tend to differ from their bulk counterparts. Specifically, the bulge and microtensile testing of thin films used in MEMS have revealed that these films demonstrate an inverse relationship between thickness and strength. A film dimension is not a material property, but it evidently does affect the mechanical performance of materials at very small thicknesses. A hypothetical explanation for this phenomenon is that as the thickness dimension of the film decreases, it is statistically less likely that imperfections exist in the material. It would require a very small thickness (or volume) to limit imperfections in a material, which is why this phenomenon is seen in films with thicknesses on the order of 100 nm to a few microns. Another hypothesized explanation is that the surface tension that exists in bulk material also exists in thin films but has a greater impact at such a small scale. The goal of this research is to identify a theoretical prediction of the strength of thin films based on its microstructural properties such as grain size and film thickness. This would minimize the need for expensive and complicated tests such as the bulge and microtensile tests. In this research, data was collected from the bulge and microtensile testing of copper, aluminum, gold, and polysilicon free-standing thin films. Statistical testing of this data revealed a definitive inverse relationship between thickness and strength, as well as between grain size and strength, as expected. However, due to a lack of a standardized method for either test, there were significant variations in the data. This research compares and analyzes the methods used by other researchers to develop a suggested set of instructions for a standardized bulge test and standardized microtensile test. The most important parameters to be controlled in each test were found to be strain rate, temperature, film deposition method, film length, and strain measurement.

  19. Simple algorithms for remote determination of mineral abundances and particle sizes from reflectance spectra

    NASA Technical Reports Server (NTRS)

    Johnson, Paul E.; Smith, Milton O.; Adams, John B.

    1992-01-01

    Algorithms were developed, based on Hapke's (1981) equations, for remote determinations of mineral abundances and particle sizes from reflectance spectra. In this method, spectra are modeled as a function of end-member abundances and illumination/viewing geometry. The method was tested on a laboratory data set. It is emphasized that, although there exist more sophisticated models, the present algorithms are particularly suited for remotely sensed data, where little opportunity exists to independently measure reflectance versus article size and phase function.

  20. From empirical data to time-inhomogeneous continuous Markov processes.

    PubMed

    Lencastre, Pedro; Raischel, Frank; Rogers, Tim; Lind, Pedro G

    2016-03-01

    We present an approach for testing for the existence of continuous generators of discrete stochastic transition matrices. Typically, existing methods to ascertain the existence of continuous Markov processes are based on the assumption that only time-homogeneous generators exist. Here a systematic extension to time inhomogeneity is presented, based on new mathematical propositions incorporating necessary and sufficient conditions, which are then implemented computationally and applied to numerical data. A discussion concerning the bridging between rigorous mathematical results on the existence of generators to its computational implementation is presented. Our detection algorithm shows to be effective in more than 60% of tested matrices, typically 80% to 90%, and for those an estimate of the (nonhomogeneous) generator matrix follows. We also solve the embedding problem analytically for the particular case of three-dimensional circulant matrices. Finally, a discussion of possible applications of our framework to problems in different fields is briefly addressed.

  1. MAFsnp: A Multi-Sample Accurate and Flexible SNP Caller Using Next-Generation Sequencing Data

    PubMed Central

    Hu, Jiyuan; Li, Tengfei; Xiu, Zidi; Zhang, Hong

    2015-01-01

    Most existing statistical methods developed for calling single nucleotide polymorphisms (SNPs) using next-generation sequencing (NGS) data are based on Bayesian frameworks, and there does not exist any SNP caller that produces p-values for calling SNPs in a frequentist framework. To fill in this gap, we develop a new method MAFsnp, a Multiple-sample based Accurate and Flexible algorithm for calling SNPs with NGS data. MAFsnp is based on an estimated likelihood ratio test (eLRT) statistic. In practical situation, the involved parameter is very close to the boundary of the parametric space, so the standard large sample property is not suitable to evaluate the finite-sample distribution of the eLRT statistic. Observing that the distribution of the test statistic is a mixture of zero and a continuous part, we propose to model the test statistic with a novel two-parameter mixture distribution. Once the parameters in the mixture distribution are estimated, p-values can be easily calculated for detecting SNPs, and the multiple-testing corrected p-values can be used to control false discovery rate (FDR) at any pre-specified level. With simulated data, MAFsnp is shown to have much better control of FDR than the existing SNP callers. Through the application to two real datasets, MAFsnp is also shown to outperform the existing SNP callers in terms of calling accuracy. An R package “MAFsnp” implementing the new SNP caller is freely available at http://homepage.fudan.edu.cn/zhangh/softwares/. PMID:26309201

  2. Computer tomography of flows external to test models

    NASA Technical Reports Server (NTRS)

    Prikryl, I.; Vest, C. M.

    1982-01-01

    Computer tomographic techniques for reconstruction of three-dimensional aerodynamic density fields, from interferograms recorded from several different viewing directions were studied. Emphasis is on the case in which an opaque object such as a test model in a wind tunnel obscures significant regions of the interferograms (projection data). A method called the Iterative Convolution Method (ICM), existing methods in which the field is represented by a series expansions, and analysis of real experimental data in the form of aerodynamic interferograms are discussed.

  3. Validation of a new ELISA method for in vitro potency testing of hepatitis A vaccines.

    PubMed

    Morgeaux, S; Variot, P; Daas, A; Costanzo, A

    2013-01-01

    The goal of the project was to standardise a new in vitro method in replacement of the existing standard method for the determination of hepatitis A virus antigen content in hepatitis A vaccines (HAV) marketed in Europe. This became necessary due to issues with the method used previously, requiring the use of commercial test kits. The selected candidate method, not based on commercial kits, had already been used for many years by an Official Medicines Control Laboratory (OMCL) for routine testing and batch release of HAV. After a pre-qualification phase (Phase 1) that showed the suitability of the commercially available critical ELISA reagents for the determination of antigen content in marketed HAV present on the European market, an international collaborative study (Phase 2) was carried out in order to fully validate the method. Eleven laboratories took part in the collaborative study. They performed assays with the candidate standard method and, in parallel, for comparison purposes, with their own in-house validated methods where these were available. The study demonstrated that the new assay provides a more reliable and reproducible method when compared to the existing standard method. A good correlation of the candidate standard method with the in vivo immunogenicity assay in mice was shown previously for both potent and sub-potent (stressed) vaccines. Thus, the new standard method validated during the collaborative study may be implemented readily by manufacturers and OMCLs for routine batch release but also for in-process control or consistency testing. The new method was approved in October 2012 by Group of Experts 15 of the European Pharmacopoeia (Ph. Eur.) as the standard method for in vitro potency testing of HAV. The relevant texts will be revised accordingly. Critical reagents such as coating reagent and detection antibodies have been adopted by the Ph. Eur. Commission and are available from the EDQM as Ph. Eur. Biological Reference Reagents (BRRs).

  4. Advanced Computational Techniques for Hypersonic Propulsion

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1996-01-01

    CFD has played a major role in the resurgence of hypersonic flight, on the premise that numerical methods will allow us to perform simulations at conditions for which no ground test capability exists. Validation of CFD methods is being established using the experimental data base available, which is below Mach 8. It is important, however, to realize the limitations involved in the extrapolation process as well as the deficiencies that exist in numerical methods at the present time. Current features of CFD codes are examined for application to propulsion system components. The shortcomings in simulation and modeling are identified and discussed.

  5. Systematic evaluation of non-animal test methods for skin sensitisation safety assessment.

    PubMed

    Reisinger, Kerstin; Hoffmann, Sebastian; Alépée, Nathalie; Ashikaga, Takao; Barroso, Joao; Elcombe, Cliff; Gellatly, Nicola; Galbiati, Valentina; Gibbs, Susan; Groux, Hervé; Hibatallah, Jalila; Keller, Donald; Kern, Petra; Klaric, Martina; Kolle, Susanne; Kuehnl, Jochen; Lambrechts, Nathalie; Lindstedt, Malin; Millet, Marion; Martinozzi-Teissier, Silvia; Natsch, Andreas; Petersohn, Dirk; Pike, Ian; Sakaguchi, Hitoshi; Schepky, Andreas; Tailhardat, Magalie; Templier, Marie; van Vliet, Erwin; Maxwell, Gavin

    2015-02-01

    The need for non-animal data to assess skin sensitisation properties of substances, especially cosmetics ingredients, has spawned the development of many in vitro methods. As it is widely believed that no single method can provide a solution, the Cosmetics Europe Skin Tolerance Task Force has defined a three-phase framework for the development of a non-animal testing strategy for skin sensitization potency prediction. The results of the first phase – systematic evaluation of 16 test methods – are presented here. This evaluation involved generation of data on a common set of ten substances in all methods and systematic collation of information including the level of standardisation, existing test data,potential for throughput, transferability and accessibility in cooperation with the test method developers.A workshop was held with the test method developers to review the outcome of this evaluation and to discuss the results. The evaluation informed the prioritisation of test methods for the next phase of the non-animal testing strategy development framework. Ultimately, the testing strategy – combined with bioavailability and skin metabolism data and exposure consideration – is envisaged to allow establishment of a data integration approach for skin sensitisation safety assessment of cosmetic ingredients.

  6. First experiences with an accelerated CMV antigenemia test: CMV Brite Turbo assay.

    PubMed

    Visser, C E; van Zeijl, C J; de Klerk, E P; Schillizi, B M; Beersma, M F; Kroes, A C

    2000-06-01

    Cytomegalovirus disease is still a major problem in immunocompromised patients, such as bone marrow or kidney transplantation patients. The detection of viral antigen in leukocytes (antigenemia) has proven to be a clinically relevant marker of CMV activity and has found widespread application. Because most existing assays are rather time-consuming and laborious, an accelerated version (Brite Turbo) of an existing method (Brite) has been developed. The major modification is in the direct lysis of erythrocytes instead of separation by sedimentation. In this study the Brite Turbo method has been compared with the conventional Brite method to detect CMV antigen pp65 in peripheral blood leukocytes of 107 consecutive immunocompromised patients. Both tests produced similar results. Discrepancies were limited to the lowest positive range and sensitivity and specificity were comparable for both tests. Two major advantages of the Brite Turbo method could be observed in comparison to the original method: assay-time was reduced by more than 50% and only 2 ml of blood was required. An additional advantage was the higher number of positive nuclei in the Brite Turbo method attributable to the increased number of granulocytes in the assay. Early detection of CMV infection or reactivation has become faster and easier with this modified assay.

  7. TESTING OF INDOOR RADON REDUCTION TECHNIQUES IN 19 MARYLAND HOUSES

    EPA Science Inventory

    The report gives results of testing of indoor radon reduction techniques in 19 existing houses in Maryland. The focus was on passive measures: various passive soil depressurization methods, where natural wind and temperature effects are utilized to develop suction in the system; ...

  8. Conducting Slug Tests in Mini-Piezometers.

    PubMed

    Fritz, Bradley G; Mackley, Rob D; Arntzen, Evan V

    2016-03-01

    Slug tests performed using mini-piezometers with internal diameters as small as 0.43 cm can provide a cost effective tool for hydraulic characterization. We evaluated the hydraulic properties of the apparatus in a laboratory environment and compared those results with field tests of mini-piezometers installed into locations with varying hydraulic properties. Based on our evaluation, slug tests conducted in mini-piezometers using the fabrication and installation approach described here are effective within formations where the hydraulic conductivity is less than 1 × 10(-3) cm/s. While these constraints limit the potential application of this method, the benefits to this approach are that the installation, measurement, and analysis is cost effective, and the installation can be completed in areas where other (larger diameter) methods might not be possible. Additionally, this methodology could be applied to existing mini-piezometers previously installed for other purposes. Such analysis of existing installations could be beneficial in interpreting previously collected data (e.g., water-quality data or hydraulic head data). © 2015, National Ground Water Association.

  9. Comparison of three commercially available fit-test methods.

    PubMed

    Janssen, Larry L; Luinenburg, D Michael; Mullins, Haskell E; Nelson, Thomas J

    2002-01-01

    American National Standards Institute (ANSI) standard Z88.10, Respirator Fit Testing Methods, includes criteria to evaluate new fit-tests. The standard allows generated aerosol, particle counting, or controlled negative pressure quantitative fit-tests to be used as the reference method to determine acceptability of a new test. This study examined (1) comparability of three Occupational Safety and Health Administration-accepted fit-test methods, all of which were validated using generated aerosol as the reference method; and (2) the effect of the reference method on the apparent performance of a fit-test method under evaluation. Sequential fit-tests were performed using the controlled negative pressure and particle counting quantitative fit-tests and the bitter aerosol qualitative fit-test. Of 75 fit-tests conducted with each method, the controlled negative pressure method identified 24 failures; bitter aerosol identified 22 failures; and the particle counting method identified 15 failures. The sensitivity of each method, that is, agreement with the reference method in identifying unacceptable fits, was calculated using each of the other two methods as the reference. None of the test methods met the ANSI sensitivity criterion of 0.95 or greater when compared with either of the other two methods. These results demonstrate that (1) the apparent performance of any fit-test depends on the reference method used, and (2) the fit-tests evaluated use different criteria to identify inadequately fitting respirators. Although "acceptable fit" cannot be defined in absolute terms at this time, the ability of existing fit-test methods to reject poor fits can be inferred from workplace protection factor studies.

  10. Reliability of High-Voltage Tantalum Capacitors. Parts 3 and 4)

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2010-01-01

    Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.

  11. Review and specification for shrinkage cracks of bridge decks : final report.

    DOT National Transportation Integrated Search

    2016-12-01

    An existing standard method ASTM C157 is used to determine the length change or free shrinkage of an unrestrained concrete specimen. However, in bridge decks, the concrete is actually under restrained conditions, and thus free shrinkage test methods ...

  12. Turbulence excited frequency domain damping measurement and truncation effects

    NASA Technical Reports Server (NTRS)

    Soovere, J.

    1976-01-01

    Existing frequency domain modal frequency and damping analysis methods are discussed. The effects of truncation in the Laplace and Fourier transform data analysis methods are described. Methods for eliminating truncation errors from measured damping are presented. Implications of truncation effects in fast Fourier transform analysis are discussed. Limited comparison with test data is presented.

  13. 40 CFR 63.5798 - What if I want to use, or I manufacture, an application technology (new or existing) whose...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: Reinforced Plastic Composites Production Calculating Organic Hap Emissions Factors for Open Molding and... description of the resin or gel coat application technology and supporting organic HAP emissions test data obtained using EPA test methods or their equivalent. The emission test data should be obtained using a...

  14. 40 CFR 63.5798 - What if I want to use, or I manufacture, an application technology (new or existing) whose...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Reinforced Plastic Composites Production Calculating Organic Hap Emissions Factors for Open Molding and... description of the resin or gel coat application technology and supporting organic HAP emissions test data obtained using EPA test methods or their equivalent. The emission test data should be obtained using a...

  15. 40 CFR 63.5798 - What if I want to use, or I manufacture, an application technology (new or existing) whose...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Pollutants: Reinforced Plastic Composites Production Calculating Organic Hap Emissions Factors for Open... organic HAP emissions test data obtained using EPA test methods or their equivalent. The emission test data should be obtained using a range of resin or gel coat HAP contents to demonstrate the...

  16. 40 CFR 63.5798 - What if I want to use, or I manufacture, an application technology (new or existing) whose...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Pollutants: Reinforced Plastic Composites Production Calculating Organic Hap Emissions Factors for Open... organic HAP emissions test data obtained using EPA test methods or their equivalent. The emission test data should be obtained using a range of resin or gel coat HAP contents to demonstrate the...

  17. 40 CFR 63.5798 - What if I want to use, or I manufacture, an application technology (new or existing) whose...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Pollutants: Reinforced Plastic Composites Production Calculating Organic Hap Emissions Factors for Open... organic HAP emissions test data obtained using EPA test methods or their equivalent. The emission test data should be obtained using a range of resin or gel coat HAP contents to demonstrate the...

  18. Optimal Sample Size Determinations for the Heteroscedastic Two One-Sided Tests of Mean Equivalence: Design Schemes and Software Implementations

    ERIC Educational Resources Information Center

    Jan, Show-Li; Shieh, Gwowen

    2017-01-01

    Equivalence assessment is becoming an increasingly important topic in many application areas including behavioral and social sciences research. Although there exist more powerful tests, the two one-sided tests (TOST) procedure is a technically transparent and widely accepted method for establishing statistical equivalence. Alternatively, a direct…

  19. Compression Testing of Textile Composite Materials

    NASA Technical Reports Server (NTRS)

    Masters, John E.

    1996-01-01

    The applicability of existing test methods, which were developed primarily for laminates made of unidirectional prepreg tape, to textile composites is an area of concern. The issue is whether the values measured for the 2-D and 3-D braided, woven, stitched, and knit materials are accurate representations of the true material response. This report provides a review of efforts to establish a compression test method for textile reinforced composite materials. Experimental data have been gathered from several sources and evaluated to assess the effectiveness of a variety of test methods. The effectiveness of the individual test methods to measure the material's modulus and strength is determined. Data are presented for 2-D triaxial braided, 3-D woven, and stitched graphite/epoxy material. However, the determination of a recommended test method and specimen dimensions is based, primarily, on experimental results obtained by the Boeing Defense and Space Group for 2-D triaxially braided materials. They evaluated seven test methods: NASA Short Block, Modified IITRI, Boeing Open Hole Compression, Zabora Compression, Boeing Compression after Impact, NASA ST-4, and a Sandwich Column Test.

  20. Using Set Covering with Item Sampling to Analyze the Infeasibility of Linear Programming Test Assembly Models

    ERIC Educational Resources Information Center

    Huitzing, Hiddo A.

    2004-01-01

    This article shows how set covering with item sampling (SCIS) methods can be used in the analysis and preanalysis of linear programming models for test assembly (LPTA). LPTA models can construct tests, fulfilling a set of constraints set by the test assembler. Sometimes, no solution to the LPTA model exists. The model is then said to be…

  1. Practical Applications of a Building Method to Construct Aerodynamic Database of Guided Missile Using Wind Tunnel Test Data

    NASA Astrophysics Data System (ADS)

    Kim, Duk-hyun; Lee, Hyoung-Jin

    2018-04-01

    A study of efficient aerodynamic database modeling method was conducted. A creation of database using periodicity and symmetry characteristic of missile aerodynamic coefficient was investigated to minimize the number of wind tunnel test cases. In addition, studies of how to generate the aerodynamic database when the periodicity changes due to installation of protuberance and how to conduct a zero calibration were carried out. Depending on missile configurations, the required number of test cases changes and there exist tests that can be omitted. A database of aerodynamic on deflection angle of control surface can be constituted using phase shift. A validity of modeling method was demonstrated by confirming that the result which the aerodynamic coefficient calculated by using the modeling method was in agreement with wind tunnel test results.

  2. Biodegradability standards for carrier bags and plastic films in aquatic environments: a critical review.

    PubMed

    Harrison, Jesse P; Boardman, Carl; O'Callaghan, Kenneth; Delort, Anne-Marie; Song, Jim

    2018-05-01

    Plastic litter is encountered in aquatic ecosystems across the globe, including polar environments and the deep sea. To mitigate the adverse societal and ecological impacts of this waste, there has been debate on whether 'biodegradable' materials should be granted exemptions from plastic bag bans and levies. However, great care must be exercised when attempting to define this term, due to the broad and complex range of physical and chemical conditions encountered within natural ecosystems. Here, we review existing international industry standards and regional test methods for evaluating the biodegradability of plastics within aquatic environments (wastewater, unmanaged freshwater and marine habitats). We argue that current standards and test methods are insufficient in their ability to realistically predict the biodegradability of carrier bags in these environments, due to several shortcomings in experimental procedures and a paucity of information in the scientific literature. Moreover, existing biodegradability standards and test methods for aquatic environments do not involve toxicity testing or account for the potentially adverse ecological impacts of carrier bags, plastic additives, polymer degradation products or small (microscopic) plastic particles that can arise via fragmentation. Successfully addressing these knowledge gaps is a key requirement for developing new biodegradability standard(s) for lightweight carrier bags.

  3. A retrospective evaluation method for in vitro mammalian genotoxicity tests using cytotoxicity index transformation formulae.

    PubMed

    Fujita, Yurika; Kasamatsu, Toshio; Ikeda, Naohiro; Nishiyama, Naohiro; Honda, Hiroshi

    2016-01-15

    Although in vitro chromosomal aberration tests and micronucleus tests have been widely used for genotoxicity evaluation, false-positive results have been reported under strong cytotoxic conditions. To reduce false-positive results, the new Organization for Economic Co-operation and Development (OECD) test guideline (TG) recommends the use of a new cytotoxicity index, relative increase in cell count or relative population doubling (RICC/RPD), instead of the traditionally used index, relative cell count (RCC). Although the use of the RICC/RPD may result in different outcomes and require re-evaluation of tested substances, it is impractical to re-evaluate all existing data. Therefore, we established a method to estimate test results from existing RCC data. First, we developed formulae to estimate RICC/RPD from RCC without cell counts by considering cell doubling time and experiment time. Next, the accuracy of the cytotoxicity index transformation formulae was verified by comparing estimated RICC/RPD and measured RICC/RPD for 3 major chemicals associated with false-positive genotoxicity test results: ethyl acrylate, eugenol and p-nitrophenol. Moreover, 25 compounds with false-positive in vitro chromosomal aberration (CA) test results were re-evaluated to establish a retrospective evaluation method based on derived estimated RICC/RPD values. The estimated RICC/RPD values were in good agreement with the measured RICC/RPD values for every concentration and chemical, and the estimated RICC suggested the possibility that 12 chemicals (48%) with previously judged false-positive results in fact had negative results. Our method enables transformation of RCC data into RICC/RPD values with a high degree of accuracy and will facilitate comprehensive retrospective evaluation of test results. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Overview of U.S. EPA Office of Research and Development’s planned research on analysis and monitoring in fresh and coastal/estuarine environments

    EPA Science Inventory

    This research plan has several objectives: 1) develop new or refine existing chemical, instrument and biological methods for the detection of cyanobacteria and their toxins; test such methods in field studies in both HAB and non HAB environments; 2) determine the method(s) that c...

  5. Hydrogeologic and hydraulic characterization of aquifer and nonaquifer layers in a lateritic terrain (West Bengal, India)

    NASA Astrophysics Data System (ADS)

    Biswal, Sabinaya; Jha, Madan K.; Sharma, Shashi P.

    2018-02-01

    The hydrogeologic and hydraulic characteristics of a lateritic terrain in West Bengal, India, were investigated. Test drilling was conducted at ten sites and grain-size distribution curves (GSDCs) were prepared for 275 geologic samples. Performance evaluation of eight grain-size-analysis (GSA) methods was carried out to estimate the hydraulic conductivity (K) of subsurface formations. Finally, the GSA results were validated against pumping-test data. The GSDCs indicated that shallow aquifer layers are coarser than the deeper aquifer layers (uniformity coefficient 0.19-11.4). Stratigraphy analysis revealed that both shallow and deep aquifers of varying thickness exist at depths 9-40 and 40-79 m, respectively. The mean K estimates by the GSA methods are 3.62-292.86 m/day for shallow aquifer layers and 0.97-209.93 m/day for the deeper aquifer layers, suggesting significant aquifer heterogeneity. Pumping-test data indicated that the deeper aquifers are leaky confined with transmissivity 122.69-693.79 m2/day, storage coefficient 1.01 × 10-7-2.13 × 10-4 and leakance 2.01 × 10-7-34.56 × 10-2 day-1. Although the K values yielded by the GSA methods are generally larger than those obtained from the pumping tests, the Slichter, Harleman and US Bureau Reclamation (USBR) GSA methods yielded reasonable values at most of the sites (1-3 times higher than K estimates by the pumping-test method). In conclusion, more reliable aquifers exist at deeper depths that can be tapped for dependable water supply. GSA methods such as Slichter, Harleman and USBR can be used for the preliminary assessment of K in lateritic terrains in the absence of reliable field methods.

  6. A NEW APPROACH TO THE STUDY OF MUCOADHESIVENESS OF POLYMERIC MEMBRANES USING SILICONE DISCS.

    PubMed

    Nowak, Karolina Maria; Szterk, Arkadiusz; Fiedor, Piotr; Bodek, Kazimiera Henryka

    2016-01-01

    The introduction of new test methods and the modification of existing ones are crucial for obtaining reliable results, which contributes to the development of innovative materials that may have clinical applications. Today, silicone is commonly used in medicine and the diversity of its applications are continually growing. The aim of this study is to evaluate the mucoadhesiveness of polymeric membranes by a method that modifies the existing test methods through the introduction of silicone discs. The matrices were designed for clinical application in the management of diseases within the oral cavity. The use of silicone discs allows reliable and reproducible results to be obtained, which allows us to make various tensometric measurements. In this study, different types of polymeric matrices were examined, as well as their crosslinking and the presence for the active pharmaceutical ingredient were compared to the pure dosage form. The lidocaine hydrochloride (Lid(HCl)) was used as a model active substance, due to its use in dentistry and clinical safety. The results were characterized by a high repeatability (RSD < 10.6%). The advantage of silicone material due to its mechanical strength, chemical and physical resistance, allowed a new test method using a texture analyzer to be proposed.

  7. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diprete, D.; McCabe, D.

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less

  8. LLNL small-scale static spark machine: static spark sensitivity test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foltz, M F; Simpson, L R

    1999-08-23

    Small-scale safety testing of explosives and other energetic materials is done in order to determine their sensitivity to various stimuli, such as friction, static spark, and impact. Typically this testing is done to discover potential handling problems that may exist for either newly synthesized materials of unknown behavior, or materials that have been stored for long periods of time. This report describes the existing ''Static Spark Test Apparatus'' at Lawrence Livermore National Laboratory (LLNL), as well as the method used to evaluate the relative static spark sensitivity of energetic materials. The basic design, originally developed by the Picatinny Arsenal inmore » New Jersey, is discussed. The accumulated data for the materials tested to date is not included here, with the exception of specific examples that have yielded interesting or unusual results during the tests.« less

  9. A retrospective analysis of in vivo eye irritation, skin irritation and skin sensitisation studies with agrochemical formulations: Setting the scene for development of alternative strategies.

    PubMed

    Corvaro, M; Gehen, S; Andrews, K; Chatfield, R; Macleod, F; Mehta, J

    2017-10-01

    Analysis of the prevalence of health effects in large scale databases is key in defining testing strategies within the context of Integrated Approaches on Testing and Assessment (IATA), and is relevant to drive policy changes in existing regulatory toxicology frameworks towards non-animal approaches. A retrospective analysis of existing results from in vivo skin irritation, eye irritation, and skin sensitisation studies on a database of 223 agrochemical formulations is herein published. For skin or eye effects, high prevalence of mild to non-irritant formulations (i.e. per GHS, CLP or EPA classification) would generally suggest a bottom-up approach. Severity of erythema or corneal opacity, for skinor eye effects respectively, were the key drivers for classification, consistent with existing literature. The reciprocal predictivity of skin versus eye irritation and the good negative predictivity of the GHS additivity calculation approach (>85%) provided valuable non-testing evidence for irritation endpoints. For dermal sensitisation, concordance on data from three different methods confirmed the high false negative rate for the Buehler method in this product class. These results have been reviewed together with existing literature on the use of in vitro alternatives for agrochemical formulations, to propose improvements to current regulatory strategies and to identify further research needs. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Statistical Considerations Concerning Dissimilar Regulatory Requirements for Dissolution Similarity Assessment. The Example of Immediate-Release Dosage Forms.

    PubMed

    Jasińska-Stroschein, Magdalena; Kurczewska, Urszula; Orszulak-Michalak, Daria

    2017-05-01

    When performing in vitro dissolution testing, especially in the area of biowaivers, it is necessary to follow regulatory guidelines to minimize the risk of an unsafe or ineffective product being approved. The present study examines model-independent and model-dependent methods of comparing dissolution profiles based on various compared and contrasted international guidelines. Dissolution profiles for immediate release solid oral dosage forms were generated. The test material comprised tablets containing several substances, with at least 85% of the labeled amount dissolved within 15 min, 20-30 min, or 45 min. Dissolution profile similarity can vary with regard to the following criteria: time point selection (including the last time point), coefficient of variation, and statistical method selection. Variation between regulatory guidance and statistical methods can raise methodological questions and result potentially in a different outcome when reporting dissolution profile testing. The harmonization of existing guidelines would address existing problems concerning the interpretation of regulatory recommendations and research findings. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  11. Conditions for the existence of Kelvin-Helmholtz instability in a CME

    NASA Astrophysics Data System (ADS)

    Jatenco-Pereira, Vera; Páez, Andrés; Falceta-Gonçalves, Diego; Opher, Merav

    2015-08-01

    The presence of Kelvin-Helmholtz instability (KHI) in the sheaths of the Coronal Mass Ejection (CME) has motivated several analysis and simulations to test their existence. In the present work we assume the existence of the KHI and propose a method to identify the regions where it is possible the development of KHI for a CME propagating in a fast and slow solar wind. We build functions for the velocities, densities and magnetic fields for two different zones of interaction between the solar wind and a CME. Based on the theory of magnetic KHI proposed by Chandrasekhar (1961) and we found conditions for the existence of KHI in the CME sheaths. Using this method it is possible to determine the range of parameters, in particular CME magnetic fields in which the KHI could exist. We conclude that KHI may exist in the two CME flanks and it is perceived that the zone with boundaries with the slow solar wind is more appropriated for the formation of the KHI.

  12. Understanding patient choices for attending sexually transmitted infection testing services: a qualitative study

    PubMed Central

    Pollard, Alex; Miners, Alec; Richardson, Daniel; Fisher, Martin; Cairns, John; Smith, Helen

    2012-01-01

    Objectives To establish which aspects of sexually transmitted infection (STI) testing services are important to STI testing service users. Methods 10 focus groups consisting of previous or existing users of STI testing services were conducted in community settings in the south east of England. Groups were quota sampled based on age, gender and sexual orientation. Data were analysed using Framework Analysis. Results 65 respondents (58% men) participated. Perceived expertise of staff was the key reason for attendance at genitourinary medicine services rather than general practice. Although some respondents voiced a willingness to test for STIs within general practice, the apparent limited range of tests available in general practice and the perceived lack of expertise around sexual health appeared to discourage attendance at general practice. The decision of where to test for STIs was also influenced by past experience of testing, existing relationships with general practice, method of receiving test results and whether the patient had other medical conditions such as HIV. Conclusions No one type of STI testing service is suitable for all patients. This is recognised by policymakers, and it now requires commissioners and providers to make services outside of genitourinary medicine clinics more acceptable and attractive to patients, in particular to address the perceived lack of expertise and limited range of STIs tests available at alternative testing sites. PMID:22628665

  13. How Magnetic Disturbance Influences the Attitude and Heading in Magnetic and Inertial Sensor-Based Orientation Estimation

    PubMed Central

    Li, Qingguo

    2017-01-01

    With the advancements in micro-electromechanical systems (MEMS) technologies, magnetic and inertial sensors are becoming more and more accurate, lightweight, smaller in size as well as low-cost, which in turn boosts their applications in human movement analysis. However, challenges still exist in the field of sensor orientation estimation, where magnetic disturbance represents one of the obstacles limiting their practical application. The objective of this paper is to systematically analyze exactly how magnetic disturbances affects the attitude and heading estimation for a magnetic and inertial sensor. First, we reviewed four major components dealing with magnetic disturbance, namely decoupling attitude estimation from magnetic reading, gyro bias estimation, adaptive strategies of compensating magnetic disturbance and sensor fusion algorithms. We review and analyze the features of existing methods of each component. Second, to understand each component in magnetic disturbance rejection, four representative sensor fusion methods were implemented, including gradient descent algorithms, improved explicit complementary filter, dual-linear Kalman filter and extended Kalman filter. Finally, a new standardized testing procedure has been developed to objectively assess the performance of each method against magnetic disturbance. Based upon the testing results, the strength and weakness of the existing sensor fusion methods were easily examined, and suggestions were presented for selecting a proper sensor fusion algorithm or developing new sensor fusion method. PMID:29283432

  14. Development of a Complete Life Cycle Sediment Toxicity Test for the Sheepshead Minnow (Cyprinodon variegatus)

    EPA Science Inventory

    Existing sediment toxicity test methods are limited to acute and chronic exposure of invertebrates and acute exposure of vertebrates, with limited guidance on the chronic exposure of vertebrates, specifically fishes. A series of life stage-specific studies were conducted to dete...

  15. The Very Essentials of Fitness for Trial Assessment in Canada

    ERIC Educational Resources Information Center

    Newby, Diana; Faltin, Robert

    2008-01-01

    Fitness for trial constitutes the most frequent referral to forensic assessment services. Several approaches to this evaluation exist in Canada, including the Fitness Interview Test and Basic Fitness for Trial Test. The following article presents a review of the issues and a method for basic fitness for trial evaluation.

  16. Statistical Models for Incorporating Data from Routine HIV Testing of Pregnant Women at Antenatal Clinics into HIV/AIDS Epidemic Estimates

    PubMed Central

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B.; Gregson, Simon; Eaton, Jeffrey W.; Bao, Le

    2017-01-01

    Objective HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women, and can be used to improve estimates of national and sub-national HIV prevalence trends. We develop methods to incorporate this new data source into the UNAIDS Estimation and Projection Package (EPP) in Spectrum 2017. Methods We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (‘site-level’) or regionally (‘census-level’), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. Results We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. Conclusion We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends, and should be tested as more data become available from national ANC-RT programs. PMID:28296804

  17. International development of methods of analysis for the presence of products of modern biotechnology.

    PubMed

    Cantrill, Richard C

    2008-01-01

    Methods of analysis for products of modern biotechnology are required for national and international trade in seeds, grain and food in order to meet the labeling or import/export requirements of different nations and trading blocks. Although many methods were developed by the originators of transgenic events, governments, universities, and testing laboratories, trade is less complicated if there exists a set of international consensus-derived analytical standards. In any analytical situation, multiple methods may exist for testing for the same analyte. These methods may be supported by regional preferences and regulatory requirements. However, tests need to be sensitive enough to determine low levels of these traits in commodity grain for regulatory purposes and also to indicate purity of seeds containing these traits. The International Organization for Standardization (ISO) and its European counterpart have worked to produce a suite of standards through open, balanced and consensus-driven processes. Presently, these standards are approaching the time for their first review. In fact, ISO 21572, the "protein standard" has already been circulated for systematic review. In order to expedite the review and revision of the nucleic acid standards an ISO Technical Specification (ISO/TS 21098) was drafted to set the criteria for the inclusion of precision data from collaborative studies into the annexes of these standards.

  18. Job Knowledge Test Design: A Cognitively-Oriented Approach. Institute Report No. 241.

    ERIC Educational Resources Information Center

    DuBois, David; And Others

    Selected cognitive science methods were used to modify existing test development procedures so that the modified procedures could in turn be used to improve the usefulness of job knowledge tests as a proxy for hands-on performance. A plan-goal graph representation was used to capture the knowledge content and goal structure of the task of using a…

  19. Reader Reaction On the generalized Kruskal-Wallis test for genetic association studies incorporating group uncertainty

    PubMed Central

    Wu, Baolin; Guan, Weihua

    2015-01-01

    Summary Acar and Sun (2013, Biometrics, 69, 427-435) presented a generalized Kruskal-Wallis (GKW) test for genetic association studies that incorporated the genotype uncertainty and showed its robust and competitive performance compared to existing methods. We present another interesting way to derive the GKW test via a rank linear model. PMID:25351417

  20. Reader reaction on the generalized Kruskal-Wallis test for genetic association studies incorporating group uncertainty.

    PubMed

    Wu, Baolin; Guan, Weihua

    2015-06-01

    Acar and Sun (2013, Biometrics 69, 427-435) presented a generalized Kruskal-Wallis (GKW) test for genetic association studies that incorporated the genotype uncertainty and showed its robust and competitive performance compared to existing methods. We present another interesting way to derive the GKW test via a rank linear model. © 2014, The International Biometric Society.

  1. Improving Upon String Methods for Transition State Discovery.

    PubMed

    Chaffey-Millar, Hugh; Nikodem, Astrid; Matveev, Alexei V; Krüger, Sven; Rösch, Notker

    2012-02-14

    Transition state discovery via application of string methods has been researched on two fronts. The first front involves development of a new string method, named the Searching String method, while the second one aims at estimating transition states from a discretized reaction path. The Searching String method has been benchmarked against a number of previously existing string methods and the Nudged Elastic Band method. The developed methods have led to a reduction in the number of gradient calls required to optimize a transition state, as compared to existing methods. The Searching String method reported here places new beads on a reaction pathway at the midpoint between existing beads, such that the resolution of the path discretization in the region containing the transition state grows exponentially with the number of beads. This approach leads to favorable convergence behavior and generates more accurate estimates of transition states from which convergence to the final transition states occurs more readily. Several techniques for generating improved estimates of transition states from a converged string or nudged elastic band have been developed and benchmarked on 13 chemical test cases. Optimization approaches for string methods, and pitfalls therein, are discussed.

  2. Diagnosis of coccidioidomycosis by culture: safety considerations, traditional methods, and susceptibility testing.

    PubMed

    Sutton, Deanna A

    2007-09-01

    The recovery of Coccidioides spp. by culture and confirmation utilizing the AccuProbe nucleic acid hybridization method by GenProbe remain the definitive diagnostic method. Biosafety considerations from specimen collection through culture confirmation in the mycology laboratory are critical, as acquisition of coccidioidomycosis by laboratory workers is well documented. The designation of Coccidioides spp. as select agents of potential bioterrorism has mandated strict regulation of their transport and inventory. The genus appears generally susceptible, in vitro, although no defined breakpoints exist. Susceptibility testing may assist in documenting treatment failures.

  3. A new background subtraction method for energy dispersive X-ray fluorescence spectra using a cubic spline interpolation

    NASA Astrophysics Data System (ADS)

    Yi, Longtao; Liu, Zhiguo; Wang, Kai; Chen, Man; Peng, Shiqi; Zhao, Weigang; He, Jialin; Zhao, Guangcui

    2015-03-01

    A new method is presented to subtract the background from the energy dispersive X-ray fluorescence (EDXRF) spectrum using a cubic spline interpolation. To accurately obtain interpolation nodes, a smooth fitting and a set of discriminant formulations were adopted. From these interpolation nodes, the background is estimated by a calculated cubic spline function. The method has been tested on spectra measured from a coin and an oil painting using a confocal MXRF setup. In addition, the method has been tested on an existing sample spectrum. The result confirms that the method can properly subtract the background.

  4. Fabrication methods for YF-12 wing panels for the Supersonic Cruise Aircraft Research Program

    NASA Technical Reports Server (NTRS)

    Hoffman, E. L.; Payne, L.; Carter, A. L.

    1975-01-01

    Advanced fabrication and joining processes for titanium and composite materials are being investigated by NASA to develop technology for the Supersonic Cruise Aircraft Research (SCAR) Program. With Lockheed-ADP as the prime contractor, full-scale structural panels are being designed and fabricated to replace an existing integrally stiffened shear panel on the upper wing surface of the NASA YF-12 aircraft. The program involves ground testing and Mach 3 flight testing of full-scale structural panels and laboratory testing of representative structural element specimens. Fabrication methods and test results for weldbrazed and Rohrbond titanium panels are discussed. The fabrication methods being developed for boron/aluminum, Borsic/aluminum, and graphite/polyimide panels are also presented.

  5. Development of monitoring and diagnostic methods for robots used in remediation of waste sites. 1997 annual progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tecza, J.

    1998-06-01

    'Safe and efficient clean up of hazardous and radioactive waste sites throughout the DOE complex will require extensive use of robots. This research effort focuses on developing Monitoring and Diagnostic (M and D) methods for robots that will provide early detection, isolation, and tracking of impending faults before they result in serious failure. The utility and effectiveness of applying M and D methods to hydraulic robots has never been proven. The present research program is utilizing seeded faults in a laboratory test rig that is representative of an existing hydraulically-powered remediation robot. This report summarizes activity conducted in the firstmore » 9 months of the project. The research team has analyzed the Rosie Mobile Worksystem as a representative hydraulic robot, developed a test rig for implanted fault testing, developed a test plan and agenda, and established methods for acquiring and analyzing the test data.'« less

  6. Rapid assessment of rice seed availability for wildlife in harvested fields

    USGS Publications Warehouse

    Halstead, B.J.; Miller, M.R.; Casazza, Michael L.; Coates, P.S.; Farinha, M.A.; Benjamin, Gustafson K.; Yee, J.L.; Fleskes, J.P.

    2011-01-01

    Rice seed remaining in commercial fields after harvest (waste rice) is a critical food resource for wintering waterfowl in rice-growing regions of North America. Accurate and precise estimates of the seed mass density of waste rice are essential for planning waterfowl wintering habitat extents and management. In the Sacramento Valley of California, USA, the existing method for obtaining estimates of availability of waste rice in harvested fields produces relatively precise estimates, but the labor-, time-, and machineryintensive process is not practical for routine assessments needed to examine long-term trends in waste rice availability. We tested several experimental methods designed to rapidly derive estimates that would not be burdened with disadvantages of the existing method. We first conducted a simulation study of the efficiency of each method and then conducted field tests. For each approach, methods did not vary in root mean squared error, although some methods did exhibit bias for both simulations and field tests. Methods also varied substantially in the time to conduct each sample and in the number of samples required to detect a standard trend. Overall, modified line-intercept methods performed well for estimating the density of rice seeds. Waste rice in the straw, although not measured directly, can be accounted for by a positive relationship with density of rice on the ground. Rapid assessment of food availability is a useful tool to help waterfowl managers establish and implement wetland restoration and agricultural habitat-enhancement goals for wintering waterfowl. ?? 2011 The Wildlife Society.

  7. A Review of Treatment Adherence Measurement Methods

    ERIC Educational Resources Information Center

    Schoenwald, Sonja K.; Garland, Ann F.

    2013-01-01

    Fidelity measurement is critical for testing the effectiveness and implementation in practice of psychosocial interventions. Adherence is a critical component of fidelity. The purposes of this review were to catalogue adherence measurement methods and assess existing evidence for the valid and reliable use of the scores that they generate and the…

  8. Detecting wood surface defects with fusion algorithm of visual saliency and local threshold segmentation

    NASA Astrophysics Data System (ADS)

    Wang, Xuejuan; Wu, Shuhang; Liu, Yunpeng

    2018-04-01

    This paper presents a new method for wood defect detection. It can solve the over-segmentation problem existing in local threshold segmentation methods. This method effectively takes advantages of visual saliency and local threshold segmentation. Firstly, defect areas are coarsely located by using spectral residual method to calculate global visual saliency of them. Then, the threshold segmentation of maximum inter-class variance method is adopted for positioning and segmenting the wood surface defects precisely around the coarse located areas. Lastly, we use mathematical morphology to process the binary images after segmentation, which reduces the noise and small false objects. Experiments on test images of insect hole, dead knot and sound knot show that the method we proposed obtains ideal segmentation results and is superior to the existing segmentation methods based on edge detection, OSTU and threshold segmentation.

  9. Characterization of background concentrations of contaminants using a mixture of normal distributions.

    PubMed

    Qian, Song S; Lyons, Regan E

    2006-10-01

    We present a Bayesian approach for characterizing background contaminant concentration distributions using data from sites that may have been contaminated. Our method, focused on estimation, resolves several technical problems of the existing methods sanctioned by the U.S. Environmental Protection Agency (USEPA) (a hypothesis testing based method), resulting in a simple and quick procedure for estimating background contaminant concentrations. The proposed Bayesian method is applied to two data sets from a federal facility regulated under the Resource Conservation and Restoration Act. The results are compared to background distributions identified using existing methods recommended by the USEPA. The two data sets represent low and moderate levels of censorship in the data. Although an unbiased estimator is elusive, we show that the proposed Bayesian estimation method will have a smaller bias than the EPA recommended method.

  10. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  11. A hidden two-locus disease association pattern in genome-wide association studies

    PubMed Central

    2011-01-01

    Background Recent association analyses in genome-wide association studies (GWAS) mainly focus on single-locus association tests (marginal tests) and two-locus interaction detections. These analysis methods have provided strong evidence of associations between genetics variances and complex diseases. However, there exists a type of association pattern, which often occurs within local regions in the genome and is unlikely to be detected by either marginal tests or interaction tests. This association pattern involves a group of correlated single-nucleotide polymorphisms (SNPs). The correlation among SNPs can lead to weak marginal effects and the interaction does not play a role in this association pattern. This phenomenon is due to the existence of unfaithfulness: the marginal effects of correlated SNPs do not express their significant joint effects faithfully due to the correlation cancelation. Results In this paper, we develop a computational method to detect this association pattern masked by unfaithfulness. We have applied our method to analyze seven data sets from the Wellcome Trust Case Control Consortium (WTCCC). The analysis for each data set takes about one week to finish the examination of all pairs of SNPs. Based on the empirical result of these real data, we show that this type of association masked by unfaithfulness widely exists in GWAS. Conclusions These newly identified associations enrich the discoveries of GWAS, which may provide new insights both in the analysis of tagSNPs and in the experiment design of GWAS. Since these associations may be easily missed by existing analysis tools, we can only connect some of them to publicly available findings from other association studies. As independent data set is limited at this moment, we also have difficulties to replicate these findings. More biological implications need further investigation. Availability The software is freely available at http://bioinformatics.ust.hk/hidden_pattern_finder.zip. PMID:21569557

  12. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    PubMed

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current methods. Analytical methods are often used to ensure safety, efficacy, and quality of medicinal products. According to government regulations and regulatory guidelines, these methods need to be validated through well-designed studies to minimize the risk of accepting unsuitable methods. This article describes a novel statistical test for analytical method validation, which provides better protection for the risk of accepting unsuitable analytical methods. © PDA, Inc. 2015.

  13. Fabrication and evaluation of advanced titanium structural panels for supersonic cruise aircraft

    NASA Technical Reports Server (NTRS)

    Payne, L.

    1977-01-01

    Flightworthy primary structural panels were designed, fabricated, and tested to investigate two advanced fabrication methods for titanium alloys. Skin-stringer panels fabricated using the weldbraze process, and honeycomb-core sandwich panels fabricated using a diffusion bonding process, were designed to replace an existing integrally stiffened shear panel on the upper wing surface of the NASA YF-12 research aircraft. The investigation included ground testing and Mach 3 flight testing of full-scale panels, and laboratory testing of representative structural element specimens. Test results obtained on full-scale panels and structural element specimens indicate that both of the fabrication methods investigated are suitable for primary structural applications on future civil and military supersonic cruise aircraft.

  14. Automated Simultaneous Assembly of Multistage Testlets for a High-Stakes Licensing Examination

    ERIC Educational Resources Information Center

    Breithaupt, Krista; Hare, Donovan R.

    2007-01-01

    Many challenges exist for high-stakes testing programs offering continuous computerized administration. The automated assembly of test questions to exactly meet content and other requirements, provide uniformity, and control item exposure can be modeled and solved by mixed-integer programming (MIP) methods. A case study of the computerized…

  15. An Investigation of Sample Size Splitting on ATFIND and DIMTEST

    ERIC Educational Resources Information Center

    Socha, Alan; DeMars, Christine E.

    2013-01-01

    Modeling multidimensional test data with a unidimensional model can result in serious statistical errors, such as bias in item parameter estimates. Many methods exist for assessing the dimensionality of a test. The current study focused on DIMTEST. Using simulated data, the effects of sample size splitting for use with the ATFIND procedure for…

  16. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less

  17. Conducting Slug Tests in Mini-Piezometers: B.G. Fritz Ground Water xx, no. x: x-xx

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, Bradley G.; Mackley, Rob D.; Arntzen, Evan V.

    Slug tests performed using mini-piezometers with diameters as small as 0.43 cm can provide a cost effective tool for hydraulic characterization. We evaluated the hydraulic properties of the apparatus in an infinite hydraulic conductivity environment and compared those results with field tests of mini-piezometers installed into locations with varying hydraulic properties. Based on our evaluation, slug tests conducted in mini-piezometers using the fabrication and installation approach described here are effective within formations where the hydraulic conductivity is less than 1 x 10-3 cm/s. While these constraints limit the potential application of this method, the benefits to this approach are thatmore » the installation, measurement and analysis is extremely cost effective, and the installation can be completed in areas where other (larger diameter) methods might not be possible. Additionally, this methodology could be applied to existing mini-piezometers previously installed for other purposes. Such analysis of existing installations could be beneficial in interpreting previously collected data (e.g. water quality data or hydraulic head data).« less

  18. Factors that influence utilisation of HIV/AIDS prevention methods among university students residing at a selected university campus.

    PubMed

    Ndabarora, Eléazar; Mchunu, Gugu

    2014-01-01

    Various studies have reported that university students, who are mostly young people, rarely use existing HIV/AIDS preventive methods. Although studies have shown that young university students have a high degree of knowledge about HIV/AIDS and HIV modes of transmission, they are still not utilising the existing HIV prevention methods and still engage in risky sexual practices favourable to HIV. Some variables, such as awareness of existing HIV/AIDS prevention methods, have been associated with utilisation of such methods. The study aimed to explore factors that influence use of existing HIV/AIDS prevention methods among university students residing in a selected campus, using the Health Belief Model (HBM) as a theoretical framework. A quantitative research approach and an exploratory-descriptive design were used to describe perceived factors that influence utilisation by university students of HIV/AIDS prevention methods. A total of 335 students completed online and manual questionnaires. Study findings showed that the factors which influenced utilisation of HIV/AIDS prevention methods were mainly determined by awareness of the existing university-based HIV/AIDS prevention strategies. Most utilised prevention methods were voluntary counselling and testing services and free condoms. Perceived susceptibility and perceived threat of HIV/AIDS score was also found to correlate with HIV risk index score. Perceived susceptibility and perceived threat of HIV/AIDS showed correlation with self-efficacy on condoms and their utilisation. Most HBM variables were not predictors of utilisation of HIV/AIDS prevention methods among students. Intervention aiming to improve the utilisation of HIV/AIDS prevention methods among students at the selected university should focus on removing identified barriers, promoting HIV/AIDS prevention services and providing appropriate resources to implement such programmes.

  19. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  20. Shrinkage regression-based methods for microarray missing value imputation.

    PubMed

    Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng

    2013-01-01

    Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.

  1. Sociometric Indicators of Leadership: An Exploratory Analysis

    DTIC Science & Technology

    2018-01-01

    streamline existing observational protocols and assessment methods . This research provides an initial test of sociometric badges in the context of the U.S...understand, the requirements of the mission. Traditional research and assessment methods focusing on leader and follower interactions require direct...based methods of social network analysis. Novel Measures of Leadership Building on these findings and earlier research , it is apparent that

  2. Interest and limitations of projective techniques in the assessment of personality disorders.

    PubMed

    Petot, J M

    2000-06-01

    Assessing personality disorders (PD) remains a difficult task because of persistent problems linked to concurrent validity of existing instruments, which are all structured interviews or self-report inventories. It has been advocated that indirect methods, projective techniques in particular, can strengthen PD assessment methods. The thematic apperception test (TAT) may be a significant adjuvant method of PD assessment.

  3. Local tolerance testing under REACH: Accepted non-animal methods are not on equal footing with animal tests.

    PubMed

    Sauer, Ursula G; Hill, Erin H; Curren, Rodger D; Raabe, Hans A; Kolle, Susanne N; Teubner, Wera; Mehling, Annette; Landsiedel, Robert

    2016-07-01

    In general, no single non-animal method can cover the complexity of any given animal test. Therefore, fixed sets of in vitro (and in chemico) methods have been combined into testing strategies for skin and eye irritation and skin sensitisation testing, with pre-defined prediction models for substance classification. Many of these methods have been adopted as OECD test guidelines. Various testing strategies have been successfully validated in extensive in-house and inter-laboratory studies, but they have not yet received formal acceptance for substance classification. Therefore, under the European REACH Regulation, data from testing strategies can, in general, only be used in so-called weight-of-evidence approaches. While animal testing data generated under the specific REACH information requirements are per se sufficient, the sufficiency of weight-of-evidence approaches can be questioned under the REACH system, and further animal testing can be required. This constitutes an imbalance between the regulatory acceptance of data from approved non-animal methods and animal tests that is not justified on scientific grounds. To ensure that testing strategies for local tolerance testing truly serve to replace animal testing for the REACH registration 2018 deadline (when the majority of existing chemicals have to be registered), clarity on their regulatory acceptance as complete replacements is urgently required. 2016 FRAME.

  4. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  5. Applicability of ambient toxicity testing to national or regional water-quality assessment

    USGS Publications Warehouse

    Elder, J.F.

    1989-01-01

    Comprehensive assessment of the quality of natural waters requires a multifaceted approach. Based on experimentation designed to monitor responses of organisms to environmental stresses, toxicity testing may have diverse purposes in water quality assessments. These purposes may include identification that warrant further study because of poor water quality or unusual ecological features, verification of other types of monitoring, or assessment of contaminant effects on aquatic communities. A wide variety of toxicity test methods have been developed to fulfill the needs of diverse applications. The methods differ primarily in the full selections made relative to four characteristics: (1) test species, (2) endpoints (acute or chronic), (3) test enclosure type, and (4) test substance (toxicant) that functions as the environmental stress. Toxicity test approachs vary in their capacity to meet the needs of large-scale assessments of existing water quality. Ambient testing is more likely to meet these needs than are the procedures that call for exposure of the test organisms to known concentrations of a single toxicant. However, meaningful interpretation of ambient test results depend on the existence of accompanying chemical analysis of the ambient media. The ambient test substance may be water or sediments. Sediment tests have had limited application, but they are useful because of the fact that most toxicants tend to accumulate in sediments, and many test species either inhabit the sediments or are in frequent contact with them. Biochemical testing methods, which have been developing rapidly in recent years, are likely to be among the most useful procedures for large-scale water quality assessments. They are relatively rapid and simple, and more importantly, they focus on biochemical changes that are the initial responses of virtually all organisms to environmental stimuli. Most species are sensitive to relatively few toxicants and their sensitivities vary as conditions change. One of the most informative approaches for toxicity testing is to combine biochemical tests with other test methods in a ' battery or tests ' that is diversified enough to characterize different types of toxicants and different trophic levels. (Lantz-PTT)

  6. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  7. A novel statistical approach for identification of the master regulator transcription factor.

    PubMed

    Sikdar, Sinjini; Datta, Susmita

    2017-02-02

    Transcription factors are known to play key roles in carcinogenesis and therefore, are gaining popularity as potential therapeutic targets in drug development. A 'master regulator' transcription factor often appears to control most of the regulatory activities of the other transcription factors and the associated genes. This 'master regulator' transcription factor is at the top of the hierarchy of the transcriptomic regulation. Therefore, it is important to identify and target the master regulator transcription factor for proper understanding of the associated disease process and identifying the best therapeutic option. We present a novel two-step computational approach for identification of master regulator transcription factor in a genome. At the first step of our method we test whether there exists any master regulator transcription factor in the system. We evaluate the concordance of two ranked lists of transcription factors using a statistical measure. In case the concordance measure is statistically significant, we conclude that there is a master regulator. At the second step, our method identifies the master regulator transcription factor, if there exists one. In the simulation scenario, our method performs reasonably well in validating the existence of a master regulator when the number of subjects in each treatment group is reasonably large. In application to two real datasets, our method ensures the existence of master regulators and identifies biologically meaningful master regulators. An R code for implementing our method in a sample test data can be found in http://www.somnathdatta.org/software . We have developed a screening method of identifying the 'master regulator' transcription factor just using only the gene expression data. Understanding the regulatory structure and finding the master regulator help narrowing the search space for identifying biomarkers for complex diseases such as cancer. In addition to identifying the master regulator our method provides an overview of the regulatory structure of the transcription factors which control the global gene expression profiles and consequently the cell functioning.

  8. Biodegradability standards for carrier bags and plastic films in aquatic environments: a critical review

    PubMed Central

    Boardman, Carl; O'Callaghan, Kenneth; Delort, Anne-Marie; Song, Jim

    2018-01-01

    Plastic litter is encountered in aquatic ecosystems across the globe, including polar environments and the deep sea. To mitigate the adverse societal and ecological impacts of this waste, there has been debate on whether ‘biodegradable' materials should be granted exemptions from plastic bag bans and levies. However, great care must be exercised when attempting to define this term, due to the broad and complex range of physical and chemical conditions encountered within natural ecosystems. Here, we review existing international industry standards and regional test methods for evaluating the biodegradability of plastics within aquatic environments (wastewater, unmanaged freshwater and marine habitats). We argue that current standards and test methods are insufficient in their ability to realistically predict the biodegradability of carrier bags in these environments, due to several shortcomings in experimental procedures and a paucity of information in the scientific literature. Moreover, existing biodegradability standards and test methods for aquatic environments do not involve toxicity testing or account for the potentially adverse ecological impacts of carrier bags, plastic additives, polymer degradation products or small (microscopic) plastic particles that can arise via fragmentation. Successfully addressing these knowledge gaps is a key requirement for developing new biodegradability standard(s) for lightweight carrier bags. PMID:29892374

  9. A comparison of stimulus presentation methods in temporal discrimination testing.

    PubMed

    Mc Govern, Eavan M; Butler, John S; Beiser, Ines; Williams, Laura; Quinlivan, Brendan; Narasiham, Shruti; Beck, Rebecca; O'Riordan, Sean; Reilly, Richard B; Hutchinson, Michael

    2017-02-01

    The temporal discrimination threshold (TDT) is the shortest time interval at which an individual detects two stimuli to be asynchronous (normal  =  30-50 ms). It has been shown to be abnormal in patients with disorders affecting the basal ganglia including adult onset idiopathic focal dystonia (AOIFD). Up to 97% of patients have an abnormal TDT with age- and sex-related penetrance in unaffected relatives, demonstrating an autosomal dominant inheritance pattern. These findings support the use of the TDT as a pre-clinical biomarker for AOIFD. The usual stimulus presentation method involves the presentation of progressively asynchronous stimuli; when three sequential stimuli are reported asynchronous is taken as a participant's TDT. To investigate the robustness of the 'staircase' method of presentation, we introduced a method of randomised presentation order to explore any potential 'learning effect' that may be associated with this existing method. The aim of this study was to investigate differences in temporal discrimination using two methods of stimulus presentation. Thirty healthy volunteers were recruited to the study (mean age 33.73  ±  3.4 years). Visual and tactile TDT testing using a staircase and randomised method of presentation order was carried out in a single session. There was a strong relationship between the staircase and random method for TDT values. This observed consistency between testing methods suggests that the existing experimental approach is a robust method of recording an individual's TDT. In addition, our newly devised randomised paradigm is a reproducible and more efficient method for data acquisition in the clinic setting. However, the two presentation methods yield different absolute TDT results and either of the two methods should be used uniformly in all participants in any one particular study.

  10. Requirements-Driven Log Analysis Extended Abstract

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  11. Unified Least Squares Methods for the Evaluation of Diagnostic Tests With the Gold Standard

    PubMed Central

    Tang, Liansheng Larry; Yuan, Ao; Collins, John; Che, Xuan; Chan, Leighton

    2017-01-01

    The article proposes a unified least squares method to estimate the receiver operating characteristic (ROC) parameters for continuous and ordinal diagnostic tests, such as cancer biomarkers. The method is based on a linear model framework using the empirically estimated sensitivities and specificities as input “data.” It gives consistent estimates for regression and accuracy parameters when the underlying continuous test results are normally distributed after some monotonic transformation. The key difference between the proposed method and the method of Tang and Zhou lies in the response variable. The response variable in the latter is transformed empirical ROC curves at different thresholds. It takes on many values for continuous test results, but few values for ordinal test results. The limited number of values for the response variable makes it impractical for ordinal data. However, the response variable in the proposed method takes on many more distinct values so that the method yields valid estimates for ordinal data. Extensive simulation studies are conducted to investigate and compare the finite sample performance of the proposed method with an existing method, and the method is then used to analyze 2 real cancer diagnostic example as an illustration. PMID:28469385

  12. Testing high SPF sunscreens: a demonstration of the accuracy and reproducibility of the results of testing high SPF formulations by two methods and at different testing sites.

    PubMed

    Agin, Patricia Poh; Edmonds, Susan H

    2002-08-01

    The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) 'very water resistant' test method and/or the 1978 proposed monograph 'waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20-25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products.

  13. 26 CFR 53.4942(b)-2 - Alternative tests.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... methods exist (such as historical objects or buildings, certain works of art, and botanical gardens). In... museums and schools for public display. These paintings constitute 80 percent of Z's assets. Under these...

  14. 26 CFR 53.4942(b)-2 - Alternative tests.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... methods exist (such as historical objects or buildings, certain works of art, and botanical gardens). In... museums and schools for public display. These paintings constitute 80 percent of Z's assets. Under these...

  15. Satellite estimation of incident photosynthetically active radiation using ultraviolet reflectance

    NASA Technical Reports Server (NTRS)

    Eck, Thomas F.; Dye, Dennis G.

    1991-01-01

    A new satellite remote sensing method for estimating the amount of photosynthetically active radiation (PAR, 400-700 nm) incident at the earth's surface is described and tested. Potential incident PAR for clear sky conditions is computed from an existing spectral model. A major advantage of the UV approach over existing visible band approaches to estimating insolation is the improved ability to discriminate clouds from high-albedo background surfaces. UV spectral reflectance data from the Total Ozone Mapping Spectrometer (TOMS) were used to test the approach for three climatically distinct, midlatitude locations. Estimates of monthly total incident PAR from the satellite technique differed from values computed from ground-based pyranometer measurements by less than 6 percent. This UV remote sensing method can be applied to estimate PAR insolation over ocean and land surfaces which are free of ice and snow.

  16. Water Mapping Using Multispectral Airborne LIDAR Data

    NASA Astrophysics Data System (ADS)

    Yan, W. Y.; Shaker, A.; LaRocque, P. E.

    2018-04-01

    This study investigates the use of the world's first multispectral airborne LiDAR sensor, Optech Titan, manufactured by Teledyne Optech to serve the purpose of automatic land-water classification with a particular focus on near shore region and river environment. Although there exist recent studies utilizing airborne LiDAR data for shoreline detection and water surface mapping, the majority of them only perform experimental testing on clipped data subset or rely on data fusion with aerial/satellite image. In addition, most of the existing approaches require manual intervention or existing tidal/datum data for sample collection of training data. To tackle the drawbacks of previous approaches, we propose and develop an automatic data processing workflow for land-water classification using multispectral airborne LiDAR data. Depending on the nature of the study scene, two methods are proposed for automatic training data selection. The first method utilizes the elevation/intensity histogram fitted with Gaussian mixture model (GMM) to preliminarily split the land and water bodies. The second method mainly relies on the use of a newly developed scan line elevation intensity ratio (SLIER) to estimate the water surface data points. Regardless of the training methods being used, feature spaces can be constructed using the multispectral LiDAR intensity, elevation and other features derived from these parameters. The comprehensive workflow was tested with two datasets collected for different near shore region and river environment, where the overall accuracy yielded better than 96 %.

  17. A sequential test for assessing observed agreement between raters.

    PubMed

    Bersimis, Sotiris; Sachlas, Athanasios; Chakraborti, Subha

    2018-01-01

    Assessing the agreement between two or more raters is an important topic in medical practice. Existing techniques, which deal with categorical data, are based on contingency tables. This is often an obstacle in practice as we have to wait for a long time to collect the appropriate sample size of subjects to construct the contingency table. In this paper, we introduce a nonparametric sequential test for assessing agreement, which can be applied as data accrues, does not require a contingency table, facilitating a rapid assessment of the agreement. The proposed test is based on the cumulative sum of the number of disagreements between the two raters and a suitable statistic representing the waiting time until the cumulative sum exceeds a predefined threshold. We treat the cases of testing two raters' agreement with respect to one or more characteristics and using two or more classification categories, the case where the two raters extremely disagree, and finally the case of testing more than two raters' agreement. The numerical investigation shows that the proposed test has excellent performance. Compared to the existing methods, the proposed method appears to require significantly smaller sample size with equivalent power. Moreover, the proposed method is easily generalizable and brings the problem of assessing the agreement between two or more raters and one or more characteristics under a unified framework, thus providing an easy to use tool to medical practitioners. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. [Establishment of Assessment Method for Air Bacteria and Fungi Contamination].

    PubMed

    Zhang, Hua-ling; Yao, Da-jun; Zhang, Yu; Fang, Zi-liang

    2016-03-15

    In this paper, in order to settle existing problems in the assessment of air bacteria and fungi contamination, the indoor and outdoor air bacteria and fungi filed concentrations by impact method and settlement method in existing documents were collected and analyzed, then the goodness of chi square was used to test whether these concentration data obeyed normal distribution at the significant level of α = 0.05, and combined with the 3σ principle of normal distribution and the current assessment standards, the suggested concentrations ranges of air microbial concentrations were determined. The research results could provide a reference for developing air bacteria and fungi contamination assessment standards in the future.

  19. Validation in Support of Internationally Harmonised OECD Test Guidelines for Assessing the Safety of Chemicals.

    PubMed

    Gourmelon, Anne; Delrue, Nathalie

    Ten years elapsed since the OECD published the Guidance document on the validation and international regulatory acceptance of test methods for hazard assessment. Much experience has been gained since then in validation centres, in countries and at the OECD on a variety of test methods that were subjected to validation studies. This chapter reviews validation principles and highlights common features that appear to be important for further regulatory acceptance across studies. Existing OECD-agreed validation principles will most likely generally remain relevant and applicable to address challenges associated with the validation of future test methods. Some adaptations may be needed to take into account the level of technique introduced in test systems, but demonstration of relevance and reliability will continue to play a central role as pre-requisite for the regulatory acceptance. Demonstration of relevance will become more challenging for test methods that form part of a set of predictive tools and methods, and that do not stand alone. OECD is keen on ensuring that while these concepts evolve, countries can continue to rely on valid methods and harmonised approaches for an efficient testing and assessment of chemicals.

  20. 40 CFR 98.54 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... in paragraphs (b)(1) through (b)(3) of this section. (1) EPA Method 320, Measurement of Vapor Phase...) Direct measurement (such as using flow meters or weigh scales). (2) Existing plant procedures used for accounting purposes. (d) You must conduct all required performance tests according to the methods in § 98.54...

  1. An automatic and accurate method of full heart segmentation from CT image based on linear gradient model

    NASA Astrophysics Data System (ADS)

    Yang, Zili

    2017-07-01

    Heart segmentation is an important auxiliary method in the diagnosis of many heart diseases, such as coronary heart disease and atrial fibrillation, and in the planning of tumor radiotherapy. Most of the existing methods for full heart segmentation treat the heart as a whole part and cannot accurately extract the bottom of the heart. In this paper, we propose a new method based on linear gradient model to segment the whole heart from the CT images automatically and accurately. Twelve cases were tested in order to test this method and accurate segmentation results were achieved and identified by clinical experts. The results can provide reliable clinical support.

  2. Formal methods for test case generation

    NASA Technical Reports Server (NTRS)

    Rushby, John (Inventor); De Moura, Leonardo Mendonga (Inventor); Hamon, Gregoire (Inventor)

    2011-01-01

    The invention relates to the use of model checkers to generate efficient test sets for hardware and software systems. The method provides for extending existing tests to reach new coverage targets; searching *to* some or all of the uncovered targets in parallel; searching in parallel *from* some or all of the states reached in previous tests; and slicing the model relative to the current set of coverage targets. The invention provides efficient test case generation and test set formation. Deep regions of the state space can be reached within allotted time and memory. The approach has been applied to use of the model checkers of SRI's SAL system and to model-based designs developed in Stateflow. Stateflow models achieving complete state and transition coverage in a single test case are reported.

  3. Principles and Applications of Ultrasonic-Based Nondestructive Methods for Self-Healing in Cementitious Materials

    PubMed Central

    Ahn, Eunjong; Kim, Hyunjun; Sim, Sung-Han; Shin, Sung Woo; Shin, Myoungsu

    2017-01-01

    Recently, self-healing technologies have emerged as a promising approach to extend the service life of social infrastructure in the field of concrete construction. However, current evaluations of the self-healing technologies developed for cementitious materials are mostly limited to lab-scale experiments to inspect changes in surface crack width (by optical microscopy) and permeability. Furthermore, there is a universal lack of unified test methods to assess the effectiveness of self-healing technologies. Particularly, with respect to the self-healing of concrete applied in actual construction, nondestructive test methods are required to avoid interrupting the use of the structures under evaluation. This paper presents a review of all existing research on the principles of ultrasonic test methods and case studies pertaining to self-healing concrete. The main objective of the study is to examine the applicability and limitation of various ultrasonic test methods in assessing the self-healing performance. Finally, future directions on the development of reliable assessment methods for self-healing cementitious materials are suggested. PMID:28772640

  4. Applicability of ambient toxicity testing to national or regional water-quality assessment

    USGS Publications Warehouse

    Elder, John F.

    1990-01-01

    Comprehensive assessment of the quality of natural waters requires a multifaceted approach. Descriptions of existing conditions may be achieved by various kinds of chemical and hydrologic analyses, whereas information about the effects of such conditions on living organisms depends on biological monitoring. Toxicity testing is one type of biological monitoring that can be used to identify possible effects of toxic contaminants. Based on experimentation designed to monitor responses of organisms to environmental stresses, toxicity testing may have diverse purposes in water-quality assessments. These purposes may include identification of areas that warrant further study because of poor water quality or unusual ecological features, verification of other types of monitoring, or assessment of contaminant effects on aquatic communities. Toxicity-test results are most effective when used as a complement to chemical analyses, hydrologic measurements, and other biological monitoring. However, all toxicity-testing procedures have certain limitations that must be considered in developing the methodology and applications of toxicity testing in any large-scale water-quality-assessment program. A wide variety of toxicity-test methods have been developed to fulfill the needs of diverse applications. The methods differ primarily in the selections made relative to four characteristics: (1) test species, (2) endpoint (acute or chronic), (3) test-enclosure type, and (4) test substance (toxicant) that functions as the environmental stress. Toxicity-test approaches vary in their capacity to meet the needs of large-scale assessments of existing water quality. Ambient testing, whereby the test organism is exposed to naturally occurring substances that contain toxicant mixtures in an organic or inorganic matrix, is more likely to meet these needs than are procedures that call for exposure of the test organisms to known concentrations of a single toxicant. However, meaningful interpretation of ambient test results depends on the existence of accompanying chemical analysis of the ambient media. The ambient test substance may be water or sediments. Sediment tests have had limited application, but they are useful because most toxicants tend to accumulate in sediments and many test species either inhabit the sediments or are in frequent contact with them. Biochemical testing methods, which have been developing rapidly in recent years, are likely to be among the most useful procedures for large-scale water-quality assessments. They are relatively rapid and simple, and more. importantly, they focus on biochemical changes that are the initial responses of virtually all organisms to environmental stimuli. Most species are sensitive to relatively few toxicants, and their sensitivities vary as conditions change. Therefore, each test method has particular uses and limitations, and no single test has universal applicability. One of the most informative approaches to toxicity testing is to combine biochemical tests with other test methods in a 'battery of tests' that is diversified enough to characterize different types of toxicants and different trophic levels. However, such an approach can be costly, and if not carefully designed, it may not yield enough additional information to warrant the additional cost. The application of toxicity tests to large-scale water-quality assessments is hampered by a number of difficulties. Toxicity tests often are not sensitive enough to enable detection of most contaminant problems in the natural environment. Furthermore, because sensitivities among different species and test conditions can be highly variable, conclusions about the toxicant problems of an ecosystem are strongly dependent on the test procedure used. In addition, the experimental systems used in toxicity tests cannot replicate the complexity or variability of natural conditions, and positive test results cannot identify the source or nature of

  5. Viability of Existing INL Facilities for Dry Storage Cask Handling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randy Bohachek; Charles Park; Bruce Wallace

    2013-04-01

    This report evaluates existing capabilities at the INL to determine if a practical and cost effective method could be developed for opening and handling full-sized dry storage casks. The Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603, Irradiated Spent Fuel Storage Facility, provides the infrastructure to support handling and examining casks and their contents. Based on a reasonable set of assumptions, it is possible to receive, open, inspect, remove samples, close, and reseal large bolted-lid dry storage casks at the INL. The capability can also be used to open and inspect casks that were last examined at the TAN Hotmore » Shop over ten years ago. The Castor V/21 and REA-2023 casks can provide additional confirmatory information regarding the extended performance of low-burnup (<45 GWD/MTU) used nuclear fuel. Once a dry storage cask is opened inside CPP-603, used fuel retrieved from the cask can be packaged in a shipping cask, and sent to a laboratory for testing. Testing at the INL’s Materials and Fuels Complex (MFC) can occur starting with shipment of samples from CPP-603 over an on-site road, avoiding the need to use public highways. This reduces cost and reduces the risk to the public. The full suite of characterization methods needed to establish the condition of the fuel exists and MFC. Many other testing capabilities also exist at MFC, but when those capabilities are not adequate, samples can be prepared and shipped to other laboratories for testing. This report discusses how the casks would be handled, what work needs to be done to ready the facilities/capabilities, and what the work will cost.« less

  6. Viability of Existing INL Facilities for Dry Storage Cask Handling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohachek, Randy; Wallace, Bruce; Winston, Phil

    2013-04-30

    This report evaluates existing capabilities at the INL to determine if a practical and cost effective method could be developed for opening and handling full-sized dry storage casks. The Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603, Irradiated Spent Fuel Storage Facility, provides the infrastructure to support handling and examining casks and their contents. Based on a reasonable set of assumptions, it is possible to receive, open, inspect, remove samples, close, and reseal large bolted-lid dry storage casks at the INL. The capability can also be used to open and inspect casks that were last examined at the TAN Hotmore » Shop over ten years ago. The Castor V/21 and REA-2023 casks can provide additional confirmatory information regarding the extended performance of low-burnup (<45 GWD/MTU) used nuclear fuel. Once a dry storage cask is opened inside CPP-603, used fuel retrieved from the cask can be packaged in a shipping cask, and sent to a laboratory for testing. Testing at the INL’s Materials and Fuels Complex (MFC) can occur starting with shipment of samples from CPP-603 over an on-site road, avoiding the need to use public highways. This reduces cost and reduces the risk to the public. The full suite of characterization methods needed to establish the condition of the fuel exists and MFC. Many other testing capabilities also exist at MFC, but when those capabilities are not adequate, samples can be prepared and shipped to other laboratories for testing. This report discusses how the casks would be handled, what work needs to be done to ready the facilities/capabilities, and what the work will cost.« less

  7. On the horizon: new options for contraception.

    PubMed

    Reifsnider, E

    1997-01-01

    Future contraceptives include refinements of existing contraceptives and totally new methods. New formulations of oral contraceptives, subdermal hormonal implants, injectable hormones, vaginal spermicides, and intrauterine devices (IUDs) are being tested around the world. New methods that are not yet available include the use of vaginal preparations containing sperm-immobilizing agents, gonadotrophin releasing hormone agonists and antagonists, vaccines against ova and sperm, and endogenous hormones. Male contraceptive methods use hormones to suppress testosterone and vaccines to immobilize sperm. The availability of all future contraceptives is dependent on ample funds for research, development, and testing, and such funds are in jeopardy.

  8. Supervised segmentation of microelectrode recording artifacts using power spectral density.

    PubMed

    Bakstein, Eduard; Schneider, Jakub; Sieger, Tomas; Novak, Daniel; Wild, Jiri; Jech, Robert

    2015-08-01

    Appropriate detection of clean signal segments in extracellular microelectrode recordings (MER) is vital for maintaining high signal-to-noise ratio in MER studies. Existing alternatives to manual signal inspection are based on unsupervised change-point detection. We present a method of supervised MER artifact classification, based on power spectral density (PSD) and evaluate its performance on a database of 95 labelled MER signals. The proposed method yielded test-set accuracy of 90%, which was close to the accuracy of annotation (94%). The unsupervised methods achieved accuracy of about 77% on both training and testing data.

  9. [A study on plasma non-species specific antibody in employees working in a automobile engine testing workshop].

    PubMed

    Chen, D; Wu, T; Yuan, Y

    1996-11-01

    To investigate the existence of the non-species specific antibody in plasma of the employees working in an automobile engine testing workshop, and to use it as a scanning marker of various hazards, the heat-stress protein antigen method and western blot technique were used. This study showed that employees working in the automoblile engine testing workshop were affected by various hazards, such as noise, toxic chemicals (carbon monoxide, lead fume, benzene, and so on), and there existed non-species specific antibodies against protein 103,900 and 54,200 of rat liver in their plasma, which were postulated as the specific products produced by exposure to occupational hazards, such as noise, carbon monoxide, et al.

  10. Structural Integrity Testing Method for PRSEUS Rod-Wrap Stringer Design

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Grenoble, Ray W.; Pickell, Robert D.

    2012-01-01

    NASA Langley Research Center and The Boeing Company are developing an innovative composite structural concept, called PRSEUS, for the flat center section of a future environmentally friendly hybrid wing body (HWB) aircraft. The PRSEUS (Pultruded Rod Stitched Efficient Unitized Structure) concept uses dry textile preforms for the skins, frames, and stiffener webs. The highly loaded stiffeners are made from precured unidirectional carbon/epoxy rods and dry fiber preforms. The rods are wrapped with the dry fiber preforms and a resin infusion process is used to form the rod-wrap stiffeners. The structural integrity of the rod-wrap interface is critical for maintaining the panel s high strength and bending rigidity. No standard testing method exists for testing the strength of the rod-wrap bondline. Recently, Boeing proposed a rod push-out testing method and conducted some preliminary tests using this method. This paper details an analytical study of the rod-wrap bondline. The rod-wrap interface is modeled as a cohesive zone for studying the initiation and growth of interfacial debonding during push-out testing. Based on the correlations of analysis results and Boeing s test data, the adequacy of the rod-wrap testing method is evaluated, and potential approaches for improvement of the test method are proposed.

  11. Racking strength of walls : let-in corner bracing, sheet materials, and effect of loading rate

    Treesearch

    Roger L. Tuomi; David S. Gromala

    1977-01-01

    Determination of the racking strength of wails, which is a measure of a building system’s ability to resist wind loads, has generally been limited to performance testing. Although a standard test method exists, deviations have often been made in speed of testing and panel configuration. The purpose of this study was to determine the relative effect of some of these...

  12. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement - Part III: Material Property Characterization, Analysis, and Test Methods

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.

    2005-01-01

    The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.

  13. Optical methods for non-contact measurements of membranes

    NASA Astrophysics Data System (ADS)

    Roose, S.; Stockman, Y.; Rochus, P.; Kuhn, T.; Lang, M.; Baier, H.; Langlois, S.; Casarosa, G.

    2009-11-01

    Structures for space applications very often suffer stringent mass constraints. Lightweight structures are developed for this purpose, through the use of deployable and/or inflatable beams, and thin-film membranes. Their inherent properties (low mass and small thickness) preclude the use of conventional measurement methods (accelerometers and displacement transducers for example) during on-ground testing. In this context, innovative non-contact measurement methods need to be investigated for these stretched membranes. The object of the present project is to review existing measurement systems capable of measuring characteristics of membrane space-structures such as: dot-projection videogrammetry (static measurements), stereo-correlation (dynamic and static measurements), fringe projection (wrinkles) and 3D laser scanning vibrometry (dynamic measurements). Therefore, minimum requirements were given for the study in order to have representative test articles covering a wide range of applications. We present test results obtained with the different methods on our test articles.

  14. Development of test methods for textile composites

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Ifju, Peter G.; Fedro, Mark J.

    1993-01-01

    NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.

  15. Vestigial Biological Structures: A Classroom-Applicable Test of Creationist Hypotheses

    ERIC Educational Resources Information Center

    Senter, Phil; Ambrocio, Zenis; Andrade, Julia B.; Foust, Katanya K.; Gaston, Jasmine E.; Lewis, Ryshonda P.; Liniewski, Rachel M.; Ragin, Bobby A.; Robinson, Khanna L.; Stanley, Shane G.

    2015-01-01

    Lists of vestigial biological structures in biology textbooks are so short that some young-Earth creationist authors claim that scientists have lost confidence in the existence of vestigial structures and can no longer identify any verifiable ones. We tested these hypotheses with a method that is easily adapted to biology classes. We used online…

  16. Generation of openEHR Test Datasets for Benchmarking.

    PubMed

    El Helou, Samar; Karvonen, Tuukka; Yamamoto, Goshiro; Kume, Naoto; Kobayashi, Shinji; Kondo, Eiji; Hiragi, Shusuke; Okamoto, Kazuya; Tamura, Hiroshi; Kuroda, Tomohiro

    2017-01-01

    openEHR is a widely used EHR specification. Given its technology-independent nature, different approaches for implementing openEHR data repositories exist. Public openEHR datasets are needed to conduct benchmark analyses over different implementations. To address their current unavailability, we propose a method for generating openEHR test datasets that can be publicly shared and used.

  17. A General Audiovisual Temporal Processing Deficit in Adult Readers with Dyslexia

    ERIC Educational Resources Information Center

    Francisco, Ana A.; Jesse, Alexandra; Groen, Margriet A.; McQueen, James M.

    2017-01-01

    Purpose: Because reading is an audiovisual process, reading impairment may reflect an audiovisual processing deficit. The aim of the present study was to test the existence and scope of such a deficit in adult readers with dyslexia. Method: We tested 39 typical readers and 51 adult readers with dyslexia on their sensitivity to the simultaneity of…

  18. 76 FR 10600 - Medicare Program; Public Meeting in Calendar Year 2011 for New Clinical Laboratory Tests Payment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-25

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services [CMS-1347-N... method called ``cross-walking'' is used when a new test is determined to be comparable to an existing... either cross-walk or gap-fill. II. Format This meeting to receive comments and recommendations (including...

  19. A Comparison of What Is Part of Usability Testing in Three Countries

    NASA Astrophysics Data System (ADS)

    Clemmensen, Torkil

    The cultural diversity of users of technology challenges our methods for usability evaluation. In this paper we report and compare three ethnographic interview studies of what is a part of a standard (typical) usability test in a company in Mumbai, Beijing and Copenhagen. At each of these three locations, we use structural and contrast questions do a taxonomic and paradigm analysis of a how a company performs a usability test. We find similar parts across the three locations. We also find different results for each location. In Mumbai, most parts of the usability test are not related to the interactive application that is tested, but to differences in user characteristics, test preparation, method, and location. In Copenhagen, considerations about the client's needs are part of a usability test. In Beijing, the only varying factor is the communication pattern and relation to the user. These results are then contrasted in a cross cultural matrix to identify cultural themes that can help interpret results from existing laboratory research in usability test methods.

  20. Eye Dominance Predicts fMRI Signals in Human Retinotopic Cortex

    PubMed Central

    Mendola, Janine D.; Conner, Ian P.

    2009-01-01

    There have been many attempts to define eye dominance in normal subjects, but limited consensus exists, and relevant physiological data is scarce. In this study, we consider two different behavioral methods for assignment of eye dominance, and how well they predict fMRI signals evoked by monocular stimulation. Sighting eye dominance was assessed with two standard tests, the Porta Test, and a ‘hole in hand’ variation of the Miles Test. Acuity dominance was tested with a standard eye chart and with a computerized test of grating acuity. We found limited agreement between the sighting and acuity methods for assigning dominance in our individual subjects. We then compared the fMRI response generated by dominant eye stimulation to that generated by non-dominant eye, according to both methods, in 7 normal subjects. The stimulus consisted of a high contrast hemifield stimulus alternating with no stimulus in a blocked paradigm. In separate scans, we used standard techniques to label the borders of visual areas V1, V2, V3, VP, V4, V3A, and MT. These regions of interest (ROIs) were used to analyze each visual area separately. We found that percent change in fMRI BOLD signal was stronger for the dominant eye as defined by the acuity method, and this effect was significant for areas located in the ventral occipital territory (V1v, V2v, VP, V4). In contrast, assigning dominance based on sighting produced no significant interocular BOLD differences. We conclude that interocular BOLD differences in normal subjects exist, and may be predicted by acuity measures. PMID:17194544

  1. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach.

    PubMed

    Park, Hyunseok; Magee, Christopher L

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents.

  2. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach

    PubMed Central

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents. PMID:28135304

  3. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG

    PubMed Central

    Cowley, Benjamin U.; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap. PMID:29692705

  4. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    PubMed

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  5. Theoretical aspects for estimating anisotropic saturated hydraulic conductivity from in-well or direct-push probe injection tests in uniform media

    NASA Astrophysics Data System (ADS)

    Klammler, Harald; Layton, Leif; Nemer, Bassel; Hatfield, Kirk; Mohseni, Ana

    2017-06-01

    Hydraulic conductivity and its anisotropy are fundamental aquifer properties for groundwater flow and transport modeling. Current in-well or direct-push field measurement techniques allow for relatively quick determination of general conductivity profiles with depth. However, capabilities for identifying local scale conductivities in the horizontal and vertical directions are very limited. Here, we develop the theoretical basis for estimating horizontal and vertical conductivities from different types of steady-state single-well/probe injection tests under saturated conditions and in the absence of a well skin. We explore existing solutions and a recent semi-analytical solution approach to the flow problem under the assumption that the aquifer is locally homogeneous. The methods are based on the collection of an additional piece of information in the form of a second injection (or recirculation) test at a same location, or in the form of an additional head or flow observation along the well/probe. Results are represented in dimensionless charts for partial validation against approximate solutions and for practical application to test interpretation. The charts further allow for optimization of a test configuration to maximize sensitivity to anisotropy ratio. The two methods most sensitive to anisotropy are found to be (1) subsequent injection from a lateral screen and from the bottom of an otherwise cased borehole, and (2) single injection from a lateral screen with an additional head observation along the casing. Results may also be relevant for attributing consistent divergences in conductivity measurements from different testing methods applied at a same site or location to the potential effects of anisotropy. Some practical aspects are discussed and references are made to existing methods, which appear easily compatible with the proposed procedures.

  6. The Comparison Of In-Flight Pitot Static Calibration Method By Using Radio Altimeter As Reference with GPS and Tower Fly By Methods On CN235-100 MPA

    NASA Astrophysics Data System (ADS)

    Derajat; Hariowibowo, Hindawan

    2018-04-01

    The new proposed In-Flight Pitot Static Calibration Method has been carried out during Development and Qualification of CN235-100 MPA (Military Patrol Aircraft). This method is expected to reduce flight hours, less human resources required, no additional special equipment, simple analysis calculation and finally by using this method it is expected to automatically minimized operational cost. At The Indonesian Aerospace (IAe) Flight Test Center Division, the development and updating of new flight test technique and data analysis method as specially for flight physics test subject are still continued to be developed as long as it safety for flight and give additional value for the industrial side. More than 30 years, Flight Test Data Engineers at The Flight Test center Division work together with the Air Crew (Test Pilots, Co-Pilots, and Flight Test Engineers) to execute the flight test activity with standard procedure for both the existance or development test techniques and test data analysis. In this paper the approximation of mathematical model, data reduction and flight test technique of The In-Flight Pitot Static Calibration by using Radio Altimeter as reference will be described and the test results had been compared with another methods ie. By using Global Position System (GPS) and the traditional method (Tower Fly By Method) which were used previously during this Flight Test Program (Ref. [10]). The flight test data case are using CN235-100 MPA flight test data during development and Qualification Flight Test Program at Cazaux Airport, France, in June-November 2009 (Ref. [2]).

  7. Advanced Capabilities for Wind Tunnel Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.

    2010-01-01

    Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.

  8. Scaling Methods for Simulating Aircraft In-Flight Icing Encounters

    NASA Technical Reports Server (NTRS)

    Anderson, David N.; Ruff, Gary A.

    1997-01-01

    This paper discusses scaling methods which permit the use of subscale models in icing wind tunnels to simulate natural flight in icing. Natural icing conditions exist when air temperatures are below freezing but cloud water droplets are super-cooled liquid. Aircraft flying through such clouds are susceptible to the accretion of ice on the leading edges of unprotected components such as wings, tailplane and engine inlets. To establish the aerodynamic penalties of such ice accretion and to determine what parts need to be protected from ice accretion (by heating, for example), extensive flight and wind-tunnel testing is necessary for new aircraft and components. Testing in icing tunnels is less expensive than flight testing, is safer, and permits better control of the test conditions. However, because of limitations on both model size and operating conditions in wind tunnels, it is often necessary to perform tests with either size or test conditions scaled. This paper describes the theoretical background to the development of icing scaling methods, discusses four methods, and presents results of tests to validate them.

  9. Further investigations of the W-test for pairwise epistasis testing.

    PubMed

    Howey, Richard; Cordell, Heather J

    2017-01-01

    Background: In a recent paper, a novel W-test for pairwise epistasis testing was proposed that appeared, in computer simulations, to have higher power than competing alternatives. Application to genome-wide bipolar data detected significant epistasis between SNPs in genes of relevant biological function. Network analysis indicated that the implicated genes formed two separate interaction networks, each containing genes highly related to autism and neurodegenerative disorders. Methods: Here we investigate further the properties and performance of the W-test via theoretical evaluation, computer simulations and application to real data. Results: We demonstrate that, for common variants, the W-test is closely related to several existing tests of association allowing for interaction, including logistic regression on 8 degrees of freedom, although logistic regression can show inflated type I error for low minor allele frequencies,  whereas the W-test shows good/conservative type I error control. Although in some situations the W-test can show higher power, logistic regression is not limited to tests on 8 degrees of freedom but can instead be tailored to impose greater structure on the assumed alternative hypothesis, offering a power advantage when the imposed structure matches the true structure. Conclusions: The W-test is a potentially useful method for testing for association - without necessarily implying interaction - between genetic variants disease, particularly when one or more of the genetic variants are rare. For common variants, the advantages of the W-test are less clear, and, indeed, there are situations where existing methods perform better. In our investigations, we further uncover a number of problems with the practical implementation and application of the W-test (to bipolar disorder) previously described, apparently due to inadequate use of standard data quality-control procedures. This observation leads us to urge caution in interpretation of the previously-presented results, most of which we consider are highly likely to be artefacts.

  10. Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  11. Automatically generated acceptance test: A software reliability experiment

    NASA Technical Reports Server (NTRS)

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  12. A complementation assay for in vivo protein structure/function analysis in Physcomitrella patens (Funariaceae)

    DOE PAGES

    Scavuzzo-Duggan, Tess R.; Chaves, Arielle M.; Roberts, Alison W.

    2015-07-14

    Here, a method for rapid in vivo functional analysis of engineered proteins was developed using Physcomitrella patens. A complementation assay was designed for testing structure/function relationships in cellulose synthase (CESA) proteins. The components of the assay include (1) construction of test vectors that drive expression of epitope-tagged PpCESA5 carrying engineered mutations, (2) transformation of a ppcesa5 knockout line that fails to produce gametophores with test and control vectors, (3) scoring the stable transformants for gametophore production, (4) statistical analysis comparing complementation rates for test vectors to positive and negative control vectors, and (5) analysis of transgenic protein expression by Westernmore » blotting. The assay distinguished mutations that generate fully functional, nonfunctional, and partially functional proteins. In conclusion, compared with existing methods for in vivo testing of protein function, this complementation assay provides a rapid method for investigating protein structure/function relationships in plants.« less

  13. Ground Characterization Studies in Canakkale Pilot Site of LIQUEFACT Project

    NASA Astrophysics Data System (ADS)

    Ozcep, F.; Oztoprak, S.; Aysal, N.; Bozbey, I.; Tezel, O.; Ozer, C.; Sargin, S.; Bekin, E.; Almasraf, M.; Cengiz Cinku, M.; Ozdemir, K.

    2017-12-01

    The our aim is to outline the ground characterisation studies in Canakkale test site. Study is based on the EU H2020 LIQUEFACT project entitled "Liquefact: Assessment and mitigation of liquefaction potential across Europe: a holistic approach to protect structures / infrastructures for improved resilience to earthquake-induced liquefaction disasters". Objectives and extent of ground characterization for Canakkale test site includes pre-existing soil investigation studies and complementary field studies. There were several SPT and geophysical tests carried out in the study area. Within the context of the complementary tests, six (6) study areas in the test site were chosen and complementary tests were carried out in these areas. In these areas, additional boreholes were opened and SPT tests were performed. It was decided that additional CPT (CPTU and SCPT) and Marchetti Dilatometer (DMT) tests should be carried out within the scope of the complementary testing. Seismic refraction, MASW and micro tremor measurements had been carried out in pre-existing studies. Shear wave velocities obtained from MASW measurements were evaluated to the most rigorous level. These tests were downhole seismic, PS-logging, seismic refraction, 2D-ReMi, MASW, micro tremor (H/V Nakamura method), 2D resistivity and resonance acoustic profiling (RAP). RAP is a new technique which will be explained briefly in the relevant section. Dynamic soil properties had not been measured in pre-existing studies, therefore these properties were investigated within the scope of the complementary tests. Selection of specific experimental tests of the complementary campaign was based on cost-benefit considerations Within the context of complementary field studies, dynamic soil properties were measured using resonant column and cyclic direct shear tests. Several sieve analyses and Atterberg Limits tests which were documented in the pre-existing studies were evaluated. In the complementary study carried out, additional sieve analyses and Atterberg Limit tests were carried out. It was aimed to make some correlations between geophysical measurements and other field measurements; such as SPT, blow count values.

  14. Development of a general method for detection and quantification of the P35S promoter based on assessment of existing methods

    PubMed Central

    Wu, Yuhua; Wang, Yulei; Li, Jun; Li, Wei; Zhang, Li; Li, Yunjing; Li, Xiaofei; Li, Jun; Zhu, Li; Wu, Gang

    2014-01-01

    The Cauliflower mosaic virus (CaMV) 35S promoter (P35S) is a commonly used target for detection of genetically modified organisms (GMOs). There are currently 24 reported detection methods, targeting different regions of the P35S promoter. Initial assessment revealed that due to the absence of primer binding sites in the P35S sequence, 19 of the 24 reported methods failed to detect P35S in MON88913 cotton, and the other two methods could only be applied to certain GMOs. The rest three reported methods were not suitable for measurement of P35S in some testing events, because SNPs in binding sites of the primer/probe would result in abnormal amplification plots and poor linear regression parameters. In this study, we discovered a conserved region in the P35S sequence through sequencing of P35S promoters from multiple transgenic events, and developed new qualitative and quantitative detection systems targeting this conserved region. The qualitative PCR could detect the P35S promoter in 23 unique GMO events with high specificity and sensitivity. The quantitative method was suitable for measurement of P35S promoter, exhibiting good agreement between the amount of template and Ct values for each testing event. This study provides a general P35S screening method, with greater coverage than existing methods. PMID:25483893

  15. CNN-BLPred: a Convolutional neural network based predictor for β-Lactamases (BL) and their classes.

    PubMed

    White, Clarence; Ismail, Hamid D; Saigo, Hiroto; Kc, Dukka B

    2017-12-28

    The β-Lactamase (BL) enzyme family is an important class of enzymes that plays a key role in bacterial resistance to antibiotics. As the newly identified number of BL enzymes is increasing daily, it is imperative to develop a computational tool to classify the newly identified BL enzymes into one of its classes. There are two types of classification of BL enzymes: Molecular Classification and Functional Classification. Existing computational methods only address Molecular Classification and the performance of these existing methods is unsatisfactory. We addressed the unsatisfactory performance of the existing methods by implementing a Deep Learning approach called Convolutional Neural Network (CNN). We developed CNN-BLPred, an approach for the classification of BL proteins. The CNN-BLPred uses Gradient Boosted Feature Selection (GBFS) in order to select the ideal feature set for each BL classification. Based on the rigorous benchmarking of CCN-BLPred using both leave-one-out cross-validation and independent test sets, CCN-BLPred performed better than the other existing algorithms. Compared with other architectures of CNN, Recurrent Neural Network, and Random Forest, the simple CNN architecture with only one convolutional layer performs the best. After feature extraction, we were able to remove ~95% of the 10,912 features using Gradient Boosted Trees. During 10-fold cross validation, we increased the accuracy of the classic BL predictions by 7%. We also increased the accuracy of Class A, Class B, Class C, and Class D performance by an average of 25.64%. The independent test results followed a similar trend. We implemented a deep learning algorithm known as Convolutional Neural Network (CNN) to develop a classifier for BL classification. Combined with feature selection on an exhaustive feature set and using balancing method such as Random Oversampling (ROS), Random Undersampling (RUS) and Synthetic Minority Oversampling Technique (SMOTE), CNN-BLPred performs significantly better than existing algorithms for BL classification.

  16. Laboratory tests for hot-mix asphalt characterization in Virginia.

    DOT National Transportation Integrated Search

    2005-01-01

    This project reviewed existing laboratory methods for accurately describing the constitutive behavior of the mixes used in the Commonwealth of Virginia. Indirect tensile (IDT) strength, resilient modulus, static creep in the IDT and uniaxial modes, f...

  17. Noncontact methods for optical testing of convex aspheric mirrors for future large telescopes

    NASA Astrophysics Data System (ADS)

    Goncharov, Alexander V.; Druzhin, Vladislav V.; Batshev, Vladislav I.

    2009-06-01

    Non-contact methods for testing of large rotationally symmetric convex aspheric mirrors are proposed. These methods are based on non-null testing with side illumination schemes, in which a narrow collimated beam is reflected from the meridional aspheric profile of a mirror. The figure error of the mirror is deduced from the intensity pattern from the reflected beam obtained on a screen, which is positioned in the tangential plane (containing the optical axis) and perpendicular to the incoming beam. Testing of the entire surface is carried out by rotating the mirror about its optical axis and registering the characteristics of the intensity pattern on the screen. The intensity pattern can be formed using three different techniques: modified Hartman test, interference and boundary curve test. All these techniques are well known but have not been used in the proposed side illumination scheme. Analytical expressions characterizing the shape and location of the intensity pattern on the screen or a CCD have been developed for all types of conic surfaces. The main advantage of these testing methods compared with existing methods (Hindle sphere, null lens, computer generated hologram) is that the reference system does not require large optical components.

  18. Experiments in fault tolerant software reliability

    NASA Technical Reports Server (NTRS)

    Mcallister, David F.; Tai, K. C.; Vouk, Mladen A.

    1987-01-01

    The reliability of voting was evaluated in a fault-tolerant software system for small output spaces. The effectiveness of the back-to-back testing process was investigated. Version 3.0 of the RSDIMU-ATS, a semi-automated test bed for certification testing of RSDIMU software, was prepared and distributed. Software reliability estimation methods based on non-random sampling are being studied. The investigation of existing fault-tolerance models was continued and formulation of new models was initiated.

  19. The FY 1980 Department of Defense Program for Research, Development, and Acquisition

    DTIC Science & Technology

    1979-02-01

    materiel. Up to a point, superior performance is an offset to this quantitative disadvantage. Lanchester’s theory of warfare derived simplified relations...intermediate ranges. Underground Test. The next scheduled underground test ( UGT ), MINERS IRON, in FY 1980, will provide engineering and design data on...methods of discriminating between UGTs and earthquakes, and address U.S. capabilities to monitor both the existing Threshold Test Ban Treaty and the

  20. An influence coefficient method for the application of the modal technique to wing flutter suppression of the DAST ARW-1 wing

    NASA Technical Reports Server (NTRS)

    Pines, S.

    1981-01-01

    The methods used to compute the mass, structural stiffness, and aerodynamic forces in the form of influence coefficient matrices as applied to a flutter analysis of the Drones for Aerodynamic and Structural Testing (DAST) Aeroelastic Research Wing. The DAST wing was chosen because wind tunnel flutter test data and zero speed vibration data of the modes and frequencies exist and are available for comparison. A derivation of the equations of motion that can be used to apply the modal method for flutter suppression is included. A comparison of the open loop flutter predictions with both wind tunnel data and other analytical methods is presented.

  1. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  2. Application of Hydrophilic Silanol-Based Chemical Grout for Strengthening Damaged Reinforced Concrete Flexural Members

    PubMed Central

    Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon

    2014-01-01

    In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis. PMID:28788708

  3. Application of Hydrophilic Silanol-Based Chemical Grout for Strengthening Damaged Reinforced Concrete Flexural Members.

    PubMed

    Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon

    2014-06-23

    In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis.

  4. A novel knowledge-based potential for RNA 3D structure evaluation

    NASA Astrophysics Data System (ADS)

    Yang, Yi; Gu, Qi; Zhang, Ben-Gong; Shi, Ya-Zhou; Shao, Zhi-Gang

    2018-03-01

    Ribonucleic acids (RNAs) play a vital role in biology, and knowledge of their three-dimensional (3D) structure is required to understand their biological functions. Recently structural prediction methods have been developed to address this issue, but a series of RNA 3D structures are generally predicted by most existing methods. Therefore, the evaluation of the predicted structures is generally indispensable. Although several methods have been proposed to assess RNA 3D structures, the existing methods are not precise enough. In this work, a new all-atom knowledge-based potential is developed for more accurately evaluating RNA 3D structures. The potential not only includes local and nonlocal interactions but also fully considers the specificity of each RNA by introducing a retraining mechanism. Based on extensive test sets generated from independent methods, the proposed potential correctly distinguished the native state and ranked near-native conformations to effectively select the best. Furthermore, the proposed potential precisely captured RNA structural features such as base-stacking and base-pairing. Comparisons with existing potential methods show that the proposed potential is very reliable and accurate in RNA 3D structure evaluation. Project supported by the National Science Foundation of China (Grants Nos. 11605125, 11105054, 11274124, and 11401448).

  5. Study on the application of ambient vibration tests to evaluate the effectiveness of seismic retrofitting

    NASA Astrophysics Data System (ADS)

    Liang, Li; Takaaki, Ohkubo; Guang-hui, Li

    2018-03-01

    In recent years, earthquakes have occurred frequently, and the seismic performance of existing school buildings has become particularly important. The main method for improving the seismic resistance of existing buildings is reinforcement. However, there are few effective methods to evaluate the effect of reinforcement. Ambient vibration measurement experiments were conducted before and after seismic retrofitting using wireless measurement system and the changes of vibration characteristics were compared. The changes of acceleration response spectrum, natural periods and vibration modes indicate that the wireless vibration measurement system can be effectively applied to evaluate the effect of seismic retrofitting. The method can evaluate the effect of seismic retrofitting qualitatively, it is difficult to evaluate the effect of seismic retrofitting quantitatively at this stage.

  6. Validity and extension of the SCS-CN method for computing infiltration and rainfall-excess rates

    NASA Astrophysics Data System (ADS)

    Mishra, Surendra Kumar; Singh, Vijay P.

    2004-12-01

    A criterion is developed for determining the validity of the Soil Conservation Service curve number (SCS-CN) method. According to this criterion, the existing SCS-CN method is found to be applicable when the potential maximum retention, S, is less than or equal to twice the total rainfall amount. The criterion is tested using published data of two watersheds. Separating the steady infiltration from capillary infiltration, the method is extended for predicting infiltration and rainfall-excess rates. The extended SCS-CN method is tested using 55 sets of laboratory infiltration data on soils varying from Plainfield sand to Yolo light clay, and the computed and observed infiltration and rainfall-excess rates are found to be in good agreement.

  7. Development of a Short Form of the Boston Naming Test for Individuals with Aphasia

    ERIC Educational Resources Information Center

    del Toro, Christina M.; Bislick, Lauren P.; Comer, Matthew; Velozo, Craig; Romero, Sergio; Rothi, Leslie J. Gonzalez; Kendall, Diane L.

    2011-01-01

    Purpose: The purpose of this study was to develop a short form of the Boston Naming Test (BNT; Kaplan, Goodglass, & Weintraub, 2001) for individuals with aphasia and compare it with 2 existing short forms originally analyzed with responses from people with dementia and neurologically healthy adults. Method: Development of the new BNT-Aphasia Short…

  8. Full-Scale Experimental Verification of Soft-Story-Only Retrofits of Wood-Frame Buildings using Hybrid Testing

    Treesearch

    Elaina Jennings; John W. van de Lindt; Ershad Ziaei; Pouria Bahmani; Sangki Park; Xiaoyun Shao; Weichiang Pang; Douglas Rammer; Gary Mochizuki; Mikhail Gershfeld

    2015-01-01

    The FEMA P-807 Guidelines were developed for retrofitting soft-story wood-frame buildings based on existing data, and the method had not been verified through full-scale experimental testing. This article presents two different retrofit designs based directly on the FEMA P-807 Guidelines that were examined at several different seismic intensity levels. The...

  9. 40 CFR 63.344 - Performance test requirements and test methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... pressure as follows: (i) Locate a velocity traverse port in a section of straight duct that connects the hooding on the plating tank or tanks with the control device. The port shall be located as close to the..., appendix A). If 2.5 diameters of straight duct work does not exist, locate the port 0.8 of the duct...

  10. 40 CFR 63.344 - Performance test requirements and test methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... pressure as follows: (i) Locate a velocity traverse port in a section of straight duct that connects the hooding on the plating tank or tanks with the control device. The port shall be located as close to the..., appendix A). If 2.5 diameters of straight duct work does not exist, locate the port 0.8 of the duct...

  11. Estimating the Proportion of True Null Hypotheses Using the Pattern of Observed p-values

    PubMed Central

    Tong, Tiejun; Feng, Zeny; Hilton, Julia S.; Zhao, Hongyu

    2013-01-01

    Estimating the proportion of true null hypotheses, π0, has attracted much attention in the recent statistical literature. Besides its apparent relevance for a set of specific scientific hypotheses, an accurate estimate of this parameter is key for many multiple testing procedures. Most existing methods for estimating π0 in the literature are motivated from the independence assumption of test statistics, which is often not true in reality. Simulations indicate that most existing estimators in the presence of the dependence among test statistics can be poor, mainly due to the increase of variation in these estimators. In this paper, we propose several data-driven methods for estimating π0 by incorporating the distribution pattern of the observed p-values as a practical approach to address potential dependence among test statistics. Specifically, we use a linear fit to give a data-driven estimate for the proportion of true-null p-values in (λ, 1] over the whole range [0, 1] instead of using the expected proportion at 1 − λ. We find that the proposed estimators may substantially decrease the variance of the estimated true null proportion and thus improve the overall performance. PMID:24078762

  12. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    PubMed

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  13. Estimating the Proportion of True Null Hypotheses Using the Pattern of Observed p-values.

    PubMed

    Tong, Tiejun; Feng, Zeny; Hilton, Julia S; Zhao, Hongyu

    2013-01-01

    Estimating the proportion of true null hypotheses, π 0 , has attracted much attention in the recent statistical literature. Besides its apparent relevance for a set of specific scientific hypotheses, an accurate estimate of this parameter is key for many multiple testing procedures. Most existing methods for estimating π 0 in the literature are motivated from the independence assumption of test statistics, which is often not true in reality. Simulations indicate that most existing estimators in the presence of the dependence among test statistics can be poor, mainly due to the increase of variation in these estimators. In this paper, we propose several data-driven methods for estimating π 0 by incorporating the distribution pattern of the observed p -values as a practical approach to address potential dependence among test statistics. Specifically, we use a linear fit to give a data-driven estimate for the proportion of true-null p -values in (λ, 1] over the whole range [0, 1] instead of using the expected proportion at 1 - λ. We find that the proposed estimators may substantially decrease the variance of the estimated true null proportion and thus improve the overall performance.

  14. The optimization of aircraft seat cushion fire-blocking layers. Full Scale: Test description and results

    NASA Technical Reports Server (NTRS)

    Schutter, K. J.; Duskin, F. E.

    1982-01-01

    Full-scale burn tests were conducted on thirteen different seat cushion configurations in a cabin fire simulator. The fire source used was a quartz lamp radiant energy panel with a propane pilot flame. During each test, data were recorded for cushion temperatures, radiant heat flux, rate of weight loss of test specimens, and cabin temperatures. When compared to existing passenger aircraft seat cushions, the test specimens incorporating a fire barrier and those fabricated from advance materials, using improved construction methods, exhibited significantly greater fire resistance.

  15. Summary of AH-1G flight vibration data for validation of coupled rotor-fuselage analyses

    NASA Technical Reports Server (NTRS)

    Dompka, R. V.; Cronkhite, J. D.

    1986-01-01

    Under a NASA research program designated DAMVIBS (Design Analysis Methods for VIBrationS), four U. S. helicopter industry participants (Bell Helicopter, Boeing Vertol, McDonnell Douglas Helicopter, and Sikorsky Aircraft) are to apply existing analytical methods for calculating coupled rotor-fuselage vibrations of the AH-1G helicopter for correlation with flight test data from an AH-1G Operational Load Survey (OLS) test program. Bell Helicopter, as the manufacturer of the AH-1G, was asked to provide pertinent rotor data and to collect the OLS flight vibration data needed to perform the correlations. The analytical representation of the fuselage structure is based on a NASTRAN finite element model (FEM) developed by Bell which has been extensively documented and correlated with ground vibration tests.The AH-1G FEM was provided to each of the participants for use in their coupled rotor-fuselage analyses. This report describes the AH-1G OLS flight test program and provides the flight conditions and measured vibration data to be used by each participant in their correlation effort. In addition, the mechanical, structural, inertial and aerodynamic data for the AH-1G two-bladed teetering main rotor system are presented. Furthermore, modifications to the NASTRAN FEM of the fuselage structure that are necessary to make it compatible with the OLS test article are described. The AH-1G OLS flight test data was found to be well documented and provide a sound basis for evaluating currently existing analysis methods used for calculation of coupled rotor-fuselage vibrations.

  16. Process capability determination of new and existing equipment

    NASA Technical Reports Server (NTRS)

    Mcclelland, H. T.; Su, Penwen

    1994-01-01

    The objective of this paper is to illustrate a method of determining the process capability of new or existing equipment. The method may also be modified to apply to testing laboratories. Long term changes in the system may be determined by periodically making new test parts or submitting samples from the original set to the testing laboratory. The technique described has been developed through a series of projects in special topics manufacturing courses and graduate student projects. It will be implemented as a standard experiment in an advanced manufacturing course in a new Manufacturing Engineering program at the University of Wisconsin-Stout campus. Before starting a project of this nature, it is important to decide on the exact question to be answered. In this case, it is desired to know what variation can be reasonably expected in the next part, feature, or test result produced. Generally, this question is answered by providing the process capability or the average value of a measured characteristic of the part or process plus or minus three standard deviations. There are two general cases to be considered: the part or test is made in large quantities with little change, or the process is flexible and makes a large variety of parts. Both cases can be accommodated; however, the emphasis in this report is on short run situations.

  17. Construction of mathematical model for measuring material concentration by colorimetric method

    NASA Astrophysics Data System (ADS)

    Liu, Bing; Gao, Lingceng; Yu, Kairong; Tan, Xianghua

    2018-06-01

    This paper use the method of multiple linear regression to discuss the data of C problem of mathematical modeling in 2017. First, we have established a regression model for the concentration of 5 substances. But only the regression model of the substance concentration of urea in milk can pass through the significance test. The regression model established by the second sets of data can pass the significance test. But this model exists serious multicollinearity. We have improved the model by principal component analysis. The improved model is used to control the system so that it is possible to measure the concentration of material by direct colorimetric method.

  18. A fuzzy optimal threshold technique for medical images

    NASA Astrophysics Data System (ADS)

    Thirupathi Kannan, Balaji; Krishnasamy, Krishnaveni; Pradeep Kumar Kenny, S.

    2012-01-01

    A new fuzzy based thresholding method for medical images especially cervical cytology images having blob and mosaic structures is proposed in this paper. Many existing thresholding algorithms may segment either blob or mosaic images but there aren't any single algorithm that can do both. In this paper, an input cervical cytology image is binarized, preprocessed and the pixel value with minimum Fuzzy Gaussian Index is identified as an optimal threshold value and used for segmentation. The proposed technique is tested on various cervical cytology images having blob or mosaic structures, compared with various existing algorithms and proved better than the existing algorithms.

  19. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  20. Review of laboratory-based terrestrial bioaccumulation assessment approaches for organic chemicals: Current status and future possibilities.

    PubMed

    Hoke, Robert; Huggett, Duane; Brasfield, Sandra; Brown, Becky; Embry, Michelle; Fairbrother, Anne; Kivi, Michelle; Paumen, Miriam Leon; Prosser, Ryan; Salvito, Dan; Scroggins, Rick

    2016-01-01

    In the last decade, interest has been renewed in approaches for the assessment of the bioaccumulation potential of chemicals, principally driven by the need to evaluate large numbers of chemicals as part of new chemical legislation, while reducing vertebrate test organism use called for in animal welfare legislation. This renewed interest has inspired research activities and advances in bioaccumulation science for neutral organic chemicals in aquatic environments. In January 2013, ILSI Health and Environmental Sciences Institute convened experts to identify the state of the science and existing shortcomings in terrestrial bioaccumulation assessment of neutral organic chemicals. Potential modifications to existing laboratory methods were identified, including areas in which new laboratory approaches or test methods could be developed to address terrestrial bioaccumulation. The utility of "non-ecotoxicity" data (e.g., mammalian laboratory data) was also discussed. The highlights of the workshop discussions are presented along with potential modifications in laboratory approaches and new test guidelines that could be used for assessing the bioaccumulation of chemicals in terrestrial organisms. © 2015 SETAC.

  1. Analysis of 2-alkylcyclobutanones in cashew nut, nutmeg, apricot kernel, and pine nut samples: re-evaluating the uniqueness of 2-alkylcyclobutanones for irradiated food identification.

    PubMed

    Leung, Elvis M K; Tang, Phyllis N Y; Ye, Yuran; Chan, Wan

    2013-10-16

    2-Alkylcyclobutanones (2-ACBs) have long been considered as unique radiolytic products that can be used as indicators for irradiated food identification. A recent report on the natural existence of 2-ACB in non-irradiated nutmeg and cashew nut samples aroused worldwide concern because it contradicts the general belief that 2-ACBs are specific to irradiated food. The goal of this study is to test the natural existence of 2-ACBs in nut samples using our newly developed liquid chromatography-tandem mass spectrometry (LC-MS/MS) method with enhanced analytical sensitivity and selectivity ( Ye , Y. ; Liu , H. ; Horvatovich , P. ; Chan , W. Liquid chromatography-electrospray ionization tandem mass spectrometric analysis of 2-alkylcyclobutanones in irradiated chicken by precolumn derivatization with hydroxylamine . J. Agric. Food Chem. 2013 , 61 , 5758 - 5763 ). The validated method was applied to identify 2-dodecylcyclobutanone (2-DCB) and 2-tetradecylcyclobutanone (2-TCB) in nutmeg, cashew nut, pine nut, and apricot kernel samples (n = 22) of different origins. Our study reveals that 2-DCB and 2-TCB either do not exist naturally or exist at concentrations below the detection limit of the existing method. Thus, 2-DCB and 2-TCB are still valid to be used as biomarkers for identifying irradiated food.

  2. COSMOS: accurate detection of somatic structural variations through asymmetric comparison between tumor and normal samples

    PubMed Central

    Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun

    2016-01-01

    An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. PMID:26833260

  3. Pseudorange Measurement Method Based on AIS Signals.

    PubMed

    Zhang, Jingbo; Zhang, Shufang; Wang, Jinpeng

    2017-05-22

    In order to use the existing automatic identification system (AIS) to provide additional navigation and positioning services, a complete pseudorange measurements solution is presented in this paper. Through the mathematical analysis of the AIS signal, the bit-0-phases in the digital sequences were determined as the timestamps. Monte Carlo simulation was carried out to compare the accuracy of the zero-crossing and differential peak, which are two timestamp detection methods in the additive white Gaussian noise (AWGN) channel. Considering the low-speed and low-dynamic motion characteristics of ships, an optimal estimation method based on the minimum mean square error is proposed to improve detection accuracy. Furthermore, the α difference filter algorithm was used to achieve the fusion of the optimal estimation results of the two detection methods. The results show that the algorithm can greatly improve the accuracy of pseudorange estimation under low signal-to-noise ratio (SNR) conditions. In order to verify the effectiveness of the scheme, prototypes containing the measurement scheme were developed and field tests in Xinghai Bay of Dalian (China) were performed. The test results show that the pseudorange measurement accuracy was better than 28 m (σ) without any modification of the existing AIS system.

  4. Pseudorange Measurement Method Based on AIS Signals

    PubMed Central

    Zhang, Jingbo; Zhang, Shufang; Wang, Jinpeng

    2017-01-01

    In order to use the existing automatic identification system (AIS) to provide additional navigation and positioning services, a complete pseudorange measurements solution is presented in this paper. Through the mathematical analysis of the AIS signal, the bit-0-phases in the digital sequences were determined as the timestamps. Monte Carlo simulation was carried out to compare the accuracy of the zero-crossing and differential peak, which are two timestamp detection methods in the additive white Gaussian noise (AWGN) channel. Considering the low-speed and low-dynamic motion characteristics of ships, an optimal estimation method based on the minimum mean square error is proposed to improve detection accuracy. Furthermore, the α difference filter algorithm was used to achieve the fusion of the optimal estimation results of the two detection methods. The results show that the algorithm can greatly improve the accuracy of pseudorange estimation under low signal-to-noise ratio (SNR) conditions. In order to verify the effectiveness of the scheme, prototypes containing the measurement scheme were developed and field tests in Xinghai Bay of Dalian (China) were performed. The test results show that the pseudorange measurement accuracy was better than 28 m (σ) without any modification of the existing AIS system. PMID:28531153

  5. Micro Dot Patterning on the Light Guide Panel Using Powder Blasting

    PubMed Central

    Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam

    2008-01-01

    This study is to develop a micromachining technology for a light guide panel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a single injection process instead of existing screen printing processes. The micro powder blasting technique is applied to form micro dot patterns on the LGP mold surface. The optimal conditions for masking, laminating, exposure, and developing processes to form the micro dot patterns are first experimentally investigated. A LGP mold with masked micro patterns is then machined using the micro powder blasting method and the machinability of the micro dot patterns is verified. A prototype LGP is test- injected using the developed LGP mold and a shape analysis of the patterns and performance testing of the injected LGP are carried out. As an additional approach, matte finishing, a special surface treatment method, is applied to the mold surface to improve the light diffusion characteristics, uniformity and brightness of the LGP. The results of this study show that the applied powder blasting method can be successfully used to manufacture LGPs with micro patterns by just single injection using the developed mold and thereby replace existing screen printing methods. PMID:27879740

  6. Normalized Rotational Multiple Yield Surface Framework (NRMYSF) stress-strain curve prediction method based on small strain triaxial test data on undisturbed Auckland residual clay soils

    NASA Astrophysics Data System (ADS)

    Noor, M. J. Md; Ibrahim, A.; Rahman, A. S. A.

    2018-04-01

    Small strain triaxial test measurement is considered to be significantly accurate compared to the external strain measurement using conventional method due to systematic errors normally associated with the test. Three submersible miniature linear variable differential transducer (LVDT) mounted on yokes which clamped directly onto the soil sample at equally 120° from the others. The device setup using 0.4 N resolution load cell and 16 bit AD converter was capable of consistently resolving displacement of less than 1µm and measuring axial strains ranging from less than 0.001% to 2.5%. Further analysis of small strain local measurement data was performed using new Normalized Multiple Yield Surface Framework (NRMYSF) method and compared with existing Rotational Multiple Yield Surface Framework (RMYSF) prediction method. The prediction of shear strength based on combined intrinsic curvilinear shear strength envelope using small strain triaxial test data confirmed the significant improvement and reliability of the measurement and analysis methods. Moreover, the NRMYSF method shows an excellent data prediction and significant improvement toward more reliable prediction of soil strength that can reduce the cost and time of experimental laboratory test.

  7. Validation studies and proficiency testing.

    PubMed

    Ankilam, Elke; Heinze, Petra; Kay, Simon; Van den Eede, Guy; Popping, Bert

    2002-01-01

    Genetically modified organisms (GMOs) entered the European food market in 1996. Current legislation demands the labeling of food products if they contain <1% GMO, as assessed for each ingredient of the product. To create confidence in the testing methods and to complement enforcement requirements, there is an urgent need for internationally validated methods, which could serve as reference methods. To date, several methods have been submitted to validation trials at an international level; approaches now exist that can be used in different circumstances and for different food matrixes. Moreover, the requirement for the formal validation of methods is clearly accepted; several national and international bodies are active in organizing studies. Further validation studies, especially on the quantitative polymerase chain reaction methods, need to be performed to cover the rising demand for new extraction methods and other background matrixes, as well as for novel GMO constructs.

  8. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    PubMed

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p < 0.05). In a real dataset, SAN had the lowest SDM and Kolmogorov-Smirnov values for blood urea nitrogen, hematocrit, hemoglobin, and serum potassium, and the lowest SDM for serum creatinine (p < 0.05). Subgroup-adjusted normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Grade 1 to 6 Thai students' existing ideas about light: Across-age study

    NASA Astrophysics Data System (ADS)

    Horasirt, Yupaporn; Yuenyong, Chokchai

    2018-01-01

    This paper aimed to investigate Grade 1 to 6 Thai (6 - 12 years old) students' existing ideas about light, sight, vision, source of light. The participants included 36 Grade 1 to 6 students (6 students in each Grade) who studying at a primary school in Khon Kaen. The method of this study is a descriptive qualitative research design. The tools included the two-tiered test about light and open-ended question. Students' responses were categorized the students' existing ideas about light. Findings indicated that young students held various existing ideas about light that could be categorized into 6 different groups relating to sight, vision, and source of light. The paper discussed these students' existing ideas for developing constructivist learning about light in Thailand context.

  10. Flip-avoiding interpolating surface registration for skull reconstruction.

    PubMed

    Xie, Shudong; Leow, Wee Kheng; Lee, Hanjing; Lim, Thiam Chye

    2018-03-30

    Skull reconstruction is an important and challenging task in craniofacial surgery planning, forensic investigation and anthropological studies. Existing methods typically reconstruct approximating surfaces that regard corresponding points on the target skull as soft constraints, thus incurring non-zero error even for non-defective parts and high overall reconstruction error. This paper proposes a novel geometric reconstruction method that non-rigidly registers an interpolating reference surface that regards corresponding target points as hard constraints, thus achieving low reconstruction error. To overcome the shortcoming of interpolating a surface, a flip-avoiding method is used to detect and exclude conflicting hard constraints that would otherwise cause surface patches to flip and self-intersect. Comprehensive test results show that our method is more accurate and robust than existing skull reconstruction methods. By incorporating symmetry constraints, it can produce more symmetric and normal results than other methods in reconstructing defective skulls with a large number of defects. It is robust against severe outliers such as radiation artifacts in computed tomography due to dental implants. In addition, test results also show that our method outperforms thin-plate spline for model resampling, which enables the active shape model to yield more accurate reconstruction results. As the reconstruction accuracy of defective parts varies with the use of different reference models, we also study the implication of reference model selection for skull reconstruction. Copyright © 2018 John Wiley & Sons, Ltd.

  11. Micro Dot Patterning on the Light Guide Panel Using Powder Blasting.

    PubMed

    Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam

    2008-02-08

    This study is to develop a micromachining technology for a light guidepanel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a singleinjection process instead of existing screen printing processes. The micro powder blastingtechnique is applied to form micro dot patterns on the LGP mold surface. The optimalconditions for masking, laminating, exposure, and developing processes to form the microdot patterns are first experimentally investigated. A LGP mold with masked micro patternsis then machined using the micro powder blasting method and the machinability of themicro dot patterns is verified. A prototype LGP is test- injected using the developed LGPmold and a shape analysis of the patterns and performance testing of the injected LGP arecarried out. As an additional approach, matte finishing, a special surface treatment method,is applied to the mold surface to improve the light diffusion characteristics, uniformity andbrightness of the LGP. The results of this study show that the applied powder blastingmethod can be successfully used to manufacture LGPs with micro patterns by just singleinjection using the developed mold and thereby replace existing screen printing methods.

  12. Observed physical processes in mechanical tests of PBX9501 and recomendations for experiments to explore a possible plasticity/damage threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buechler, Miles A.

    2012-05-02

    This memo discusses observations that have been made in regards to a series of monotonic and cyclic uniaxial experiments performed on PBX9501 by Darla Thompson under Enhanced Surveilance Campaign support. These observations discussed in Section Cyclic compression observations strongly suggest the presence of viscoelastic, plastic, and damage phenomena in the mechanical response of the material. In Secton Uniaxial data analysis and observations methods are discussed for separating out the viscoelastic effects. A crude application of those methods suggests the possibility of a critical stress below which plasticity and damage may be negligible. The threshold should be explored because if itmore » exists it will be an important feature of any constitutive model. Additionally, if the threshold exists then modifications of experimental methods may be feasible which could potentially simplify future experiments or provide higher quality data from those experiments. A set of experiments to explore the threshold stress are proposed in Section Exploratory tests program for identifying threshold stress.« less

  13. Measurement of CO2 diffusivity for carbon sequestration: a microfluidic approach for reservoir-specific analysis.

    PubMed

    Sell, Andrew; Fadaei, Hossein; Kim, Myeongsub; Sinton, David

    2013-01-02

    Predicting carbon dioxide (CO(2)) security and capacity in sequestration requires knowledge of CO(2) diffusion into reservoir fluids. In this paper we demonstrate a microfluidic based approach to measuring the mutual diffusion coefficient of carbon dioxide in water and brine. The approach enables formation of fresh CO(2)-liquid interfaces; the resulting diffusion is quantified by imaging fluorescence quenching of a pH-dependent dye, and subsequent analyses. This method was applied to study the effects of site-specific variables--CO(2) pressure and salinity levels--on the diffusion coefficient. In contrast to established, macro-scale pressure-volume-temperature cell methods that require large sample volumes and testing periods of hours/days, this approach requires only microliters of sample, provides results within minutes, and isolates diffusive mass transport from convective effects. The measured diffusion coefficient of CO(2) in water was constant (1.86 [± 0.26] × 10(-9) m(2)/s) over the range of pressures (5-50 bar) tested at 26 °C, in agreement with existing models. The effects of salinity were measured with solutions of 0-5 M NaCl, where the diffusion coefficient varied up to 3 times. These experimental data support existing theory and demonstrate the applicability of this method for reservoir-specific testing.

  14. Statistical models for incorporating data from routine HIV testing of pregnant women at antenatal clinics into HIV/AIDS epidemic estimates.

    PubMed

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B; Gregson, Simon; Eaton, Jeffrey W; Bao, Le

    2017-04-01

    HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women and can be used to improve estimates of national and subnational HIV prevalence trends. We develop methods to incorporate these new data source into the Joint United Nations Programme on AIDS Estimation and Projection Package in Spectrum 2017. We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (site-level) or regionally (census-level), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends and should be tested as more data become available from national ANC-RT programs.

  15. Has the connection between polyploidy and diversification actually been tested?

    PubMed

    Kellogg, Elizabeth A

    2016-04-01

    Many major clades of angiosperms have several whole genome duplications (polyploidization events) in their distant past, suggesting that polyploidy drives or at least permits diversification. However, data on recently diverged groups are more equivocal, finding little evidence of elevated diversification following polyploidy. The discrepancy may be attributable at least in part to methodology. Many studies use indirect methods, such as chromosome numbers, genome size, and Ks plots, to test polyploidy, although these approaches can be misleading, and often lack sufficient resolution. A direct test of diversification following polyploidy requires a sequence-based approach that traces the history of nuclear genomes rather than species. These methods identify the point of coalescence of ancestral genomes, but may be misleading about the time and thus the extent of diversification. Limitations of existing methods mean that the connection between polyploidy and diversification has not been rigorously tested and remains unknown. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Hard exudates segmentation based on learned initial seeds and iterative graph cut.

    PubMed

    Kusakunniran, Worapan; Wu, Qiang; Ritthipravat, Panrasee; Zhang, Jian

    2018-05-01

    (Background and Objective): The occurrence of hard exudates is one of the early signs of diabetic retinopathy which is one of the leading causes of the blindness. Many patients with diabetic retinopathy lose their vision because of the late detection of the disease. Thus, this paper is to propose a novel method of hard exudates segmentation in retinal images in an automatic way. (Methods): The existing methods are based on either supervised or unsupervised learning techniques. In addition, the learned segmentation models may often cause miss-detection and/or fault-detection of hard exudates, due to the lack of rich characteristics, the intra-variations, and the similarity with other components in the retinal image. Thus, in this paper, the supervised learning based on the multilayer perceptron (MLP) is only used to identify initial seeds with high confidences to be hard exudates. Then, the segmentation is finalized by unsupervised learning based on the iterative graph cut (GC) using clusters of initial seeds. Also, in order to reduce color intra-variations of hard exudates in different retinal images, the color transfer (CT) is applied to normalize their color information, in the pre-processing step. (Results): The experiments and comparisons with the other existing methods are based on the two well-known datasets, e_ophtha EX and DIARETDB1. It can be seen that the proposed method outperforms the other existing methods in the literature, with the sensitivity in the pixel-level of 0.891 for the DIARETDB1 dataset and 0.564 for the e_ophtha EX dataset. The cross datasets validation where the training process is performed on one dataset and the testing process is performed on another dataset is also evaluated in this paper, in order to illustrate the robustness of the proposed method. (Conclusions): This newly proposed method integrates the supervised learning and unsupervised learning based techniques. It achieves the improved performance, when compared with the existing methods in the literature. The robustness of the proposed method for the scenario of cross datasets could enhance its practical usage. That is, the trained model could be more practical for unseen data in the real-world situation, especially when the capturing environments of training and testing images are not the same. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Impact, Fire, and Fluid Spread Code Coupling for Complex Transportation Accident Environment Simulation.

    PubMed

    Brown, Alexander L; Wagner, Gregory J; Metzinger, Kurt E

    2012-06-01

    Transportation accidents frequently involve liquids dispersing in the atmosphere. An example is that of aircraft impacts, which often result in spreading fuel and a subsequent fire. Predicting the resulting environment is of interest for design, safety, and forensic applications. This environment is challenging for many reasons, one among them being the disparate time and length scales that are necessary to resolve for an accurate physical representation of the problem. A recent computational method appropriate for this class of problems has been described for modeling the impact and subsequent liquid spread. Because the environment is difficult to instrument and costly to test, the existing validation data are of limited scope and quality. A comparatively well instrumented test involving a rocket propelled cylindrical tank of water was performed, the results of which are helpful to understand the adequacy of the modeling methods. Existing data include estimates of drop sizes at several locations, final liquid surface deposition mass integrated over surface area regions, and video evidence of liquid cloud spread distances. Comparisons are drawn between the experimental observations and the predicted results of the modeling methods to provide evidence regarding the accuracy of the methods, and to provide guidance on the application and use of these methods.

  18. A comparative evaluation of Oratest with the microbiological method of assessing caries activity in children

    PubMed Central

    Sundaram, Meenakshi; Nayak, Ullal Anand; Ramalingam, Krishnakumar; Reddy, Venugopal; Rao, Arun Prasad; Mathian, Mahesh

    2013-01-01

    Aims: The aim of this study is to find out whether Oratest can be used as a diagnostic tool in assessing the caries activity by evaluating its relationship to the existing caries status and the salivary streptococcus mutans level. Materials and Methods: The study sample consists of 90 students divided into two groups. Group I (test group) and Group II (control group) consisting of 30 children for control group and 60 children for test group. The sampling of unstimulated saliva for the estimation of streptococcus mutans was done as per the method suggested by Kohler and Bratthall. The plates were then incubated. Rough surface colonies were identified as streptococcus mutans on a pre-determined area of the tip (approximately 1.5 cm2) were counted for each side of spatula pressed against mitis salivarius bacitracin agar using digital colony counter. The results were expressed in colony forming units (CFU). Oratest was carried out in the same patients after the collection of salivary sample for the microbiological method to evaluate the relationship between the two tests. Statistical Analysis Used: The tests used were ANOVA, Pearson Chi-square test, Pearson′s correlation analysis, Mann-Whitney U test and Student′s independent t-test. Results: In the control group and test group, when the streptococcus mutans count (CFU) and Oratest time (minutes) were correlated using Pearson′s correlation analysis, the streptococcus mutans counts was found to be in a statistically significant negative linear relationship with the Oratest time. When the caries status of the children, participated in the test group were correlated with mutans count (CFU) and Oratest time, caries status were found to be in a statistically significant positive linear relationship with streptococcus mutans count and in a significant negative linear relationship with Oratest time. Conclusions: The test proved to be a simple, inexpensive and rapid technique for assessing caries activity since a significant relationship exists clinically with caries status and microbiologically with the streptococcus mutans count of the individual. PMID:23946577

  19. THE RELATIONSHIP BETWEEN VARIOUS MODES OF SINGLE LEG POSTURAL CONTROL ASSESSMENT

    PubMed Central

    Schmitz, Randy

    2012-01-01

    Purpose/Background: While various techniques have been developed to assess the postural control system, little is known about the relationship between single leg static and functional balance. The purpose of the current study was to determine the relationship between the performance measures of several single leg postural stability tests. Methods: Forty six recreationally active college students (17 males, 29 females, 21±3 yrs, 173±10 cm) performed six single leg tests in a counterbalanced order: 1) Firm Surface-Eyes Open, 2) Firm Surface-Eyes Closed, 3) Multiaxial Surface-Eyes Open, 4) Multiaxial Surface-Eyes Closed, 5) Star Excursion Balance Test (posterior medial reach), 6) Single leg Hop-Stabilization Test. Bivariate correlations were conducted between the six outcome variables. Results: Mild to moderate correlations existed between the static tests. No significant correlations existed involving either of the functional tests. Conclusions: The results indicate that while performance of static balance tasks are mildly to moderately related, they appear to be unrelated to functional reaching or hopping movements, supporting the utilization of a battery of tests to determine overall postural control performance. Level of Evidence: 3b PMID:22666640

  20. The method of planning the energy consumption for electricity market

    NASA Astrophysics Data System (ADS)

    Russkov, O. V.; Saradgishvili, S. E.

    2017-10-01

    The limitations of existing forecast models are defined. The offered method is based on game theory, probabilities theory and forecasting the energy prices relations. New method is the basis for planning the uneven energy consumption of industrial enterprise. Ecological side of the offered method is disclosed. The program module performed the algorithm of the method is described. Positive method tests at the industrial enterprise are shown. The offered method allows optimizing the difference between planned and factual consumption of energy every hour of a day. The conclusion about applicability of the method for addressing economic and ecological challenges is made.

  1. From conditioning shampoo to nanomechanics and haptics of human hair.

    PubMed

    Wood, Claudia; Sugiharto, Albert Budiman; Max, Eva; Fery, Andreas

    2011-01-01

    Shampoo treatment and hair conditioning have a direct impact on our wellbeing via properties like combability and haptic perception of hair. Therefore, systematic investigations leading to quality improvement of hair care products are of major interest. The aim of our work is a better understanding of complex testing and the correlation with quantitative parameters. The motivation for the development of physical testing methods for hair feel relates to the fact that an ingredient supplier like BASF can only find new, so far not yet toxicologically approved chemistries for hair cosmetics, if an in-vitro method exists.In this work, the effects of different shampoo treatments with conditioning polymers are investigated. The employed physical test method, dry friction measurements and AFM observe friction phenomena on a macroscopic as well as on a nanoscale directly on hair. They are an approach to complement sensoric evaluation with an objective in-vitro method.

  2. The design of a joined wing flight demonstrator aircraft

    NASA Technical Reports Server (NTRS)

    Smith, S. C.; Cliff, S. E.; Kroo, I. M.

    1987-01-01

    A joined-wing flight demonstrator aircraft has been developed at the NASA Ames Research Center in collaboration with ACA Industries. The aircraft is designed to utilize the fuselage, engines, and undercarriage of the existing NASA AD-1 flight demonstrator aircraft. The design objectives, methods, constraints, and the resulting aircraft design, called the JW-1, are presented. A wind-tunnel model of the JW-1 was tested in the NASA Ames 12-foot wind tunnel. The test results indicate that the JW-1 has satisfactory flying qualities for a flight demonstrator aircraft. Good agreement of test results with design predictions confirmed the validity of the design methods used for application to joined-wing configurations.

  3. A general statistical test for correlations in a finite-length time series.

    PubMed

    Hanson, Jeffery A; Yang, Haw

    2008-06-07

    The statistical properties of the autocorrelation function from a time series composed of independently and identically distributed stochastic variables has been studied. Analytical expressions for the autocorrelation function's variance have been derived. It has been found that two common ways of calculating the autocorrelation, moving-average and Fourier transform, exhibit different uncertainty characteristics. For periodic time series, the Fourier transform method is preferred because it gives smaller uncertainties that are uniform through all time lags. Based on these analytical results, a statistically robust method has been proposed to test the existence of correlations in a time series. The statistical test is verified by computer simulations and an application to single-molecule fluorescence spectroscopy is discussed.

  4. Addressing criticisms of existing predictive bias research: cognitive ability test scores still overpredict African Americans' job performance.

    PubMed

    Berry, Christopher M; Zhao, Peng

    2015-01-01

    Predictive bias studies have generally suggested that cognitive ability test scores overpredict job performance of African Americans, meaning these tests are not predictively biased against African Americans. However, at least 2 issues call into question existing over-/underprediction evidence: (a) a bias identified by Aguinis, Culpepper, and Pierce (2010) in the intercept test typically used to assess over-/underprediction and (b) a focus on the level of observed validity instead of operational validity. The present study developed and utilized a method of assessing over-/underprediction that draws on the math of subgroup regression intercept differences, does not rely on the biased intercept test, allows for analysis at the level of operational validity, and can use meta-analytic estimates as input values. Therefore, existing meta-analytic estimates of key parameters, corrected for relevant statistical artifacts, were used to determine whether African American job performance remains overpredicted at the level of operational validity. African American job performance was typically overpredicted by cognitive ability tests across levels of job complexity and across conditions wherein African American and White regression slopes did and did not differ. Because the present study does not rely on the biased intercept test and because appropriate statistical artifact corrections were carried out, the present study's results are not affected by the 2 issues mentioned above. The present study represents strong evidence that cognitive ability tests generally overpredict job performance of African Americans. (c) 2015 APA, all rights reserved.

  5. Improved accuracy of supervised CRM discovery with interpolated Markov models and cross-species comparison

    PubMed Central

    Kazemian, Majid; Zhu, Qiyun; Halfon, Marc S.; Sinha, Saurabh

    2011-01-01

    Despite recent advances in experimental approaches for identifying transcriptional cis-regulatory modules (CRMs, ‘enhancers’), direct empirical discovery of CRMs for all genes in all cell types and environmental conditions is likely to remain an elusive goal. Effective methods for computational CRM discovery are thus a critically needed complement to empirical approaches. However, existing computational methods that search for clusters of putative binding sites are ineffective if the relevant TFs and/or their binding specificities are unknown. Here, we provide a significantly improved method for ‘motif-blind’ CRM discovery that does not depend on knowledge or accurate prediction of TF-binding motifs and is effective when limited knowledge of functional CRMs is available to ‘supervise’ the search. We propose a new statistical method, based on ‘Interpolated Markov Models’, for motif-blind, genome-wide CRM discovery. It captures the statistical profile of variable length words in known CRMs of a regulatory network and finds candidate CRMs that match this profile. The method also uses orthologs of the known CRMs from closely related genomes. We perform in silico evaluation of predicted CRMs by assessing whether their neighboring genes are enriched for the expected expression patterns. This assessment uses a novel statistical test that extends the widely used Hypergeometric test of gene set enrichment to account for variability in intergenic lengths. We find that the new CRM prediction method is superior to existing methods. Finally, we experimentally validate 12 new CRM predictions by examining their regulatory activity in vivo in Drosophila; 10 of the tested CRMs were found to be functional, while 6 of the top 7 predictions showed the expected activity patterns. We make our program available as downloadable source code, and as a plugin for a genome browser installed on our servers. PMID:21821659

  6. Determine the Compressive Strength of Calcium Silicate Bricks by Combined Nondestructive Method

    PubMed Central

    2014-01-01

    The paper deals with the application of combined nondestructive method for assessment of compressive strength of calcium silicate bricks. In this case, it is a combination of the rebound hammer method and ultrasonic pulse method. Calibration relationships for determining compressive strength of calcium silicate bricks obtained from nondestructive parameter testing for the combined method as well as for the L-type Schmidt rebound hammer and ultrasonic pulse method are quoted here. Calibration relationships are known for their close correlation and are applicable in practice. The highest correlation between parameters from nondestructive measurement and predicted compressive strength is obtained using the SonReb combined nondestructive method. Combined nondestructive SonReb method was proved applicable for determination of compressive strength of calcium silicate bricks at checking tests in a production plant and for evaluation of bricks built in existing masonry structures. PMID:25276864

  7. Review of Artificial Abrasion Test Methods for PV Module Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Muller, Matt T.; Simpson, Lin J.

    This review is intended to identify the method or methods--and the basic details of those methods--that might be used to develop an artificial abrasion test. Methods used in the PV literature were compared with their closest implementation in existing standards. Also, meetings of the International PV Quality Assurance Task Force Task Group 12-3 (TG12-3, which is concerned with coated glass) were used to identify established test methods. Feedback from the group, which included many of the authors from the PV literature, included insights not explored within the literature itself. The combined experience and examples from the literature are intended tomore » provide an assessment of the present industry practices and an informed path forward. Recommendations toward artificial abrasion test methods are then identified based on the experiences in the literature and feedback from the PV community. The review here is strictly focused on abrasion. Assessment methods, including optical performance (e.g., transmittance or reflectance), surface energy, and verification of chemical composition were not examined. Methods of artificially soiling PV modules or other specimens were not examined. The weathering of artificial or naturally soiled specimens (which may ultimately include combined temperature and humidity, thermal cycling and ultraviolet light) were also not examined. A sense of the purpose or application of an abrasion test method within the PV industry should, however, be evident from the literature.« less

  8. Training Feedback Handbook. Research Product 83-7.

    ERIC Educational Resources Information Center

    Burnside, Billy L.; And Others

    This handbook is designed to assist training developers and evaluators in structuring their collection of feedback data. Addressed first are various methods for collecting feedback data, including informal feedback, existing unit performance records, questionnaires, structured interviews, systematic observation, and testing. The next chapter, a…

  9. Improved design of electrophoretic equipment for rapid sickle-cell-anemia screening

    NASA Technical Reports Server (NTRS)

    Reddick, J. M.; Hirsch, I.

    1974-01-01

    Effective mass screening may be accomplished by modifying existing electrophoretic equipment in conjunction with multisample applicator used with cellulose-acetate-matrix test paper. Using this method, approximately 20 to 25 samples can undergo electrophoresis in 5 to 6 minutes.

  10. A comparison of optical gradation analysis devices to current test methods--phase 2.

    DOT National Transportation Integrated Search

    2012-04-01

    Optical devices are being developed to deliver accurate size and shape of aggregate particles with, less labor, less consistency error, : and greater reliability. This study was initiated to review the existing technology, and generate basic data to ...

  11. Development and testing of a weed wiper for roadside vegetative control.

    DOT National Transportation Integrated Search

    2002-10-01

    The objective of the project was to investigate the potential of using the weed wiper applicator : as an alternative method to mowing and broadcast spraying for controlling noxious weds, brush and : plant growth along roadways. An existing weed wiper...

  12. Glossary of reference terms for alternative test methods and their validation.

    PubMed

    Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas

    2014-01-01

    This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.

  13. Statistical Tests of System Linearity Based on the Method of Surrogate Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, N.; Paez, T.; Red-Horse, J.

    When dealing with measured data from dynamic systems we often make the tacit assumption that the data are generated by linear dynamics. While some systematic tests for linearity and determinism are available - for example the coherence fimction, the probability density fimction, and the bispectrum - fi,u-ther tests that quanti$ the existence and the degree of nonlinearity are clearly needed. In this paper we demonstrate a statistical test for the nonlinearity exhibited by a dynamic system excited by Gaussian random noise. We perform the usual division of the input and response time series data into blocks as required by themore » Welch method of spectrum estimation and search for significant relationships between a given input fkequency and response at harmonics of the selected input frequency. We argue that systematic tests based on the recently developed statistical method of surrogate data readily detect significant nonlinear relationships. The paper elucidates the method of surrogate data. Typical results are illustrated for a linear single degree-of-freedom system and for a system with polynomial stiffness nonlinearity.« less

  14. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    PubMed

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  15. A method for evaluating the importance of system state observations to model predictions, with application to the Death Valley regional groundwater flow system

    USGS Publications Warehouse

    Tiedeman, Claire; Ely, D. Matthew; Hill, Mary C.; O'Brien, Grady M.

    2004-01-01

    We develop a new observation‐prediction (OPR) statistic for evaluating the importance of system state observations to model predictions. The OPR statistic measures the change in prediction uncertainty produced when an observation is added to or removed from an existing monitoring network, and it can be used to guide refinement and enhancement of the network. Prediction uncertainty is approximated using a first‐order second‐moment method. We apply the OPR statistic to a model of the Death Valley regional groundwater flow system (DVRFS) to evaluate the importance of existing and potential hydraulic head observations to predicted advective transport paths in the saturated zone underlying Yucca Mountain and underground testing areas on the Nevada Test Site. Important existing observations tend to be far from the predicted paths, and many unimportant observations are in areas of high observation density. These results can be used to select locations at which increased observation accuracy would be beneficial and locations that could be removed from the network. Important potential observations are mostly in areas of high hydraulic gradient far from the paths. Results for both existing and potential observations are related to the flow system dynamics and coarse parameter zonation in the DVRFS model. If system properties in different locations are as similar as the zonation assumes, then the OPR results illustrate a data collection opportunity whereby observations in distant, high‐gradient areas can provide information about properties in flatter‐gradient areas near the paths. If this similarity is suspect, then the analysis produces a different type of data collection opportunity involving testing of model assumptions critical to the OPR results.

  16. Antepartum evaluation of the fetus and fetal well being.

    PubMed

    O'Neill, Erica; Thorp, John

    2012-09-01

    Despite widespread use of many methods of antenatal testing, limited evidence exists to demonstrate effectiveness at improving perinatal outcomes. An exception is the use of Doppler ultrasound in monitoring high-risk pregnancies thought to be at risk of placental insufficiency. Otherwise, obstetricians should proceed with caution and approach the initiation of a testing protocol by obtaining an informed consent. When confronted with an abnormal test, clinicians should evaluate with a second antenatal test and consider administering betamethasone, performing amniocentesis to assess lung maturity, and/or repeating testing to minimize the chance of iatrogenic prematurity in case of a healthy fetus.

  17. Proposed Objective Odor Control Test Methodology for Waste Containment

    NASA Technical Reports Server (NTRS)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  18. Transformation-invariant and nonparametric monotone smooth estimation of ROC curves.

    PubMed

    Du, Pang; Tang, Liansheng

    2009-01-30

    When a new diagnostic test is developed, it is of interest to evaluate its accuracy in distinguishing diseased subjects from non-diseased subjects. The accuracy of the test is often evaluated by receiver operating characteristic (ROC) curves. Smooth ROC estimates are often preferable for continuous test results when the underlying ROC curves are in fact continuous. Nonparametric and parametric methods have been proposed by various authors to obtain smooth ROC curve estimates. However, there are certain drawbacks with the existing methods. Parametric methods need specific model assumptions. Nonparametric methods do not always satisfy the inherent properties of the ROC curves, such as monotonicity and transformation invariance. In this paper we propose a monotone spline approach to obtain smooth monotone ROC curves. Our method ensures important inherent properties of the underlying ROC curves, which include monotonicity, transformation invariance, and boundary constraints. We compare the finite sample performance of the newly proposed ROC method with other ROC smoothing methods in large-scale simulation studies. We illustrate our method through a real life example. Copyright (c) 2008 John Wiley & Sons, Ltd.

  19. Study of solution procedures for nonlinear structural equations

    NASA Technical Reports Server (NTRS)

    Young, C. T., II; Jones, R. F., Jr.

    1980-01-01

    A method for the redution of the cost of solution of large nonlinear structural equations was developed. Verification was made using the MARC-STRUC structure finite element program with test cases involving single and multiple degrees of freedom for static geometric nonlinearities. The method developed was designed to exist within the envelope of accuracy and convergence characteristic of the particular finite element methodology used.

  20. Evaluation of methods for determining hardware projected life

    NASA Technical Reports Server (NTRS)

    1971-01-01

    An investigation of existing methods of predicting hardware life is summarized by reviewing programs having long life requirements, current research efforts on long life problems, and technical papers reporting work on life predicting techniques. The results indicate that there are no accurate quantitative means to predict hardware life for system level hardware. The effectiveness of test programs and the cause of hardware failures is considered.

  1. Poisson Approximation-Based Score Test for Detecting Association of Rare Variants.

    PubMed

    Fang, Hongyan; Zhang, Hong; Yang, Yaning

    2016-07-01

    Genome-wide association study (GWAS) has achieved great success in identifying genetic variants, but the nature of GWAS has determined its inherent limitations. Under the common disease rare variants (CDRV) hypothesis, the traditional association analysis methods commonly used in GWAS for common variants do not have enough power for detecting rare variants with a limited sample size. As a solution to this problem, pooling rare variants by their functions provides an efficient way for identifying susceptible genes. Rare variant typically have low frequencies of minor alleles, and the distribution of the total number of minor alleles of the rare variants can be approximated by a Poisson distribution. Based on this fact, we propose a new test method, the Poisson Approximation-based Score Test (PAST), for association analysis of rare variants. Two testing methods, namely, ePAST and mPAST, are proposed based on different strategies of pooling rare variants. Simulation results and application to the CRESCENDO cohort data show that our methods are more powerful than the existing methods. © 2016 John Wiley & Sons Ltd/University College London.

  2. Alternative methods for toxicity assessments in fish: comparison of the fish embryo toxicity and the larval growth and survival tests in zebrafish and fathead minnows.

    PubMed

    Jeffries, Marlo K Sellin; Stultz, Amy E; Smith, Austin W; Rawlings, Jane M; Belanger, Scott E; Oris, James T

    2014-11-01

    An increased demand for chemical toxicity evaluations has resulted in the need for alternative testing strategies that address animal welfare concerns. The fish embryo toxicity (FET) test developed for zebrafish (Danio rerio) is one such alternative, and the application of the FET test to other species such as the fathead minnow (Pimephales promelas) has been proposed. In the present study, the performances of the FET test and the larval growth and survival (LGS; a standard toxicity testing method) test in zebrafish and fathead minnows were evaluated. This required that testing methods for the fathead minnow FET and zebrafish LGS tests be harmonized with existing test methods and that the performance of these testing strategies be evaluated by comparing the median lethal concentrations of 2 reference toxicants, 3,4-dicholoraniline and ammonia, obtained via each of the test types. The results showed that procedures for the zebrafish FET test can be adapted and applied to the fathead minnow. Differences in test sensitivity were observed for 3,4-dicholoraniline but not ammonia; therefore, conclusions regarding which test types offer the least or most sensitivity could not be made. Overall, these results show that the fathead minnow FET test has potential as an alternative toxicity testing strategy and that further analysis with other toxicants is warranted in an effort to better characterize the sensitivity and feasibility of this testing strategy. © 2014 SETAC.

  3. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  4. A study protocol for enriching the interprofessional collaborative competency attainment survey insights.

    PubMed

    Jackson, Jeffrey B

    2017-11-01

    The following short report outlines a proposed study designed to evaluate the Interprofessional Collaborative Competency Attainment Survey and its recommended method of administration. This exploratory study seeks to determine if there is a significant difference between two methods of administration, the recommended and validated retrospective pre-test and post-test, and a traditional pre-test and post-test. If a significant difference does exist, this data will provide a means to determine the effect size of that difference. The comparison will be done using repeated measure ANOVA and the subsequent effect size will be evaluated using Cohen's d. As the retrospective design is utilised to evaluate a change in perceived competency, comparison of data from a traditional pre-test with a retrospective pre-test may provide a means for evaluation of the participants' change in understanding of the construct, and thus a more thorough picture of the forces driving changes to scores.

  5. Evaluation of techniques for increasing recall in a dictionary approach to gene and protein name identification.

    PubMed

    Schuemie, Martijn J; Mons, Barend; Weeber, Marc; Kors, Jan A

    2007-06-01

    Gene and protein name identification in text requires a dictionary approach to relate synonyms to the same gene or protein, and to link names to external databases. However, existing dictionaries are incomplete. We investigate two complementary methods for automatic generation of a comprehensive dictionary: combination of information from existing gene and protein databases and rule-based generation of spelling variations. Both methods have been reported in literature before, but have hitherto not been combined and evaluated systematically. We combined gene and protein names from several existing databases of four different organisms. The combined dictionaries showed a substantial increase in recall on three different test sets, as compared to any single database. Application of 23 spelling variation rules to the combined dictionaries further increased recall. However, many rules appeared to have no effect and some appear to have a detrimental effect on precision.

  6. Existence of the sugar-bisulfite adducts and its inhibiting effect on degradation of monosaccharide in acid system.

    PubMed

    Shi, Yan

    2014-02-01

    Degradation of fermentable monosaccharides is one of the primary concerns for acid prehydrolysis of lignocellulosic biomass. Recently, in our research on degradation of pure monosaccharides in aqueous SO₂ solution by gas chromatography (GC) analysis, we found that detected yield was not actual yield of each monosaccharide due to the existence of sugar-bisulfite adducts, and a new method was developed by ourselves which led to accurate detection of recovery yield of each monosaccharide in aqueous SO₂ solution by GC analysis. By the use of this method, degradation of each monosaccharide in aqueous SO₂ was investigated and results showed that sugar-bisulfite adducts have different inhibiting effect on degradation of each monosaccharide in aqueous SO₂ because of their different stability. In addition, NMR testing also demonstrated possible existence of reaction between conjugated based HSO₃(-) and aldehyde group of sugars in acid system.

  7. Ice Shape Scaling for Aircraft in SLD Conditions

    NASA Technical Reports Server (NTRS)

    Anderson, David N.; Tsao, Jen-Ching

    2008-01-01

    This paper has summarized recent NASA research into scaling of SLD conditions with data from both SLD and Appendix C tests. Scaling results obtained by applying existing scaling methods for size and test-condition scaling will be reviewed. Large feather growth issues, including scaling approaches, will be discussed briefly. The material included applies only to unprotected, unswept geometries. Within the limits of the conditions tested to date, the results show that the similarity parameters needed for Appendix C scaling also can be used for SLD scaling, and no additional parameters are required. These results were based on visual comparisons of reference and scale ice shapes. Nearly all of the experimental results presented have been obtained in sea-level tunnels. The currently recommended methods to scale model size, icing limit and test conditions are described.

  8. Testing for significance of phase synchronisation dynamics in the EEG.

    PubMed

    Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J

    2013-06-01

    A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.

  9. Using Pre-test/Post-test Data To Evaluate the Effectiveness of Computer Aided Instruction (A Study of CAI and Its Use with Developmental Reading Students).

    ERIC Educational Resources Information Center

    Lansford, Carl E.

    As computer aided instruction (CAI) and distance learning become more popular, a model for easily evaluating these teaching methods must be developed, one which will enable replication of the study each year. This paper discusses the results of a study using existing dependent and independent variables to evaluate CAI for developmental reading…

  10. Multiple, correlated covariates associated with differential item functioning (DIF): Accounting for language DIF when education levels differ across languages.

    PubMed

    Gibbons, Laura E; Crane, Paul K; Mehta, Kala M; Pedraza, Otto; Tang, Yuxiao; Manly, Jennifer J; Narasimhalu, Kaavya; Teresi, Jeanne; Jones, Richard N; Mungas, Dan

    2011-04-28

    Differential item functioning (DIF) occurs when a test item has different statistical properties in subgroups, controlling for the underlying ability measured by the test. DIF assessment is necessary when evaluating measurement bias in tests used across different language groups. However, other factors such as educational attainment can differ across language groups, and DIF due to these other factors may also exist. How to conduct DIF analyses in the presence of multiple, correlated factors remains largely unexplored. This study assessed DIF related to Spanish versus English language in a 44-item object naming test. Data come from a community-based sample of 1,755 Spanish- and English-speaking older adults. We compared simultaneous accounting, a new strategy for handling differences in educational attainment across language groups, with existing methods. Compared to other methods, simultaneously accounting for language- and education-related DIF yielded salient differences in some object naming scores, particularly for Spanish speakers with at least 9 years of education. Accounting for factors that vary across language groups can be important when assessing language DIF. The use of simultaneous accounting will be relevant to other cross-cultural studies in cognition and in other fields, including health-related quality of life.

  11. Multiple, correlated covariates associated with differential item functioning (DIF): Accounting for language DIF when education levels differ across languages

    PubMed Central

    Gibbons, Laura E.; Crane, Paul K.; Mehta, Kala M.; Pedraza, Otto; Tang, Yuxiao; Manly, Jennifer J.; Narasimhalu, Kaavya; Teresi, Jeanne; Jones, Richard N.; Mungas, Dan

    2012-01-01

    Differential item functioning (DIF) occurs when a test item has different statistical properties in subgroups, controlling for the underlying ability measured by the test. DIF assessment is necessary when evaluating measurement bias in tests used across different language groups. However, other factors such as educational attainment can differ across language groups, and DIF due to these other factors may also exist. How to conduct DIF analyses in the presence of multiple, correlated factors remains largely unexplored. This study assessed DIF related to Spanish versus English language in a 44-item object naming test. Data come from a community-based sample of 1,755 Spanish- and English-speaking older adults. We compared simultaneous accounting, a new strategy for handling differences in educational attainment across language groups, with existing methods. Compared to other methods, simultaneously accounting for language- and education-related DIF yielded salient differences in some object naming scores, particularly for Spanish speakers with at least 9 years of education. Accounting for factors that vary across language groups can be important when assessing language DIF. The use of simultaneous accounting will be relevant to other cross-cultural studies in cognition and in other fields, including health-related quality of life. PMID:22900138

  12. Attrition Rate of Oxygen Carriers in Chemical Looping Combustion Systems

    NASA Astrophysics Data System (ADS)

    Feilen, Harry Martin

    This project developed an evaluation methodology for determining, accurately and rapidly, the attrition resistance of oxygen carrier materials used in chemical looping technologies. Existing test protocols, to evaluate attrition resistance of granular materials, are conducted under non-reactive and ambient temperature conditions. They do not accurately reflect the actual behavior under the unique process conditions of chemical looping, including high temperatures and cyclic operation between oxidizing and reducing atmospheres. This project developed a test method and equipment that represented a significant improvement over existing protocols. Experimental results obtained from this project have shown that hematite exhibits different modes of attrition, including both due to mechanical stresses and due to structural changes in the particles due to chemical reaction at high temperature. The test methodology has also proven effective in providing reactivity changes of the material with continued use, a property, which in addition to attrition, determines material life. Consumption/replacement cost due to attrition or loss of reactivity is a critical factor in the economic application of the chemical looping technology. This test method will allow rapid evaluation of a wide range of materials that are best suited for this technology. The most important anticipated public benefit of this project is the acceleration of the development of chemical looping technology for lowering greenhouse gas emissions from fossil fuel combustion.

  13. Methods for meta-analysis of multiple traits using GWAS summary statistics.

    PubMed

    Ray, Debashree; Boehnke, Michael

    2018-03-01

    Genome-wide association studies (GWAS) for complex diseases have focused primarily on single-trait analyses for disease status and disease-related quantitative traits. For example, GWAS on risk factors for coronary artery disease analyze genetic associations of plasma lipids such as total cholesterol, LDL-cholesterol, HDL-cholesterol, and triglycerides (TGs) separately. However, traits are often correlated and a joint analysis may yield increased statistical power for association over multiple univariate analyses. Recently several multivariate methods have been proposed that require individual-level data. Here, we develop metaUSAT (where USAT is unified score-based association test), a novel unified association test of a single genetic variant with multiple traits that uses only summary statistics from existing GWAS. Although the existing methods either perform well when most correlated traits are affected by the genetic variant in the same direction or are powerful when only a few of the correlated traits are associated, metaUSAT is designed to be robust to the association structure of correlated traits. metaUSAT does not require individual-level data and can test genetic associations of categorical and/or continuous traits. One can also use metaUSAT to analyze a single trait over multiple studies, appropriately accounting for overlapping samples, if any. metaUSAT provides an approximate asymptotic P-value for association and is computationally efficient for implementation at a genome-wide level. Simulation experiments show that metaUSAT maintains proper type-I error at low error levels. It has similar and sometimes greater power to detect association across a wide array of scenarios compared to existing methods, which are usually powerful for some specific association scenarios only. When applied to plasma lipids summary data from the METSIM and the T2D-GENES studies, metaUSAT detected genome-wide significant loci beyond the ones identified by univariate analyses. Evidence from larger studies suggest that the variants additionally detected by our test are, indeed, associated with lipid levels in humans. In summary, metaUSAT can provide novel insights into the genetic architecture of a common disease or traits. © 2017 WILEY PERIODICALS, INC.

  14. Cavitation in liquid cryogens. 4: Combined correlations for venturi, hydrofoil, ogives, and pumps

    NASA Technical Reports Server (NTRS)

    Hord, J.

    1974-01-01

    The results of a series of experimental and analytical cavitation studies are presented. Cross-correlation is performed of the developed cavity data for a venturi, a hydrofoil and three scaled ogives. The new correlating parameter, MTWO, improves data correlation for these stationary bodies and for pumping equipment. Existing techniques for predicting the cavitating performance of pumping machinery were extended to include variations in flow coefficient, cavitation parameter, and equipment geometry. The new predictive formulations hold promise as a design tool and universal method for correlating pumping machinery performance. Application of these predictive formulas requires prescribed cavitation test data or an independent method of estimating the cavitation parameter for each pump. The latter would permit prediction of performance without testing; potential methods for evaluating the cavitation parameter prior to testing are suggested.

  15. 32 CFR 22.105 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... applying existing technology to new products and processes in a general way. Advanced research is most... Category 6.3A) programs within Research, Development, Test and Evaluation (RDT&E). Applied research... technology such as new materials, devices, methods and processes. It typically is funded in Applied Research...

  16. Efficient design of CMOS TSC checkers

    NASA Technical Reports Server (NTRS)

    Biddappa, Anita; Shamanna, Manjunath K.; Maki, Gary; Whitaker, Sterling

    1990-01-01

    This paper considers the design of an efficient, robustly testable, CMOS Totally Self-Checking (TSC) Checker for k-out-of-2k codes. Most existing implementations use primitive gates and assume the single stuck-at fault model. The self-testing property has been found to fail for CMOS TSC checkers under the stuck-open fault model due to timing skews and arbitrary delays in the circuit. A new four level design using CMOS primitive gates (NAND, NOR, INVERTERS) is presented. This design retains its properties under the stuck-open fault model. Additionally, this method offers an impressive reduction (greater than 70 percent) in gate count, gate inputs, and test set size when compared to the existing method. This implementation is easily realizable and is based on Anderson's technique. A thorough comparative study has been made on the proposed implementation and Kundu's implementation and the results indicate that the proposed one is better than Kundu's in all respects for k-out-of-2k codes.

  17. Quantitative optical scanning tests of complex microcircuits

    NASA Technical Reports Server (NTRS)

    Erickson, J. J.

    1980-01-01

    An approach for the development of the optical scanner as a screening inspection instrument for microcircuits involves comparing the quantitative differences in photoresponse images and then correlating them with electrical parameter differences in test devices. The existing optical scanner was modified so that the photoresponse data could be recorded and subsequently digitized. A method was devised for applying digital image processing techniques to the digitized photoresponse data in order to quantitatively compare the data. Electrical tests were performed and photoresponse images were recorded before and following life test intervals on two groups of test devices. Correlations were made between differences or changes in the electrical parameters of the test devices.

  18. Theory and practice of conventional adventitious virus testing.

    PubMed

    Gregersen, Jens-Peter

    2011-01-01

    CONFERENCE PROCEEDING Proceedings of the PDA/FDA Adventitious Viruses in Biologics: Detection and Mitigation Strategies Workshop in Bethesda, MD, USA; December 1-3, 2010 Guest Editors: Arifa Khan (Bethesda, MD), Patricia Hughes (Bethesda, MD) and Michael Wiebe (San Francisco, CA) For decades conventional tests in cell cultures and in laboratory animals have served as standard methods for broad-spectrum screening for adventitious viruses. New virus detection methods based on molecular biology have broadened and improved our knowledge about potential contaminating viruses and about the suitability of the conventional test methods. This paper summarizes and discusses practical aspects of conventional test schemes, such as detectability of various viruses, questionable or false-positive results, animal numbers needed, time and cost of testing, and applicability for rapidly changing starting materials. Strategies to improve the virus safety of biological medicinal products are proposed. The strategies should be based upon a flexible application of existing and new methods along with a scientifically based risk assessment. However, testing alone does not guarantee the absence of adventitious agents and must be accompanied by virus removing or virus inactivating process steps for critical starting materials, raw materials, and for the drug product.

  19. State of the art on alternative methods to animal testing from an industrial point of view: ready for regulation?

    PubMed

    Ashton, Rachel; De Wever, Bart; Fuchs, Horst W; Gaca, Marianna; Hill, Erin; Krul, Cyrille; Poth, Albrecht; Roggen, Erwin L

    2014-01-01

    Despite changing attitudes towards animal testing and current legislation to protect experimental animals, the rate of animal experiments seems to have changed little in recent years. On May 15-16, 2013, the In Vitro Testing Industrial Platform (IVTIP) held an open meeting to discuss the state of the art in alternative methods, how companies have, can, and will need to adapt and what drives and hinders regulatory acceptance and use. Several key messages arose from the meeting. First, industry and regulatory bodies should not wait for complete suites of alternative tests to become available, but should begin working with methods available right now (e.g., mining of existing animal data to direct future studies, implementation of alternative tests wherever scientifically valid rather than continuing to rely on animal tests) in non-animal and animal integrated strategies to reduce the numbers of animals tested. Sharing of information (communication), harmonization and standardization (coordination), commitment and collaboration are all required to improve the quality and speed of validation, acceptance, and implementation of tests. Finally, we consider how alternative methods can be used in research and development before formal implementation in regulations. Here we present the conclusions on what can be done already and suggest some solutions and strategies for the future.

  20. A modified form of conjugate gradient method for unconstrained optimization problems

    NASA Astrophysics Data System (ADS)

    Ghani, Nur Hamizah Abdul; Rivaie, Mohd.; Mamat, Mustafa

    2016-06-01

    Conjugate gradient (CG) methods have been recognized as an interesting technique to solve optimization problems, due to the numerical efficiency, simplicity and low memory requirements. In this paper, we propose a new CG method based on the study of Rivaie et al. [7] (Comparative study of conjugate gradient coefficient for unconstrained Optimization, Aus. J. Bas. Appl. Sci. 5(2011) 947-951). Then, we show that our method satisfies sufficient descent condition and converges globally with exact line search. Numerical results show that our proposed method is efficient for given standard test problems, compare to other existing CG methods.

  1. TEAM: efficient two-locus epistasis tests in human genome-wide association study.

    PubMed

    Zhang, Xiang; Huang, Shunping; Zou, Fei; Wang, Wei

    2010-06-15

    As a promising tool for identifying genetic markers underlying phenotypic differences, genome-wide association study (GWAS) has been extensively investigated in recent years. In GWAS, detecting epistasis (or gene-gene interaction) is preferable over single locus study since many diseases are known to be complex traits. A brute force search is infeasible for epistasis detection in the genome-wide scale because of the intensive computational burden. Existing epistasis detection algorithms are designed for dataset consisting of homozygous markers and small sample size. In human study, however, the genotype may be heterozygous, and number of individuals can be up to thousands. Thus, existing methods are not readily applicable to human datasets. In this article, we propose an efficient algorithm, TEAM, which significantly speeds up epistasis detection for human GWAS. Our algorithm is exhaustive, i.e. it does not ignore any epistatic interaction. Utilizing the minimum spanning tree structure, the algorithm incrementally updates the contingency tables for epistatic tests without scanning all individuals. Our algorithm has broader applicability and is more efficient than existing methods for large sample study. It supports any statistical test that is based on contingency tables, and enables both family-wise error rate and false discovery rate controlling. Extensive experiments show that our algorithm only needs to examine a small portion of the individuals to update the contingency tables, and it achieves at least an order of magnitude speed up over the brute force approach.

  2. An evaluation of computerized adaptive testing for general psychological distress: combining GHQ-12 and Affectometer-2 in an item bank for public mental health research.

    PubMed

    Stochl, Jan; Böhnke, Jan R; Pickett, Kate E; Croudace, Tim J

    2016-05-20

    Recent developments in psychometric modeling and technology allow pooling well-validated items from existing instruments into larger item banks and their deployment through methods of computerized adaptive testing (CAT). Use of item response theory-based bifactor methods and integrative data analysis overcomes barriers in cross-instrument comparison. This paper presents the joint calibration of an item bank for researchers keen to investigate population variations in general psychological distress (GPD). Multidimensional item response theory was used on existing health survey data from the Scottish Health Education Population Survey (n = 766) to calibrate an item bank consisting of pooled items from the short common mental disorder screen (GHQ-12) and the Affectometer-2 (a measure of "general happiness"). Computer simulation was used to evaluate usefulness and efficacy of its adaptive administration. A bifactor model capturing variation across a continuum of population distress (while controlling for artefacts due to item wording) was supported. The numbers of items for different required reliabilities in adaptive administration demonstrated promising efficacy of the proposed item bank. Psychometric modeling of the common dimension captured by more than one instrument offers the potential of adaptive testing for GPD using individually sequenced combinations of existing survey items. The potential for linking other item sets with alternative candidate measures of positive mental health is discussed since an optimal item bank may require even more items than these.

  3. The Space From Heart Disease Intervention for People With Cardiovascular Disease and Distress: A Mixed-Methods Study

    PubMed Central

    Clifton, Abigail; Lee, Geraldine; Norman, Ian J; O'Callaghan, David; Tierney, Karen; Richards, Derek

    2015-01-01

    Background Poor self-management of symptoms and psychological distress leads to worse outcomes and excess health service use in cardiovascular disease (CVD). Online-delivered therapy is effective, but generic interventions lack relevance for people with specific long-term conditions, such as cardiovascular disease. Objective To develop a comprehensive online CVD-specific intervention to improve both self-management and well-being, and to test acceptability and feasibility. Methods Informed by the Medical Research Council (MRC) guidance for the development of complex interventions, we adapted an existing evidence-based generic intervention for depression and anxiety for people with CVD. Content was informed by a literature review of existing resources and trial evidence, and the findings of a focus group study. Think-aloud usability testing was conducted to identify improvements to design and content. Acceptability and feasibility were tested in a cross-sectional study. Results Focus group participants (n=10) agreed that no existing resource met all their needs. Improvements such as "collapse and expand" features were added based on findings that participants’ information needs varied, and specific information, such as detecting heart attacks and when to seek help, was added. Think-aloud testing (n=2) led to changes in font size and design changes around navigation. All participants of the cross-sectional study (10/10, 100%) were able to access and use the intervention. Reported satisfaction was good, although the intervention was perceived to lack relevance for people without comorbid psychological distress. Conclusions We have developed an evidence-based, theory-informed, user-led online intervention for improving self-management and well-being in CVD. The use of multiple evaluation tests informed improvements to content and usability. Preliminary acceptability and feasibility has been demonstrated. The Space from Heart Disease intervention is now ready to be tested for effectiveness. This work has also identified that people with CVD symptoms and comorbid distress would be the most appropriate sample for a future randomized controlled trial to evaluate its effectiveness. PMID:26133739

  4. High Frequency Vibration Based Fatigue Testing of Developmental Alloys

    NASA Astrophysics Data System (ADS)

    Holycross, Casey M.; Srinivasan, Raghavan; George, Tommy J.; Tamirisakandala, Seshacharyulu; Russ, Stephan M.

    Many fatigue test methods have been previously developed to rapidly evaluate fatigue behavior. This increased test speed can come at some expense, since these methods may require non-standard specimen geometry or increased facility and equipment capability. One such method, developed by George et al, involves a base-excited plate specimen driven into a high frequency bending resonant mode. This resonant mode is of sufficient frequency (typically 1200 to 1700 Hertz) to accumulate 107 cycles in a few hours. One of the main limitations of this test method is that fatigue cracking is almost certainly guaranteed to be surface initiated at regions of high stress. This brings into question the validity of the fatigue test results, as compared to more traditional uniaxial, smooth-bar testing, since high stresses are subjecting only a small volume to fatigue damage. This limitation also brings into question the suitability of this method to screen developmental alloys, should their initiation life be governed by subsurface flaws. However, if applicable, the rapid generation of fatigue data using this method would facilitate faster design iterations, identifying more quickly, material and manufacturing process deficiencies. The developmental alloy used in this study was a powder metallurgy boron-modified Ti-6Al-4V, a new alloy currently being considered for gas turbine engine fan blades. Plate specimens were subjected to fully reversed bending fatigue. Results are compared with existing data from commercially available Ti-6Al-4V using both vibration based and more traditional fatigue test methods.

  5. The regulatory acceptance of alternatives in the European Union.

    PubMed

    Warbrick, E Vicky; Evans, Peter F

    2004-06-01

    Recently, progress has been made toward the regulatory acceptance of replacements in the European Union (EU), particularly with the introduction of in vitro methods for the prediction of skin corrosivity, dermal penetration, phototoxicity and embryotoxicity. In vitro genotoxicity tests are well established, and testing for this endpoint can be completed without animals, provided that clear negative outcomes are obtained. Tiered approaches including in vitro tests can also be used to address skin and eye irritation endpoints. Reductions and/or refinements in animal use are being achieved following the replacement of the oral LD50 test with alternative methods and the adoption of reduced test packages for materials, such as closed-system intermediates and certain polymers. Furthermore, the use of a "read-across" approach has reduced animal testing. Substantial gains in refinement will also be made with the recent acceptance of the local lymph node assay for skin sensitisation and the development of an acute inhalation toxicity method that avoids lethality as the endpoint. For the future, under the proposed EU Registration, Evaluation and Authorisation of Chemicals (REACH) scheme, it is envisaged that, where suitable in vitro methods exist, these should be used to support registration of substances produced at up to ten tonnes per annum. This proposal can only accelerate the further development, validation and regulatory acceptance of such alternative methods.

  6. The score statistic of the LD-lod analysis: detecting linkage adaptive to linkage disequilibrium.

    PubMed

    Huang, J; Jiang, Y

    2001-01-01

    We study the properties of a modified lod score method for testing linkage that incorporates linkage disequilibrium (LD-lod). By examination of its score statistic, we show that the LD-lod score method adaptively combines two sources of information: (a) the IBD sharing score which is informative for linkage regardless of the existence of LD and (b) the contrast between allele-specific IBD sharing scores which is informative for linkage only in the presence of LD. We also consider the connection between the LD-lod score method and the transmission-disequilibrium test (TDT) for triad data and the mean test for affected sib pair (ASP) data. We show that, for triad data, the recessive LD-lod test is asymptotically equivalent to the TDT; and for ASP data, it is an adaptive combination of the TDT and the ASP mean test. We demonstrate that the LD-lod score method has relatively good statistical efficiency in comparison with the ASP mean test and the TDT for a broad range of LD and the genetic models considered in this report. Therefore, the LD-lod score method is an interesting approach for detecting linkage when the extent of LD is unknown, such as in a genome-wide screen with a dense set of genetic markers. Copyright 2001 S. Karger AG, Basel

  7. Testing actinide fission yield treatment in CINDER90 for use in MCNP6 burnup calculations

    DOE PAGES

    Fensin, Michael Lorne; Umbel, Marissa

    2015-09-18

    Most of the development of the MCNPX/6 burnup capability focused on features that were applied to the Boltzman transport or used to prepare coefficients for use in CINDER90, with little change to CINDER90 or the CINDER90 data. Though a scheme exists for best solving the coupled Boltzman and Bateman equations, the most significant approximation is that the employed nuclear data are correct and complete. Thus, the CINDER90 library file contains 60 different actinide fission yields encompassing 36 fissionable actinides (thermal, fast, high energy and spontaneous fission). Fission reaction data exists for more than 60 actinides and as a result, fissionmore » yield data must be approximated for actinides that do not possess fission yield information. Several types of approximations are used for estimating fission yields for actinides which do not possess explicit fission yield data. The objective of this study is to test whether or not certain approximations of fission yield selection have any impact on predictability of major actinides and fission products. Further we assess which other fission products, available in MCNP6 Tier 3, result in the largest difference in production. Because the CINDER90 library file is in ASCII format and therefore easily amendable, we assess reasons for choosing, as well as compare actinide and major fission product prediction for the H. B. Robinson benchmark for, three separate fission yield selection methods: (1) the current CINDER90 library file method (Base); (2) the element method (Element); and (3) the isobar method (Isobar). Results show that the three methods tested result in similar prediction of major actinides, Tc-99 and Cs-137; however, certain fission products resulted in significantly different production depending on the method of choice.« less

  8. CON4EI: Slug Mucosal Irritation (SMI) test method for hazard identification and labelling of serious eye damaging and eye irritating chemicals.

    PubMed

    Adriaens, E; Guest, R; Willoughby, J A; Fochtman, P; Kandarova, H; Verstraelen, S; Van Rompay, A R

    2018-06-01

    Assessment of ocular irritancy is an international regulatory requirement in the safety evaluation of industrial and consumer products. Although many in vitro ocular irritation assays exist, alone they are incapable of fully categorizing chemicals. The objective of CEFIC-LRI-AIMT6-VITO CON4EI (CONsortium for in vitro Eye Irritation testing strategy) project was to develop tiered testing strategies for eye irritation assessment that can lead to complete replacement of the in vivo Draize rabbit eye test (OECD TG 405). A set of 80 reference chemicals was tested with seven test methods, one method was the Slug Mucosal Irritation (SMI) test method. The method measures the amount of mucus produced (MP) during a single 1-hour contact with a 1% and 10% dilution of the chemical. Based on the total MP, a classification (Cat 1, Cat 2, or No Cat) is predicted. The SMI test method correctly identified 65.8% of the Cat 1 chemicals with a specificity of 90.5% (low over-prediction rate for in vivo Cat 2 and No Cat chemicals). Mispredictions were predominantly unidirectional towards lower classifications with 26.7% of the liquids and 40% of the solids being underpredicted. In general, the performance was better for liquids than for solids with respectively 76.5% vs 57.1% (Cat 1), 61.5% vs 50% (Cat 2), and 87.5% vs 85.7% (No Cat) being identified correctly. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Passive Magnetic Bearing With Ferrofluid Stabilization

    NASA Technical Reports Server (NTRS)

    Jansen, Ralph; DiRusso, Eliseo

    1996-01-01

    A new class of magnetic bearings is shown to exist analytically and is demonstrated experimentally. The class of magnetic bearings utilize a ferrofluid/solid magnet interaction to stabilize the axial degree of freedom of a permanent magnet radial bearing. Twenty six permanent magnet bearing designs and twenty two ferrofluid stabilizer designs are evaluated. Two types of radial bearing designs are tested to determine their force and stiffness utilizing two methods. The first method is based on the use of frequency measurements to determine stiffness by utilizing an analytical model. The second method consisted of loading the system and measuring displacement in order to measure stiffness. Two ferrofluid stabilizers are tested and force displacement curves are measured. Two experimental test fixtures are designed and constructed in order to conduct the stiffness testing. Polynomial models of the data are generated and used to design the bearing prototype. The prototype was constructed and tested and shown to be stable. Further testing shows the possibility of using this technology for vibration isolation. The project successfully demonstrated the viability of the passive magnetic bearing with ferrofluid stabilization both experimentally and analytically.

  10. Highly Efficient Design-of-Experiments Methods for Combining CFD Analysis and Experimental Data

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Haller, Harold S.

    2009-01-01

    It is the purpose of this study to examine the impact of "highly efficient" Design-of-Experiments (DOE) methods for combining sets of CFD generated analysis data with smaller sets of Experimental test data in order to accurately predict performance results where experimental test data were not obtained. The study examines the impact of micro-ramp flow control on the shock wave boundary layer (SWBL) interaction where a complete paired set of data exist from both CFD analysis and Experimental measurements By combining the complete set of CFD analysis data composed of fifteen (15) cases with a smaller subset of experimental test data containing four/five (4/5) cases, compound data sets (CFD/EXP) were generated which allows the prediction of the complete set of Experimental results No statistical difference were found to exist between the combined (CFD/EXP) generated data sets and the complete Experimental data set composed of fifteen (15) cases. The same optimal micro-ramp configuration was obtained using the (CFD/EXP) generated data as obtained with the complete set of Experimental data, and the DOE response surfaces generated by the two data sets were also not statistically different.

  11. Evaluation of the new respiratory gating system

    PubMed Central

    Shi, Chengyu; Tang, Xiaoli; Chan, Maria

    2018-01-01

    Objective The newly released Respiratory Gating for Scanners (RGSC; Varian Medical Systems, Palo Alto, CA, USA) system has limited existing quality assurance (QA) protocols and pertinent publications. Herein, we report our experiences of the RGSC system acceptance and QA. Methods The RGSC system integration was tested with peripheral equipment, spatial reproducibility, and dynamic localization accuracy for regular and irregular breathing patterns, respectively. A QUASAR Respiratory Motion Phantom and a mathematical fitting method were used for data acquisition and analysis. Results The results showed that the RGSC system could accurately measure regular motion periods of 3–10 s. For irregular breathing patterns, differences from the existing Real-time Position Management (RPM; Varian Medical Systems, Palo Alto, CA) system were observed. For dynamic localization measurements, the RGSC system showed 76% agreement with the programmed test data within ±5% tolerance in terms of fitting period. As s comparison, the RPM system showed 66% agreement within ±5% tolerance, and 65% for the RGSC versus RPM measurements. Conclusions New functions and positioning accuracy improve the RGSC system’s ability to achieve higher dynamic treatment precision. A 4D phantom is helpful for the QA tests. Further investigation is required for the whole RGSC system performance QA. PMID:29722356

  12. Multi-Label Learning via Random Label Selection for Protein Subcellular Multi-Locations Prediction.

    PubMed

    Wang, Xiao; Li, Guo-Zheng

    2013-03-12

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multi-location proteins to multiple proteins with single location, which doesn't take correlations among different subcellular locations into account. In this paper, a novel method named RALS (multi-label learning via RAndom Label Selection), is proposed to learn from multi-location proteins in an effective and efficient way. Through five-fold cross validation test on a benchmark dataset, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark datasets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multi-locations of proteins. The prediction web server is available at http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  13. PCV: An Alignment Free Method for Finding Homologous Nucleotide Sequences and its Application in Phylogenetic Study.

    PubMed

    Kumar, Rajnish; Mishra, Bharat Kumar; Lahiri, Tapobrata; Kumar, Gautam; Kumar, Nilesh; Gupta, Rahul; Pal, Manoj Kumar

    2017-06-01

    Online retrieval of the homologous nucleotide sequences through existing alignment techniques is a common practice against the given database of sequences. The salient point of these techniques is their dependence on local alignment techniques and scoring matrices the reliability of which is limited by computational complexity and accuracy. Toward this direction, this work offers a novel way for numerical representation of genes which can further help in dividing the data space into smaller partitions helping formation of a search tree. In this context, this paper introduces a 36-dimensional Periodicity Count Value (PCV) which is representative of a particular nucleotide sequence and created through adaptation from the concept of stochastic model of Kolekar et al. (American Institute of Physics 1298:307-312, 2010. doi: 10.1063/1.3516320 ). The PCV construct uses information on physicochemical properties of nucleotides and their positional distribution pattern within a gene. It is observed that PCV representation of gene reduces computational cost in the calculation of distances between a pair of genes while being consistent with the existing methods. The validity of PCV-based method was further tested through their use in molecular phylogeny constructs in comparison with that using existing sequence alignment methods.

  14. Foot and mouth disease vaccine strain selection: Current approaches and future perspectives.

    PubMed

    Mahapatra, Mana; Parida, Satya

    2018-06-27

    Lack of cross protection between foot and mouth disease (FMD) virus (FMDV) serotypes as well as incomplete protection between some subtypes of FMDV affect the application of vaccine in the field. Further, the emergence of new variant FMD viruses periodically makes the existing vaccine inefficient. Consequently, periodical vaccine strain selection either by in vivo methods or in vitro methods become an essential requirement to enable utilisation of appropriate and efficient vaccines. Areas covered: Here we describe the cross reactivity of the existing vaccines with the global pool of circulating viruses and the putative selected vaccine strains for targeting protection against the two major circulating serotype O and A FMD viruses for East Africa, the Middle East, South Asia and South East Asia. Expert Commentary: Although in vivo cross protection studies are more appropriate methods for vaccine matching and selection than in vitro neutralisation test or ELISA, in the face of an outbreak both in vivo and in vitro methods of vaccine matching are not easy, and time consuming. The FMDV capsid contains all the immunogenic epitopes, and therefore vaccine strain prediction models using both capsid sequence and serology data will likely replace existing tools in the future.

  15. A systematic and efficient method to compute multi-loop master integrals

    NASA Astrophysics Data System (ADS)

    Liu, Xiao; Ma, Yan-Qing; Wang, Chen-Yu

    2018-04-01

    We propose a novel method to compute multi-loop master integrals by constructing and numerically solving a system of ordinary differential equations, with almost trivial boundary conditions. Thus it can be systematically applied to problems with arbitrary kinematic configurations. Numerical tests show that our method can not only achieve results with high precision, but also be much faster than the only existing systematic method sector decomposition. As a by product, we find a new strategy to compute scalar one-loop integrals without reducing them to master integrals.

  16. Space station contamination control study: Internal combustion, phase 1

    NASA Technical Reports Server (NTRS)

    Ruggeri, Robert T.

    1987-01-01

    Contamination inside Space Station modules was studied to determine the best methods of controlling contamination. The work was conducted in five tasks that identified existing contamination control requirements, analyzed contamination levels, developed outgassing specification for materials, wrote a contamination control plan, and evaluated current materials of offgassing tests used by NASA. It is concluded that current contamination control methods can be made to function on the Space Station for up to 1000 days, but that current methods are deficient for periods longer than about 1000 days.

  17. Acceptability of Using Electronic Vending Machines to Deliver Oral Rapid HIV Self-Testing Kits: A Qualitative Study

    PubMed Central

    Young, Sean D.; Daniels, Joseph; Chiu, ChingChe J.; Bolan, Robert K.; Flynn, Risa P.; Kwok, Justin; Klausner, Jeffrey D.

    2014-01-01

    Introduction Rates of unrecognized HIV infection are significantly higher among Latino and Black men who have sex with men (MSM). Policy makers have proposed that HIV self-testing kits and new methods for delivering self-testing could improve testing uptake among minority MSM. This study sought to conduct qualitative assessments with MSM of color to determine the acceptability of using electronic vending machines to dispense HIV self-testing kits. Materials and Methods African American and Latino MSM were recruited using a participant pool from an existing HIV prevention trial on Facebook. If participants expressed interest in using a vending machine to receive an HIV self-testing kit, they were emailed a 4-digit personal identification number (PIN) code to retrieve the test from the machine. We followed up with those who had tested to assess their willingness to participate in an interview about their experience. Results Twelve kits were dispensed and 8 interviews were conducted. In general, participants expressed that the vending machine was an acceptable HIV test delivery method due to its novelty and convenience. Discussion Acceptability of this delivery model for HIV testing kits was closely associated with three main factors: credibility, confidentiality, and convenience. Future research is needed to address issues, such as user-induced errors and costs, before scaling up the dispensing method. PMID:25076208

  18. Analytical concepts for health management systems of liquid rocket engines

    NASA Technical Reports Server (NTRS)

    Williams, Richard; Tulpule, Sharayu; Hawman, Michael

    1990-01-01

    Substantial improvement in health management systems performance can be realized by implementing advanced analytical methods of processing existing liquid rocket engine sensor data. In this paper, such techniques ranging from time series analysis to multisensor pattern recognition to expert systems to fault isolation models are examined and contrasted. The performance of several of these methods is evaluated using data from test firings of the Space Shuttle main engines.

  19. Defense Small Business Innovation Research Program (SBIR) FY 1984.

    DTIC Science & Technology

    1984-01-12

    nuclear submarine non-metallic, light weight, high strength piping . Includes the development of adequate fabrication procedures for attaching pipe ...waste heat economizer methods, require development. Improved conventional and hybrid heat pipes and/or two phase transport devices 149 IF are required...DESCRIPTION: A need exists to conceive, design, fabricate and test a method of adjusting the length of the individual legs of nylon or Kevlar rope sling

  20. Noncontaminating technique for making holes in existing process systems

    NASA Technical Reports Server (NTRS)

    Hecker, T. P.; Czapor, H. P.; Giordano, S. M.

    1972-01-01

    Technique is developed for making cleanly-contoured holes in assembled process systems without introducing chips or other contaminants into system. Technique uses portable equipment and does not require dismantling of system. Method was tested on Inconel, stainless steel, ASTMA-53, and Hastelloy X in all positions.

  1. In Vitro Toxicity Assessment Technique for Volatile Substances Using Flow-Through System

    EPA Science Inventory

    : The U.S. EPA is responsible for evaluating the effects of approximately 80,000 chemicals registered for use. The challenge is that limited toxicity data exists for many of these chemicals; traditional toxicity testing methods are slow, costly, involve animal studies, and canno...

  2. The pointillism method for creating stimuli suitable for use in computer-based visual contrast sensitivity testing.

    PubMed

    Turner, Travis H

    2005-03-30

    An increasingly large corpus of clinical and experimental neuropsychological research has demonstrated the utility of measuring visual contrast sensitivity. Unfortunately, existing means of measuring contrast sensitivity can be prohibitively expensive, difficult to standardize, or lack reliability. Additionally, most existing tests do not allow full control over important characteristics, such as off-angle rotations, waveform, contrast, and spatial frequency. Ideally, researchers could manipulate characteristics and display stimuli in a computerized task designed to meet experimental needs. Thus far, 256-bit color limitation in standard cathode ray tube (CRT) monitors has been preclusive. To this end, the pointillism method (PM) was developed. Using MATLAB software, stimuli are created based on both mathematical and stochastic components, such that differences in regional luminance values of the gradient field closely approximate the desired contrast. This paper describes the method and examines its performance in sine and square-wave image sets from a range of contrast values. Results suggest the utility of the method for most experimental applications. Weaknesses in the current version, the need for validation and reliability studies, and considerations regarding applications are discussed. Syntax for the program is provided in an appendix, and a version of the program independent of MATLAB is available from the author.

  3. Does diurnal variation in cough reflex testing exist in healthy young adults?

    PubMed

    Perry, Sarah; Huckabee, Maggie-Lee

    2017-05-01

    The aim of this study was to investigate whether diurnal variation in cough reflex sensitivity exists in healthy young adults when a tidal-breathing method is used. Fifty-three participants (19-37 years) underwent cough reflex testing on two occasions: once in the morning (between 9 am - midday) and once in the afternoon (between 2-5 pm). The order of testing was counter-balanced. Within each assessment, participants inhaled successively higher citric acid concentrations via a facemask, with saline solution randomly interspersed to control for a placebo response. The lowest concentration that elicited a reflexive cough response was recorded. Morning cough thresholds (mean=0.6mol/L) were not different from afternoon cough thresholds (mean=0.6mol/L), p=0.16, T=101, r=-0.14. We found no evidence of diurnal variability in cough reflex testing. There was, however, an order effect irrespective of time of day, confirming that healthy participants are able to volitionally modulate their cough response. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. a Method of Time-Series Change Detection Using Full Polsar Images from Different Sensors

    NASA Astrophysics Data System (ADS)

    Liu, W.; Yang, J.; Zhao, J.; Shi, H.; Yang, L.

    2018-04-01

    Most of the existing change detection methods using full polarimetric synthetic aperture radar (PolSAR) are limited to detecting change between two points in time. In this paper, a novel method was proposed to detect the change based on time-series data from different sensors. Firstly, the overall difference image of a time-series PolSAR was calculated by ominous statistic test. Secondly, difference images between any two images in different times ware acquired by Rj statistic test. Generalized Gaussian mixture model (GGMM) was used to obtain time-series change detection maps in the last step for the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection by using the time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can detect the time-series change from different sensors.

  5. A new method for monitoring the extracellular proteolytic activity of wine yeasts during alcoholic fermentation of grape must.

    PubMed

    Chasseriaud, Laura; Miot-Sertier, Cécile; Coulon, Joana; Iturmendi, Nerea; Moine, Virginie; Albertin, Warren; Bely, Marina

    2015-12-01

    The existing methods for testing proteolytic activity are time consuming, quite difficult to perform, and do not allow real-time monitoring. Proteases have attracted considerable interest in winemaking and some yeast species naturally present in grape must, such as Metschnikowia pulcherrima, are capable of expressing this activity. In this study, a new test is proposed for measuring proteolytic activity directly in fermenting grape must, using azocasein, a chromogenic substrate. Several yeast strains were tested and differences in proteolytic activity were observed. Moreover, analysis of grape must proteins in wines revealed that protease secreted by Metschnikowia strains may be active against wine proteins. Copyright © 2015. Published by Elsevier B.V.

  6. Thermal/structural design verification strategies for large space structures

    NASA Technical Reports Server (NTRS)

    Benton, David

    1988-01-01

    Requirements for space structures of increasing size, complexity, and precision have engendered a search for thermal design verification methods that do not impose unreasonable costs, that fit within the capabilities of existing facilities, and that still adequately reduce technical risk. This requires a combination of analytical and testing methods. This requires two approaches. The first is to limit thermal testing to sub-elements of the total system only in a compact configuration (i.e., not fully deployed). The second approach is to use a simplified environment to correlate analytical models with test results. These models can then be used to predict flight performance. In practice, a combination of these approaches is needed to verify the thermal/structural design of future very large space systems.

  7. COSMOS: accurate detection of somatic structural variations through asymmetric comparison between tumor and normal samples.

    PubMed

    Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun

    2016-05-05

    An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Need for new technologies for detection of adventitious agents in vaccines and other biological products.

    PubMed

    Mallet, Laurent; Gisonni-Lex, Lucy

    2014-01-01

    From an industrial perspective, the conventional in vitro and in vivo assays used for detection of viral contaminants have shown their limitations, as illustrated by the unfortunate detection of porcine circovirus contamination in a licensed rotavirus vaccine. This contamination event illustrates the gaps within the existing adventitious agent strategy and the potential use of new broader molecular detection methods. This paper serves to summarize current testing approaches and challenges, along with opportunities for the use of these new technologies. Testing of biological products is required to ensure the safety of patients. Recently, a licensed vaccine was found to be contaminated with a virus. This contamination did not cause a safety concern to the patients; however, it highlights the need for using new testing methods to control our biological products. This paper introduces the benefits of these new tests and outlines the challenges with the current tests. © PDA, Inc. 2014.

  9. Selecting Single Model in Combination Forecasting Based on Cointegration Test and Encompassing Test

    PubMed Central

    Jiang, Chuanjin; Zhang, Jing; Song, Fugen

    2014-01-01

    Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability. PMID:24892061

  10. Selecting single model in combination forecasting based on cointegration test and encompassing test.

    PubMed

    Jiang, Chuanjin; Zhang, Jing; Song, Fugen

    2014-01-01

    Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.

  11. Design and Construction of Airport Pavements on Expansive Soils

    DTIC Science & Technology

    1976-06-01

    Selection of the type anc amount of stabilizing agent (lime, cement , asphalt, only) (4) Test methods to determine the physical properties of sta...7 8.3 5.4 3.3 6.5 1 4.7 3-3, 1 (8) Investigate the effect of sulfate on cement -stabilized soils and establish...terested because the properties of soil/ cement mixtures and the relationships existing among these properties and various test values are discussed

  12. Manufacturing Methods for High Speed Machining of Aluminum

    DTIC Science & Technology

    1978-02-01

    Tests 53 4.4.3 Intergrmnular Corrosion Tests. ........... 53 4.4.4 Cost Analysis . .. ............... . .. .... 60 4.5 Conclusions...Corporat~ion and Others to equuip an existing Uwidstvahd, five-axes, Modal as-i, oidail with a 20,000 rVIL 20 hOW~pse spindle, Based anresults obtained...economic analysis for high-speed machining wan also conducted by Metout, and the results are given in Section 11.0. Xn Section 12.0, conclusions and

  13. Gear Drive Testing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Philadelphia Gear Corporation used two COSMIC computer programs; one dealing with shrink fit analysis and the other with rotor dynamics problems in computerized design and test work. The programs were used to verify existing in-house programs to insure design accuracy by checking its company-developed computer methods against procedures developed by other organizations. Its specialty is in custom units for unique applications, such as Coast Guard ice breaking ships, steel mill drives, coal crusher, sewage treatment equipment and electricity.

  14. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    PubMed

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  15. Ecotoxicity testing of chemicals with particular reference to pesticides.

    PubMed

    Walker, Colin H

    2006-07-01

    Ecotoxicity tests are performed on vertebrates and invertebrates for the environmental risk assessment of pesticides and other chemicals and for a variety of ecotoxicological studies in the laboratory and in the field. Existing practices and strategies in ecotoxicity testing are reviewed, including an account of current requirements of the European Commission for the testing of pesticides and the recent REACH (Registration, Evaluation, Authorisation and Restrictions of Chemicals) proposals for industrial chemicals. Criticisms of existing practices have been made on both scientific and ethical grounds, and these are considered before dealing with the question of possible alternative methods and strategies both for environmental risk assessment and for ecotoxicological studies more generally. New approaches from an ecological point of view are compared with recent developments in laboratory-based methods such as toxicity tests, biomarker assays and bioassays. With regard to the development of new strategies for risk assessment, it is suggested that full consideration should be given to the findings of earlier long-term studies of pollution, which identified mechanisms of action by which environmental chemicals can cause natural populations to decline. Neurotoxicity and endocrine disruption are two cases in point, and biomarker assays for them could have an important role in testing new chemicals suspected of having these properties. In a concluding discussion, possible ways of improving testing protocols are discussed, having regard for current issues in the field of environmental risk assessment as exemplified by the debate over the REACH proposals. The importance of flexibility and the roles of ecologists and ecotoxicologists are stressed in the context of environmental risk assessment.

  16. Robust rotational-velocity-Verlet integration methods.

    PubMed

    Rozmanov, Dmitri; Kusalik, Peter G

    2010-05-01

    Two rotational integration algorithms for rigid-body dynamics are proposed in velocity-Verlet formulation. The first method uses quaternion dynamics and was derived from the original rotational leap-frog method by Svanberg [Mol. Phys. 92, 1085 (1997)]; it produces time consistent positions and momenta. The second method is also formulated in terms of quaternions but it is not quaternion specific and can be easily adapted for any other orientational representation. Both the methods are tested extensively and compared to existing rotational integrators. The proposed integrators demonstrated performance at least at the level of previously reported rotational algorithms. The choice of simulation parameters is also discussed.

  17. Robust rotational-velocity-Verlet integration methods

    NASA Astrophysics Data System (ADS)

    Rozmanov, Dmitri; Kusalik, Peter G.

    2010-05-01

    Two rotational integration algorithms for rigid-body dynamics are proposed in velocity-Verlet formulation. The first method uses quaternion dynamics and was derived from the original rotational leap-frog method by Svanberg [Mol. Phys. 92, 1085 (1997)]; it produces time consistent positions and momenta. The second method is also formulated in terms of quaternions but it is not quaternion specific and can be easily adapted for any other orientational representation. Both the methods are tested extensively and compared to existing rotational integrators. The proposed integrators demonstrated performance at least at the level of previously reported rotational algorithms. The choice of simulation parameters is also discussed.

  18. Design of a Channel Error Simulator using Virtual Instrument Techniques for the Initial Testing of TCP/IP and SCPS Protocols

    NASA Technical Reports Server (NTRS)

    Horan, Stephen; Wang, Ru-Hai

    1999-01-01

    There exists a need for designers and developers to have a method to conveniently test a variety of communications parameters for an overall system design. This is no different when testing network protocols as when testing modulation formats. In this report, we discuss a means of providing a networking test device specifically designed to be used for space communications. This test device is a PC-based Virtual Instrument (VI) programmed using the LabVIEW(TM) version 5 software suite developed by National Instruments(TM)TM. This instrument was designed to be portable and usable by others without special, additional equipment. The programming was designed to replicate a VME-based hardware module developed earlier at New Mexico State University (NMSU) and to provide expanded capabilities exceeding the baseline configuration existing in that module. This report describes the design goals for the VI module in the next section and follows that with a description of the design of the VI instrument. This is followed with a description of the validation tests run on the VI. An application of the error-generating VI to networking protocols is then given.

  19. Evaluation of test equipment for the detection of contamination on electronic circuits

    NASA Astrophysics Data System (ADS)

    Bergendahl, C. G.; Dunn, B. D.

    1984-08-01

    The reproducibility, sensitivity and ease of operation of test equipment for the detection of ionizable contaminants on the surface of printed circuit assemblies were assessed. The characteristics of the test equipment are described. Soldering fluxes were chosen as contaminants and were applied in controlled amounts to printed-circuit board assemblies possessing two different component populations. Results show that the relationship between equipment readings varies with flux type. Each kind of test equipment gives a good measure of board cleanliness, although reservations exist concerning the interpretation of such results. A test method for the analysis of total (organic and inorganic) halides in solder fluxes is presented.

  20. Non-contact method of search and analysis of pulsating vessels

    NASA Astrophysics Data System (ADS)

    Avtomonov, Yuri N.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Despite the variety of existing methods of recording the human pulse and a solid history of their development, there is still considerable interest in this topic. The development of new non-contact methods, based on advanced image processing, caused a new wave of interest in this issue. We present a simple but quite effective method for analyzing the mechanical pulsations of blood vessels lying close to the surface of the skin. Our technique is a modification of imaging (or remote) photoplethysmography (i-PPG). We supplemented this method with the addition of a laser light source, which made it possible to use other methods of searching for the proposed pulsation zone. During the testing of the method, several series of experiments were carried out with both artificial oscillating objects as well as with the target signal source (human wrist). The obtained results show that our method allows correct interpretation of complex data. To summarize, we proposed and tested an alternative method for the search and analysis of pulsating vessels.

  1. Real time algorithms for sharp wave ripple detection.

    PubMed

    Sethi, Ankit; Kemere, Caleb

    2014-01-01

    Neural activity during sharp wave ripples (SWR), short bursts of co-ordinated oscillatory activity in the CA1 region of the rodent hippocampus, is implicated in a variety of memory functions from consolidation to recall. Detection of these events in an algorithmic framework, has thus far relied on simple thresholding techniques with heuristically derived parameters. This study is an investigation into testing and improving the current methods for detection of SWR events in neural recordings. We propose and profile methods to reduce latency in ripple detection. Proposed algorithms are tested on simulated ripple data. The findings show that simple realtime algorithms can improve upon existing power thresholding methods and can detect ripple activity with latencies in the range of 10-20 ms.

  2. Can electronic medical images replace hard-copy film? Defining and testing the equivalence of diagnostic tests.

    PubMed

    Obuchowski, N A

    2001-10-15

    Electronic medical images are an efficient and convenient format in which to display, store and transmit radiographic information. Before electronic images can be used routinely to screen and diagnose patients, however, it must be shown that readers have the same diagnostic performance with this new format as traditional hard-copy film. Currently, there exist no suitable definitions of diagnostic equivalence. In this paper we propose two criteria for diagnostic equivalence. The first criterion ('population equivalence') considers the variability between and within readers, as well as the mean reader performance. This criterion is useful for most applications. The second criterion ('individual equivalence') involves a comparison of the test results for individual patients and is necessary when patients are followed radiographically over time. We present methods for testing both individual and population equivalence. The properties of the proposed methods are assessed in a Monte Carlo simulation study. Data from a mammography screening study is used to illustrate the proposed methods and compare them with results from more conventional methods of assessing equivalence and inter-procedure agreement. Copyright 2001 John Wiley & Sons, Ltd.

  3. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  4. Ecotoxicity testing: science, politics and ethics.

    PubMed

    Walker, Colin H

    2008-02-01

    Animal welfare organisations have long been concerned about the use of animals for ecotoxicity testing. Ecotoxicity testing is a necessary part of the statutory risk assessment of chemicals that may be released into the environment. It is sometimes also carried out during the development of new chemicals and in the investigation of pollution in the field. This review considers the existing requirements for ecotoxicity testing, with particular reference to practices in the European Union, including the recent REACH system proposals, before discussing criticisms that have been made of existing practices for environmental risk assessment. These criticisms have been made on scientific and ethical grounds, as well as on questions of cost. A case is made for greater investment in the development of alternative testing methods, which could improve the science, as well as serving the cause of animal welfare. It has frequently been suggested that the statutory requirements for environmental risk assessment are too rigid and bureaucratic. A case is made for flexibility and the greater involvement of scientists in the risk assessment procedure, in the interests of both improved science and improved animal welfare.

  5. Validation of catchment models for predicting land-use and climate change impacts. 2. Case study for a Mediterranean catchment

    NASA Astrophysics Data System (ADS)

    Parkin, G.; O'Donnell, G.; Ewen, J.; Bathurst, J. C.; O'Connell, P. E.; Lavabre, J.

    1996-02-01

    Validation methods commonly used to test catchment models are not capable of demonstrating a model's fitness for making predictions for catchments where the catchment response is not known (including hypothetical catchments, and future conditions of existing catchments which are subject to land-use or climate change). This paper describes the first use of a new method of validation (Ewen and Parkin, 1996. J. Hydrol., 175: 583-594) designed to address these types of application; the method involves making 'blind' predictions of selected hydrological responses which are considered important for a particular application. SHETRAN (a physically based, distributed catchment modelling system) is tested on a small Mediterranean catchment. The test involves quantification of the uncertainty in four predicted features of the catchment response (continuous hydrograph, peak discharge rates, monthly runoff, and total runoff), and comparison of observations with the predicted ranges for these features. The results of this test are considered encouraging.

  6. Developmental Exposure to Valproate or Ethanol Alters Locomotor Activity and Retino-Tectal Projection Area in Zebrafish Embryos

    EPA Science Inventory

    Given the minimal developmental neurotoxicity data available for the large number of new and existing chemicals, there is a critical need for alternative methods to identify and prioritize chemicals for further testing. We outline a developmental neurotoxicity screening approach ...

  7. Non-Contacting Compliant Foil Seal for Gas Turbine Engine

    NASA Technical Reports Server (NTRS)

    Salehi, Mohsen; Heshmat, Hooshang

    2002-01-01

    The program is aimed at enhancing the existing analysis to include the turbulence effect. Several manufacturing methods are being investigated in order to apply our know-how in building the seal hardware. The contents include: 1) Test Facilities; 2) Analysis Enhancements; 3) Accomplishments/Status; and 4) Materials Study.

  8. Exergame Apps and Physical Activity: The Results of the ZOMBIE Trial

    ERIC Educational Resources Information Center

    Cowdery, Joan; Majeske, Paul; Frank, Rebecca; Brown, Devin

    2015-01-01

    Background: Although there are thousands of health and fitness smartphone apps currently available, little research exists regarding the effects of mobile app technology on physical activity behavior. Purpose: The purpose of this study was to test whether Exergame smartphone applications increase physical activity levels. Methods: This was a…

  9. Coaching in the AP Classroom

    ERIC Educational Resources Information Center

    Fornaciari, Jim

    2013-01-01

    Many parallels exist between quality coaches and quality classroom teachers--especially AP teachers, who often feel the pressure to produce positive test results. Having developed a series of techniques and strategies for building a team-oriented winning culture on the field, Jim Fornaciari writes about how he adapted those methods to work in the…

  10. Equal Employment Legislation: Alternative Means of Compliance.

    ERIC Educational Resources Information Center

    Daum, Jeffrey W.

    Alternative means of compliance available to organizations to bring their manpower uses into line with existing equal employment legislation are discussed in this paper. The first area addressed concerns the classical approach to selection and placement based on testing methods. The second area discussed reviews various nontesting techniques, such…

  11. Altering Methods to Fill the English Curriculum Gap in Japan

    ERIC Educational Resources Information Center

    Zinck, Gerald W.

    2017-01-01

    In the Japanese English education system, a distinct disconnect exists between the elementary and secondary education curricula. Elementary schools across Japan offer English classes, but adjusting to junior high English classes is often difficult for students. While the Japanese government reformed junior high school tests to aid student…

  12. 40 CFR 63.11466 - What are the performance test requirements for new and existing sources?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (Appendix A-1) to select sampling port locations and the number of traverse points in each stack or duct... of the stack gas. (iii) Method 3, 3A, or 3B (Appendix A-2) to determine the dry molecular weight of...

  13. Beta Testing in Social Work

    ERIC Educational Resources Information Center

    Traube, Dorian E.; Begun, Stephanie; Petering, Robin; Flynn, Marilyn L.

    2017-01-01

    The field of social work does not currently have a widely adopted method for expediting innovations into micro- or macropractice. Although it is common in fields such as engineering and business to have formal processes for accelerating scientific advances into consumer markets, few comparable mechanisms exist in the social sciences or social…

  14. Quantification of soil surface roughness evolution under simulated rainfall

    USDA-ARS?s Scientific Manuscript database

    Soil surface roughness is commonly identified as one of the dominant factors governing runoff and interrill erosion. The objective of this study was to compare several existing soil surface roughness indices and to test the Revised Triangular Prism surface area Method (RTPM) as a new approach to cal...

  15. Use of nutrient self selection as a diet refining tool in Tenebrio molitor (Coleoptera: Tenebrionidae)

    USDA-ARS?s Scientific Manuscript database

    A new method to refine existing dietary supplements for improving production of the yellow mealworm, Tenebrio molitor L. (Coleoptera: Tenebrionidae), was tested. Self selected ratios of 6 dietary ingredients by T. molitor larvae were used to produce a dietary supplement. This supplement was compared...

  16. Variables Related to MDTA Trainee Employment Success in Minnesota.

    ERIC Educational Resources Information Center

    Pucel, David J.

    To predict a person's use of his Manpower Development and Training Act (MDTA) training, this study attempted to supplement existing methods of evaluation, using personal descriptive data about trainees and General Aptitude Test Battery Scores. The sample under study included all students enrolled in ten MDTA projects, representing a geographical…

  17. Multidimensional Scaling of High School Students' Perceptions of Academic Dishonesty

    ERIC Educational Resources Information Center

    Schmelkin, Liora Pedhazur; Gilbert, Kimberly A.; Silva, Rebecca

    2010-01-01

    Although cheating on tests and other forms of academic dishonesty are considered rampant, no standard definition of academic dishonesty exists. The current study was conducted to investigate the perceptions of academic dishonesty in high school students, utilizing an innovative methodology, multidimensional scaling (MDS). Two methods were used to…

  18. Least squares regression methods for clustered ROC data with discrete covariates.

    PubMed

    Tang, Liansheng Larry; Zhang, Wei; Li, Qizhai; Ye, Xuan; Chan, Leighton

    2016-07-01

    The receiver operating characteristic (ROC) curve is a popular tool to evaluate and compare the accuracy of diagnostic tests to distinguish the diseased group from the nondiseased group when test results from tests are continuous or ordinal. A complicated data setting occurs when multiple tests are measured on abnormal and normal locations from the same subject and the measurements are clustered within the subject. Although least squares regression methods can be used for the estimation of ROC curve from correlated data, how to develop the least squares methods to estimate the ROC curve from the clustered data has not been studied. Also, the statistical properties of the least squares methods under the clustering setting are unknown. In this article, we develop the least squares ROC methods to allow the baseline and link functions to differ, and more importantly, to accommodate clustered data with discrete covariates. The methods can generate smooth ROC curves that satisfy the inherent continuous property of the true underlying curve. The least squares methods are shown to be more efficient than the existing nonparametric ROC methods under appropriate model assumptions in simulation studies. We apply the methods to a real example in the detection of glaucomatous deterioration. We also derive the asymptotic properties of the proposed methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A risk-based approach to flood management decisions in a nonstationary world

    NASA Astrophysics Data System (ADS)

    Rosner, Ana; Vogel, Richard M.; Kirshen, Paul H.

    2014-03-01

    Traditional approaches to flood management in a nonstationary world begin with a null hypothesis test of "no trend" and its likelihood, with little or no attention given to the likelihood that we might ignore a trend if it really existed. Concluding a trend exists when it does not, or rejecting a trend when it exists are known as type I and type II errors, respectively. Decision-makers are poorly served by statistical and/or decision methods that do not carefully consider both over- and under-preparation errors, respectively. Similarly, little attention is given to how to integrate uncertainty in our ability to detect trends into a flood management decision context. We show how trend hypothesis test results can be combined with an adaptation's infrastructure costs and damages avoided to provide a rational decision approach in a nonstationary world. The criterion of expected regret is shown to be a useful metric that integrates the statistical, economic, and hydrological aspects of the flood management problem in a nonstationary world.

  20. Development of phytotoxicity tests using wetland species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, M.K.; Fairchild, J.F.

    1994-12-31

    Laboratory phytotoxicity tests used to assess contaminant effects may not effectively protect wetland communities. The authors are developing routine culture and testing methods for selected fresh water plants, that can be used in risk assessments and monitoring of existing wetland systems. Utility of these tests includes evaluating the effects of point or non-point source contamination that may cause water or sediment quality degradation. Selected species include algae (blue-green, green), phytoflagellates (Chlamydomonas, Euglena), and floating or submerged vascular plants (milfoil, coontail, wild celery, elodea, duckweed). Algae toxicity tests range from 2-d, 4-d, and 7 day tests, and macrophyte tests from 10-dmore » to 14 days. Metribuzin and boron are the selected contaminants for developing the test methods. Metribuzin, a triazinone herbicide, is a photosystem 11 inhibitor, and is commonly used for control of grass and broad-leaf plants. As a plant micronutrient, boron is required in very small amounts, but excessive levels can result in phytotoxicity or accumulation. The investigations focus on the influence of important factors including the influence of light quality and quantity, and nutrient media. Reference toxicant exposures with potassium chloride are used to establish baseline data for sensitivity and vitality of the plants. These culture and test methods will be incorporated into recommendations for standard phytotoxicity test designs.« less

  1. Acceptability of using electronic vending machines to deliver oral rapid HIV self-testing kits: a qualitative study.

    PubMed

    Young, Sean D; Daniels, Joseph; Chiu, ChingChe J; Bolan, Robert K; Flynn, Risa P; Kwok, Justin; Klausner, Jeffrey D

    2014-01-01

    Rates of unrecognized HIV infection are significantly higher among Latino and Black men who have sex with men (MSM). Policy makers have proposed that HIV self-testing kits and new methods for delivering self-testing could improve testing uptake among minority MSM. This study sought to conduct qualitative assessments with MSM of color to determine the acceptability of using electronic vending machines to dispense HIV self-testing kits. African American and Latino MSM were recruited using a participant pool from an existing HIV prevention trial on Facebook. If participants expressed interest in using a vending machine to receive an HIV self-testing kit, they were emailed a 4-digit personal identification number (PIN) code to retrieve the test from the machine. We followed up with those who had tested to assess their willingness to participate in an interview about their experience. Twelve kits were dispensed and 8 interviews were conducted. In general, participants expressed that the vending machine was an acceptable HIV test delivery method due to its novelty and convenience. Acceptability of this delivery model for HIV testing kits was closely associated with three main factors: credibility, confidentiality, and convenience. Future research is needed to address issues, such as user-induced errors and costs, before scaling up the dispensing method.

  2. Approaching sub-50 nanoradian measurements by reducing the saw-tooth deviation of the autocollimator in the Nano-Optic-Measuring Machine

    NASA Astrophysics Data System (ADS)

    Qian, Shinan; Geckeler, Ralf D.; Just, Andreas; Idir, Mourad; Wu, Xuehui

    2015-06-01

    Since the development of the Nano-Optic-Measuring Machine (NOM), the accuracy of measuring the profile of an optical surface has been enhanced to the 100-nrad rms level or better. However, to update the accuracy of the NOM system to sub-50 nrad rms, the large saw-tooth deviation (269 nrad rms) of an existing electronic autocollimator, the Elcomat 3000/8, must be resolved. We carried out simulations to assess the saw-tooth-like deviation. We developed a method for setting readings to reduce the deviation to sub-50 nrad rms, suitable for testing plane mirrors. With this method, we found that all the tests conducted in a slowly rising section of the saw-tooth show a small deviation of 28.8 to <40 nrad rms. We also developed a dense-measurement method and an integer-period method to lower the saw-tooth deviation during tests of sphere mirrors. Further research is necessary for formulating a precise test for a spherical mirror. We present a series of test results from our experiments that verify the value of the improvements we made.

  3. More Than Just Accuracy: A Novel Method to Incorporate Multiple Test Attributes in Evaluating Diagnostic Tests Including Point of Care Tests.

    PubMed

    Thompson, Matthew; Weigl, Bernhard; Fitzpatrick, Annette; Ide, Nicole

    2016-01-01

    Current frameworks for evaluating diagnostic tests are constrained by a focus on diagnostic accuracy, and assume that all aspects of the testing process and test attributes are discrete and equally important. Determining the balance between the benefits and harms associated with new or existing tests has been overlooked. Yet, this is critically important information for stakeholders involved in developing, testing, and implementing tests. This is particularly important for point of care tests (POCTs) where tradeoffs exist between numerous aspects of the testing process and test attributes. We developed a new model that multiple stakeholders (e.g., clinicians, patients, researchers, test developers, industry, regulators, and health care funders) can use to visualize the multiple attributes of tests, the interactions that occur between these attributes, and their impacts on health outcomes. We use multiple examples to illustrate interactions between test attributes (test availability, test experience, and test results) and outcomes, including several POCTs. The model could be used to prioritize research and development efforts, and inform regulatory submissions for new diagnostics. It could potentially provide a way to incorporate the relative weights that various subgroups or clinical settings might place on different test attributes. Our model provides a novel way that multiple stakeholders can use to visualize test attributes, their interactions, and impacts on individual and population outcomes. We anticipate that this will facilitate more informed decision making around diagnostic tests.

  4. INVESTIGATING DIFFERENCES IN BRAIN FUNCTIONAL NETWORKS USING HIERARCHICAL COVARIATE-ADJUSTED INDEPENDENT COMPONENT ANALYSIS.

    PubMed

    Shi, Ran; Guo, Ying

    2016-12-01

    Human brains perform tasks via complex functional networks consisting of separated brain regions. A popular approach to characterize brain functional networks in fMRI studies is independent component analysis (ICA), which is a powerful method to reconstruct latent source signals from their linear mixtures. In many fMRI studies, an important goal is to investigate how brain functional networks change according to specific clinical and demographic variabilities. Existing ICA methods, however, cannot directly incorporate covariate effects in ICA decomposition. Heuristic post-ICA analysis to address this need can be inaccurate and inefficient. In this paper, we propose a hierarchical covariate-adjusted ICA (hc-ICA) model that provides a formal statistical framework for estimating covariate effects and testing differences between brain functional networks. Our method provides a more reliable and powerful statistical tool for evaluating group differences in brain functional networks while appropriately controlling for potential confounding factors. We present an analytically tractable EM algorithm to obtain maximum likelihood estimates of our model. We also develop a subspace-based approximate EM that runs significantly faster while retaining high accuracy. To test the differences in functional networks, we introduce a voxel-wise approximate inference procedure which eliminates the need of computationally expensive covariance matrix estimation and inversion. We demonstrate the advantages of our methods over the existing method via simulation studies. We apply our method to an fMRI study to investigate differences in brain functional networks associated with post-traumatic stress disorder (PTSD).

  5. Computational Pollutant Environment Assessment from Propulsion-System Testing

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; McConnaughey, Paul; Chen, Yen-Sen; Warsi, Saif

    1996-01-01

    An asymptotic plume growth method based on a time-accurate three-dimensional computational fluid dynamics formulation has been developed to assess the exhaust-plume pollutant environment from a simulated RD-170 engine hot-fire test on the F1 Test Stand at Marshall Space Flight Center. Researchers have long known that rocket-engine hot firing has the potential for forming thermal nitric oxides, as well as producing carbon monoxide when hydrocarbon fuels are used. Because of the complex physics involved, most attempts to predict the pollutant emissions from ground-based engine testing have used simplified methods, which may grossly underpredict and/or overpredict the pollutant formations in a test environment. The objective of this work has been to develop a computational fluid dynamics-based methodology that replicates the underlying test-stand flow physics to accurately and efficiently assess pollutant emissions from ground-based rocket-engine testing. A nominal RD-170 engine hot-fire test was computed, and pertinent test-stand flow physics was captured. The predicted total emission rates compared reasonably well with those of the existing hydrocarbon engine hot-firing test data.

  6. Equivalence testing using existing reference data: An example with genetically modified and conventional crops in animal feeding studies.

    PubMed

    van der Voet, Hilko; Goedhart, Paul W; Schmidt, Kerstin

    2017-11-01

    An equivalence testing method is described to assess the safety of regulated products using relevant data obtained in historical studies with assumedly safe reference products. The method is illustrated using data from a series of animal feeding studies with genetically modified and reference maize varieties. Several criteria for quantifying equivalence are discussed, and study-corrected distribution-wise equivalence is selected as being appropriate for the example case study. An equivalence test is proposed based on a high probability of declaring equivalence in a simplified situation, where there is no between-group variation, where the historical and current studies have the same residual variance, and where the current study is assumed to have a sample size as set by a regulator. The method makes use of generalized fiducial inference methods to integrate uncertainties from both the historical and the current data. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Methodenvergleich zur Bestimmung der hydraulischen Durchlässigkeit

    NASA Astrophysics Data System (ADS)

    Storz, Katharina; Steger, Hagen; Wagner, Valentin; Bayer, Peter; Blum, Philipp

    2017-06-01

    Knowing the hydraulic conductivity (K) is a precondition for understanding groundwater flow processes in the subsurface. Numerous laboratory and field methods for the determination of hydraulic conductivity exist, which can lead to significantly different results. In order to quantify the variability of these various methods, the hydraulic conductivity was examined for an industrial silica sand (Dorsilit) using four different methods: (1) grain-size analysis, (2) Kozeny-Carman approach, (3) permeameter tests and (4) flow rate experiments in large-scale tank experiments. Due to the large volume of the artificially built aquifer, the tank experiment results are assumed to be the most representative. Hydraulic conductivity values derived from permeameter tests show only minor deviation, while results of the empirically evaluated grain-size analysis are about one magnitude higher and show great variances. The latter was confirmed by the analysis of several methods for the determination of K-values found in the literature, thus we generally question the suitability of grain-size analyses and strongly recommend the use of permeameter tests.

  8. Boundary cooled rocket engines for space storable propellants

    NASA Technical Reports Server (NTRS)

    Kesselring, R. C.; Mcfarland, B. L.; Knight, R. M.; Gurnitz, R. N.

    1972-01-01

    An evaluation of an existing analytical heat transfer model was made to develop the technology of boundary film/conduction cooled rocket thrust chambers to the space storable propellant combination oxygen difluoride/diborane. Critical design parameters were identified and their importance determined. Test reduction methods were developed to enable data obtained from short duration hot firings with a thin walled (calorimeter) chamber to be used quantitatively evaluate the heat absorbing capability of the vapor film. The modification of the existing like-doublet injector was based on the results obtained from the calorimeter firings.

  9. Testing for independence in J×K contingency tables with complex sample survey data.

    PubMed

    Lipsitz, Stuart R; Fitzmaurice, Garrett M; Sinha, Debajyoti; Hevelone, Nathanael; Giovannucci, Edward; Hu, Jim C

    2015-09-01

    The test of independence of row and column variables in a (J×K) contingency table is a widely used statistical test in many areas of application. For complex survey samples, use of the standard Pearson chi-squared test is inappropriate due to correlation among units within the same cluster. Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) proposed an approach in which the standard Pearson chi-squared statistic is multiplied by a design effect to adjust for the complex survey design. Unfortunately, this test fails to exist when one of the observed cell counts equals zero. Even with the large samples typical of many complex surveys, zero cell counts can occur for rare events, small domains, or contingency tables with a large number of cells. Here, we propose Wald and score test statistics for independence based on weighted least squares estimating equations. In contrast to the Rao-Scott test statistic, the proposed Wald and score test statistics always exist. In simulations, the score test is found to perform best with respect to type I error. The proposed method is motivated by, and applied to, post surgical complications data from the United States' Nationwide Inpatient Sample (NIS) complex survey of hospitals in 2008. © 2015, The International Biometric Society.

  10. Internal Stress Monitoring of In-Service Structural Steel Members with Ultrasonic Method

    PubMed Central

    Li, Zuohua; He, Jingbo; Teng, Jun; Wang, Ying

    2016-01-01

    Internal stress in structural steel members is an important parameter for steel structures in their design, construction, and service stages. However, it is hard to measure via traditional approaches. Among the existing non-destructive testing (NDT) methods, the ultrasonic method has received the most research attention. Longitudinal critically refracted (Lcr) waves, which propagate parallel to the surface of the material within an effective depth, have shown great potential as an effective stress measurement approach. This paper presents a systematic non-destructive evaluation method to determine the internal stress in in-service structural steel members using Lcr waves. Based on theory of acoustoelasticity, a stress evaluation formula is derived. Factor of stress to acoustic time difference is used to describe the relationship between stress and measurable acoustic results. A testing facility is developed and used to demonstrate the performance of the proposed method. Two steel members are measured by using the proposed method and the traditional strain gauge method for verification. Parametric studies are performed on three steel members and the aluminum plate to investigate the factors that influence the testing results. The results show that the proposed method is effective and accurate for determining stress in in-service structural steel members. PMID:28773347

  11. Internal Stress Monitoring of In-Service Structural Steel Members with Ultrasonic Method.

    PubMed

    Li, Zuohua; He, Jingbo; Teng, Jun; Wang, Ying

    2016-03-23

    Internal stress in structural steel members is an important parameter for steel structures in their design, construction, and service stages. However, it is hard to measure via traditional approaches. Among the existing non-destructive testing (NDT) methods, the ultrasonic method has received the most research attention. Longitudinal critically refracted (Lcr) waves, which propagate parallel to the surface of the material within an effective depth, have shown great potential as an effective stress measurement approach. This paper presents a systematic non-destructive evaluation method to determine the internal stress in in-service structural steel members using Lcr waves. Based on theory of acoustoelasticity, a stress evaluation formula is derived. Factor of stress to acoustic time difference is used to describe the relationship between stress and measurable acoustic results. A testing facility is developed and used to demonstrate the performance of the proposed method. Two steel members are measured by using the proposed method and the traditional strain gauge method for verification. Parametric studies are performed on three steel members and the aluminum plate to investigate the factors that influence the testing results. The results show that the proposed method is effective and accurate for determining stress in in-service structural steel members.

  12. Design and implementation of a general main axis controller for the ESO telescopes

    NASA Astrophysics Data System (ADS)

    Sandrock, Stefan; Di Lieto, Nicola; Pettazzi, Lorenzo; Erm, Toomas

    2012-09-01

    Most of the real-time control systems at the existing ESO telescopes were developed with "traditional" methods, using general purpose VMEbus electronics, and running applications that were coded by hand, mostly using the C programming language under VxWorks. As we are moving towards more modern design methods, we have explored a model-based design approach for real-time applications in the telescope area, and used the control algorithm of a standard telescope main axis as a first example. We wanted to have a clear work-flow that follows the "correct-by-construction" paradigm, where the implementation is testable in simulation on the development host, and where the testing time spent by debugging on target is minimized. It should respect the domains of control, electronics, and software engineers in the choice of tools. It should be a targetindependent approach so that the result could be deployed on various platforms. We have selected the Mathworks tools Simulink, Stateflow, and Embedded Coder for design and implementation, and LabVIEW with NI hardware for hardware-in-the-loop testing, all of which are widely used in industry. We describe how these tools have been used in order to model, simulate, and test the application. We also evaluate the benefits of this approach compared to the traditional method with respect to testing effort and maintainability. For a specific axis controller application we have successfully integrated the result into the legacy platform of the existing VLT software, as well as demonstrated how to use the same design for a new development with a completely different environment.

  13. Acoustic Treatment Design Scaling Methods. Volume 1; Overview, Results, and Recommendations

    NASA Technical Reports Server (NTRS)

    Kraft, R. E.; Yu, J.

    1999-01-01

    Scale model fan rigs that simulate new generation ultra-high-bypass engines at about 1/5-scale are achieving increased importance as development vehicles for the design of low-noise aircraft engines. Testing at small scale allows the tests to be performed in existing anechoic wind tunnels, which provides an accurate simulation of the important effects of aircraft forward motion on the noise generation. The ability to design, build, and test miniaturized acoustic treatment panels on scale model fan rigs representative of the fullscale engine provides not only a cost-savings, but an opportunity to optimize the treatment by allowing tests of different designs. The primary objective of this study was to develop methods that will allow scale model fan rigs to be successfully used as acoustic treatment design tools. The study focuses on finding methods to extend the upper limit of the frequency range of impedance prediction models and acoustic impedance measurement methods for subscale treatment liner designs, and confirm the predictions by correlation with measured data. This phase of the program had as a goal doubling the upper limit of impedance measurement from 6 kHz to 12 kHz. The program utilizes combined analytical and experimental methods to achieve the objectives.

  14. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  15. Application of Non-destructive Methods of Stress-strain State at Hazardous Production Facilities

    NASA Astrophysics Data System (ADS)

    Shram, V.; Kravtsova, Ye; Selsky, A.; Bezborodov, Yu; Lysyannikova, N.; Lysyannikov, A.

    2016-06-01

    The paper deals with the sources of accidents in distillation columns, on the basis of which the most dangerous defects are detected. The analysis of the currently existing methods of non-destructive testing of the stress-strain state is performed. It is proposed to apply strain and acoustic emission techniques to continuously monitor dangerous objects, which helps prevent the possibility of accidents, as well as reduce the work.

  16. Oral aniracetam treatment in C57BL/6J mice without pre-existing cognitive dysfunction reveals no changes in learning, memory, anxiety or stereotypy

    PubMed Central

    Reynolds, Conner D.; Jefferson, Taylor S.; Volquardsen, Meagan; Pandian, Ashvini; Smith, Gregory D.; Holley, Andrew J.; Lugo, Joaquin N.

    2017-01-01

    Background: The piracetam analog, aniracetam, has recently received attention for its cognition enhancing potential, with minimal reported side effects.  Previous studies report the drug to be effective in both human and non-human models with pre-existing cognitive dysfunction, but few studies have evaluated its efficacy in healthy subjects. A previous study performed in our laboratory found no cognitive enhancing effects of oral aniracetam administration 1-hour prior to behavioral testing in naïve C57BL/6J mice. Methods: The current study aims to further evaluate this drug by administration of aniracetam 30 minutes prior to testing in order to optimize any cognitive enhancing effects. In this study, all naïve C57BL/6J mice were tested in tasks of delayed fear conditioning, novel object recognition, rotarod, open field, elevated plus maze, and marble burying. Results: Across all tasks, animals in the treatment group failed to show enhanced learning when compared to controls. Conclusions: These results provide further evidence suggesting that aniracetam conveys no therapeutic benefit to subjects without pre-existing cognitive dysfunction. PMID:29946420

  17. Effect of temperature on the fracture toughness in the nuclear reactor pressure vessel steel (SA508-3)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oh, S.W.; Lim, M.B.; Yoon, H.K.

    1994-12-31

    The elastic-plastic fracture toughness J{sub IC} of the Nuclear Reactor Pressure Vessel Steel (SA508-3) which has high toughness was obtained at three temperatures (room temperature, {minus}20 C, 200 C) using a 1/2 CT specimen. Especially the two methods recommended in ASTM and JSME were compared. It was found that difficulty exists in obtaining J{sub IC} by ASTM R-curve method, while JSME R-curve method yielded good results. The stretched zone width method gave slightly larger J{sub IC} values than those by the R-curve method for SA508-3 steel and the blunting line was not affected by the test temperatures. The relation betweenmore » SZW and J, SZW and J/E and SZW and J/{sigma}{sub ys} before initiation of a stable crack growth in the fracture toughness test at three temperatures is described.« less

  18. Moisture Separator Reheater for NPP Turbines

    NASA Astrophysics Data System (ADS)

    Manabe, Jun; Kasahara, Jiro

    This paper introduces the development of the current model Moisture Separator Reheater (MSR) for nuclear power plant (NPP) turbines, commercially placed in service in the period 1984-1997, focusing on the mist separation performance of the MSR along with drainage from heat exchanger tubes. A method of predicting the mist separation performance was devised first based on the observation of mist separation behaviors under an air-water test. Then the method was developed for the application to predict under the steam conditions, followed by the verification in comparison with the actual results of a steam condition test. The instability of tube drainage associated with both sub-cooling and temperature oscillation might adversely affect the seal welding of tubes to tube sheet due to thermal fatigue. The instability was measured on an existing unit to clarify behaviors and the development of a method to suppress them. Both methods were applied to newly constructed units and the effectiveness of the methods was demonstrated.

  19. Adaptive testing for multiple traits in a proportional odds model with applications to detect SNP-brain network associations.

    PubMed

    Kim, Junghi; Pan, Wei

    2017-04-01

    There has been increasing interest in developing more powerful and flexible statistical tests to detect genetic associations with multiple traits, as arising from neuroimaging genetic studies. Most of existing methods treat a single trait or multiple traits as response while treating an SNP as a predictor coded under an additive inheritance mode. In this paper, we follow an earlier approach in treating an SNP as an ordinal response while treating traits as predictors in a proportional odds model (POM). In this way, it is not only easier to handle mixed types of traits, e.g., some quantitative and some binary, but it is also potentially more robust to the commonly adopted additive inheritance mode. More importantly, we develop an adaptive test in a POM so that it can maintain high power across many possible situations. Compared to the existing methods treating multiple traits as responses, e.g., in a generalized estimating equation (GEE) approach, the proposed method can be applied to a high dimensional setting where the number of phenotypes (p) can be larger than the sample size (n), in addition to a usual small P setting. The promising performance of the proposed method was demonstrated with applications to the Alzheimer's Disease Neuroimaging Initiative (ADNI) data, in which either structural MRI driven phenotypes or resting-state functional MRI (rs-fMRI) derived brain functional connectivity measures were used as phenotypes. The applications led to the identification of several top SNPs of biological interest. Furthermore, simulation studies showed competitive performance of the new method, especially for p>n. © 2017 WILEY PERIODICALS, INC.

  20. A computer-controlled apparatus for Seebeck inhomogeneity testing of sheathed thermocouples

    NASA Technical Reports Server (NTRS)

    Burkett, Cecil G., Jr.; Bauserman, Willard A., Jr.

    1993-01-01

    Mineral-insulated metal-sheathed (MIMS) thermocouple assemblies are used throughout industry and research facilities as a method of temperature measurement where requirements for either harsh environmental conditions exist, or where rigidity of the measurement probe is required. Seebeck inhomogeneity is the abnormal variation of the Seebeck coefficient from point to point in a material. It is not disclosed in conventional calibration. A standardized method of measuring thermoelectric inhomogeneity along the thermocouple probe length is not available. Therefore, calibration for sheathed probes normally does not include testing of probe inhomogeneity. The measurement accuracy would be severely impacted if significant inhomogeneity and a temperature gradient were present in the same region of the probe. A computer-controlled system for determining inhomogeneities was designed, fabricated, and tested. This system provides an accurate method for the identification of the location of inhomogeneity along the length of a sheathed thermocouple and for the quantification of the inhomogeneity. This paper will discuss the apparatus and procedure used to perform these tests and will present data showing tests performed on sheathed thermocouple probes.

  1. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference

    PubMed Central

    Storey, Helen L.; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates. PMID:26566275

  2. Four-way-leaning test shows larger limits of stability than a circular-leaning test.

    PubMed

    Thomsen, Mikkel Højgaard; Støttrup, Nicolai; Larsen, Frederik Greve; Pedersen, Ann-Marie Sydow Krogh; Poulsen, Anne Grove; Hirata, Rogerio Pessoto

    2017-01-01

    Limits of stability (LOS) have extensive clinical and rehabilitational value yet no standard consensus on measuring LOS exists. LOS measured using a leaning or a circling protocol is commonly used in research and clinical settings, however differences in protocols and reliability problems exist. This study measured LOS using a four-way-leaning test and a circular-leaning test to test which showed larger LOS measurements. Furthermore, number of adaptation trials needed for consistent results was assessed. Limits of stability were measured using a force plate (Metitur Good Balance System ® ) sampling at 50Hz. Thirty healthy subjects completed 30 trials assessing LOS alternating between four-way-leaning test and circular-leaning test. A main effect of methods (ANOVA:F(1,28)=45.86, P<0.01) with the four-way-leaning test showing larger values than the circular-leaning test (NK, P<0.01). An interaction between method×directions was found (ANOVA:F(3, 84)=24.87, P<0.01). The four-way-leaning test showed larger LOS in anterior (NK, P<0.05), right (NK, P<0.01) and left direction (NK, P<0.01). Analysis of LOS for the four-way-leaning test showed a difference between trials (ANOVA:F(14,392)=7.81, P<0.01). Differences were found between trial 1 and 7 (NK, P<0.03), trial 6 and 8 (NK, P<0.02) and trial 7 and 15 (NK, P<0.02). Four-way-leaning test showed high correlation (ICC>0.87) between first and second trial for all directions. Four-way-leaning test yields larger LOS in anterior, right and left direction making it more reliable when measuring LOS. A learning effect was found up to the 8th trial, which suggests using 8 adaptation trials before reliable LOS is measured. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    PubMed

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  4. Signal-to-noise ratio estimation using adaptive tuning on the piecewise cubic Hermite interpolation model for images.

    PubMed

    Sim, K S; Yeap, Z X; Tso, C P

    2016-11-01

    An improvement to the existing technique of quantifying signal-to-noise ratio (SNR) of scanning electron microscope (SEM) images using piecewise cubic Hermite interpolation (PCHIP) technique is proposed. The new technique uses an adaptive tuning onto the PCHIP, and is thus named as ATPCHIP. To test its accuracy, 70 images are corrupted with noise and their autocorrelation functions are then plotted. The ATPCHIP technique is applied to estimate the uncorrupted noise-free zero offset point from a corrupted image. Three existing methods, the nearest neighborhood, first order interpolation and original PCHIP, are used to compare with the performance of the proposed ATPCHIP method, with respect to their calculated SNR values. Results show that ATPCHIP is an accurate and reliable method to estimate SNR values from SEM images. SCANNING 38:502-514, 2016. © 2015 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  5. Research on Operation Assessment Method for Energy Meter

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    The existing electric energy meter rotation maintenance strategy regularly checks the electric energy meter and evaluates the state. It only considers the influence of time factors, neglects the influence of other factors, leads to the inaccuracy of the evaluation, and causes the waste of resources. In order to evaluate the running state of the electric energy meter in time, a method of the operation evaluation of the electric energy meter is proposed. The method is based on extracting the existing data acquisition system, marketing business system and metrology production scheduling platform that affect the state of energy meters, and classified into error stability, operational reliability, potential risks and other factors according to the influencing factors, based on the above basic test score, inspecting score, monitoring score, score of family defect detection. Then, according to the evaluation model according to the scoring, we evaluate electric energy meter operating state, and finally put forward the corresponding maintenance strategy of rotation.

  6. A Method of Evaluating Operation of Electric Energy Meter

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Li, Tianyang; Cao, Fei; Chu, Pengfei; Zhao, Xinwang; Huang, Rui; Liu, Liping; Zhang, Chenglin

    2018-05-01

    The existing electric energy meter rotation maintenance strategy regularly checks the electric energy meter and evaluates the state. It only considers the influence of time factors, neglects the influence of other factors, leads to the inaccuracy of the evaluation, and causes the waste of resources. In order to evaluate the running state of the electric energy meter in time, a method of the operation evaluation of the electric energy meter is proposed. The method is based on extracting the existing data acquisition system, marketing business system and metrology production scheduling platform that affect the state of energy meters, and classified into error stability, operational reliability, potential risks and other factors according to the influencing factors, based on the above basic test score, inspecting score, monitoring score, score of family defect detection. Then, according to the evaluation model according to the scoring, we evaluate electric energy meter operating state, and finally put forward the corresponding maintenance strategy of rotation.

  7. Testing mapping algorithms of the cancer-specific EORTC QLQ-C30 onto EQ-5D in malignant mesothelioma.

    PubMed

    Arnold, David T; Rowen, Donna; Versteegh, Matthijs M; Morley, Anna; Hooper, Clare E; Maskell, Nicholas A

    2015-01-23

    In order to estimate utilities for cancer studies where the EQ-5D was not used, the EORTC QLQ-C30 can be used to estimate EQ-5D using existing mapping algorithms. Several mapping algorithms exist for this transformation, however, algorithms tend to lose accuracy in patients in poor health states. The aim of this study was to test all existing mapping algorithms of QLQ-C30 onto EQ-5D, in a dataset of patients with malignant pleural mesothelioma, an invariably fatal malignancy where no previous mapping estimation has been published. Health related quality of life (HRQoL) data where both the EQ-5D and QLQ-C30 were used simultaneously was obtained from the UK-based prospective observational SWAMP (South West Area Mesothelioma and Pemetrexed) trial. In the original trial 73 patients with pleural mesothelioma were offered palliative chemotherapy and their HRQoL was assessed across five time points. This data was used to test the nine available mapping algorithms found in the literature, comparing predicted against observed EQ-5D values. The ability of algorithms to predict the mean, minimise error and detect clinically significant differences was assessed. The dataset had a total of 250 observations across 5 timepoints. The linear regression mapping algorithms tested generally performed poorly, over-estimating the predicted compared to observed EQ-5D values, especially when observed EQ-5D was below 0.5. The best performing algorithm used a response mapping method and predicted the mean EQ-5D with accuracy with an average root mean squared error of 0.17 (Standard Deviation; 0.22). This algorithm reliably discriminated between clinically distinct subgroups seen in the primary dataset. This study tested mapping algorithms in a population with poor health states, where they have been previously shown to perform poorly. Further research into EQ-5D estimation should be directed at response mapping methods given its superior performance in this study.

  8. [Antagonism in vitro among phytopathogenic and saprobic fungi from horticultural soils].

    PubMed

    Alippi, H E; Monaco, C

    1990-01-01

    Two methods were tested in order to determine the existence of in vitro antagonism among saprobic and pathogenic fungi. These microorganisms were the most common isolates from horticultural soils of La Plata (Buenos Aires). Trichoderma harzianum; T. koningii and Penicillium sp. were antagonistic to all the pathogenic fungi tested, Fusarium solani; F. oxysporum; Alternaria solani; Colletotrichum sp. and Sclerotium rolfsii Spicaria sp., Paecilomyces sp. and Chaetomiun sp. were antagonistic only to Colletotrichum sp. and Fusarium solani.

  9. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  10. CON4EI: Short Time Exposure (STE) test method for hazard identification and labelling of eye irritating chemicals.

    PubMed

    Adriaens, E; Willoughby, J A; Meyer, B R; Blakeman, L C; Alépée, N; Fochtman, P; Guest, R; Kandarova, H; Verstraelen, S; Van Rompay, A R

    2018-06-01

    Assessment of ocular irritancy is an international regulatory requirement in the safety evaluation of industrial and consumer products. Although many in vitro ocular irritation assays exist, alone they are incapable of fully categorizing chemicals. Therefore, the CEFIC-LRI-AIMT6-VITO CON4EI consortium was developed to assess the reliability of eight in vitro test methods and establish an optimal tiered-testing strategy. One assay selected was the Short Time Exposure (STE) assay. This assay measures the viability of SIRC rabbit corneal cells after 5min exposure to 5% and 0.05% solutions of test material, and is capable of categorizing of Category 1 and No Category chemicals. The accuracy of the STE test method to identify Cat 1 chemicals was 61.3% with 23.7% sensitivity and 95.2% specificity. If non-soluble chemicals and unqualified results were excluded, the performance to identify Cat 1 chemicals remained similar (accuracy 62.2% with 22.7% sensitivity and 100% specificity). The accuracy of the STE test method to identify No Cat chemicals was 72.5% with 66.2% sensitivity and 100% specificity. Excluding highly volatile chemicals, non-surfactant solids and non-qualified results resulted in an important improvement of the performance of the STE test method (accuracy 96.2% with 81.8% sensitivity and 100% specificity). Furthermore, it seems that solids are more difficult to test in the STE, 71.4% of the solids resulted in unqualified results (solubility issues and/or high variation between independent runs) whereas for liquids 13.2% of the results were not qualified, supporting the restriction of the test method regarding the testing of solids. Copyright © 2017. Published by Elsevier Ltd.

  11. Physical and chemical test results of electrostatic safe flooring materials

    NASA Technical Reports Server (NTRS)

    Gompf, R. H.

    1988-01-01

    This test program was initiated because a need existed at the Kennedy Space Center (KSC) to have this information readily available to the engineer who must make the choice of which electrostatic safe floor to use in a specific application. The information, however, should be of value throughout both the government and private industry in the selection of a floor covering material. Included are the test results of 18 floor covering materials which by test evaluation at KSC are considered electrostatically safe. Tests were done and/or the data compiled in the following areas: electrostatics, flammability, hypergolic compatibility, outgassing, floor type, material thickness, and available colors. Each section contains the test method used to gather the data and the test results.

  12. RTO Technical Publications: A Quarterly Listing

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This is a listing of recent unclassified RTO technical publications processed by the NASA Center for AeroSpace Information covering the period from July 1, 2005 to September 30, 2005; and available in the NASA Aeronautics and Space Database. Contents include: Aeroelastic Deformation: Adaptation of Wind Tunnel Measurement Concepts to Full-Scale Vehicle Flight Testing; Actively Controlling Buffet-Induced Excitations; Modelling and Simulation to Address NATO's New and Existing Military Requirements; Latency in Visionic Systems: Test Methods and Requirements; Personal Hearing Protection including Active Noise Reduction; Virtual Laboratory Enabling Collaborative Research in Applied Vehicle Technologies; A Method to Analyze Tail Buffet Loads of Aircraft; Particle Image Velocimetry Measurements to Evaluate the Effectiveness of Deck-Edge Columnar Vortex Generators on Aircraft Carriers; Introduction to Flight Test Engineering, Volume 14; Pathological Aspects and Associated Biodynamics in Aircraft Accident Investigation;

  13. Smoke detection

    DOEpatents

    Warmack, Robert J. Bruce; Wolf, Dennis A.; Frank, Steven Shane

    2016-09-06

    Various apparatus and methods for smoke detection are disclosed. In one embodiment, a method of training a classifier for a smoke detector comprises inputting sensor data from a plurality of tests into a processor. The sensor data is processed to generate derived signal data corresponding to the test data for respective tests. The derived signal data is assigned into categories comprising at least one fire group and at least one non-fire group. Linear discriminant analysis (LDA) training is performed by the processor. The derived signal data and the assigned categories for the derived signal data are inputs to the LDA training. The output of the LDA training is stored in a computer readable medium, such as in a smoke detector that uses LDA to determine, based on the training, whether present conditions indicate the existence of a fire.

  14. Smoke detection

    DOEpatents

    Warmack, Robert J. Bruce; Wolf, Dennis A.; Frank, Steven Shane

    2015-10-27

    Various apparatus and methods for smoke detection are disclosed. In one embodiment, a method of training a classifier for a smoke detector comprises inputting sensor data from a plurality of tests into a processor. The sensor data is processed to generate derived signal data corresponding to the test data for respective tests. The derived signal data is assigned into categories comprising at least one fire group and at least one non-fire group. Linear discriminant analysis (LDA) training is performed by the processor. The derived signal data and the assigned categories for the derived signal data are inputs to the LDA training. The output of the LDA training is stored in a computer readable medium, such as in a smoke detector that uses LDA to determine, based on the training, whether present conditions indicate the existence of a fire.

  15. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  16. All-Versus-Nothing Proof of Einstein-Podolsky-Rosen Steering

    PubMed Central

    Chen, Jing-Ling; Ye, Xiang-Jun; Wu, Chunfeng; Su, Hong-Yi; Cabello, Adán; Kwek, L. C.; Oh, C. H.

    2013-01-01

    Einstein-Podolsky-Rosen steering is a form of quantum nonlocality intermediate between entanglement and Bell nonlocality. Although Schrödinger already mooted the idea in 1935, steering still defies a complete understanding. In analogy to “all-versus-nothing” proofs of Bell nonlocality, here we present a proof of steering without inequalities rendering the detection of correlations leading to a violation of steering inequalities unnecessary. We show that, given any two-qubit entangled state, the existence of certain projective measurement by Alice so that Bob's normalized conditional states can be regarded as two different pure states provides a criterion for Alice-to-Bob steerability. A steering inequality equivalent to the all-versus-nothing proof is also obtained. Our result clearly demonstrates that there exist many quantum states which do not violate any previously known steering inequality but are indeed steerable. Our method offers advantages over the existing methods for experimentally testing steerability, and sheds new light on the asymmetric steering problem. PMID:23828242

  17. Predicting future discoveries from current scientific literature.

    PubMed

    Petrič, Ingrid; Cestnik, Bojan

    2014-01-01

    Knowledge discovery in biomedicine is a time-consuming process starting from the basic research, through preclinical testing, towards possible clinical applications. Crossing of conceptual boundaries is often needed for groundbreaking biomedical research that generates highly inventive discoveries. We demonstrate the ability of a creative literature mining method to advance valuable new discoveries based on rare ideas from existing literature. When emerging ideas from scientific literature are put together as fragments of knowledge in a systematic way, they may lead to original, sometimes surprising, research findings. If enough scientific evidence is already published for the association of such findings, they can be considered as scientific hypotheses. In this chapter, we describe a method for the computer-aided generation of such hypotheses based on the existing scientific literature. Our literature-based discovery of NF-kappaB with its possible connections to autism was recently approved by scientific community, which confirms the ability of our literature mining methodology to accelerate future discoveries based on rare ideas from existing literature.

  18. On-line Monitoring Device for High-voltage Switch Cabinet Partial Discharge Based on Pulse Current Method

    NASA Astrophysics Data System (ADS)

    Y Tao, S.; Zhang, X. Z.; Cai, H. W.; Li, P.; Feng, Y.; Zhang, T. C.; Li, J.; Wang, W. S.; Zhang, X. K.

    2017-12-01

    The pulse current method for partial discharge detection is generally applied in type testing and other off-line tests of electrical equipment at delivery. After intensive analysis of the present situation and existing problems of partial discharge detection in switch cabinets, this paper designed the circuit principle and signal extraction method for partial discharge on-line detection based on a high-voltage presence indicating systems (VPIS), established a high voltage switch cabinet partial discharge on-line detection circuit based on the pulse current method, developed background software integrated with real-time monitoring, judging and analyzing functions, carried out a real discharge simulation test on a real-type partial discharge defect simulation platform of a 10KV switch cabinet, and verified the sensitivity and validity of the high-voltage switch cabinet partial discharge on-line monitoring device based on the pulse current method. The study presented in this paper is of great significance for switch cabinet maintenance and theoretical study on pulse current method on-line detection, and has provided a good implementation method for partial discharge on-line monitoring devices for 10KV distribution network equipment.

  19. Measurement of plasma unbound unconjugated bilirubin.

    PubMed

    Ahlfors, C E

    2000-03-15

    A method is described for measuring the unconjugated fraction of the unbound bilirubin concentration in plasma by combining the peroxidase method for determining unbound bilirubin with a diazo method for measuring conjugated and unconjugated bilirubin. The accuracy of the unbound bilirubin determination is improved by decreasing sample dilution, eliminating interference by conjugated bilirubin, monitoring changes in bilirubin concentration using diazo derivatives, and correcting for rate-limiting dissociation of bilirubin from albumin. The unbound unconjugated bilirubin concentration by the combined method in plasma from 20 jaundiced newborns was significantly greater than and poorly correlated with the unbound bilirubin determined by the existing peroxidase method (r = 0.7), possibly due to differences in sample dilution between the methods. The unbound unconjugated bilirubin was an unpredictable fraction of the unbound bilirubin in plasma samples from patients with similar total bilirubin concentrations but varying levels of conjugated bilirubin. A bilirubin-binding competitor was readily detected at a sample dilution typically used for the combined test but not at the dilution used for the existing peroxidase method. The combined method is ideally suited to measuring unbound unconjugated bilirubin in jaundiced human newborns or animal models of kernicterus. Copyright 2000 Academic Press.

  20. Improved accuracy of supervised CRM discovery with interpolated Markov models and cross-species comparison.

    PubMed

    Kazemian, Majid; Zhu, Qiyun; Halfon, Marc S; Sinha, Saurabh

    2011-12-01

    Despite recent advances in experimental approaches for identifying transcriptional cis-regulatory modules (CRMs, 'enhancers'), direct empirical discovery of CRMs for all genes in all cell types and environmental conditions is likely to remain an elusive goal. Effective methods for computational CRM discovery are thus a critically needed complement to empirical approaches. However, existing computational methods that search for clusters of putative binding sites are ineffective if the relevant TFs and/or their binding specificities are unknown. Here, we provide a significantly improved method for 'motif-blind' CRM discovery that does not depend on knowledge or accurate prediction of TF-binding motifs and is effective when limited knowledge of functional CRMs is available to 'supervise' the search. We propose a new statistical method, based on 'Interpolated Markov Models', for motif-blind, genome-wide CRM discovery. It captures the statistical profile of variable length words in known CRMs of a regulatory network and finds candidate CRMs that match this profile. The method also uses orthologs of the known CRMs from closely related genomes. We perform in silico evaluation of predicted CRMs by assessing whether their neighboring genes are enriched for the expected expression patterns. This assessment uses a novel statistical test that extends the widely used Hypergeometric test of gene set enrichment to account for variability in intergenic lengths. We find that the new CRM prediction method is superior to existing methods. Finally, we experimentally validate 12 new CRM predictions by examining their regulatory activity in vivo in Drosophila; 10 of the tested CRMs were found to be functional, while 6 of the top 7 predictions showed the expected activity patterns. We make our program available as downloadable source code, and as a plugin for a genome browser installed on our servers. © The Author(s) 2011. Published by Oxford University Press.

  1. Determination of vitamin A (retinol) in infant formula and adult nutritionals by liquid chromatography: First Action 2011.15.

    PubMed

    DeVries, Jonathan W; Silvera, Karlene R; McSherry, Elliot; Dowell, Dawn

    2012-01-01

    During the "Standards Development and International Harmonization: AOAC INTERNATIONAL Mid-Year Meeting," held on June 29, 2011, an Expert Review Panel (ERP) reviewed the method for the "Determination of Vitamins A (Retinol) and E (alpha-Tocopherol) in Foods by Liquid Chromatography: Collaborative Study," published by Jonathan W. DeVries and Karlene R. Silvera in J. AOAC Int. in 2002. After evaluation of the original validation data, an ERP agreed in June 2011 that the method meets standard method performance requirements (SMPRs) for vitamin A, as articulated by the Stakeholder Panel on Infant Formula and Adult Nutritionals. The ERP granted the method First Action status, applicable to determining vitamin A in ready-to-eat infant and adult nutritional formula. In an effort to achieve Final Action status, it was recommended that additional information be generated for different types of infant and adult nutritional formula matrixes at varied concentration levels as indicated in the vitamin A (retinol) SMPR. Existing AOAC LC methods are suited for specific vitamin A analytical applications. The original method differs from existing methods in that it can be used to assay samples in all nine sectors of the food matrix. One sector of the food matrix was powdered infant formula and gave support for the First Action approval for vitamin A in infant and adult nutritional formula. In this method, standards and test samples are saponified in basic ethanol-water solution, neutralized, and diluted, converting fats to fatty acids and retinol esters to retinol. Retinol is quantitated by an LC method, using UV detection at 313 or 328 nm for retinol. Vitamin concentration is calculated by comparison of the peak heights or peak areas of retinol in test samples with those of standards.

  2. Evaluating Gene Set Enrichment Analysis Via a Hybrid Data Model

    PubMed Central

    Hua, Jianping; Bittner, Michael L.; Dougherty, Edward R.

    2014-01-01

    Gene set enrichment analysis (GSA) methods have been widely adopted by biological labs to analyze data and generate hypotheses for validation. Most of the existing comparison studies focus on whether the existing GSA methods can produce accurate P-values; however, practitioners are often more concerned with the correct gene-set ranking generated by the methods. The ranking performance is closely related to two critical goals associated with GSA methods: the ability to reveal biological themes and ensuring reproducibility, especially for small-sample studies. We have conducted a comprehensive simulation study focusing on the ranking performance of seven representative GSA methods. We overcome the limitation on the availability of real data sets by creating hybrid data models from existing large data sets. To build the data model, we pick a master gene from the data set to form the ground truth and artificially generate the phenotype labels. Multiple hybrid data models can be constructed from one data set and multiple data sets of smaller sizes can be generated by resampling the original data set. This approach enables us to generate a large batch of data sets to check the ranking performance of GSA methods. Our simulation study reveals that for the proposed data model, the Q2 type GSA methods have in general better performance than other GSA methods and the global test has the most robust results. The properties of a data set play a critical role in the performance. For the data sets with highly connected genes, all GSA methods suffer significantly in performance. PMID:24558298

  3. Toward a Method for Exposing and Elucidating Ethical Issues with Human Cognitive Enhancement Technologies.

    PubMed

    Hofmann, Bjørn

    2017-04-01

    To develop a method for exposing and elucidating ethical issues with human cognitive enhancement (HCE). The intended use of the method is to support and facilitate open and transparent deliberation and decision making with respect to this emerging technology with great potential formative implications for individuals and society. Literature search to identify relevant approaches. Conventional content analysis of the identified papers and methods in order to assess their suitability for assessing HCE according to four selection criteria. Method development. Amendment after pilot testing on smart-glasses. Based on three existing approaches in health technology assessment a method for exposing and elucidating ethical issues in the assessment of HCE technologies was developed. Based on a pilot test for smart-glasses, the method was amended. The method consists of six steps and a guiding list of 43 questions. A method for exposing and elucidating ethical issues in the assessment of HCE was developed. The method provides the ground work for context specific ethical assessment and analysis. Widespread use, amendments, and further developments of the method are encouraged.

  4. Remediating Misconception on Climate Change among Secondary School Students in Malaysia

    ERIC Educational Resources Information Center

    Karpudewan, Mageswary; Roth, Wolff-Michael; Chandrakesan, Kasturi

    2015-01-01

    Existing studies report on secondary school students' misconceptions related to climate change; they also report on the methods of teaching as reinforcing misconceptions. This quasi-experimental study was designed to test the null hypothesis that a curriculum based on constructivist principles does not lead to greater understanding and fewer…

  5. Assessing the Psychometric Properties of Kember and Leung's Reflection Questionnaire

    ERIC Educational Resources Information Center

    Lethbridge, Kristen; Andrusyszyn, Mary-Anne; Iwasiw, Carroll; Laschinger, Heather K. S.; Fernando, Rajulton

    2013-01-01

    Reflective thinking is often stated as a learning outcome of baccalaureate nursing education, and as a characteristic of a competent professional; however, no consistent method exists to assess the extent to which students engage in reflective thinking. To address this need, Kember and Leung developed and tested a self-report questionnaire based…

  6. Hybrid Ultra-Low VOC and Non-HAP Rain Erosion Coatings

    DTIC Science & Technology

    2018-01-12

    cavitation test stand for running the modified ASTM G32 method...Objective Numerous military aircraft and shipboard surfaces, such as radomes, antennas, gun shields, wing leading edges, and helicopter blade leading edges... blades , and helicopter blade leading edges. The application market is extremely widespread. Luna will leverage existing internal contacts for

  7. A Methodology for Zumbo's Third Generation DIF Analyses and the Ecology of Item Responding

    ERIC Educational Resources Information Center

    Zumbo, Bruno D.; Liu, Yan; Wu, Amery D.; Shear, Benjamin R.; Olvera Astivia, Oscar L.; Ark, Tavinder K.

    2015-01-01

    Methods for detecting differential item functioning (DIF) and item bias are typically used in the process of item analysis when developing new measures; adapting existing measures for different populations, languages, or cultures; or more generally validating test score inferences. In 2007 in "Language Assessment Quarterly," Zumbo…

  8. The Effect of Spoilers on the Enjoyment of Short Stories

    ERIC Educational Resources Information Center

    Levine, William H.; Betzner, Michelle; Autry, Kevin S.

    2016-01-01

    Recent research has provided evidence that the information provided before a story--a spoiler--may increase the enjoyment of that story, perhaps by increasing the processing fluency experienced during reading. In one experiment, we tested the reliability of these findings by closely replicating existing methods and the generality of these findings…

  9. REMOVAL OF AMMONIA TOXCITY IN MARINE SEDIMENT TIES: A COMPARISON OF ULVA LACTUCA, ZEOLITE AND AREATION METHODS

    EPA Science Inventory

    Ammonia is suspected of causing some of the toxicity observed in marine sediment toxicity tests because it is sometimes found at elevated concentrations in marine interstitial waters. In marine waters, ammonia exists as un-ionized ammonia (NH3) and ammonium (NH4+) which combine ...

  10. Demulsification; industrial applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lissant, K.J.

    1983-01-01

    For scientists involved in the problems of selecting or designing demulsification programs. The author shows clearly why no pat formula exists to help out but does point out initial information required to start work. Theory. Testing. Demulsification of oil-in-water emulsions. Demulsification of water-in-oil emulsions. Demulsification of petroleum emulsions. Additional methods and areas in demulsification.

  11. 40 CFR 280.21 - Upgrading of existing UST systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sound and free of corrosion holes prior to installing the cathodic protection system; or (ii) The tank... for corrosion holes by conducting two (2) tightness tests that meet the requirements of § 280.43(c... operation of the cathodic protection system; or (iv) The tank is assessed for corrosion holes by a method...

  12. 40 CFR 280.21 - Upgrading of existing UST systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sound and free of corrosion holes prior to installing the cathodic protection system; or (ii) The tank... for corrosion holes by conducting two (2) tightness tests that meet the requirements of § 280.43(c... operation of the cathodic protection system; or (iv) The tank is assessed for corrosion holes by a method...

  13. 40 CFR 63.1365 - Test methods and initial compliance procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... temperature of 760 °C, the design evaluation must document that these conditions exist. (ii) For a combustion... autoignition temperature of the organic HAP, must consider the vent stream flow rate, and must establish the design minimum and average temperature in the combustion zone and the combustion zone residence time. (B...

  14. 40 CFR 63.1365 - Test methods and initial compliance procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... temperature of 760 °C, the design evaluation must document that these conditions exist. (ii) For a combustion... autoignition temperature of the organic HAP, must consider the vent stream flow rate, and must establish the design minimum and average temperature in the combustion zone and the combustion zone residence time. (B...

  15. 40 CFR 63.1365 - Test methods and initial compliance procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... temperature of 760 °C, the design evaluation must document that these conditions exist. (ii) For a combustion... autoignition temperature of the organic HAP, must consider the vent stream flow rate, and must establish the design minimum and average temperature in the combustion zone and the combustion zone residence time. (B...

  16. 40 CFR 63.1365 - Test methods and initial compliance procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... temperature of 760 °C, the design evaluation must document that these conditions exist. (ii) For a combustion... autoignition temperature of the organic HAP, must consider the vent stream flow rate, and must establish the design minimum and average temperature in the combustion zone and the combustion zone residence time. (B...

  17. Interactive visual analysis promotes exploration of long-term ecological data

    Treesearch

    T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst

    2013-01-01

    Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...

  18. Using Cardiac Biomarkers in Veterinary Practice.

    PubMed

    Oyama, Mark A

    2015-09-01

    Blood-based assays for various cardiac biomarkers can assist in the diagnosis of heart disease in dogs and cats. The two most common markers are cardiac troponin-I and N-terminal pro-B-type natriuretic peptide. Biomarker assays can assist in differentiating cardiac from noncardiac causes of respiratory signs and detection of preclinical cardiomyopathy. Increasingly, studies indicate that cardiac biomarker testing can help assess the risk of morbidity and mortality in animals with heart disease. Usage of cardiac biomarker testing in clinical practice relies on proper patient selection, correct interpretation of test results, and incorporation of biomarker testing into existing diagnostic methods. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Using cardiac biomarkers in veterinary practice.

    PubMed

    Oyama, Mark A

    2013-11-01

    Blood-based assays for various cardiac biomarkers can assist in the diagnosis of heart disease in dogs and cats. The two most common markers are cardiac troponin-I and N-terminal pro-B-type natriuretic peptide. Biomarker assays can assist in differentiating cardiac from noncardiac causes of respiratory signs and detection of preclinical cardiomyopathy. Increasingly, studies indicate that cardiac biomarker testing can help assess the risk of morbidity and mortality in animals with heart disease. Usage of cardiac biomarker testing in clinical practice relies on proper patient selection, correct interpretation of test results, and incorporation of biomarker testing into existing diagnostic methods. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Behavioral testing in rodent models of orofacial neuropathic and inflammatory pain

    PubMed Central

    Krzyzanowska, Agnieszka; Avendaño, Carlos

    2012-01-01

    Orofacial pain conditions are often very debilitating to the patient and difficult to treat. While clinical interest is high, the proportion of studies performed in the orofacial region in laboratory animals is relatively low, compared with other body regions. This is partly due to difficulties in testing freely moving animals and therefore lack of reliable testing methods. Here we present a comprehensive review of the currently used rodent models of inflammatory and neuropathic pain adapted to the orofacial areas, taking into account the difficulties and drawbacks of the existing approaches. We examine the available testing methods and procedures used for assessing the behavioral responses in the face in both mice and rats and provide a summary of some pharmacological agents used in these paradigms to date. The use of these agents in animal models is also compared with outcomes observed in the clinic. PMID:23139912

  1. Design of automata theory of cubical complexes with applications to diagnosis and algorithmic description

    NASA Technical Reports Server (NTRS)

    Roth, J. P.

    1972-01-01

    The following problems are considered: (1) methods for development of logic design together with algorithms, so that it is possible to compute a test for any failure in the logic design, if such a test exists, and developing algorithms and heuristics for the purpose of minimizing the computation for tests; and (2) a method of design of logic for ultra LSI (large scale integration). It was discovered that the so-called quantum calculus can be extended to render it possible: (1) to describe the functional behavior of a mechanism component by component, and (2) to compute tests for failures, in the mechanism, using the diagnosis algorithm. The development of an algorithm for the multioutput two-level minimization problem is presented and the program MIN 360 was written for this algorithm. The program has options of mode (exact minimum or various approximations), cost function, cost bound, etc., providing flexibility.

  2. Cross-Country Skiing Injuries and Training Methods.

    PubMed

    Nagle, Kyle B

    2015-01-01

    Cross-country skiing is a low injury-risk sport that has many health benefits and few long-term health risks. Some concern exists that cross-country skiing may be associated with a higher incidence of atrial fibrillation; however, mortality rates among skiers are lower than those among the general population. While continuing to emphasize aerobic and anaerobic training, training methods also should promote ski-specific strength training to increase maximum force and its rate of delivery and to build muscular endurance to maintain that power through a race. Multiple tests are available to monitor training progress. Which tests are most appropriate depends on the specific events targeted. In addition to laboratory-based tests, there also are many simpler, more cost-effective tests, such as short time trials, that can be used to monitor training progress and predict performance particularly at the junior skier level where access and cost may be more prohibitive.

  3. Assessing and Testing Hydrokinetic Turbine Performance and Effects on Open Channel Hydrodynamics: An Irrigation Canal Case Study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunawan, Budi; Neary, Vincent Sinclair; Mortensen, Josh

    Hydrokinetic energy from flowing water in open channels has the potential to support local electricity needs with lower regulatory or capital investment than impounding water with more conventional means. MOU agencies involved in federal hydropower development have identified the need to better understand the opportunities for hydrokinetic (HK) energy development within existing canal systems that may already have integrated hydropower plants. This document provides an overview of the main considerations, tools, and assessment methods, for implementing field tests in an open-channel water system to characterize current energy converter (CEC) device performance and hydrodynamic effects. It describes open channel processes relevantmore » to their HK site and perform pertinent analyses to guide siting and CEC layout design, with the goal of streamlining the evaluation process and reducing the risk of interfering with existing uses of the site. This document outlines key site parameters of interest and effective tools and methods for measurement and analysis with examples drawn from the Roza Main Canal, in Yakima, WA to illustrate a site application.« less

  4. [The role of biotechnology in pharmaceutical drug design].

    PubMed

    Gaisser, Sibylle; Nusser, Michael

    2010-01-01

    Biotechnological methods have become an important tool in pharmaceutical drug research and development. Today approximately 15 % of drug revenues are derived from biopharmaceuticals. The most relevant indications are oncology, metabolic disorders and disorders of the musculoskeletal system. For the future it can be expected that the relevance of biopharmaceuticals will further increase. Currently, the share of substances in preclinical testing that rely on biotechnology is more than 25 % of all substances in preclinical testing. Products for the treatment of cancer, metabolic disorders and infectious diseases are most important. New therapeutic approaches such as RNA interference only play a minor role in current commercial drug research and development with 1.5 % of all biological preclinical substances. Investments in sustainable high technology such as biotechnology are of vital importance for a highly developed country like Germany because of its lack of raw materials. Biotechnology helps the pharmaceutical industry to develop new products, new processes, methods and services and to improve existing ones. Thus, international competitiveness can be strengthened, new jobs can be created and existing jobs preserved.

  5. Solution for testing large high-power laser lenses having long focal length (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Fappani, Denis; IDE, Monique

    2017-05-01

    Many high power laser facilities are in operation all around the world and include various tight optical components such as large focussing lenses. Such lenses exhibit generally long focal lengths which induces some issues for their optical testing during manufacturing and inspection. Indeed, their transmitted wave fronts need to be very accurate and interferometric testing is the baseline to achieve that. But, it is always a problem to manage simultaneously long testing distances and fine accuracies in such interferometry testing. Taking example of the large focusing lenses produced for the Orion experimentation at AWE (UK), the presentation will describe which kind of testing method has been developed to demonstrate simultaneously good performances with sufficiently good repeatability and absolute accuracy. Special emphasis will be made onto the optical manufacturing issues and interferometric testing solutions. Some ZEMAX results presenting the test set-up and the calibration method will be presented as well. The presentation will conclude with a brief overview of the existing "state of the art" at Thales SESO for these technologies.

  6. Absorbable magnesium-based stent: physiological factors to consider for in vitro degradation assessments

    PubMed Central

    Wang, Juan; Smith, Christopher E.; Sankar, Jagannathan; Yun, Yeoheung; Huang, Nan

    2015-01-01

    Absorbable metals have been widely tested in various in vitro settings using cells to evaluate their possible suitability as an implant material. However, there exists a gap between in vivo and in vitro test results for absorbable materials. A lot of traditional in vitro assessments for permanent materials are no longer applicable to absorbable metallic implants. A key step is to identify and test the relevant microenvironment and parameters in test systems, which should be adapted according to the specific application. New test methods are necessary to reduce the difference between in vivo and in vitro test results and provide more accurate information to better understand absorbable metallic implants. In this investigative review, we strive to summarize the latest test methods for characterizing absorbable magnesium-based stent for bioabsorption/biodegradation behavior in the mimicking vascular environments. Also, this article comprehensively discusses the direction of test standardization for absorbable stents to paint a more accurate picture of the in vivo condition around implants to determine the most important parameters and their dynamic interactions. PMID:26816631

  7. Hole filling with oriented sticks in ultrasound volume reconstruction

    PubMed Central

    Vaughan, Thomas; Lasso, Andras; Ungi, Tamas; Fichtinger, Gabor

    2015-01-01

    Abstract. Volumes reconstructed from tracked planar ultrasound images often contain regions where no information was recorded. Existing interpolation methods introduce image artifacts and tend to be slow in filling large missing regions. Our goal was to develop a computationally efficient method that fills missing regions while adequately preserving image features. We use directional sticks to interpolate between pairs of known opposing voxels in nearby images. We tested our method on 30 volumetric ultrasound scans acquired from human subjects, and compared its performance to that of other published hole-filling methods. Reconstruction accuracy, fidelity, and time were improved compared with other methods. PMID:26839907

  8. Noninvasive vacuum integrity tests on fast warm-up traveling-wave tubes

    NASA Astrophysics Data System (ADS)

    Dallos, A.; Carignan, R. G.

    1989-04-01

    A method of tube vacuum monitoring that uses the tube's existing internal electrodes as an ion gage is discussed. This method has been refined using present-day instrumentation and has proved to be a precise, simple, and fast method of tube vacuum measurement. The method is noninvasive due to operation of the cathode at low temperature, which minimizes pumping or outgassing. Because of the low current levels to be measured, anode insulator leakage must be low, and the leads must be properly shielded to minimize charging effects. A description of the method, instrumentation used, limitations, and data showing results over a period of 600 days are presented.

  9. A New Test Unit for Disintegration End-Point Determination of Orodispersible Films.

    PubMed

    Low, Ariana; Kok, Si Ling; Khong, Yuet Mei; Chan, Sui Yung; Gokhale, Rajeev

    2015-11-01

    No standard time or pharmacopoeia disintegration test method for orodispersible films (ODFs) exists. The USP disintegration test for tablets and capsules poses significant challenges for end-point determination when used for ODFs. We tested a newly developed disintegration test unit (DTU) against the USP disintegration test. The DTU is an accessory to the USP disintegration apparatus. It holds the ODF in a horizontal position, allowing top-view of the ODF during testing. A Gauge R&R study was conducted to assign relative contributions of the total variability from the operator, sample or the experimental set-up. Precision was compared using commercial ODF products in different media. Agreement between the two measurement methods was analysed. The DTU showed improved repeatability and reproducibility compared to the USP disintegration system with tighter standard deviations regardless of operator or medium. There is good agreement between the two methods, with the USP disintegration test giving generally longer disintegration times possibly due to difficulty in end-point determination. The DTU provided clear end-point determination and is suitable for quality control of ODFs during product developmental stage or manufacturing. This may facilitate the development of a standardized methodology for disintegration time determination of ODFs. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  10. Exploiting MeSH indexing in MEDLINE to generate a data set for word sense disambiguation.

    PubMed

    Jimeno-Yepes, Antonio J; McInnes, Bridget T; Aronson, Alan R

    2011-06-02

    Evaluation of Word Sense Disambiguation (WSD) methods in the biomedical domain is difficult because the available resources are either too small or too focused on specific types of entities (e.g. diseases or genes). We present a method that can be used to automatically develop a WSD test collection using the Unified Medical Language System (UMLS) Metathesaurus and the manual MeSH indexing of MEDLINE. We demonstrate the use of this method by developing such a data set, called MSH WSD. In our method, the Metathesaurus is first screened to identify ambiguous terms whose possible senses consist of two or more MeSH headings. We then use each ambiguous term and its corresponding MeSH heading to extract MEDLINE citations where the term and only one of the MeSH headings co-occur. The term found in the MEDLINE citation is automatically assigned the UMLS CUI linked to the MeSH heading. Each instance has been assigned a UMLS Concept Unique Identifier (CUI). We compare the characteristics of the MSH WSD data set to the previously existing NLM WSD data set. The resulting MSH WSD data set consists of 106 ambiguous abbreviations, 88 ambiguous terms and 9 which are a combination of both, for a total of 203 ambiguous entities. For each ambiguous term/abbreviation, the data set contains a maximum of 100 instances per sense obtained from MEDLINE.We evaluated the reliability of the MSH WSD data set using existing knowledge-based methods and compared their performance to that of the results previously obtained by these algorithms on the pre-existing data set, NLM WSD. We show that the knowledge-based methods achieve different results but keep their relative performance except for the Journal Descriptor Indexing (JDI) method, whose performance is below the other methods. The MSH WSD data set allows the evaluation of WSD algorithms in the biomedical domain. Compared to previously existing data sets, MSH WSD contains a larger number of biomedical terms/abbreviations and covers the largest set of UMLS Semantic Types. Furthermore, the MSH WSD data set has been generated automatically reusing already existing annotations and, therefore, can be regenerated from subsequent UMLS versions.

  11. Research of ceramic matrix for a safe immobilization of radioactive sludge waste

    NASA Astrophysics Data System (ADS)

    Dorofeeva, Ludmila; Orekhov, Dmitry

    2018-03-01

    The research and improvement of the existing method for radioactive waste hardening by fixation in a ceramic matrix was carried out. For the samples covered with the sodium silicate and tested after the storage on the air the speed of a radionuclides leaching was determined. The properties of a clay ceramics and the optimum conditions of sintering were defined. The experimental data about the influence of a temperature mode sintering, water quantities, sludge and additives in the samples on their mechanical durability and a water resistance were obtained. The comparative analysis of the conducted research is aimed at improvement of the existing method of the hardening radioactive waste by inclusion in a ceramic matrix and reveals the advantages of the received results over analogs.

  12. Susceptibility screening of hyphae-forming fungi with a new, easy, and fast inoculum preparation method.

    PubMed

    Schmalreck, Arno; Willinger, Birgit; Czaika, Viktor; Fegeler, Wolfgang; Becker, Karsten; Blum, Gerhard; Lass-Flörl, Cornelia

    2012-12-01

    In vitro susceptibility testing of clinically important fungi becomes more and more essential due to the rising number of fungal infections in patients with impaired immune system. Existing standardized microbroth dilution methods for in vitro testing of molds (CLSI, EUCAST) are not intended for routine testing. These methods are very time-consuming and dependent on sporulating of hyphomycetes. In this multicentre study, a new (independent of sporulation) inoculum preparation method (containing a mixture of vegetative cells, hyphae, and conidia) was evaluated. Minimal inhibitory concentrations (MIC) of amphotericin B, posaconazole, and voriconazole of 180 molds were determined with two different culture media (YST and RPMI 1640) according to the DIN (Deutsches Institut für Normung) microdilution assay. 24 and 48 h MIC of quality control strains, tested per each test run, prepared with the new inoculum method were in the range of DIN. YST and RPMI 1640 media showed similar MIC distributions for all molds tested. MIC readings at 48 versus 24 h yield 1 log(2) higher MIC values and more than 90 % of the MICs read at 24 and 48 h were within ± 2 log(2) dilution. MIC end point reading (log(2 MIC-RPMI 1640)-log(2 MIC-YST)) of both media demonstrated a tendency to slightly lower MICs with RPMI 1640 medium. This study reports the results of a new, time-saving, and easy-to-perform method for inoculum preparation for routine susceptibility testing that can be applied for all types of spore-/non-spore and hyphae-forming fungi.

  13. The Effect of Pilates Exercise on Trunk and Postural Stability and Throwing Velocity in College Baseball Pitchers: Single Subject Design

    PubMed Central

    Howe, Katherine

    2007-01-01

    Background Baseball pitchers need trunk strength to maximize performance. The Pilates method of exercise is gaining popularity throughout the country as a fitness and rehabilitation method of exercise. However, very few studies exist that examine the effects of the Pilates method of exercise on trunk strength or performance. Objectives Using a single subject, multiple baseline across subjects design, this study examines the effects of the Pilates method of exercise on performance of double leg lowering, star excursion balance test, and throwing velocity in college-aged baseball pitchers. Methods A convenience sample of three college baseball pitchers served as the subjects for this single subject design study. For each subject, double leg lowering, star excursion balance test, and throwing speed were measured prior to the introduction of the intervention. When baseline test values showed consistent performance, the intervention was introduced to one subject at a time. Intervention was introduced to the other subjects over a period of 4 weeks as they also demonstrated consistent performance on the baseline tests. Intervention was continued with periodic tests for the remainder of the 10 week trial. Results Each subject improved in performance on double leg lowering (increased 24.43-32.7%) and star excursion balance test (increased 4.63-17.84%) after introduction of the intervention. Throwing speed improved in two of the three subjects (up to 5.61%). Discussion and Conclusions The Pilates method of exercise may contribute to improved performance in double leg lowering, star excursion balance tests, and throwing speed in college baseball pitchers. PMID:21522199

  14. Effectiveness comparison of partially executed t-way test suite based generated by existing strategies

    NASA Astrophysics Data System (ADS)

    Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur

    2015-05-01

    Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.

  15. Development of thermal control methods for specialized components and scientific instruments at very low temperatures (follow-on)

    NASA Technical Reports Server (NTRS)

    Wright, J. P.; Wilson, D. E.

    1976-01-01

    Many payloads currently proposed to be flown by the space shuttle system require long-duration cooling in the 3 to 200 K temperature range. Common requirements also exist for certain DOD payloads. Parametric design and optimization studies are reported for multistage and diode heat pipe radiator systems designed to operate in this temperature range. Also optimized are ground test systems for two long-life passive thermal control concepts operating under specified space environmental conditions. The ground test systems evaluated are ultimately intended to evolve into flight test qualification prototypes for early shuttle flights.

  16. The effects of time on the capacity of pipe piles in dense marine sand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, F.C.; Jardine, R.J.; Brucy, F.

    Investigations into pile behavior in dense marine sand have been performed by IFP and IC at Dunkirk, North France. In the most recent series of tests, strain-gauged, open-ended pipe piles, driven and statically load tested in 1989, were retested in 1994. Surprisingly large increases in shaft capacity were measured. The possible causes are evaluated in relation to previous case histories, laboratory soil tests, pile corrosion and new effective stress analyses developed using smaller, more intensively instrumented piles. The shaft capacities predicted by existing design methods are also assessed. 51 refs., 12 figs., 4 tabs.

  17. A statistical investigation of z test and ROC curve on seismo-ionospheric anomalies in TEC associated earthquakes in Taiwan during 1999-2014

    NASA Astrophysics Data System (ADS)

    Shih, A. L.; Liu, J. Y. G.

    2015-12-01

    A median-based method and a z test are employed to find characteristics of seismo-ionospheric precursor (SIP) of the total electron content (TEC) in global ionosphere map (GIM) associated with 129 M≥5.5 earthquakes in Taiwan during 1999-2014. Results show that both negative and positive anomalies in the GIM TEC with the statistical significance of the z test appear few days before the earthquakes. The receiver operating characteristic (ROC) curve is further applied to see whether the SIPs exist in Taiwan.

  18. Investigation of Super*Zip separation joint

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Schimmel, Morry L.

    1988-01-01

    An investigation to determine the most likely cause of two failures of five tests on 79 inch diameter Lockheed Super*Zip spacecraft separation joints being used for the development of a Shuttle/Centaur propulsion system. This joint utilizes an explosively expanded tube to fracture surrounding prenotched aluminum plates to achieve planar separation. A test method was developed and more than 300 tests firings were made to provide an understanding of severance mechanisms and the functional performance effects of system variables. An approach for defining functional margin was developed, and specific recommendations were made for improving existing and future systems.

  19. The effect of intra-wellbore head losses in a vertical well

    NASA Astrophysics Data System (ADS)

    Wang, Quanrong; Zhan, Hongbin

    2017-05-01

    Flow to a partially penetrating vertical well is made more complex by intra-wellbore losses. These are caused not only by the frictional effect, but also by the kinematic effect, which consists of the accelerational and fluid inflow effects inside a wellbore. Existing models of flow to a partially penetrating vertical well assume either a uniform-flux boundary condition (UFBC) or a uniform-head boundary condition (UHBC) for treating the flow into the wellbore. Neither approach considers intra-wellbore losses. In this study a new general solution, named the mixed-type boundary condition (MTBC) solution, is introduced to include intra-wellbore losses. It is developed from the existing solutions using a hybrid analytical-numerical method. The MTBC solution is capable of modeling various types of aquifer tests (constant-head tests, constant-rate tests, and slug tests) for partially or fully penetrating vertical wells in confined aquifers. Results show that intra-wellbore losses (both frictional and kinematic) can be significant in the early pumping stage. At later pumping times the UHBC solution is adequate because the difference between the MTBC and UHBC solutions becomes negligible.

  20. The General Mission Analysis Tool (GMAT) System Test Plan

    NASA Technical Reports Server (NTRS)

    Conway, Darrel J.; Hughes, Steven P.

    2007-01-01

    This document serves as the System Test Approach for the GMAT Project. Preparation for system testing consists of three major stages: 1) The Test Approach sets the scope of system testing, the overall strategy to be adopted, the activities to be completed, the general resources required and the methods and processes to be used to test the release. It also details the activities, dependencies and effort required to conduct the System Test. 2) Test Planning details the activities, dependencies and effort required to conduct the System Test. 3) Test Cases documents the tests to be applied, the data to be processed, the automated testing coverage and the expected results. This document covers the first two of these items, and established the framework used for the GMAT test case development. The test cases themselves exist as separate components, and are managed outside of and concurrently with this System Test Plan.

  1. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  2. Multilabel learning via random label selection for protein subcellular multilocations prediction.

    PubMed

    Wang, Xiao; Li, Guo-Zheng

    2013-01-01

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multilocation proteins to multiple proteins with single location, which does not take correlations among different subcellular locations into account. In this paper, a novel method named random label selection (RALS) (multilabel learning via RALS), which extends the simple binary relevance (BR) method, is proposed to learn from multilocation proteins in an effective and efficient way. RALS does not explicitly find the correlations among labels, but rather implicitly attempts to learn the label correlations from data by augmenting original feature space with randomly selected labels as its additional input features. Through the fivefold cross-validation test on a benchmark data set, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark data sets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multilocations of proteins. The prediction web server is available at >http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage.

  3. Detecting imipenem resistance in Acinetobacter baumannii by automated systems (BD Phoenix, Microscan WalkAway, Vitek 2); high error rates with Microscan WalkAway

    PubMed Central

    2009-01-01

    Background Increasing reports of carbapenem resistant Acinetobacter baumannii infections are of serious concern. Reliable susceptibility testing results remains a critical issue for the clinical outcome. Automated systems are increasingly used for species identification and susceptibility testing. This study was organized to evaluate the accuracies of three widely used automated susceptibility testing methods for testing the imipenem susceptibilities of A. baumannii isolates, by comparing to the validated test methods. Methods Selected 112 clinical isolates of A. baumanii collected between January 2003 and May 2006 were tested to confirm imipenem susceptibility results. Strains were tested against imipenem by the reference broth microdilution (BMD), disk diffusion (DD), Etest, BD Phoenix, MicroScan WalkAway and Vitek 2 automated systems. Data were analysed by comparing the results from each test method to those produced by the reference BMD test. Results MicroScan performed true identification of all A. baumannii strains while Vitek 2 unidentified one strain, Phoenix unidentified two strains and misidentified two strains. Eighty seven of the strains (78%) were resistant to imipenem by BMD. Etest, Vitek 2 and BD Phoenix produced acceptable error rates when tested against imipenem. Etest showed the best performance with only two minor errors (1.8%). Vitek 2 produced eight minor errors(7.2%). BD Phoenix produced three major errors (2.8%). DD produced two very major errors (1.8%) (slightly higher (0.3%) than the acceptable limit) and three major errors (2.7%). MicroScan showed the worst performance in susceptibility testing with unacceptable error rates; 28 very major (25%) and 50 minor errors (44.6%). Conclusion Reporting errors for A. baumannii against imipenem do exist in susceptibility testing systems. We suggest clinical laboratories using MicroScan system for routine use should consider using a second, independent antimicrobial susceptibility testing method to validate imipenem susceptibility. Etest, whereever available, may be used as an easy method to confirm imipenem susceptibility. PMID:19291298

  4. Bias correction for selecting the minimal-error classifier from many machine learning models.

    PubMed

    Ding, Ying; Tang, Shaowu; Liao, Serena G; Jia, Jia; Oesterreich, Steffi; Lin, Yan; Tseng, George C

    2014-11-15

    Supervised machine learning is commonly applied in genomic research to construct a classifier from the training data that is generalizable to predict independent testing data. When test datasets are not available, cross-validation is commonly used to estimate the error rate. Many machine learning methods are available, and it is well known that no universally best method exists in general. It has been a common practice to apply many machine learning methods and report the method that produces the smallest cross-validation error rate. Theoretically, such a procedure produces a selection bias. Consequently, many clinical studies with moderate sample sizes (e.g. n = 30-60) risk reporting a falsely small cross-validation error rate that could not be validated later in independent cohorts. In this article, we illustrated the probabilistic framework of the problem and explored the statistical and asymptotic properties. We proposed a new bias correction method based on learning curve fitting by inverse power law (IPL) and compared it with three existing methods: nested cross-validation, weighted mean correction and Tibshirani-Tibshirani procedure. All methods were compared in simulation datasets, five moderate size real datasets and two large breast cancer datasets. The result showed that IPL outperforms the other methods in bias correction with smaller variance, and it has an additional advantage to extrapolate error estimates for larger sample sizes, a practical feature to recommend whether more samples should be recruited to improve the classifier and accuracy. An R package 'MLbias' and all source files are publicly available. tsenglab.biostat.pitt.edu/software.htm. ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Application of Bayesian configural frequency analysis (BCFA) to determine characteristics user and non-user motor X

    NASA Astrophysics Data System (ADS)

    Mawardi, Muhamad Iqbal; Padmadisastra, Septiadi; Tantular, Bertho

    2018-03-01

    Configural Frequency Analysis is a method for cell-wise testing in contingency tables for exploratory search type and antitype, that can see the existence of discrepancy on the model by existence of a significant difference between the frequency of observation and frequency of expectation. This analysis focuses on whether or not the interaction among categories from different variables, and not the interaction among variables. One of the extensions of CFA method is Bayesian CFA, this alternative method pursue the same goal as frequentist version of CFA with the advantage that adjustment of the experiment-wise significance level α is not necessary and test whether groups of types and antitypes form composite types or composite antitypes. Hence, this research will present the concept of the Bayesian CFA and how it works for the real data. The data on this paper is based on case studies in a company about decrease Brand Awareness & Image motor X on Top Of Mind Unit indicator in Cirebon City for user 30.8% and non user 9.8%. From the result of B-CFA have four characteristics from deviation, one of the four characteristics above that is the configuration 2212 need more attention by company to determine promotion strategy to maintain and improve Top Of Mind Unit in Cirebon City.

  6. Astigmatism error modification for absolute shape reconstruction using Fourier transform method

    NASA Astrophysics Data System (ADS)

    He, Yuhang; Li, Qiang; Gao, Bo; Liu, Ang; Xu, Kaiyuan; Wei, Xiaohong; Chai, Liqun

    2014-12-01

    A method is proposed to modify astigmatism errors in absolute shape reconstruction of optical plane using Fourier transform method. If a transmission and reflection flat are used in an absolute test, two translation measurements lead to obtain the absolute shapes by making use of the characteristic relationship between the differential and original shapes in spatial frequency domain. However, because the translation device cannot guarantee the test and reference flats rigidly parallel to each other after the translations, a tilt error exists in the obtained differential data, which caused power and astigmatism errors in the reconstructed shapes. In order to modify the astigmatism errors, a rotation measurement is added. Based on the rotation invariability of the form of Zernike polynomial in circular domain, the astigmatism terms are calculated by solving polynomial coefficient equations related to the rotation differential data, and subsequently the astigmatism terms including error are modified. Computer simulation proves the validity of the proposed method.

  7. A dynamic integrated fault diagnosis method for power transformers.

    PubMed

    Gao, Wensheng; Bai, Cuifen; Liu, Tong

    2015-01-01

    In order to diagnose transformer fault efficiently and accurately, a dynamic integrated fault diagnosis method based on Bayesian network is proposed in this paper. First, an integrated fault diagnosis model is established based on the causal relationship among abnormal working conditions, failure modes, and failure symptoms of transformers, aimed at obtaining the most possible failure mode. And then considering the evidence input into the diagnosis model is gradually acquired and the fault diagnosis process in reality is multistep, a dynamic fault diagnosis mechanism is proposed based on the integrated fault diagnosis model. Different from the existing one-step diagnosis mechanism, it includes a multistep evidence-selection process, which gives the most effective diagnostic test to be performed in next step. Therefore, it can reduce unnecessary diagnostic tests and improve the accuracy and efficiency of diagnosis. Finally, the dynamic integrated fault diagnosis method is applied to actual cases, and the validity of this method is verified.

  8. A Dynamic Integrated Fault Diagnosis Method for Power Transformers

    PubMed Central

    Gao, Wensheng; Liu, Tong

    2015-01-01

    In order to diagnose transformer fault efficiently and accurately, a dynamic integrated fault diagnosis method based on Bayesian network is proposed in this paper. First, an integrated fault diagnosis model is established based on the causal relationship among abnormal working conditions, failure modes, and failure symptoms of transformers, aimed at obtaining the most possible failure mode. And then considering the evidence input into the diagnosis model is gradually acquired and the fault diagnosis process in reality is multistep, a dynamic fault diagnosis mechanism is proposed based on the integrated fault diagnosis model. Different from the existing one-step diagnosis mechanism, it includes a multistep evidence-selection process, which gives the most effective diagnostic test to be performed in next step. Therefore, it can reduce unnecessary diagnostic tests and improve the accuracy and efficiency of diagnosis. Finally, the dynamic integrated fault diagnosis method is applied to actual cases, and the validity of this method is verified. PMID:25685841

  9. Autoregressive statistical pattern recognition algorithms for damage detection in civil structures

    NASA Astrophysics Data System (ADS)

    Yao, Ruigen; Pakzad, Shamim N.

    2012-08-01

    Statistical pattern recognition has recently emerged as a promising set of complementary methods to system identification for automatic structural damage assessment. Its essence is to use well-known concepts in statistics for boundary definition of different pattern classes, such as those for damaged and undamaged structures. In this paper, several statistical pattern recognition algorithms using autoregressive models, including statistical control charts and hypothesis testing, are reviewed as potentially competitive damage detection techniques. To enhance the performance of statistical methods, new feature extraction techniques using model spectra and residual autocorrelation, together with resampling-based threshold construction methods, are proposed. Subsequently, simulated acceleration data from a multi degree-of-freedom system is generated to test and compare the efficiency of the existing and proposed algorithms. Data from laboratory experiments conducted on a truss and a large-scale bridge slab model are then used to further validate the damage detection methods and demonstrate the superior performance of proposed algorithms.

  10. Automated detection of age-related macular degeneration in OCT images using multiple instance learning

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Liu, Xiaoming; Yang, Zhou

    2017-07-01

    Age-related Macular Degeneration (AMD) is a kind of macular disease which mostly occurs in old people,and it may cause decreased vision or even lead to permanent blindness. Drusen is an important clinical indicator for AMD which can help doctor diagnose disease and decide the strategy of treatment. Optical Coherence Tomography (OCT) is widely used in the diagnosis of ophthalmic diseases, include AMD. In this paper, we propose a classification method based on Multiple Instance Learning (MIL) to detect AMD. Drusen can exist in a few slices of OCT images, and MIL is utilized in our method. We divided the method into two phases: training phase and testing phase. We train the initial features and clustered to create a codebook, and employ the trained classifier in the test set. Experiment results show that our method achieved high accuracy and effectiveness.

  11. Large rotorcraft transmission technology development program

    NASA Technical Reports Server (NTRS)

    Mack, J. C.

    1983-01-01

    Testing of a U.S. Army XCH-62 HLH aft rotor transmission under NASA Contract NAS 3-22143 was successfully completed. This test establishes the feasibility of large, high power rotorcraft transmissions as well as demonstrating the resolution of deficiencies identified during the HLH advanced technology programs and reported by USAAMRDLTR-77-38. Over 100 hours of testing was conducted. At the 100% design power rating of 10,620 horsepower, the power transferred through a single spiral bevel gear mesh is more than twice that of current helicopter bevel gearing. In the original design of these gears, industry-wide design methods were employed and failures were experienced which identified problem areas unique to gear size. To remedy this technology shortfall, a program was developed to predict gear stresses using finite element analysis for complete and accurate representation of the gear tooth and supporting structure. To validate the finite element methodology gear strain data from the existing U.S. Army HLH aft transmission was acquired, and existing data from smaller gears were made available.

  12. Testing of aircraft passenger seat cushion materials. Full scale, test description and results, volume 1

    NASA Technical Reports Server (NTRS)

    Schutter, K. J.; Gaume, J. G.; Duskin, F. E.

    1981-01-01

    Eight different seat cushion configurations were subjected to full-scale burn tests. Each cushion configuration was tested twice for a total of sixteen tests. Two different fire sources were used. They consisted of one liter of Jet A fuel for eight tests and a radiant energy source with propane flame for eight tests. Both fire sources were ignited by a propane flame. During each test, data were recorded for smoke density, cushion temperatures, radiant heat flux, animal response to combustion products, rate of weight loss of test specimens, cabin temperature, and for the type and content of gas within the cabin atmosphere. When compared to existing passenger aircraft seat cushions, the test specimens incorporating a fire barrier and those fabricated from advanced materials, using improved construction methods, exhibited significantly greater fire resistance.

  13. Physiologic Screening Test for Eating Disorders/Disordered Eating Among Female Collegiate Athletes.

    PubMed

    Black, David R.; Larkin, Laurie J.S.; Coster, Daniel C.; Leverenz, Larry J.; Abood, Doris A.

    2003-12-01

    OBJECTIVE: To develop and evaluate a physiologic screening test specifically designed for collegiate female athletes engaged in athletic competition or highly athletic performances in order to detect eating disorders/disordered eating. No such physiologically based test currently exists. METHODS: Subjects included 148 (84.5%) of 175 volunteer, National Collegiate Athletic Association Division I (n = 92), club (n = 15), and dance team (n = 41) athletes 18 to 25 years old who attended a large, Midwestern university. Participants completed 4 tests: 2 normed for the general population (Eating Disorders Inventory-2 and Bulimia Test-Revised); a new physiologic test, developed and pilot tested by the investigators, called the Physiologic Screening Test; and the Eating Disorder Exam 12.0D, a structured, validated, diagnostic interview used for criterion validity. RESULTS: The 18-item Physiologic Screening Test produced the highest sensitivity (87%) and specificity (78%) and was superior to the Eating Disorders Inventory-2 (sensitivity = 62%, specificity = 74%) and Bulimia Test-Revised (sensitivity = 27%, specificity = 99%). A substantial number (n = 51, 35%) of athletes were classified as eating disordered/disordered eating. CONCLUSIONS: The Physiologic Screening Test should be considered for screening athletes for eating disorders/disordered eating. The Physiologic Screening Test seems to be a viable alternative to existing tests because it is specifically designed for female athletes, it is brief (4 measurements and 14 items), and validity is enhanced and response bias is lessened because the purpose is less obvious, especially when included as part of a mandatory preparticipation examination.

  14. A Self-Directed Method for Cell-Type Identification and Separation of Gene Expression Microarrays

    PubMed Central

    Zuckerman, Neta S.; Noam, Yair; Goldsmith, Andrea J.; Lee, Peter P.

    2013-01-01

    Gene expression analysis is generally performed on heterogeneous tissue samples consisting of multiple cell types. Current methods developed to separate heterogeneous gene expression rely on prior knowledge of the cell-type composition and/or signatures - these are not available in most public datasets. We present a novel method to identify the cell-type composition, signatures and proportions per sample without need for a-priori information. The method was successfully tested on controlled and semi-controlled datasets and performed as accurately as current methods that do require additional information. As such, this method enables the analysis of cell-type specific gene expression using existing large pools of publically available microarray datasets. PMID:23990767

  15. A sampling and classification item selection approach with content balancing.

    PubMed

    Chen, Pei-Hua

    2015-03-01

    Existing automated test assembly methods typically employ constrained combinatorial optimization. Constructing forms sequentially based on an optimization approach usually results in unparallel forms and requires heuristic modifications. Methods based on a random search approach have the major advantage of producing parallel forms sequentially without further adjustment. This study incorporated a flexible content-balancing element into the statistical perspective item selection method of the cell-only method (Chen et al. in Educational and Psychological Measurement, 72(6), 933-953, 2012). The new method was compared with a sequential interitem distance weighted deviation model (IID WDM) (Swanson & Stocking in Applied Psychological Measurement, 17(2), 151-166, 1993), a simultaneous IID WDM, and a big-shadow-test mixed integer programming (BST MIP) method to construct multiple parallel forms based on matching a reference form item-by-item. The results showed that the cell-only method with content balancing and the sequential and simultaneous versions of IID WDM yielded results comparable to those obtained using the BST MIP method. The cell-only method with content balancing is computationally less intensive than the sequential and simultaneous versions of IID WDM.

  16. A review of consensus test methods for established medical imaging modalities and their implications for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Pfefer, Joshua; Agrawal, Anant

    2012-03-01

    In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.

  17. Searching for disease-susceptibility loci by testing for Hardy-Weinberg disequilibrium in a gene bank of affected individuals.

    PubMed

    Lee, Wen-Chung

    2003-09-01

    The future of genetic studies of complex human diseases will rely more and more on the epidemiologic association paradigm. The author proposes to scan the genome for disease-susceptibility gene(s) by testing for deviation from Hardy-Weinberg equilibrium in a gene bank of affected individuals. A power formula is presented, which is very accurate as revealed by Monte Carlo simulations. If the disease-susceptibility gene is recessive with an allele frequency of < or = 0.5 or dominant with an allele frequency of > or = 0.5, the number of subjects needed by the present method is smaller than that needed by using a case-parents design (using either the transmission/disequilibrium test or the 2-df likelihood ratio test). However, the method cannot detect genes with a multiplicative mode of inheritance, and the validity of the method relies on the assumption that the source population from which the cases arise is in Hardy-Weinberg equilibrium. Thus, it is prone to produce false positive and false negative results. Nevertheless, the method enables rapid gene hunting in an existing gene bank of affected individuals with no extra effort beyond simple calculations.

  18. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  19. Distribution system model calibration with big data from AMI and PV inverters

    DOE PAGES

    Peppanen, Jouni; Reno, Matthew J.; Broderick, Robert J.; ...

    2016-03-03

    Efficient management and coordination of distributed energy resources with advanced automation schemes requires accurate distribution system modeling and monitoring. Big data from smart meters and photovoltaic (PV) micro-inverters can be leveraged to calibrate existing utility models. This paper presents computationally efficient distribution system parameter estimation algorithms to improve the accuracy of existing utility feeder radial secondary circuit model parameters. The method is demonstrated using a real utility feeder model with advanced metering infrastructure (AMI) and PV micro-inverters, along with alternative parameter estimation approaches that can be used to improve secondary circuit models when limited measurement data is available. Lastly, themore » parameter estimation accuracy is demonstrated for both a three-phase test circuit with typical secondary circuit topologies and single-phase secondary circuits in a real mixed-phase test system.« less

  20. Distribution system model calibration with big data from AMI and PV inverters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peppanen, Jouni; Reno, Matthew J.; Broderick, Robert J.

    Efficient management and coordination of distributed energy resources with advanced automation schemes requires accurate distribution system modeling and monitoring. Big data from smart meters and photovoltaic (PV) micro-inverters can be leveraged to calibrate existing utility models. This paper presents computationally efficient distribution system parameter estimation algorithms to improve the accuracy of existing utility feeder radial secondary circuit model parameters. The method is demonstrated using a real utility feeder model with advanced metering infrastructure (AMI) and PV micro-inverters, along with alternative parameter estimation approaches that can be used to improve secondary circuit models when limited measurement data is available. Lastly, themore » parameter estimation accuracy is demonstrated for both a three-phase test circuit with typical secondary circuit topologies and single-phase secondary circuits in a real mixed-phase test system.« less

  1. Hybrid detection of lung nodules on CT scan images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Lin; Tan, Yongqiang; Schwartz, Lawrence H.

    Purpose: The diversity of lung nodules poses difficulty for the current computer-aided diagnostic (CAD) schemes for lung nodule detection on computed tomography (CT) scan images, especially in large-scale CT screening studies. We proposed a novel CAD scheme based on a hybrid method to address the challenges of detection in diverse lung nodules. Methods: The hybrid method proposed in this paper integrates several existing and widely used algorithms in the field of nodule detection, including morphological operation, dot-enhancement based on Hessian matrix, fuzzy connectedness segmentation, local density maximum algorithm, geodesic distance map, and regression tree classification. All of the adopted algorithmsmore » were organized into tree structures with multi-nodes. Each node in the tree structure aimed to deal with one type of lung nodule. Results: The method has been evaluated on 294 CT scans from the Lung Image Database Consortium (LIDC) dataset. The CT scans were randomly divided into two independent subsets: a training set (196 scans) and a test set (98 scans). In total, the 294 CT scans contained 631 lung nodules, which were annotated by at least two radiologists participating in the LIDC project. The sensitivity and false positive per scan for the training set were 87% and 2.61%. The sensitivity and false positive per scan for the testing set were 85.2% and 3.13%. Conclusions: The proposed hybrid method yielded high performance on the evaluation dataset and exhibits advantages over existing CAD schemes. We believe that the present method would be useful for a wide variety of CT imaging protocols used in both routine diagnosis and screening studies.« less

  2. Evaluating attention in delirium: A comparison of bedside tests of attention.

    PubMed

    Adamis, Dimitrios; Meagher, David; Murray, Orla; O'Neill, Donagh; O'Mahony, Edmond; Mulligan, Owen; McCarthy, Geraldine

    2016-09-01

    Impaired attention is a core diagnostic feature for delirium. The present study examined the discriminating properties for patients with delirium versus those with dementia and/or no neurocognitive disorder of four objective tests of attention: digit span, vigilance "A" test, serial 7s subtraction and months of the year backwards together with global clinical subjective rating of attention. This as a prospective study of older patients admitted consecutively in a general hospital. Participants were assessed using the Confusion Assessment Method, Delirium Rating Scale-98 Revised and Montreal Cognitive Assessment scales, and months of the year backwards. Pre-existing dementia was diagnosed according to the Diagnostic and Statistical Manual of Mental Disorders fourth edition criteria. The sample consisted of 200 participants (mean age 81.1 ± 6.5 years; 50% women; pre-existing cognitive impairment in 126 [63%]). A total of 34 (17%) were identified with delirium (Confusion Assessment Method +). The five approaches to assessing attention had statistically significant correlations (P < 0.05). Discriminant analysis showed that clinical subjective rating of attention in conjunction with the months of the year backwards had the best discriminatory ability to identify Confusion Assessment Method-defined delirium, and to discriminate patients with delirium from those with dementia and/or normal cognition. Both of these approaches had high sensitivity, but modest specificity. Objective tests are useful for prediction of non-delirium, but lack specificity for a delirium diagnosis. Global attentional deficits were more indicative of delirium than deficits of specific domains of attention. Geriatr Gerontol Int 2016; 16: 1028-1035. © 2015 The Authors. Geriatrics & Gerontology International published by. Wiley Publishing Asia Pty Ltd on behalf of Japanese Geriatrics Society.

  3. Order-restricted inference for means with missing values.

    PubMed

    Wang, Heng; Zhong, Ping-Shou

    2017-09-01

    Missing values appear very often in many applications, but the problem of missing values has not received much attention in testing order-restricted alternatives. Under the missing at random (MAR) assumption, we impute the missing values nonparametrically using kernel regression. For data with imputation, the classical likelihood ratio test designed for testing the order-restricted means is no longer applicable since the likelihood does not exist. This article proposes a novel method for constructing test statistics for assessing means with an increasing order or a decreasing order based on jackknife empirical likelihood (JEL) ratio. It is shown that the JEL ratio statistic evaluated under the null hypothesis converges to a chi-bar-square distribution, whose weights depend on missing probabilities and nonparametric imputation. Simulation study shows that the proposed test performs well under various missing scenarios and is robust for normally and nonnormally distributed data. The proposed method is applied to an Alzheimer's disease neuroimaging initiative data set for finding a biomarker for the diagnosis of the Alzheimer's disease. © 2017, The International Biometric Society.

  4. Plans for Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Ballmann, Josef; Bhatia, Kumar; Blades, Eric; Boucke, Alexander; Chwalowski, Pawel; Dietz, Guido; Dowell, Earl; Florance, Jennifer P.; Hansen, Thorsten; hide

    2011-01-01

    This paper summarizes the plans for the first Aeroelastic Prediction Workshop. The workshop is designed to assess the state of the art of computational methods for predicting unsteady flow fields and aeroelastic response. The goals are to provide an impartial forum to evaluate the effectiveness of existing computer codes and modeling techniques, and to identify computational and experimental areas needing additional research and development. Three subject configurations have been chosen from existing wind tunnel data sets where there is pertinent experimental data available for comparison. For each case chosen, the wind tunnel testing was conducted using forced oscillation of the model at specified frequencies

  5. An ROC-type measure of diagnostic accuracy when the gold standard is continuous-scale.

    PubMed

    Obuchowski, Nancy A

    2006-02-15

    ROC curves and summary measures of accuracy derived from them, such as the area under the ROC curve, have become the standard for describing and comparing the accuracy of diagnostic tests. Methods for estimating ROC curves rely on the existence of a gold standard which dichotomizes patients into disease present or absent. There are, however, many examples of diagnostic tests whose gold standards are not binary-scale, but rather continuous-scale. Unnatural dichotomization of these gold standards leads to bias and inconsistency in estimates of diagnostic accuracy. In this paper, we propose a non-parametric estimator of diagnostic test accuracy which does not require dichotomization of the gold standard. This estimator has an interpretation analogous to the area under the ROC curve. We propose a confidence interval for test accuracy and a statistical test for comparing accuracies of tests from paired designs. We compare the performance (i.e. CI coverage, type I error rate, power) of the proposed methods with several alternatives. An example is presented where the accuracies of two quick blood tests for measuring serum iron concentrations are estimated and compared.

  6. Redshift drift constraints on holographic dark energy

    NASA Astrophysics Data System (ADS)

    He, Dong-Ze; Zhang, Jing-Fei; Zhang, Xin

    2017-03-01

    The Sandage-Loeb (SL) test is a promising method for probing dark energy because it measures the redshift drift in the spectra of Lyman- α forest of distant quasars, covering the "redshift desert" of 2 ≲ z ≲ 5, which is not covered by existing cosmological observations. Therefore, it could provide an important supplement to current cosmological observations. In this paper, we explore the impact of SL test on the precision of cosmological constraints for two typical holographic dark energy models, i.e., the original holographic dark energy (HDE) model and the Ricci holographic dark energy (RDE) model. To avoid data inconsistency, we use the best-fit models based on current combined observational data as the fiducial models to simulate 30 mock SL test data. The results show that SL test can effectively break the existing strong degeneracy between the present-day matter density Ωm0 and the Hubble constant H 0 in other cosmological observations. For the considered two typical dark energy models, not only can a 30-year observation of SL test improve the constraint precision of Ωm0 and h dramatically, but can also enhance the constraint precision of the model parameters c and α significantly.

  7. Development of Tasks and Evaluation of a Prototype Forceps for NOTES

    PubMed Central

    Addis, Matthew; Aguirre, Milton; Haluck, Randy; Matthew, Abraham; Pauli, Eric; Gopal, Jegan

    2012-01-01

    Background and Objectives: Few standardized testing procedures exist for instruments intended for Natural Orifice Translumenal Endoscopic Surgery. These testing procedures are critical for evaluating surgical skills and surgical instruments to ensure sufficient quality. This need is widely recognized by endoscopic surgeons as a major hurdle for the advancement of Natural Orifice Translumenal Endoscopic Surgery. Methods: Beginning with tasks currently used to evaluate laparoscopic surgeons and instruments, new tasks were designed to evaluate endoscopic surgical forceps instruments. Results: Six tasks have been developed from existing tasks, adapted and modified for use with endoscopic instruments, or newly designed to test additional features of endoscopic forceps. The new tasks include the Fuzzy Ball Task, Cup Drop Task, Ring Around Task, Material Pull Task, Simulated Biopsy Task, and the Force Gauge Task. These tasks were then used to evaluate the performance of a new forceps instrument designed at Pennsylvania State University. Conclusions: The need for testing procedures for the advancement of Natural Orifice Translumenal Endoscopic Surgery has been addressed in this work. The developed tasks form a basis for not only testing new forceps instruments, but also for evaluating individual performance of surgical candidates with endoscopic forceps instruments. PMID:22906337

  8. Mobility of coated and uncoated TiO2 nanomaterials in soil columns--Applicability of the tests methods of OECD TG 312 and 106 for nanomaterials.

    PubMed

    Nickel, Carmen; Gabsch, Stephan; Hellack, Bryan; Nogowski, Andre; Babick, Frank; Stintz, Michael; Kuhlbusch, Thomas A J

    2015-07-01

    Nanomaterials are commonly used in everyday life products and during their life cycle they can be released into the environment. Soils and sediments are estimated as significant sinks for those nanomaterials. To investigate and assess the behaviour of nanomaterials in soils and sediments standardized test methods are needed. In this study the applicability of two existing international standardized test guidelines for the testing of nanomaterials, OECD TG 106 "Adsorption/Desorption using a Bath Equilibrium Method" and the OECD TG 312 "Leaching in Soil Columns", were investigated. For the study one coated and two uncoated TiO2 nanomaterials were used, respectively. The results indicate that the OECD TG 106 is not applicable for nanomaterials. However, the test method according to OECD TG 312 was found to be applicable if nano-specific adaptations are applied. The mobility investigations of the OECD TG 312 indicated a material-dependent mobility of the nanomaterials, which in some cases may lead to an accumulation in the upper soil layers. Whereas no significant transport was observed for the uncoated materials for the double-coated material (coating with dimethicone and aluminiumoxide) a significant transport was detected and attributed to the coating. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Revisiting the blind tests in crystal structure prediction: accurate energy ranking of molecular crystals.

    PubMed

    Asmadi, Aldi; Neumann, Marcus A; Kendrick, John; Girard, Pascale; Perrin, Marc-Antoine; Leusen, Frank J J

    2009-12-24

    In the 2007 blind test of crystal structure prediction hosted by the Cambridge Crystallographic Data Centre (CCDC), a hybrid DFT/MM method correctly ranked each of the four experimental structures as having the lowest lattice energy of all the crystal structures predicted for each molecule. The work presented here further validates this hybrid method by optimizing the crystal structures (experimental and submitted) of the first three CCDC blind tests held in 1999, 2001, and 2004. Except for the crystal structures of compound IX, all structures were reminimized and ranked according to their lattice energies. The hybrid method computes the lattice energy of a crystal structure as the sum of the DFT total energy and a van der Waals (dispersion) energy correction. Considering all four blind tests, the crystal structure with the lowest lattice energy corresponds to the experimentally observed structure for 12 out of 14 molecules. Moreover, good geometrical agreement is observed between the structures determined by the hybrid method and those measured experimentally. In comparison with the correct submissions made by the blind test participants, all hybrid optimized crystal structures (apart from compound II) have the smallest calculated root mean squared deviations from the experimentally observed structures. It is predicted that a new polymorph of compound V exists under pressure.

  10. A review of propeller noise prediction methodology: 1919-1994

    NASA Technical Reports Server (NTRS)

    Metzger, F. Bruce

    1995-01-01

    This report summarizes a review of the literature regarding propeller noise prediction methods. The review is divided into six sections: (1) early methods; (2) more recent methods based on earlier theory; (3) more recent methods based on the Acoustic Analogy; (4) more recent methods based on Computational Acoustics; (5) empirical methods; and (6) broadband methods. The report concludes that there are a large number of noise prediction procedures available which vary markedly in complexity. Deficiencies in accuracy of methods in many cases may be related, not to the methods themselves, but the accuracy and detail of the aerodynamic inputs used to calculate noise. The steps recommended in the report to provide accurate and easy to use prediction methods are: (1) identify reliable test data; (2) define and conduct test programs to fill gaps in the existing data base; (3) identify the most promising prediction methods; (4) evaluate promising prediction methods relative to the data base; (5) identify and correct the weaknesses in the prediction methods, including lack of user friendliness, and include features now available only in research codes; (6) confirm the accuracy of improved prediction methods to the data base; and (7) make the methods widely available and provide training in their use.

  11. Comparisons of NDT Methods to Inspect Cork and Cork filled Epoxy Bands

    NASA Technical Reports Server (NTRS)

    Lingbloom, Mike

    2007-01-01

    Sheet cork and cork filled epoxy provide external insulation for the Reusable Solid Rocket Motor (RSRM) on the Nation's Space Transportation System (STS). Interest in the reliability of the external insulation bonds has increased since the Columbia incident. A non-destructive test (NDT) method that will provide the best inspection for these bonds has been under evaluation. Electronic Shearography has been selected as the primary NDT method for inspection of these bond lines in the RSRM production flow. ATK Launch Systems Group has purchased an electronic shearography system that includes a vacuum chamber that is used for evaluation of test parts and custom vacuum windows for inspection of full-scale motors. Although the electronic shearography technology has been selected as the primary method for inspection of the external bonds, other technologies that exist continue to be investigated. The NASA/Marshall Space Flight Center (MSFC) NDT department has inspected several samples for comparison with electronic shearography with various inspections systems in their laboratory. The systems that were evaluated are X-ray backscatter, terahertz imaging, and microwave imaging. The samples tested have some programmed flaws as well as some flaws that occurred naturally during the sample making process. These samples provide sufficient flaw variation for the evaluation of the different inspection systems. This paper will describe and compare the basic functionality, test method and test results including dissection for each inspection technology.

  12. Methods for detecting long-term CNS dysfunction after prenatal exposure to neurotoxins.

    PubMed

    Vorhees, C V

    1997-11-01

    Current U.S. Environmental Protection Agency regulatory guidelines for developmental neurotoxicity emphasize functional categories such as motor activity, auditory startle, and learning and memory. A single test of some simple form of learning and memory is accepted to meet the latter category. The rationale for this emphasis has been that sensitive and reliable methods for assessing complex learning and memory are either not available or are too burdensome, and that insufficient data exist to endorse one approach over another. There has been little discussion of the fact that learning and memory is not a single identifiable functional category and no single test can assess all types of learning and memory. Three methods for assessing complex learning and memory are presented that assess two different types of learning and memory, are relatively efficient to conduct, and are sensitive to several known neurobehavioral teratogens. The tests are a 9-unit multiple-T swimming maze, and the Morris and Barnes mazes. The first of these assesses sequential learning, while the latter two assess spatial learning. A description of each test is provided, along with procedures for their use, and data exemplifying effects obtained using developmental exposure to phenytoin, methamphetamine, and MDMA. It is argued that multiple tests of learning and memory are required to ascertain cognitive deficits; something no single method can accomplish. Methods for acoustic startle are also presented.

  13. Test result communication in primary care: clinical and office staff perspectives.

    PubMed

    Litchfield, Ian J; Bentham, Louise M; Lilford, Richard J; Greenfield, Sheila M

    2014-10-01

    To understand how the results of laboratory tests are communicated to patients in primary care and perceptions on how the process may be improved. Qualitative study employing staff focus groups. Four UK primary care practices. Staff involved in the communication of test results. Five main themes emerged from the data: (i) the default method for communicating results differed between practices; (ii) clinical impact of results and patient characteristics such as anxiety level or health literacy influenced methods by which patients received their test result; (iii) which staff member had responsibility for the task was frequently unclear; (iv) barriers to communicating results existed, including there being no system or failsafe in place to determine whether results were returned to a practice or patient; (v) staff envisaged problems with a variety of test result communication methods discussed, including use of modern technologies, such as SMS messaging or online access. Communication of test results is a complex yet core primary care activity necessitating flexibility by both patients and staff. Dealing with the results from increasing numbers of tests is resource intensive and pressure on practice staff can be eased by greater utilization of electronic communication. Current systems appear vulnerable with no routine method of tracing delayed or missing results. Instead, practices only become aware of missing results following queries from patients. The creation of a test communication protocol for dissemination among patients and staff would help ensure both groups are aware of their roles and responsibilities. © The Author 2014. Published by Oxford University Press.

  14. Vadose Zone Transport Field Study: Detailed Test Plan for Simulated Leak Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Anderson L.; Gee, Glendon W.

    2000-06-23

    This report describes controlled transport experiments at well-instrumented field tests to be conducted during FY 2000 in support of DOE?s Vadose Zone Transport Field Study (VZTFS). The VZTFS supports the Groundwater/Vadose Zone Integration Project Science and Technology Initiative. The field tests will improve understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. These methods will capture the extent of contaminant plumes using existing steel-cased boreholes. Specific objectives are to 1) identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford?s waste disposal sites; 2) reduce uncertainty in conceptualmore » models; 3) develop a detailed and accurate data base of hydraulic and transport parameters for validation of three-dimensional numerical models; and 4) identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. Pacific Northwest National Laboratory (PNNL) manages the VZTFS for DOE.« less

  15. Multiple testing corrections in quantitative proteomics: A useful but blunt tool.

    PubMed

    Pascovici, Dana; Handler, David C L; Wu, Jemma X; Haynes, Paul A

    2016-09-01

    Multiple testing corrections are a useful tool for restricting the FDR, but can be blunt in the context of low power, as we demonstrate by a series of simple simulations. Unfortunately, in proteomics experiments low power can be common, driven by proteomics-specific issues like small effects due to ratio compression, and few replicates due to reagent high cost, instrument time availability and other issues; in such situations, most multiple testing corrections methods, if used with conventional thresholds, will fail to detect any true positives even when many exist. In this low power, medium scale situation, other methods such as effect size considerations or peptide-level calculations may be a more effective option, even if they do not offer the same theoretical guarantee of a low FDR. Thus, we aim to highlight in this article that proteomics presents some specific challenges to the standard multiple testing corrections methods, which should be employed as a useful tool but not be regarded as a required rubber stamp. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warmack, Robert J. Bruce; Wolf, Dennis A.; Frank, Steven Shane

    Various apparatus and methods for smoke detection are disclosed. In one embodiment, a method of training a classifier for a smoke detector comprises inputting sensor data from a plurality of tests into a processor. The sensor data is processed to generate derived signal data corresponding to the test data for respective tests. The derived signal data is assigned into categories comprising at least one fire group and at least one non-fire group. Linear discriminant analysis (LDA) training is performed by the processor. The derived signal data and the assigned categories for the derived signal data are inputs to the LDAmore » training. The output of the LDA training is stored in a computer readable medium, such as in a smoke detector that uses LDA to determine, based on the training, whether present conditions indicate the existence of a fire.« less

  17. Testing of aircraft passenger seat cushion material, full scale. Data, volume 2

    NASA Technical Reports Server (NTRS)

    Schutter, K. J.; Gaume, J. G.; Duskin, F. E.

    1980-01-01

    Burn characteristics of presently used and proposed seat cushion materials and types of constructions were determined. Eight different seat cushion configurations were subjected to full scale burn tests. Each cushion configuration was tested twice for a total of 16 tests. Two different fire sources were used: Jet A-fuel for eight tests, and a radiant energy source with propane flame for eight tests. Data were recorded for smoke density, cushion temperatures, radiant heat flux, animal response to combustion products, rate of weight loss of test specimens, cabin temperature, and type and content of gas within the cabin. When compared to existing seat cushions, the test specimens incorporating a fire barrier and those fabricated from advanced materials, using improved construction methods, exhibited significantly greater fire resistance. Flammability comparison tests were conducted upon one fire blocking configuration and one polyimide configuration.

  18. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    NASA Astrophysics Data System (ADS)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  19. Findings from the 2012 West Virginia Online Writing Scoring Comparability Study

    ERIC Educational Resources Information Center

    Hixson, Nate; Rhudy, Vaughn

    2013-01-01

    Student responses to the West Virginia Educational Standards Test (WESTEST) 2 Online Writing Assessment are scored by a computer-scoring engine. The scoring method is not widely understood among educators, and there exists a misperception that it is not comparable to hand scoring. To address these issues, the West Virginia Department of Education…

  20. Exploring the Relationship between Academic Dishonesty and Moral Development in Law School Students

    ERIC Educational Resources Information Center

    Edmondson, Macey Lynd

    2013-01-01

    This mixed methods study explored whether a relationship existed between moral development and dishonest academic behaviors in law students. The quantitative portion of the study utilized a survey adapted from James Rest's Defining Issues Test and Donald McCabe's Academic Integrity Survey. Law students were solicited by email from two public…

  1. A Parent-Only Group Intervention for Children with Anxiety Disorders: Pilot Study

    ERIC Educational Resources Information Center

    Thienemann, Margo; Moore, Phoebe; Tompkins, Kim

    2006-01-01

    Objective: Working to optimize treatment outcome and use resources efficiently, investigators conducted the first test of an existing parent-only group cognitive-behavioral therapy protocol to treat 24 children 7 to 16 years old with primary anxiety disorder diagnoses. Method: Over the course of 7 months, the authors evaluated a manual-based…

  2. Spontaneously Combustible Solids -- A Literature Search

    DTIC Science & Technology

    1975-05-01

    Wasahizeon, D.* C. It. K(EY WORDS (Continue on reviers side It necesary and Identify by block number) Pyrophoric Materials Hazardous Materials...and Identify by block number) Existing information on spontaneously combustible solids including pyrophoric - air hazardous materials and water... pyrophoric -air hazardous and water reactive materials. All available hazard classification systems and test methods releting to spontaneous combustion have

  3. Complementary Relationships between Traditional Media and Health Apps among American College Students

    ERIC Educational Resources Information Center

    Cho, Jaehee; Lee, H. Erin; Quinlan, Margaret

    2015-01-01

    Objective: This study explored the potential relationships between existing media and health apps for health information among college students. Participants: This study collected and analyzed a total of 408 surveys from students of 7 universities across the United States. Methods: In order to explore the research questions and test the…

  4. Enhancing the Effectiveness of Juvenile Drug Courts by Integrating Evidence-Based Practices

    ERIC Educational Resources Information Center

    Henggeler, Scott W.; McCart, Michael R.; Cunningham, Phillippe B.; Chapman, Jason E.

    2012-01-01

    Objective: The primary purpose of this study was to test a relatively efficient strategy for enhancing the capacity of juvenile drug courts (JDC) to reduce youth substance use and criminal behavior by incorporating components of evidence-based treatments into their existing services. Method: Six JDCs were randomized to a condition in which…

  5. Testing for Granger Causality in the Frequency Domain: A Phase Resampling Method.

    PubMed

    Liu, Siwei; Molenaar, Peter

    2016-01-01

    This article introduces phase resampling, an existing but rarely used surrogate data method for making statistical inferences of Granger causality in frequency domain time series analysis. Granger causality testing is essential for establishing causal relations among variables in multivariate dynamic processes. However, testing for Granger causality in the frequency domain is challenging due to the nonlinear relation between frequency domain measures (e.g., partial directed coherence, generalized partial directed coherence) and time domain data. Through a simulation study, we demonstrate that phase resampling is a general and robust method for making statistical inferences even with short time series. With Gaussian data, phase resampling yields satisfactory type I and type II error rates in all but one condition we examine: when a small effect size is combined with an insufficient number of data points. Violations of normality lead to slightly higher error rates but are mostly within acceptable ranges. We illustrate the utility of phase resampling with two empirical examples involving multivariate electroencephalography (EEG) and skin conductance data.

  6. Noise radiation directivity from a wind-tunnel inlet with inlet vanes and duct wall linings

    NASA Technical Reports Server (NTRS)

    Soderman, P. T.; Phillips, J. D.

    1986-01-01

    The acoustic radiation patterns from a 1/15th scale model of the Ames 80- by 120-Ft Wind Tunnel test section and inlet have been measured with a noise source installed in the test section. Data were acquired without airflow in the duct. Sound-absorbent inlet vanes oriented parallel to each other, or splayed with a variable incidence relative to the duct long axis, were evaluated along with duct wall linings. Results show that splayed vans tend to spread the sound to greater angles than those measured with the open inlet. Parallel vanes narrowed the high-frequency radiation pattern. Duct wall linings had a strong effect on acoustic directivity by attenuating wall reflections. Vane insertion loss was measured. Directivity results are compared with existing data from square ducts. Two prediction methods for duct radiation directivity are described: one is an empirical method based on the test data, and the other is a analytical method based on ray acoustics.

  7. Current status and future perspectives on molecular and serological methods in diagnostic mycology.

    PubMed

    Lau, Anna; Chen, Sharon; Sleiman, Sue; Sorrell, Tania

    2009-11-01

    Invasive fungal infections are an important cause of infectious morbidity. Nonculture-based methods are increasingly used for rapid, accurate diagnosis to improve patient outcomes. New and existing DNA amplification platforms have high sensitivity and specificity for direct detection and identification of fungi in clinical specimens. Since laboratories are increasingly reliant on DNA sequencing for fungal identification, measures to improve sequence interpretation should support validation of reference isolates and quality control in public gene repositories. Novel technologies (e.g., isothermal and PNA FISH methods), platforms enabling high-throughput analyses (e.g., DNA microarrays and Luminex xMAP) and/or commercial PCR assays warrant further evaluation for routine diagnostic use. Notwithstanding the advantages of molecular tests, serological assays remain clinically useful for patient management. The serum Aspergillus galactomannan test has been incorporated into diagnostic algorithms of invasive aspergillosis. Both the galactomannan and the serum beta-D-glucan test have value for diagnosing infection and monitoring therapeutic response.

  8. Study of diffusion bond development in 6061 aluminum and its relationship to future high density fuels fabrication.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prokofiev, I.; Wiencek, T.; McGann, D.

    1997-10-07

    Powder metallurgy dispersions of uranium alloys and silicides in an aluminum matrix have been developed by the RERTR program as a new generation of proliferation-resistant fuels. Testing is done with miniplate-type fuel plates to simulate standard fuel with cladding and matrix in plate-type configurations. In order to seal the dispersion fuel plates, a diffusion bond must exist between the aluminum coverplates surrounding the fuel meat. Four different variations in the standard method for roll-bonding 6061 aluminum were studied. They included mechanical cleaning, addition of a getter material, modifications to the standard chemical etching, and welding methods. Aluminum test pieces weremore » subjected to a bend test after each rolling pass. Results, based on 400 samples, indicate that at least a 70% reduction in thickness is required to produce a diffusion bond using the standard rollbonding method versus a 60% reduction using the Type II method in which the assembly was welded 100% and contained open 9mm holes at frame corners.« less

  9. Multivariate Welch t-test on distances

    PubMed Central

    2016-01-01

    Motivation: Permutational non-Euclidean analysis of variance, PERMANOVA, is routinely used in exploratory analysis of multivariate datasets to draw conclusions about the significance of patterns visualized through dimension reduction. This method recognizes that pairwise distance matrix between observations is sufficient to compute within and between group sums of squares necessary to form the (pseudo) F statistic. Moreover, not only Euclidean, but arbitrary distances can be used. This method, however, suffers from loss of power and type I error inflation in the presence of heteroscedasticity and sample size imbalances. Results: We develop a solution in the form of a distance-based Welch t-test, TW2, for two sample potentially unbalanced and heteroscedastic data. We demonstrate empirically the desirable type I error and power characteristics of the new test. We compare the performance of PERMANOVA and TW2 in reanalysis of two existing microbiome datasets, where the methodology has originated. Availability and Implementation: The source code for methods and analysis of this article is available at https://github.com/alekseyenko/Tw2. Further guidance on application of these methods can be obtained from the author. Contact: alekseye@musc.edu PMID:27515741

  10. Multivariate Welch t-test on distances.

    PubMed

    Alekseyenko, Alexander V

    2016-12-01

    Permutational non-Euclidean analysis of variance, PERMANOVA, is routinely used in exploratory analysis of multivariate datasets to draw conclusions about the significance of patterns visualized through dimension reduction. This method recognizes that pairwise distance matrix between observations is sufficient to compute within and between group sums of squares necessary to form the (pseudo) F statistic. Moreover, not only Euclidean, but arbitrary distances can be used. This method, however, suffers from loss of power and type I error inflation in the presence of heteroscedasticity and sample size imbalances. We develop a solution in the form of a distance-based Welch t-test, [Formula: see text], for two sample potentially unbalanced and heteroscedastic data. We demonstrate empirically the desirable type I error and power characteristics of the new test. We compare the performance of PERMANOVA and [Formula: see text] in reanalysis of two existing microbiome datasets, where the methodology has originated. The source code for methods and analysis of this article is available at https://github.com/alekseyenko/Tw2 Further guidance on application of these methods can be obtained from the author. alekseye@musc.edu. © The Author 2016. Published by Oxford University Press.

  11. Bench Scale Saltcake Dissolution Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BECHTOLD, D.B.; PACQUET, E.A.

    A potential scenario for retrieving saltcake from single shell tanks is the ''Rainbird{reg_sign} sprinkler'' method. Water is distributed evenly across the surface of the saltcake and allowed to percolate by gravity through the waste. The salt dissolves in the water, forming a saturated solution. The saturated liquid is removed by a saltwell pump situated near the bottom of the tank. By this method, there is never a large inventory of liquid in the tank that could pose a threat of leakage. There are many variables or factors that can influence the hydrodynamics of this retrieval process. They include saltcake porosity;more » saltwell pumping rate; salt dissolution chemistry; factors that could promote flow channeling (e.g. tank walls, dry wells, inclusions or discontinuities in the saltcake); method of water distribution; plug formation due to crystal formations or accumulation of insoluble solids. A brief literature search indicates that very little experimental data exist on these aspects of saltcake dissolution (Wiersma 1996, 1997). The tests reported here were planned (Herting, 2000) to provide preliminary data and information for planning future, scaled-up tests of the sprinkler method.« less

  12. Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations

    NASA Astrophysics Data System (ADS)

    Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.

    2018-07-01

    Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalized waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalized model for extreme-mass-ratio inspirals constructed on deformed black hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.

  13. Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations

    NASA Astrophysics Data System (ADS)

    Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.

    2018-04-01

    Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black-hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalised waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalised model for extreme-mass-ratio inspirals constructed on deformed black-hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.

  14. Enriching consumer health vocabulary through mining a social Q&A site: A similarity-based approach.

    PubMed

    He, Zhe; Chen, Zhiwei; Oh, Sanghee; Hou, Jinghui; Bian, Jiang

    2017-05-01

    The widely known vocabulary gap between health consumers and healthcare professionals hinders information seeking and health dialogue of consumers on end-user health applications. The Open Access and Collaborative Consumer Health Vocabulary (OAC CHV), which contains health-related terms used by lay consumers, has been created to bridge such a gap. Specifically, the OAC CHV facilitates consumers' health information retrieval by enabling consumer-facing health applications to translate between professional language and consumer friendly language. To keep up with the constantly evolving medical knowledge and language use, new terms need to be identified and added to the OAC CHV. User-generated content on social media, including social question and answer (social Q&A) sites, afford us an enormous opportunity in mining consumer health terms. Existing methods of identifying new consumer terms from text typically use ad-hoc lexical syntactic patterns and human review. Our study extends an existing method by extracting n-grams from a social Q&A textual corpus and representing them with a rich set of contextual and syntactic features. Using K-means clustering, our method, simiTerm, was able to identify terms that are both contextually and syntactically similar to the existing OAC CHV terms. We tested our method on social Q&A corpora on two disease domains: diabetes and cancer. Our method outperformed three baseline ranking methods. A post-hoc qualitative evaluation by human experts further validated that our method can effectively identify meaningful new consumer terms on social Q&A. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. TargetMiner: microRNA target prediction with systematic identification of tissue-specific negative examples.

    PubMed

    Bandyopadhyay, Sanghamitra; Mitra, Ramkrishna

    2009-10-15

    Prediction of microRNA (miRNA) target mRNAs using machine learning approaches is an important area of research. However, most of the methods suffer from either high false positive or false negative rates. One reason for this is the marked deficiency of negative examples or miRNA non-target pairs. Systematic identification of non-target mRNAs is still not addressed properly, and therefore, current machine learning approaches are compelled to rely on artificially generated negative examples for training. In this article, we have identified approximately 300 tissue-specific negative examples using a novel approach that involves expression profiling of both miRNAs and mRNAs, miRNA-mRNA structural interactions and seed-site conservation. The newly generated negative examples are validated with pSILAC dataset, which elucidate the fact that the identified non-targets are indeed non-targets.These high-throughput tissue-specific negative examples and a set of experimentally verified positive examples are then used to build a system called TargetMiner, a support vector machine (SVM)-based classifier. In addition to assessing the prediction accuracy on cross-validation experiments, TargetMiner has been validated with a completely independent experimental test dataset. Our method outperforms 10 existing target prediction algorithms and provides a good balance between sensitivity and specificity that is not reflected in the existing methods. We achieve a significantly higher sensitivity and specificity of 69% and 67.8% based on a pool of 90 feature set and 76.5% and 66.1% using a set of 30 selected feature set on the completely independent test dataset. In order to establish the effectiveness of the systematically generated negative examples, the SVM is trained using a different set of negative data generated using the method in Yousef et al. A significantly higher false positive rate (70.6%) is observed when tested on the independent set, while all other factors are kept the same. Again, when an existing method (NBmiRTar) is executed with the our proposed negative data, we observe an improvement in its performance. These clearly establish the effectiveness of the proposed approach of selecting the negative examples systematically. TargetMiner is now available as an online tool at www.isical.ac.in/ approximately bioinfo_miu

  16. Paroxysmal atrial fibrillation prediction method with shorter HRV sequences.

    PubMed

    Boon, K H; Khalil-Hani, M; Malarvili, M B; Sia, C W

    2016-10-01

    This paper proposes a method that predicts the onset of paroxysmal atrial fibrillation (PAF), using heart rate variability (HRV) segments that are shorter than those applied in existing methods, while maintaining good prediction accuracy. PAF is a common cardiac arrhythmia that increases the health risk of a patient, and the development of an accurate predictor of the onset of PAF is clinical important because it increases the possibility to stabilize (electrically) and prevent the onset of atrial arrhythmias with different pacing techniques. We investigate the effect of HRV features extracted from different lengths of HRV segments prior to PAF onset with the proposed PAF prediction method. The pre-processing stage of the predictor includes QRS detection, HRV quantification and ectopic beat correction. Time-domain, frequency-domain, non-linear and bispectrum features are then extracted from the quantified HRV. In the feature selection, the HRV feature set and classifier parameters are optimized simultaneously using an optimization procedure based on genetic algorithm (GA). Both full feature set and statistically significant feature subset are optimized by GA respectively. For the statistically significant feature subset, Mann-Whitney U test is used to filter non-statistical significance features that cannot pass the statistical test at 20% significant level. The final stage of our predictor is the classifier that is based on support vector machine (SVM). A 10-fold cross-validation is applied in performance evaluation, and the proposed method achieves 79.3% prediction accuracy using 15-minutes HRV segment. This accuracy is comparable to that achieved by existing methods that use 30-minutes HRV segments, most of which achieves accuracy of around 80%. More importantly, our method significantly outperforms those that applied segments shorter than 30 minutes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Assessing Mediational Models: Testing and Interval Estimation for Indirect Effects.

    PubMed

    Biesanz, Jeremy C; Falk, Carl F; Savalei, Victoria

    2010-08-06

    Theoretical models specifying indirect or mediated effects are common in the social sciences. An indirect effect exists when an independent variable's influence on the dependent variable is mediated through an intervening variable. Classic approaches to assessing such mediational hypotheses ( Baron & Kenny, 1986 ; Sobel, 1982 ) have in recent years been supplemented by computationally intensive methods such as bootstrapping, the distribution of the product methods, and hierarchical Bayesian Markov chain Monte Carlo (MCMC) methods. These different approaches for assessing mediation are illustrated using data from Dunn, Biesanz, Human, and Finn (2007). However, little is known about how these methods perform relative to each other, particularly in more challenging situations, such as with data that are incomplete and/or nonnormal. This article presents an extensive Monte Carlo simulation evaluating a host of approaches for assessing mediation. We examine Type I error rates, power, and coverage. We study normal and nonnormal data as well as complete and incomplete data. In addition, we adapt a method, recently proposed in statistical literature, that does not rely on confidence intervals (CIs) to test the null hypothesis of no indirect effect. The results suggest that the new inferential method-the partial posterior p value-slightly outperforms existing ones in terms of maintaining Type I error rates while maximizing power, especially with incomplete data. Among confidence interval approaches, the bias-corrected accelerated (BC a ) bootstrapping approach often has inflated Type I error rates and inconsistent coverage and is not recommended; In contrast, the bootstrapped percentile confidence interval and the hierarchical Bayesian MCMC method perform best overall, maintaining Type I error rates, exhibiting reasonable power, and producing stable and accurate coverage rates.

  18. Locally refined block-centred finite-difference groundwater models: Evaluation of parameter sensitivity and the consequences for inverse modelling

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2002-01-01

    Models with local grid refinement, as often required in groundwater models, pose special problems for model calibration. This work investigates the calculation of sensitivities and the performance of regression methods using two existing and one new method of grid refinement. The existing local grid refinement methods considered are: (a) a variably spaced grid in which the grid spacing becomes smaller near the area of interest and larger where such detail is not needed, and (b) telescopic mesh refinement (TMR), which uses the hydraulic heads or fluxes of a regional model to provide the boundary conditions for a locally refined model. The new method has a feedback between the regional and local grids using shared nodes, and thereby, unlike the TMR methods, balances heads and fluxes at the interfacing boundary. Results for sensitivities are compared for the three methods and the effect of the accuracy of sensitivity calculations are evaluated by comparing inverse modelling results. For the cases tested, results indicate that the inaccuracies of the sensitivities calculated using the TMR approach can cause the inverse model to converge to an incorrect solution.

  19. Locally refined block-centered finite-difference groundwater models: Evaluation of parameter sensitivity and the consequences for inverse modelling and predictions

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2002-01-01

    Models with local grid refinement, as often required in groundwater models, pose special problems for model calibration. This work investigates the calculation of sensitivities and performance of regression methods using two existing and one new method of grid refinement. The existing local grid refinement methods considered are (1) a variably spaced grid in which the grid spacing becomes smaller near the area of interest and larger where such detail is not needed and (2) telescopic mesh refinement (TMR), which uses the hydraulic heads or fluxes of a regional model to provide the boundary conditions for a locally refined model. The new method has a feedback between the regional and local grids using shared nodes, and thereby, unlike the TMR methods, balances heads and fluxes at the interfacing boundary. Results for sensitivities are compared for the three methods and the effect of the accuracy of sensitivity calculations are evaluated by comparing inverse modelling results. For the cases tested, results indicate that the inaccuracies of the sensitivities calculated using the TMR approach can cause the inverse model to converge to an incorrect solution.

  20. A new gradient shimming method based on undistorted field map of B0 inhomogeneity.

    PubMed

    Bao, Qingjia; Chen, Fang; Chen, Li; Song, Kan; Liu, Zao; Liu, Chaoyang

    2016-04-01

    Most existing gradient shimming methods for NMR spectrometers estimate field maps that resolve B0 inhomogeneity spatially from dual gradient-echo (GRE) images acquired at different echo times. However, the distortions induced by B0 inhomogeneity that always exists in the GRE images can result in estimated field maps that are distorted in both geometry and intensity, leading to inaccurate shimming. This work proposes a new gradient shimming method based on undistorted field map of B0 inhomogeneity obtained by a more accurate field map estimation technique. Compared to the traditional field map estimation method, this new method exploits both the positive and negative polarities of the frequency encoded gradients to eliminate the distortions caused by B0 inhomogeneity in the field map. Next, the corresponding automatic post-data procedure is introduced to obtain undistorted B0 field map based on knowledge of the invariant characteristics of the B0 inhomogeneity and the variant polarity of the encoded gradient. The experimental results on both simulated and real gradient shimming tests demonstrate the high performance of this new method. Copyright © 2015 Elsevier Inc. All rights reserved.

Top