Sample records for testing methods based

  1. Do Examinees Understand Score Reports for Alternate Methods of Scoring Computer Based Tests?

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.; Williams, Natasha J.; Dodd, Barbara G.

    2011-01-01

    This study assessed the interpretability of scaled scores based on either number correct (NC) scoring for a paper-and-pencil test or one of two methods of scoring computer-based tests: an item pattern (IP) scoring method and a method based on equated NC scoring. The equated NC scoring method for computer-based tests was proposed as an alternative…

  2. Medical students’ attitudes and perspectives regarding novel computer-based practical spot tests compared to traditional practical spot tests

    PubMed Central

    Wijerathne, Buddhika; Rathnayake, Geetha

    2013-01-01

    Background Most universities currently practice traditional practical spot tests to evaluate students. However, traditional methods have several disadvantages. Computer-based examination techniques are becoming more popular among medical educators worldwide. Therefore incorporating the computer interface in practical spot testing is a novel concept that may minimize the shortcomings of traditional methods. Assessing students’ attitudes and perspectives is vital in understanding how students perceive the novel method. Methods One hundred and sixty medical students were randomly allocated to either a computer-based spot test (n=80) or a traditional spot test (n=80). The students rated their attitudes and perspectives regarding the spot test method soon after the test. The results were described comparatively. Results Students had higher positive attitudes towards the computer-based practical spot test compared to the traditional spot test. Their recommendations to introduce the novel practical spot test method for future exams and to other universities were statistically significantly higher. Conclusions The computer-based practical spot test is viewed as more acceptable to students than the traditional spot test. PMID:26451213

  3. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    DOT National Transportation Integrated Search

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  4. Effectiveness of Jigsaw learning compared to lecture-based learning in dental education.

    PubMed

    Sagsoz, O; Karatas, O; Turel, V; Yildiz, M; Kaya, E

    2017-02-01

    The objective of this study was to evaluate the success levels of students using the Jigsaw learning method in dental education. Fifty students with similar grade point average (GPA) scores were selected and randomly assigned into one of two groups (n = 25). A pretest concerning 'adhesion and bonding agents in dentistry' was administered to all students before classes. The Jigsaw learning method was applied to the experimental group for 3 weeks. At the same time, the control group was taking classes using the lecture-based learning method. At the end of the 3 weeks, all students were retested (post-test) on the subject. A retention test was administered 3 weeks after the post-test. Mean scores were calculated for each test for the experimental and control groups, and the data obtained were analysed using the independent samples t-test. No significant difference was determined between the Jigsaw and lecture-based methods at pretest or post-test. The highest mean test score was observed in the post-test with the Jigsaw method. In the retention test, success with the Jigsaw method was significantly higher than that with the lecture-based method. The Jigsaw method is as effective as the lecture-based method. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Evaluation of Alternative Altitude Scaling Methods for Thermal Ice Protection System in NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Addy, Harold E. Jr.; Broeren, Andy P.; Orchard, David M.

    2017-01-01

    A test was conducted at NASA Icing Research Tunnel to evaluate altitude scaling methods for thermal ice protection system. Two new scaling methods based on Weber number were compared against a method based on Reynolds number. The results generally agreed with the previous set of tests conducted in NRCC Altitude Icing Wind Tunnel where the three methods of scaling were also tested and compared along with reference (altitude) icing conditions. In those tests, the Weber number-based scaling methods yielded results much closer to those observed at the reference icing conditions than the Reynolds number-based icing conditions. The test in the NASA IRT used a much larger, asymmetric airfoil with an ice protection system that more closely resembled designs used in commercial aircraft. Following the trends observed during the AIWT tests, the Weber number based scaling methods resulted in smaller runback ice than the Reynolds number based scaling, and the ice formed farther upstream. The results show that the new Weber number based scaling methods, particularly the Weber number with water loading scaling, continue to show promise for ice protection system development and evaluation in atmospheric icing tunnels.

  6. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  7. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  8. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  9. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  10. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  11. A Novel Quantum Dots-Based Point of Care Test for Syphilis

    NASA Astrophysics Data System (ADS)

    Yang, Hao; Li, Ding; He, Rong; Guo, Qin; Wang, Kan; Zhang, Xueqing; Huang, Peng; Cui, Daxiang

    2010-05-01

    One-step lateral flow test is recommended as the first line screening of syphilis for primary healthcare settings in developing countries. However, it generally shows low sensitivity. We describe here the development of a novel fluorescent POC (Point Of Care) test method to be used for screening for syphilis. The method was designed to combine the rapidness of lateral flow test and sensitiveness of fluorescent method. 50 syphilis-positive specimens and 50 healthy specimens conformed by Treponema pallidum particle agglutination (TPPA) were tested with Quantum Dot-labeled and colloidal gold-labeled lateral flow test strips, respectively. The results showed that both sensitivity and specificity of the quantum dots-based method reached up to 100% (95% confidence interval [CI], 91-100%), while those of the colloidal gold-based method were 82% (95% CI, 68-91%) and 100% (95% CI, 91-100%), respectively. In addition, the naked-eye detection limit of quantum dot-based method could achieve 2 ng/ml of anti-TP47 polyclonal antibodies purified by affinity chromatography with TP47 antigen, which was tenfold higher than that of colloidal gold-based method. In conclusion, the quantum dots were found to be suitable for labels of lateral flow test strip. Its ease of use, sensitiveness and low cost make it well-suited for population-based on-the-site syphilis screening.

  12. Comparing In-Class and Out-of-Class Computer-Based Tests to Traditional Paper-and-Pencil Tests in Introductory Psychology Courses

    ERIC Educational Resources Information Center

    Frein, Scott T.

    2011-01-01

    This article describes three experiments comparing paper-and-pencil tests (PPTs) to computer-based tests (CBTs) in terms of test method preferences and student performance. In Experiment 1, students took tests using three methods: PPT in class, CBT in class, and CBT at the time and place of their choosing. Results indicate that test method did not…

  13. Measurement of susceptibility artifacts with histogram-based reference value on magnetic resonance images according to standard ASTM F2119.

    PubMed

    Heinrich, Andreas; Teichgräber, Ulf K; Güttler, Felix V

    2015-12-01

    The standard ASTM F2119 describes a test method for measuring the size of a susceptibility artifact based on the example of a passive implant. A pixel in an image is considered to be a part of an image artifact if the intensity is changed by at least 30% in the presence of a test object, compared to a reference image in which the test object is absent (reference value). The aim of this paper is to simplify and accelerate the test method using a histogram-based reference value. Four test objects were scanned parallel and perpendicular to the main magnetic field, and the largest susceptibility artifacts were measured using two methods of reference value determination (reference image-based and histogram-based reference value). The results between both methods were compared using the Mann-Whitney U-test. The difference between both reference values was 42.35 ± 23.66. The difference of artifact size was 0.64 ± 0.69 mm. The artifact sizes of both methods did not show significant differences; the p-value of the Mann-Whitney U-test was between 0.710 and 0.521. A standard-conform method for a rapid, objective, and reproducible evaluation of susceptibility artifacts could be implemented. The result of the histogram-based method does not significantly differ from the ASTM-conform method.

  14. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  15. Simple Test Functions in Meshless Local Petrov-Galerkin Methods

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.

    2016-01-01

    Two meshless local Petrov-Galerkin (MLPG) methods based on two different trial functions but that use a simple linear test function were developed for beam and column problems. These methods used generalized moving least squares (GMLS) and radial basis (RB) interpolation functions as trial functions. These two methods were tested on various patch test problems. Both methods passed the patch tests successfully. Then the methods were applied to various beam vibration problems and problems involving Euler and Beck's columns. Both methods yielded accurate solutions for all problems studied. The simple linear test function offers considerable savings in computing efforts as the domain integrals involved in the weak form are avoided. The two methods based on this simple linear test function method produced accurate results for frequencies and buckling loads. Of the two methods studied, the method with radial basis trial functions is very attractive as the method is simple, accurate, and robust.

  16. Flight-Test Evaluation of Flutter-Prediction Methods

    NASA Technical Reports Server (NTRS)

    Lind, RIck; Brenner, Marty

    2003-01-01

    The flight-test community routinely spends considerable time and money to determine a range of flight conditions, called a flight envelope, within which an aircraft is safe to fly. The cost of determining a flight envelope could be greatly reduced if there were a method of safely and accurately predicting the speed associated with the onset of an instability called flutter. Several methods have been developed with the goal of predicting flutter speeds to improve the efficiency of flight testing. These methods include (1) data-based methods, in which one relies entirely on information obtained from the flight tests and (2) model-based approaches, in which one relies on a combination of flight data and theoretical models. The data-driven methods include one based on extrapolation of damping trends, one that involves an envelope function, one that involves the Zimmerman-Weissenburger flutter margin, and one that involves a discrete-time auto-regressive model. An example of a model-based approach is that of the flutterometer. These methods have all been shown to be theoretically valid and have been demonstrated on simple test cases; however, until now, they have not been thoroughly evaluated in flight tests. An experimental apparatus called the Aerostructures Test Wing (ATW) was developed to test these prediction methods.

  17. Experimental Evaluation of Suitability of Selected Multi-Criteria Decision-Making Methods for Large-Scale Agent-Based Simulations.

    PubMed

    Tučník, Petr; Bureš, Vladimír

    2016-01-01

    Multi-criteria decision-making (MCDM) can be formally implemented by various methods. This study compares suitability of four selected MCDM methods, namely WPM, TOPSIS, VIKOR, and PROMETHEE, for future applications in agent-based computational economic (ACE) models of larger scale (i.e., over 10 000 agents in one geographical region). These four MCDM methods were selected according to their appropriateness for computational processing in ACE applications. Tests of the selected methods were conducted on four hardware configurations. For each method, 100 tests were performed, which represented one testing iteration. With four testing iterations conducted on each hardware setting and separated testing of all configurations with the-server parameter de/activated, altogether, 12800 data points were collected and consequently analyzed. An illustrational decision-making scenario was used which allows the mutual comparison of all of the selected decision making methods. Our test results suggest that although all methods are convenient and can be used in practice, the VIKOR method accomplished the tests with the best results and thus can be recommended as the most suitable for simulations of large-scale agent-based models.

  18. Verification of a ground-based method for simulating high-altitude, supersonic flight conditions

    NASA Astrophysics Data System (ADS)

    Zhou, Xuewen; Xu, Jian; Lv, Shuiyan

    Ground-based methods for accurately representing high-altitude, high-speed flight conditions have been an important research topic in the aerospace field. Based on an analysis of the requirements for high-altitude supersonic flight tests, a ground-based test bed was designed combining Laval nozzle, which is often found in wind tunnels, with a rocket sled system. Sled tests were used to verify the performance of the test bed. The test results indicated that the test bed produced a uniform-flow field with a static pressure and density equivalent to atmospheric conditions at an altitude of 13-15km and at a flow velocity of approximately M 2.4. This test method has the advantages of accuracy, fewer experimental limitations, and reusability.

  19. Analytical interferences in point-of-care testing glucometers by icodextrin and its metabolites: an overview.

    PubMed

    Floré, Katelijne M J; Delanghe, Joris R

    2009-01-01

    Current point-of-care testing (POCT) glucometers are based on various test principles. Two major method groups dominate the market: glucose oxidase-based systems and glucose dehydrogenase-based systems using pyrroloquinoline quinone (GDH-PQQ) as a cofactor. The GDH-PQQ-based glucometers are replacing the older glucose oxidase-based systems because of their lower sensitivity for oxygen. On the other hand, the GDH-PQQ test method results in falsely elevated blood glucose levels in peritoneal dialysis patients receiving solutions containing icodextrin (e.g., Extraneal; Baxter, Brussels, Belgium). Icodextrin is metabolized in the systemic circulation into different glucose polymers, but mainly maltose, which interferes with the GDH-PQQ-based method. Clinicians should be aware of this analytical interference. The POCT glucometers based on the GDH-PQQ method should preferably not be used in this high-risk population and POCT glucose results inconsistent with clinical suspicion of hypoglycemic coma should be retested with another testing system.

  20. GEE-based SNP set association test for continuous and discrete traits in family-based association studies.

    PubMed

    Wang, Xuefeng; Lee, Seunggeun; Zhu, Xiaofeng; Redline, Susan; Lin, Xihong

    2013-12-01

    Family-based genetic association studies of related individuals provide opportunities to detect genetic variants that complement studies of unrelated individuals. Most statistical methods for family association studies for common variants are single marker based, which test one SNP a time. In this paper, we consider testing the effect of an SNP set, e.g., SNPs in a gene, in family studies, for both continuous and discrete traits. Specifically, we propose a generalized estimating equations (GEEs) based kernel association test, a variance component based testing method, to test for the association between a phenotype and multiple variants in an SNP set jointly using family samples. The proposed approach allows for both continuous and discrete traits, where the correlation among family members is taken into account through the use of an empirical covariance estimator. We derive the theoretical distribution of the proposed statistic under the null and develop analytical methods to calculate the P-values. We also propose an efficient resampling method for correcting for small sample size bias in family studies. The proposed method allows for easily incorporating covariates and SNP-SNP interactions. Simulation studies show that the proposed method properly controls for type I error rates under both random and ascertained sampling schemes in family studies. We demonstrate through simulation studies that our approach has superior performance for association mapping compared to the single marker based minimum P-value GEE test for an SNP-set effect over a range of scenarios. We illustrate the application of the proposed method using data from the Cleveland Family GWAS Study. © 2013 WILEY PERIODICALS, INC.

  1. 37: COMPARISON OF TWO METHODS: TBL-BASED AND LECTURE-BASED LEARNING IN NURSING CARE OF PATIENTS WITH DIABETES IN NURSING STUDENTS

    PubMed Central

    Khodaveisi, Masoud; Qaderian, Khosro; Oshvandi, Khodayar; Soltanian, Ali Reza; Vardanjani, Mehdi molavi

    2017-01-01

    Background and aims learning plays an important role in developing nursing skills and right care-taking. The Present study aims to evaluate two learning methods based on team –based learning and lecture-based learning in learning care-taking of patients with diabetes in nursing students. Method In this quasi-experimental study, 64 students in term 4 in nursing college of Bukan and Miandoab were included in the study based on knowledge and performance questionnaire including 15 questions based on knowledge and 5 questions based on performance on care-taking in patients with diabetes were used as data collection tool whose reliability was confirmed by cronbach alpha (r=0.83) by the researcher. To compare the mean score of knowledge and performance in each group in pre-test step and post-test step, pair –t test and to compare mean of scores in two groups of control and intervention, the independent t- test was used. Results There was not significant statistical difference between two groups in pre terms of knowledge and performance score (p=0.784). There was significant difference between the mean of knowledge scores and diabetes performance in the post-test in the team-based learning group and lecture-based learning group (p=0.001). There was significant difference between the mean score of knowledge of diabetes care in pre-test and post-test in base learning groups (p=0.001). Conclusion In both methods team-based and lecture-based learning approaches resulted in improvement in learning in students, but the rate of learning in the team-based learning approach is greater compared to that of lecture-based learning and it is recommended that this method be used as a higher education method in the education of students.

  2. Experimental Evaluation of Suitability of Selected Multi-Criteria Decision-Making Methods for Large-Scale Agent-Based Simulations

    PubMed Central

    2016-01-01

    Multi-criteria decision-making (MCDM) can be formally implemented by various methods. This study compares suitability of four selected MCDM methods, namely WPM, TOPSIS, VIKOR, and PROMETHEE, for future applications in agent-based computational economic (ACE) models of larger scale (i.e., over 10 000 agents in one geographical region). These four MCDM methods were selected according to their appropriateness for computational processing in ACE applications. Tests of the selected methods were conducted on four hardware configurations. For each method, 100 tests were performed, which represented one testing iteration. With four testing iterations conducted on each hardware setting and separated testing of all configurations with the–server parameter de/activated, altogether, 12800 data points were collected and consequently analyzed. An illustrational decision-making scenario was used which allows the mutual comparison of all of the selected decision making methods. Our test results suggest that although all methods are convenient and can be used in practice, the VIKOR method accomplished the tests with the best results and thus can be recommended as the most suitable for simulations of large-scale agent-based models. PMID:27806061

  3. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  4. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  5. Evaluation of a Secure Laptop-Based Testing Program in an Undergraduate Nursing Program: Students' Perspective.

    PubMed

    Tao, Jinyuan; Gunter, Glenda; Tsai, Ming-Hsiu; Lim, Dan

    2016-01-01

    Recently, the many robust learning management systems, and the availability of affordable laptops, have made secure laptop-based testing a reality on many campuses. The undergraduate nursing program at the authors' university began to implement a secure laptop-based testing program in 2009, which allowed students to use their newly purchased laptops to take quizzes and tests securely in classrooms. After nearly 5 years' secure laptop-based testing program implementation, a formative evaluation, using a mixed method that has both descriptive and correlational data elements, was conducted to seek constructive feedback from students to improve the program. Evaluation data show that, overall, students (n = 166) believed the secure laptop-based testing program helps them get hands-on experience of taking examinations on the computer and gets them prepared for their computerized NCLEX-RN. Students, however, had a lot of concerns about laptop glitches and campus wireless network glitches they experienced during testing. At the same time, NCLEX-RN first-time passing rate data were analyzed using the χ2 test, and revealed no significant association between the two testing methods (paper-and-pencil testing and the secure laptop-based testing) and students' first-time NCLEX-RN passing rate. Based on the odds ratio, however, the odds of students passing NCLEX-RN the first time was 1.37 times higher if they were taught with the secure laptop-based testing method than if taught with the traditional paper-and-pencil testing method in nursing school. It was recommended to the institution that better quality of laptops needs to be provided to future students, measures needed to be taken to further stabilize the campus wireless Internet network, and there was a need to reevaluate the Laptop Initiative Program.

  6. Comparison of Standardized Test Scores from Traditional Classrooms and Those Using Problem-Based Learning

    ERIC Educational Resources Information Center

    Needham, Martha Elaine

    2010-01-01

    This research compares differences between standardized test scores in problem-based learning (PBL) classrooms and a traditional classroom for 6th grade students using a mixed-method, quasi-experimental and qualitative design. The research shows that problem-based learning is as effective as traditional teaching methods on standardized tests. The…

  7. Develop a new testing and evaluation protocol to assess flexbase performance using strength of soil binder.

    DOT National Transportation Integrated Search

    2008-01-01

    This research involved a detailed laboratory study of a new test method for evaluating road base materials based on : the strength of the soil binder. In this test method, small test specimens (5.0in length and 0.75in square cross : section) of binde...

  8. Significance testing of rules in rule-based models of human problem solving

    NASA Technical Reports Server (NTRS)

    Lewis, C. M.; Hammer, J. M.

    1986-01-01

    Rule-based models of human problem solving have typically not been tested for statistical significance. Three methods of testing rules - analysis of variance, randomization, and contingency tables - are presented. Advantages and disadvantages of the methods are also described.

  9. Laboratory based instruction in Pakistan: Comparative evaluation of three laboratory instruction methods in biological science at higher secondary school level

    NASA Astrophysics Data System (ADS)

    Cheema, Tabinda Shahid

    This study of laboratory based instruction at higher secondary school level was an attempt to gain some insight into the effectiveness of three laboratory instruction methods: cooperative group instruction method, individualised instruction method and lecture demonstration method on biology achievement and retention. A Randomised subjects, Pre-test Post-test Comparative Methods Design was applied. Three groups of students from a year 11 class in Pakistan conducted experiments using the different laboratory instruction methods. Pre-tests, achievement tests after the experiments and retention tests one month later were administered. Results showed no significant difference between the groups on total achievement and retention, nor was there any significant difference on knowledge and comprehension test scores or skills performance. Future research investigating a similar problem is suggested.

  10. A critical issue in model-based inference for studying trait-based community assembly and a solution.

    PubMed

    Ter Braak, Cajo J F; Peres-Neto, Pedro; Dray, Stéphane

    2017-01-01

    Statistical testing of trait-environment association from data is a challenge as there is no common unit of observation: the trait is observed on species, the environment on sites and the mediating abundance on species-site combinations. A number of correlation-based methods, such as the community weighted trait means method (CWM), the fourth-corner correlation method and the multivariate method RLQ, have been proposed to estimate such trait-environment associations. In these methods, valid statistical testing proceeds by performing two separate resampling tests, one site-based and the other species-based and by assessing significance by the largest of the two p -values (the p max test). Recently, regression-based methods using generalized linear models (GLM) have been proposed as a promising alternative with statistical inference via site-based resampling. We investigated the performance of this new approach along with approaches that mimicked the p max test using GLM instead of fourth-corner. By simulation using models with additional random variation in the species response to the environment, the site-based resampling tests using GLM are shown to have severely inflated type I error, of up to 90%, when the nominal level is set as 5%. In addition, predictive modelling of such data using site-based cross-validation very often identified trait-environment interactions that had no predictive value. The problem that we identify is not an "omitted variable bias" problem as it occurs even when the additional random variation is independent of the observed trait and environment data. Instead, it is a problem of ignoring a random effect. In the same simulations, the GLM-based p max test controlled the type I error in all models proposed so far in this context, but still gave slightly inflated error in more complex models that included both missing (but important) traits and missing (but important) environmental variables. For screening the importance of single trait-environment combinations, the fourth-corner test is shown to give almost the same results as the GLM-based tests in far less computing time.

  11. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    PubMed

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  12. Evidence-Based Toxicology.

    PubMed

    Hoffmann, Sebastian; Hartung, Thomas; Stephens, Martin

    Evidence-based toxicology (EBT) was introduced independently by two groups in 2005, in the context of toxicological risk assessment and causation as well as based on parallels between the evaluation of test methods in toxicology and evidence-based assessment of diagnostics tests in medicine. The role model of evidence-based medicine (EBM) motivated both proposals and guided the evolution of EBT, whereas especially systematic reviews and evidence quality assessment attract considerable attention in toxicology.Regarding test assessment, in the search of solutions for various problems related to validation, such as the imperfectness of the reference standard or the challenge to comprehensively evaluate tests, the field of Diagnostic Test Assessment (DTA) was identified as a potential resource. DTA being an EBM discipline, test method assessment/validation therefore became one of the main drivers spurring the development of EBT.In the context of pathway-based toxicology, EBT approaches, given their objectivity, transparency and consistency, have been proposed to be used for carrying out a (retrospective) mechanistic validation.In summary, implementation of more evidence-based approaches may provide the tools necessary to adapt the assessment/validation of toxicological test methods and testing strategies to face the challenges of toxicology in the twenty first century.

  13. Model-based sensor-less wavefront aberration correction in optical coherence tomography.

    PubMed

    Verstraete, Hans R G W; Wahls, Sander; Kalkman, Jeroen; Verhaegen, Michel

    2015-12-15

    Several sensor-less wavefront aberration correction methods that correct nonlinear wavefront aberrations by maximizing the optical coherence tomography (OCT) signal are tested on an OCT setup. A conventional coordinate search method is compared to two model-based optimization methods. The first model-based method takes advantage of the well-known optimization algorithm (NEWUOA) and utilizes a quadratic model. The second model-based method (DONE) is new and utilizes a random multidimensional Fourier-basis expansion. The model-based algorithms achieve lower wavefront errors with up to ten times fewer measurements. Furthermore, the newly proposed DONE method outperforms the NEWUOA method significantly. The DONE algorithm is tested on OCT images and shows a significantly improved image quality.

  14. New decision criteria for selecting delta check methods based on the ratio of the delta difference to the width of the reference range can be generally applicable for each clinical chemistry test item.

    PubMed

    Park, Sang Hyuk; Kim, So-Young; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2012-09-01

    Many laboratories use 4 delta check methods: delta difference, delta percent change, rate difference, and rate percent change. However, guidelines regarding decision criteria for selecting delta check methods have not yet been provided. We present new decision criteria for selecting delta check methods for each clinical chemistry test item. We collected 811,920 and 669,750 paired (present and previous) test results for 27 clinical chemistry test items from inpatients and outpatients, respectively. We devised new decision criteria for the selection of delta check methods based on the ratio of the delta difference to the width of the reference range (DD/RR). Delta check methods based on these criteria were compared with those based on the CV% of the absolute delta difference (ADD) as well as those reported in 2 previous studies. The delta check methods suggested by new decision criteria based on the DD/RR ratio corresponded well with those based on the CV% of the ADD except for only 2 items each in inpatients and outpatients. Delta check methods based on the DD/RR ratio also corresponded with those suggested in the 2 previous studies, except for 1 and 7 items in inpatients and outpatients, respectively. The DD/RR method appears to yield more feasible and intuitive selection criteria and can easily explain changes in the results by reflecting both the biological variation of the test item and the clinical characteristics of patients in each laboratory. We suggest this as a measure to determine delta check methods.

  15. Nanoparticle filtration performance of NIOSH-certified particulate air-purifying filtering facepiece respirators: evaluation by light scattering photometric and particle number-based test methods.

    PubMed

    Rengasamy, Samy; Eimer, Benjamin C

    2012-01-01

    National Institute for Occupational Safety and Health (NIOSH) certification test methods employ charge neutralized NaCl or dioctyl phthalate (DOP) aerosols to measure filter penetration levels of air-purifying particulate respirators photometrically using a TSI 8130 automated filter tester at 85 L/min. A previous study in our laboratory found that widely different filter penetration levels were measured for nanoparticles depending on whether a particle number (count)-based detector or a photometric detector was used. The purpose of this study was to better understand the influence of key test parameters, including filter media type, challenge aerosol size range, and detector system. Initial penetration levels for 17 models of NIOSH-approved N-, R-, and P-series filtering facepiece respirators were measured using the TSI 8130 photometric method and compared with the particle number-based penetration (obtained using two ultrafine condensation particle counters) for the same challenge aerosols generated by the TSI 8130. In general, the penetration obtained by the photometric method was less than the penetration obtained with the number-based method. Filter penetration was also measured for ambient room aerosols. Penetration measured by the TSI 8130 photometric method was lower than the number-based ambient aerosol penetration values. Number-based monodisperse NaCl aerosol penetration measurements showed that the most penetrating particle size was in the 50 nm range for all respirator models tested, with the exception of one model at ~200 nm size. Respirator models containing electrostatic filter media also showed lower penetration values with the TSI 8130 photometric method than the number-based penetration obtained for the most penetrating monodisperse particles. Results suggest that to provide a more challenging respirator filter test method than what is currently used for respirators containing electrostatic media, the test method should utilize a sufficient number of particles <100 nm and a count (particle number)-based detector.

  16. Antimicrobial Testing Methods & Procedures: MB-09-06

    EPA Pesticide Factsheets

    Describes the methodology used to determine the efficacy of towelette-based disinfectants against microbes on hard surfaces. The test is based on AOAC Method 961.02 (Germicidal Spray Products as Disinfectants).

  17. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    ERIC Educational Resources Information Center

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  18. Method modification of the Legipid® Legionella fast detection test kit.

    PubMed

    Albalat, Guillermo Rodríguez; Broch, Begoña Bedrina; Bono, Marisa Jiménez

    2014-01-01

    Legipid(®) Legionella Fast Detection is a test based on combined magnetic immunocapture and enzyme-immunoassay (CEIA) for the detection of Legionella in water. The test is based on the use of anti-Legionella antibodies immobilized on magnetic microspheres. Target microorganism is preconcentrated by filtration. Immunomagnetic analysis is applied on these preconcentrated water samples in a final test portion of 9 mL. The test kit was certified by the AOAC Research Institute as Performance Tested Method(SM) (PTM) No. 111101 in a PTM validation which certifies the performance claims of the test method in comparison to the ISO reference method 11731-1998 and the revision 11731-2004 "Water Quality: Detection and Enumeration of Legionella pneumophila" in potable water, industrial water, and waste water. The modification of this test kit has been approved. The modification includes increasing the target analyte from L. pneumophila to Legionella species and adding an optical reader to the test method. In this study, 71 strains of Legionella spp. other than L. pneumophila were tested to determine its reactivity with the kit based on CEIA. All the strains of Legionella spp. tested by the CEIA test were confirmed positive by reference standard method ISO 11731. This test (PTM 111101) has been modified to include a final optical reading. A methods comparison study was conducted to demonstrate the equivalence of this modification to the reference culture method. Two water matrixes were analyzed. Results show no statistically detectable difference between the test method and the reference culture method for the enumeration of Legionella spp. The relative level of detection was 93 CFU/volume examined (LOD50). For optical reading, the LOD was 40 CFU/volume examined and the LOQ was 60 CFU/volume examined. Results showed that the test Legipid Legionella Fast Detection is equivalent to the reference culture method for the enumeration of Legionella spp.

  19. 77 FR 50510 - Federal Agency Responses to Interagency Coordinating Committee on the Validation of Alternative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ... methods for potential use in the EPA EDSP. The evaluation indicated that no in vitro ER- or AR-based test... considered a high priority based on the lack of adequately validated test methods and the regulatory and... Limitations of the LUMI-CELL[supreg] ER (BG1Luc ER TA) Test Method, An In Vitro Assay for Identifying Human...

  20. Efficient Testing Combining Design of Experiment and Learn-to-Fly Strategies

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Brandon, Jay M.

    2017-01-01

    Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.

  1. Adaptive Set-Based Methods for Association Testing

    PubMed Central

    Su, Yu-Chen; Gauderman, W. James; Kiros, Berhane; Lewinger, Juan Pablo

    2017-01-01

    With a typical sample size of a few thousand subjects, a single genomewide association study (GWAS) using traditional one-SNP-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. While self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly ‘adapt’ to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a LASSO based test. PMID:26707371

  2. Restricted random search method based on taboo search in the multiple minima problem

    NASA Astrophysics Data System (ADS)

    Hong, Seung Do; Jhon, Mu Shik

    1997-03-01

    The restricted random search method is proposed as a simple Monte Carlo sampling method to search minima fast in the multiple minima problem. This method is based on taboo search applied recently to continuous test functions. The concept of the taboo region instead of the taboo list is used and therefore the sampling of a region near an old configuration is restricted in this method. This method is applied to 2-dimensional test functions and the argon clusters. This method is found to be a practical and efficient method to search near-global configurations of test functions and the argon clusters.

  3. Methanogenic activity tests by Infrared Tunable Diode Laser Absorption Spectroscopy.

    PubMed

    Martinez-Cruz, Karla; Sepulveda-Jauregui, Armando; Escobar-Orozco, Nayeli; Thalasso, Frederic

    2012-10-01

    Methanogenic activity (MA) tests are commonly carried out to estimate the capability of anaerobic biomass to treat effluents, to evaluate anaerobic activity in bioreactors or natural ecosystems, or to quantify inhibitory effects on methanogenic activity. These activity tests are usually based on the measurement of the volume of biogas produced by volumetric, pressure increase or gas chromatography (GC) methods. In this study, we present an alternative method for non-invasive measurement of methane produced during activity tests in closed vials, based on Infrared Tunable Diode Laser Absorption Spectroscopy (MA-TDLAS). This new method was tested during model acetoclastic and hydrogenotrophic methanogenic activity tests and was compared to a more traditional method based on gas chromatography. From the results obtained, the CH(4) detection limit of the method was estimated to 60 ppm and the minimum measurable methane production rate was estimated to 1.09(.)10(-3) mg l(-1) h(-1), which is below CH(4) production rate usually reported in both anaerobic reactors and natural ecosystems. Additionally to sensitivity, the method has several potential interests compared to more traditional methods among which short measurements time allowing the measurement of a large number of MA test vials, non-invasive measurements avoiding leakage or external interferences and similar cost to GC based methods. It is concluded that MA-TDLAS is a promising method that could be of interest not only in the field of anaerobic digestion but also, in the field of environmental ecology where CH(4) production rates are usually very low. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  5. Construction of Expert Knowledge Monitoring and Assessment System Based on Integral Method of Knowledge Evaluation

    ERIC Educational Resources Information Center

    Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D.

    2016-01-01

    Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…

  6. A comparative laboratory diagnosis of malaria: microscopy versus rapid diagnostic test kits.

    PubMed

    Azikiwe, C C A; Ifezulike, C C; Siminialayi, I M; Amazu, L U; Enye, J C; Nwakwunite, O E

    2012-04-01

    To compare the two methods of rapid diagnostic tests (RDTs) and microscopy in the diagnosis of malaria. RDTs and microscopy were carried out to diagnose malaria. Percentage malaria parasitaemia was calculated on thin films and all non-acute cases of plasmodiasis with less than 0.001% malaria parasitaemia were regarded as negative. Results were simply presented as percentage positive of the total number of patients under study. The results of RDTs were compared to those of microscopy while those of RDTs based on antigen were compared to those of RDTs based on antibody. Patients' follow-up was made for all cases. All the 200 patients under present study tested positive to RDTs based on malaria antibodies (serum) method (100%). 128 out of 200 tested positive to RDTs based on malaria antigen (whole blood) method (64%), while 118 out of 200 patients under present study tested positive to visual microscopy of Lieshman and diluted Giemsa (59%). All patients that tested positive to microscopy also tested positive to RDTs based on antigen. All patients on the second day of follow-up were non-febrile and had antimalaria drugs. We conclude based on the present study that the RDTs based on malaria antigen (whole blood) method is as specific as the traditional microscopy and even appears more sensitive than microscopy. The RDTs based on antibody (serum) method is unspecific thus it should not be encouraged. It is most likely that Africa being an endemic region, formation of certain levels of malaria antibody may not be uncommon. The present study also supports the opinion that a good number of febrile cases is not due to malaria. We support WHO's report on cost effectiveness of RDTs but, recommend that only the antigen based method should possibly, be adopted in Africa and other malaria endemic regions of the world.

  7. Evaluation of Alternative Altitude Scaling Methods for Thermal Ice Protection System in NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Addy, Harold; Broeren, Andy P.; Orchard, David M.

    2017-01-01

    A test was conducted at NASA Icing Research Tunnel to evaluate altitude scaling methods for thermal ice protection system. Two scaling methods based on Weber number were compared against a method based on the Reynolds number. The results generally agreed with the previous set of tests conducted in NRCC Altitude Icing Wind Tunnel. The Weber number based scaling methods resulted in smaller runback ice mass than the Reynolds number based scaling method. The ice accretions from the Weber number based scaling method also formed farther upstream. However there were large differences in the accreted ice mass between the two Weber number based scaling methods. The difference became greater when the speed was increased. This indicated that there may be some Reynolds number effects that isnt fully accounted for and warrants further study.

  8. Development of a nondestructive leak testing method utilizing the head space analyzer for ampoule products containing ethanol-based solutions.

    PubMed

    Sudo, Hirotaka; O'driscoll, Michael; Nishiwaki, Kenji; Kawamoto, Yuji; Gammell, Philip; Schramm, Gerhard; Wertli, Toni; Prinz, Heino; Mori, Atsuhide; Sako, Kazuhiro

    2012-01-01

    The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. Studies using ampoules filled with ethanol-based solution and with nitrogen in the headspace demonstrated that the head space analysis (HSA) method showed sufficient sensitivity in detecting an ampoule crack. The proposed method is the use of HSA in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate the oxygen flow through the crack in the ampoule. The method was examined in comparative studies with a conventional dye ingress method, and the results showed that the HSA method exhibits sensitivity superior to the dye method. The results indicate that the HSA method in combination with the bombing treatment provides potential application as a leak test for the detection of container defects not only for ampoule products with ethanol-based solutions, but also for testing lyophilized products in vials with nitrogen in the head space. The application of a head space analyzer for oxygen concentration was examined to develop a novel ampoule leak test method. The proposed method is the use of head space analysis (HSA) in conjunction with the pretreatment of an overpressurising process known as bombing to facilitate oxygen flow through the crack in the ampoule for use in routine production. The result of the comparative study with a conventional dye leak test method indicates that the HSA method in combination with the bombing treatment can be used as a leak test method, enabling detection of container defects.

  9. Evaluation of the filtration performance of NIOSH-approved N95 filtering facepiece respirators by photometric and number-based test methods.

    PubMed

    Rengasamy, Samy; Miller, Adam; Eimer, Benjamin C

    2011-01-01

    N95 particulate filtering facepiece respirators are certified by measuring penetration levels photometrically with a presumed severe case test method using charge neutralized NaCl aerosols at 85 L/min. However, penetration values obtained by photometric methods have not been compared with count-based methods using contemporary respirators composed of electrostatic filter media and challenged with both generated and ambient aerosols. To better understand the effects of key test parameters (e.g., particle charge, detection method), initial penetration levels for five N95 model filtering facepiece respirators were measured using NaCl aerosols with the aerosol challenge and test equipment employed in the NIOSH respirator certification method (photometric) and compared with an ultrafine condensation particle counter method (count based) for the same NaCl aerosols as well as for ambient room air particles. Penetrations using the NIOSH test method were several-fold less than the penetrations obtained by the ultrafine condensation particle counter for NaCl aerosols as well as for room particles indicating that penetration measurement based on particle counting offers a more difficult challenge than the photometric method, which lacks sensitivity for particles < 100 nm. All five N95 models showed the most penetrating particle size around 50 nm for room air particles with or without charge neutralization, and at 200 nm for singly charged NaCl monodisperse particles. Room air with fewer charged particles and an overwhelming number of neutral particles contributed to the most penetrating particle size in the 50 nm range, indicating that the charge state for the majority of test particles determines the MPPS. Data suggest that the NIOSH respirator certification protocol employing the photometric method may not be a more challenging aerosol test method. Filter penetrations can vary among workplaces with different particle size distributions, which suggests the need for the development of new or revised "more challenging" aerosol test methods for NIOSH certification of respirators.

  10. Equating Scores from Adaptive to Linear Tests

    ERIC Educational Resources Information Center

    van der Linden, Wim J.

    2006-01-01

    Two local methods for observed-score equating are applied to the problem of equating an adaptive test to a linear test. In an empirical study, the methods were evaluated against a method based on the test characteristic function (TCF) of the linear test and traditional equipercentile equating applied to the ability estimates on the adaptive test…

  11. A comparison of interteaching and lecture in the college classroom.

    PubMed

    Saville, Bryan K; Zinn, Tracy E; Neef, Nancy A; Van Norman, Renee; Ferreri, Summer J

    2006-01-01

    Interteaching is a new method of classroom instruction that is based on behavioral principles but offers more flexibility than other behaviorally based methods. We examined the effectiveness of interteaching relative to a traditional form of classroom instruction-the lecture. In Study 1, participants in a graduate course in special education took short quizzes after alternating conditions of interteaching and lecture. Quiz scores following interteaching were higher than quiz scores following lecture, although both methods improved performance relative to pretest measures. In Study 2, we also alternated interteaching and lecture but counterbalanced the conditions across two sections of an undergraduate research methods class. After each unit of information, participants from both sections took the same test. Again, test scores following interteaching were higher than test scores following lecture. In addition, students correctly answered more interteaching-based questions than lecture-based questions on a cumulative final test. In both studies, the majority of students reported a preference for interteaching relative to traditional lecture. In sum, the results suggest that interteaching may be an effective alternative to traditional lecture-based methods of instruction.

  12. Some important considerations in the development of stress corrosion cracking test methods.

    NASA Technical Reports Server (NTRS)

    Wei, R. P.; Novak, S. R.; Williams, D. P.

    1972-01-01

    Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.

  13. Methods for Equating Mental Tests.

    DTIC Science & Technology

    1984-11-01

    1983) compared conventional and IRT methods for equating the Test of English as a Foreign Language ( TOEFL ) after chaining. Three conventional and...three IRT equating methods were examined in this study; two sections of TOEFL were each (separately) equated. The IRT methods included the following: (a...group. A separate base form was established for each of the six equating methods. Instead of equating the base-form TOEFL to itself, the last (eighth

  14. Adaptive Set-Based Methods for Association Testing.

    PubMed

    Su, Yu-Chen; Gauderman, William James; Berhane, Kiros; Lewinger, Juan Pablo

    2016-02-01

    With a typical sample size of a few thousand subjects, a single genome-wide association study (GWAS) using traditional one single nucleotide polymorphism (SNP)-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. Although self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly "adapt" to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a least absolute shrinkage and selection operator (LASSO)-based test. © 2015 WILEY PERIODICALS, INC.

  15. Gene-Based Testing of Interactions in Association Studies of Quantitative Traits

    PubMed Central

    Ma, Li; Clark, Andrew G.; Keinan, Alon

    2013-01-01

    Various methods have been developed for identifying gene–gene interactions in genome-wide association studies (GWAS). However, most methods focus on individual markers as the testing unit, and the large number of such tests drastically erodes statistical power. In this study, we propose novel interaction tests of quantitative traits that are gene-based and that confer advantage in both statistical power and biological interpretation. The framework of gene-based gene–gene interaction (GGG) tests combine marker-based interaction tests between all pairs of markers in two genes to produce a gene-level test for interaction between the two. The tests are based on an analytical formula we derive for the correlation between marker-based interaction tests due to linkage disequilibrium. We propose four GGG tests that extend the following P value combining methods: minimum P value, extended Simes procedure, truncated tail strength, and truncated P value product. Extensive simulations point to correct type I error rates of all tests and show that the two truncated tests are more powerful than the other tests in cases of markers involved in the underlying interaction not being directly genotyped and in cases of multiple underlying interactions. We applied our tests to pairs of genes that exhibit a protein–protein interaction to test for gene-level interactions underlying lipid levels using genotype data from the Atherosclerosis Risk in Communities study. We identified five novel interactions that are not evident from marker-based interaction testing and successfully replicated one of these interactions, between SMAD3 and NEDD9, in an independent sample from the Multi-Ethnic Study of Atherosclerosis. We conclude that our GGG tests show improved power to identify gene-level interactions in existing, as well as emerging, association studies. PMID:23468652

  16. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  17. Meeting Report: Validation of Toxicogenomics-Based Test Systems: ECVAM–ICCVAM/NICEATM Considerations for Regulatory Use

    PubMed Central

    Corvi, Raffaella; Ahr, Hans-Jürgen; Albertini, Silvio; Blakey, David H.; Clerici, Libero; Coecke, Sandra; Douglas, George R.; Gribaldo, Laura; Groten, John P.; Haase, Bernd; Hamernik, Karen; Hartung, Thomas; Inoue, Tohru; Indans, Ian; Maurici, Daniela; Orphanides, George; Rembges, Diana; Sansone, Susanna-Assunta; Snape, Jason R.; Toda, Eisaku; Tong, Weida; van Delft, Joost H.; Weis, Brenda; Schechtman, Leonard M.

    2006-01-01

    This is the report of the first workshop “Validation of Toxicogenomics-Based Test Systems” held 11–12 December 2003 in Ispra, Italy. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and organized jointly by ECVAM, the U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM). The primary aim of the workshop was for participants to discuss and define principles applicable to the validation of toxicogenomics platforms as well as validation of specific toxicologic test methods that incorporate toxicogenomics technologies. The workshop was viewed as an opportunity for initiating a dialogue between technologic experts, regulators, and the principal validation bodies and for identifying those factors to which the validation process would be applicable. It was felt that to do so now, as the technology is evolving and associated challenges are identified, would be a basis for the future validation of the technology when it reaches the appropriate stage. Because of the complexity of the issue, different aspects of the validation of toxicogenomics-based test methods were covered. The three focus areas include a) biologic validation of toxicogenomics-based test methods for regulatory decision making, b) technical and bioinformatics aspects related to validation, and c) validation issues as they relate to regulatory acceptance and use of toxicogenomics-based test methods. In this report we summarize the discussions and describe in detail the recommendations for future direction and priorities. PMID:16507466

  18. Meeting report: Validation of toxicogenomics-based test systems: ECVAM-ICCVAM/NICEATM considerations for regulatory use.

    PubMed

    Corvi, Raffaella; Ahr, Hans-Jürgen; Albertini, Silvio; Blakey, David H; Clerici, Libero; Coecke, Sandra; Douglas, George R; Gribaldo, Laura; Groten, John P; Haase, Bernd; Hamernik, Karen; Hartung, Thomas; Inoue, Tohru; Indans, Ian; Maurici, Daniela; Orphanides, George; Rembges, Diana; Sansone, Susanna-Assunta; Snape, Jason R; Toda, Eisaku; Tong, Weida; van Delft, Joost H; Weis, Brenda; Schechtman, Leonard M

    2006-03-01

    This is the report of the first workshop "Validation of Toxicogenomics-Based Test Systems" held 11-12 December 2003 in Ispra, Italy. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and organized jointly by ECVAM, the U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM). The primary aim of the workshop was for participants to discuss and define principles applicable to the validation of toxicogenomics platforms as well as validation of specific toxicologic test methods that incorporate toxicogenomics technologies. The workshop was viewed as an opportunity for initiating a dialogue between technologic experts, regulators, and the principal validation bodies and for identifying those factors to which the validation process would be applicable. It was felt that to do so now, as the technology is evolving and associated challenges are identified, would be a basis for the future validation of the technology when it reaches the appropriate stage. Because of the complexity of the issue, different aspects of the validation of toxicogenomics-based test methods were covered. The three focus areas include a) biologic validation of toxicogenomics-based test methods for regulatory decision making, b) technical and bioinformatics aspects related to validation, and c) validation issues as they relate to regulatory acceptance and use of toxicogenomics-based test methods. In this report we summarize the discussions and describe in detail the recommendations for future direction and priorities.

  19. Intelligent Evaluation Method of Tank Bottom Corrosion Status Based on Improved BP Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Qiu, Feng; Dai, Guang; Zhang, Ying

    According to the acoustic emission information and the appearance inspection information of tank bottom online testing, the external factors associated with tank bottom corrosion status are confirmed. Applying artificial neural network intelligent evaluation method, three tank bottom corrosion status evaluation models based on appearance inspection information, acoustic emission information, and online testing information are established. Comparing with the result of acoustic emission online testing through the evaluation of test sample, the accuracy of the evaluation model based on online testing information is 94 %. The evaluation model can evaluate tank bottom corrosion accurately and realize acoustic emission online testing intelligent evaluation of tank bottom.

  20. The Robustness of IRT-Based Vertical Scaling Methods to Violation of Unidimensionality

    ERIC Educational Resources Information Center

    Yin, Liqun

    2013-01-01

    In recent years, many states have adopted Item Response Theory (IRT) based vertically scaled tests due to their compelling features in a growth-based accountability context. However, selection of a practical and effective calibration/scaling method and proper understanding of issues with possible multidimensionality in the test data is critical to…

  1. Correlation of Simulation Examination to Written Test Scores for Advanced Cardiac Life Support Testing: Prospective Cohort Study.

    PubMed

    Strom, Suzanne L; Anderson, Craig L; Yang, Luanna; Canales, Cecilia; Amin, Alpesh; Lotfipour, Shahram; McCoy, C Eric; Osborn, Megan Boysen; Langdorf, Mark I

    2015-11-01

    Traditional Advanced Cardiac Life Support (ACLS) courses are evaluated using written multiple-choice tests. High-fidelity simulation is a widely used adjunct to didactic content, and has been used in many specialties as a training resource as well as an evaluative tool. There are no data to our knowledge that compare simulation examination scores with written test scores for ACLS courses. To compare and correlate a novel high-fidelity simulation-based evaluation with traditional written testing for senior medical students in an ACLS course. We performed a prospective cohort study to determine the correlation between simulation-based evaluation and traditional written testing in a medical school simulation center. Students were tested on a standard acute coronary syndrome/ventricular fibrillation cardiac arrest scenario. Our primary outcome measure was correlation of exam results for 19 volunteer fourth-year medical students after a 32-hour ACLS-based Resuscitation Boot Camp course. Our secondary outcome was comparison of simulation-based vs. written outcome scores. The composite average score on the written evaluation was substantially higher (93.6%) than the simulation performance score (81.3%, absolute difference 12.3%, 95% CI [10.6-14.0%], p<0.00005). We found a statistically significant moderate correlation between simulation scenario test performance and traditional written testing (Pearson r=0.48, p=0.04), validating the new evaluation method. Simulation-based ACLS evaluation methods correlate with traditional written testing and demonstrate resuscitation knowledge and skills. Simulation may be a more discriminating and challenging testing method, as students scored higher on written evaluation methods compared to simulation.

  2. Comparison of Several Methods of Predicting the Pressure Loss at Altitude Across a Baffled Aircraft-Engine Cylinder

    NASA Technical Reports Server (NTRS)

    Neustein, Joseph; Schafer, Louis J , Jr

    1946-01-01

    Several methods of predicting the compressible-flow pressure loss across a baffled aircraft-engine cylinder were analytically related and were experimentally investigated on a typical air-cooled aircraft-engine cylinder. Tests with and without heat transfer covered a wide range of cooling-air flows and simulated altitudes from sea level to 40,000 feet. Both the analysis and the test results showed that the method based on the density determined by the static pressure and the stagnation temperature at the baffle exit gave results comparable with those obtained from methods derived by one-dimensional-flow theory. The method based on a characteristic Mach number, although related analytically to one-dimensional-flow theory, was found impractical in the present tests because of the difficulty encountered in defining the proper characteristic state of the cooling air. Accurate predictions of altitude pressure loss can apparently be made by these methods, provided that they are based on the results of sea-level tests with heat transfer.

  3. The Objective Borderline Method: A Probabilistic Method for Standard Setting

    ERIC Educational Resources Information Center

    Shulruf, Boaz; Poole, Phillippa; Jones, Philip; Wilkinson, Tim

    2015-01-01

    A new probability-based standard setting technique, the Objective Borderline Method (OBM), was introduced recently. This was based on a mathematical model of how test scores relate to student ability. The present study refined the model and tested it using 2500 simulated data-sets. The OBM was feasible to use. On average, the OBM performed well…

  4. Attitude algorithm and initial alignment method for SINS applied in short-range aircraft

    NASA Astrophysics Data System (ADS)

    Zhang, Rong-Hui; He, Zhao-Cheng; You, Feng; Chen, Bo

    2017-07-01

    This paper presents an attitude solution algorithm based on the Micro-Electro-Mechanical System and quaternion method. We completed the numerical calculation and engineering practice by adopting fourth-order Runge-Kutta algorithm in the digital signal processor. The state space mathematical model of initial alignment in static base was established, and the initial alignment method based on Kalman filter was proposed. Based on the hardware in the loop simulation platform, the short-range flight simulation test and the actual flight test were carried out. The results show that the error of pitch, yaw and roll angle is fast convergent, and the fitting rate between flight simulation and flight test is more than 85%.

  5. Hybrid Residual Flexibility/Mass-Additive Method for Structural Dynamic Testing

    NASA Technical Reports Server (NTRS)

    Tinker, M. L.

    2003-01-01

    A large fixture was designed and constructed for modal vibration testing of International Space Station elements. This fixed-base test fixture, which weighs thousands of pounds and is anchored to a massive concrete floor, initially utilized spherical bearings and pendulum mechanisms to simulate Shuttle orbiter boundary constraints for launch of the hardware. Many difficulties were encountered during a checkout test of the common module prototype structure, mainly due to undesirable friction and excessive clearances in the test-article-to-fixture interface bearings. Measured mode shapes and frequencies were not representative of orbiter-constrained modes due to the friction and clearance effects in the bearings. As a result, a major redesign effort for the interface mechanisms was undertaken. The total cost of the fixture design, construction and checkout, and redesign was over $2 million. Because of the problems experienced with fixed-base testing, alternative free-suspension methods were studied, including the residual flexibility and mass-additive approaches. Free-suspension structural dynamics test methods utilize soft elastic bungee cords and overhead frame suspension systems that are less complex and much less expensive than fixed-base systems. The cost of free-suspension fixturing is on the order of tens of thousands of dollars as opposed to millions, for large fixed-base fixturing. In addition, free-suspension test configurations are portable, allowing modal tests to be done at sites without modal test facilities. For example, a mass-additive modal test of the ASTRO-1 Shuttle payload was done at the Kennedy Space Center launch site. In this Technical Memorandum, the mass-additive and residual flexibility test methods are described in detail. A discussion of a hybrid approach that combines the best characteristics of each method follows and is the focus of the study.

  6. On sine dwell or broadband methods for modal testing

    NASA Technical Reports Server (NTRS)

    Chen, Jay-Chung; Wada, Ben K.

    1987-01-01

    For large, complex spacecraft structural systems, the objectives of the modal test are outlined. Based on these objectives, the comparison criteria for the modal test methods, namely, the broadband excitation and the sine dwell methods are established. Using the Galileo spacecraft modal test and the Centaur G Prime upper stage vehicle modal test as examples, the relative advantages or disadvantages of each method are examined. The usefulness or shortcoming of the methods are given from a practicing engineer's view point.

  7. Development of a Contact Permeation Test Fixture and Method

    DTIC Science & Technology

    2013-04-01

    direct contact with the skin, indicates the need for a quantitative contact test method. Comparison tests were conducted with VX on a standardized...Guide for the Care and Use of Laboratory Animals (8th ed.; National Research Council: Washington, DC, 2011). This test was also performed in...1 1.2 Development of a Contact-Based Permeation Test Method ........................................ 1 2. EXPERIMENTAL PROCEDURES

  8. Comparison of 3 in vivo methods for assessment of alcohol-based hand rubs.

    PubMed

    Edmonds-Wilson, Sarah; Campbell, Esther; Fox, Kyle; Macinga, David

    2015-05-01

    Alcohol-based hand rubs (ABHRs) are the primary method of hand hygiene in health-care settings. ICPs increasingly are assessing ABHR product efficacy data as improved products and test methods are developed. As a result, ICPs need better tools and recommendations for how to assess and compare ABHRs. Two ABHRs (70% ethanol) were tested according to 3 in vivo methods approved by ASTM International: E1174, E2755, and E2784. Log10 reductions were measured after a single test product use and after 10 consecutive uses at an application volume of 2 mL. The test method used had a significant influence on ABHR efficacy; however, in this study the test product (gel or foam) did not significantly influence efficacy. In addition, for all test methods, log10 reductions obtained after a single application were not predictive of results after 10 applications. Choice of test method can significantly influence efficacy results. Therefore, when assessing antimicrobial efficacy data of hand hygiene products, ICPs should pay close attention to the test method used, and ensure that product comparisons are made head to head in the same study using the same test methodology. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  9. Detection of Test Collusion via Kullback-Leibler Divergence

    ERIC Educational Resources Information Center

    Belov, Dmitry I.

    2013-01-01

    The development of statistical methods for detecting test collusion is a new research direction in the area of test security. Test collusion may be described as large-scale sharing of test materials, including answers to test items. Current methods of detecting test collusion are based on statistics also used in answer-copying detection.…

  10. Advantages and limitations of common testing methods for antioxidants.

    PubMed

    Amorati, R; Valgimigli, L

    2015-05-01

    Owing to the importance of antioxidants in the protection of both natural and man-made materials, a large variety of testing methods have been proposed and applied. These include methods based on inhibited autoxidation studies, which are better followed by monitoring the kinetics of oxygen consumption or of the formation of hydroperoxides, the primary oxidation products. Analytical determination of secondary oxidation products (e.g. carbonyl compounds) has also been used. The majority of testing methods, however, do not involve substrate autoxidation. They are based on the competitive bleaching of a probe (e.g. ORAC assay, β-carotene, crocin bleaching assays, and luminol assay), on reaction with a different probe (e.g. spin-trapping and TOSC assay), or they are indirect methods based on the reduction of persistent radicals (e.g. galvinoxyl, DPPH and TEAC assays), or of inorganic oxidizing species (e.g. FRAP, CUPRAC and Folin-Ciocalteu assays). Yet other methods are specific for preventive antioxidants. The relevance, advantages, and limitations of these methods are critically discussed, with respect to their chemistry and the mechanisms of antioxidant activity. A variety of cell-based assays have also been proposed, to investigate the biological activity of antioxidants. Their importance and critical aspects are discussed, along with arguments for the selection of the appropriate testing methods according to the different needs.

  11. A Model-Based Method for Content Validation of Automatically Generated Test Items

    ERIC Educational Resources Information Center

    Zhang, Xinxin; Gierl, Mark

    2016-01-01

    The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…

  12. Large-Scale Multiobjective Static Test Generation for Web-Based Testing with Integer Programming

    ERIC Educational Resources Information Center

    Nguyen, M. L.; Hui, Siu Cheung; Fong, A. C. M.

    2013-01-01

    Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…

  13. An Improved SoC Test Scheduling Method Based on Simulated Annealing Algorithm

    NASA Astrophysics Data System (ADS)

    Zheng, Jingjing; Shen, Zhihang; Gao, Huaien; Chen, Bianna; Zheng, Weida; Xiong, Xiaoming

    2017-02-01

    In this paper, we propose an improved SoC test scheduling method based on simulated annealing algorithm (SA). It is our first to disorganize IP core assignment for each TAM to produce a new solution for SA, allocate TAM width for each TAM using greedy algorithm and calculate corresponding testing time. And accepting the core assignment according to the principle of simulated annealing algorithm and finally attain the optimum solution. Simultaneously, we run the test scheduling experiment with the international reference circuits provided by International Test Conference 2002(ITC’02) and the result shows that our algorithm is superior to the conventional integer linear programming algorithm (ILP), simulated annealing algorithm (SA) and genetic algorithm(GA). When TAM width reaches to 48,56 and 64, the testing time based on our algorithm is lesser than the classic methods and the optimization rates are 30.74%, 3.32%, 16.13% respectively. Moreover, the testing time based on our algorithm is very close to that of improved genetic algorithm (IGA), which is state-of-the-art at present.

  14. Poisson Approximation-Based Score Test for Detecting Association of Rare Variants.

    PubMed

    Fang, Hongyan; Zhang, Hong; Yang, Yaning

    2016-07-01

    Genome-wide association study (GWAS) has achieved great success in identifying genetic variants, but the nature of GWAS has determined its inherent limitations. Under the common disease rare variants (CDRV) hypothesis, the traditional association analysis methods commonly used in GWAS for common variants do not have enough power for detecting rare variants with a limited sample size. As a solution to this problem, pooling rare variants by their functions provides an efficient way for identifying susceptible genes. Rare variant typically have low frequencies of minor alleles, and the distribution of the total number of minor alleles of the rare variants can be approximated by a Poisson distribution. Based on this fact, we propose a new test method, the Poisson Approximation-based Score Test (PAST), for association analysis of rare variants. Two testing methods, namely, ePAST and mPAST, are proposed based on different strategies of pooling rare variants. Simulation results and application to the CRESCENDO cohort data show that our methods are more powerful than the existing methods. © 2016 John Wiley & Sons Ltd/University College London.

  15. Design and Test of Pseudorandom Number Generator Using a Star Network of Lorenz Oscillators

    NASA Astrophysics Data System (ADS)

    Cho, Kenichiro; Miyano, Takaya

    We have recently developed a chaos-based stream cipher based on augmented Lorenz equations as a star network of Lorenz subsystems. In our method, the augmented Lorenz equations are used as a pseudorandom number generator. In this study, we propose a new method based on the augmented Lorenz equations for generating binary pseudorandom numbers and evaluate its security using the statistical tests of SP800-22 published by the National Institute for Standards and Technology in comparison with the performances of other chaotic dynamical models used as binary pseudorandom number generators. We further propose a faster version of the proposed method and evaluate its security using the statistical tests of TestU01 published by L’Ecuyer and Simard.

  16. Porosimetric, Thermal and Strength Tests of Aerated and Nonaerated Concretes

    NASA Astrophysics Data System (ADS)

    Strzałkowski, Jarosław; Garbalińska, Halina

    2017-10-01

    The paper presents the results of porosimetry tests of lightweight concretes, obtained with three research methods. Impact of different porosity structures on the basic thermal and strength properties was also evaluated. Tests were performed, using the pressure gauge method on fresh concrete mixes, as well as using the mercury porosimetry test and optic RapidAir method on specimens prepared from mature composites. The study was conducted on lightweight concretes, based on expanded clay aggregate and fly ash aggregate, in two variants: with non-aerated and aerated cement matrix. In addition, two reference concretes, based on normal aggregate, were prepared, also in two variants of matrix aeration. Changes in thermal conductivity λ and volumetric specific heat cv throughout the first three months of curing of the concretes were examined. Additionally, tests for compressive strength on cubic samples were performed during the first three months of curing. It was found that the pressure gauge method, performed on a fresh mix, gave lowered values of porosity, compared to the other methods. The mercury porosity tests showed high sensitivity in evaluation of pores smaller than 30μm. Unfortunately, this technique is not suitable for analysing pores greater than 300μm. On the other hand, the optical method proves good in evaluation of large pores, greater than 300μm. The paper also presents results of correlation of individual methods of porosity testing. A consolidated graph of the pore structure, derived from both mercury and optical methods, was presented, too. For the all of six tested concretes, differential graphs of porosity, prepared with both methods, show a very broad convergence. The thermal test results indicate usefulness of aeration of the cement matrix of the composites based on lightweight aggregates for the further reduction of the thermal conductivity coefficient λ of the materials. The lowest values of the λ coefficient were obtained for the aerated concretes based of fly ash aggregate. A diminishing influence of aeration on the volumetric heat capacity cv is clearly seen. Simultaneous aeration of the matrix and use of lightweight aggregates brought about also a significant decrease in the average compressive strength fcm of the tested composites.

  17. Real-time cartesian force feedback control of a teleoperated robot

    NASA Technical Reports Server (NTRS)

    Campbell, Perry

    1989-01-01

    Active cartesian force control of a teleoperated robot is investigated. An economical microcomputer based control method was tested. Limitations are discussed and methods of performance improvement suggested. To demonstrate the performance of this technique, a preliminary test was performed with success. A general purpose bilateral force reflecting hand controller is currently being constructed based on this control method.

  18. Catch-up validation study of an in vitro skin irritation test method based on an open source reconstructed epidermis (phase II).

    PubMed

    Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R

    2016-10-01

    To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Using Patterns of Summed Scores in Paper-and-Pencil Tests and Computer-Adaptive Tests to Detect Misfitting Item Score Patterns

    ERIC Educational Resources Information Center

    Meijer, Rob R.

    2004-01-01

    Two new methods have been proposed to determine unexpected sum scores on sub-tests (testlets) both for paper-and-pencil tests and computer adaptive tests. A method based on a conservative bound using the hypergeometric distribution, denoted p, was compared with a method where the probability for each score combination was calculated using a…

  20. Specific Yields Estimated from Gravity Change during Pumping Test

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Hwang, C.; Chang, L. C.

    2017-12-01

    Specific yield (Sy) is the most important parameter to describe available groundwater capacity in an unconfined aquifer. When estimating Sy by a field pumping test, aquifer heterogeneity and well performers will cause a large uncertainty. In this study, we use a gravity-based method to estimate Sy. At the time of pumping test, amounts of mass (groundwater) are forced to be taken out. If drawdown corn is big and close enough to high precision gravimeter, the gravity change can be detected. The gravity-based method use gravity observations that are independent from traditional flow computation. Only the drawdown corn should be modeled with observed head and hydrogeology data. The gravity method can be used in most groundwater field tests, such as locally pumping/injection tests initiated by active man-made or annual variations due to natural sources. We apply our gravity method at few sites in Taiwan situated over different unconfined aquifer. Here pumping tests for Sy determinations were also carried out. We will discuss why the gravity method produces different results from traditional pumping test, field designs and limitations of the gravity method.

  1. Fatigue crack identification method based on strain amplitude changing

    NASA Astrophysics Data System (ADS)

    Guo, Tiancai; Gao, Jun; Wang, Yonghong; Xu, Youliang

    2017-09-01

    Aiming at the difficulties in identifying the location and time of crack initiation in the castings of helicopter transmission system during fatigue tests, by introducing the classification diagnostic criteria of similar failure mode to find out the similarity of fatigue crack initiation among castings, an engineering method and quantitative criterion for detecting fatigue cracks based on strain amplitude changing is proposed. This method is applied on the fatigue test of a gearbox housing, whose results indicates: during the fatigue test, the system alarms when SC strain meter reaches the quantitative criterion. The afterwards check shows that a fatigue crack less than 5mm is found at the corresponding location of SC strain meter. The test result proves that the method can provide accurate test data for strength life analysis.

  2. Transmitted wavefront testing with large dynamic range based on computer-aided deflectometry

    NASA Astrophysics Data System (ADS)

    Wang, Daodang; Xu, Ping; Gong, Zhidong; Xie, Zhongmin; Liang, Rongguang; Xu, Xinke; Kong, Ming; Zhao, Jun

    2018-06-01

    The transmitted wavefront testing technique is demanded for the performance evaluation of transmission optics and transparent glass, in which the achievable dynamic range is a key issue. A computer-aided deflectometric testing method with fringe projection is proposed for the accurate testing of transmitted wavefronts with a large dynamic range. Ray tracing of the modeled testing system is carried out to achieve the virtual ‘null’ testing of transmitted wavefront aberrations. The ray aberration is obtained from the ray tracing result and measured slope, with which the test wavefront aberration can be reconstructed. To eliminate testing system modeling errors, a system geometry calibration based on computer-aided reverse optimization is applied to realize accurate testing. Both numerical simulation and experiments have been carried out to demonstrate the feasibility and high accuracy of the proposed testing method. The proposed testing method can achieve a large dynamic range compared with the interferometric method, providing a simple, low-cost and accurate way for the testing of transmitted wavefronts from various kinds of optics and a large amount of industrial transmission elements.

  3. Electric vehicle chassis dynamometer test methods at JPL and their correlation to track tests

    NASA Technical Reports Server (NTRS)

    Marte, J.; Bryant, J.

    1983-01-01

    Early in its electric vehicle (EV) test program, JPL recognized that EV test procedures were too vague and too loosely defined to permit much meaningful data to be obtained from the testing. Therefore, JPL adopted more stringent test procedures and chose the chassis dynamometer rather than the track as its principal test technique. Through the years, test procedures continued to evolve towards a methodology based on chassis dynamometers which would exhibit good correlation with track testing. Based on comparative dynamometer and track test results on the ETV-1 vehicle, the test methods discussed in this report demonstrate a means by which excellent track-to-dynamometer correlation can be obtained.

  4. Improving lab compaction specifications for flexible bases within the Texas DOT.

    DOT National Transportation Integrated Search

    2009-04-01

    In Test Methods Tex-113-E and Tex-114-E, the Texas Department of Transportation (TxDOT) employs an impact hammer method of sample compaction for laboratory preparation of road base and subgrade materials for testing. In this third and final report do...

  5. Pixel-based absolute surface metrology by three flat test with shifted and rotated maps

    NASA Astrophysics Data System (ADS)

    Zhai, Dede; Chen, Shanyong; Xue, Shuai; Yin, Ziqiang

    2018-03-01

    In traditional three flat test, it only provides the absolute profile along one surface diameter. In this paper, an absolute testing algorithm based on shift-rotation with three flat test has been proposed to reconstruct two-dimensional surface exactly. Pitch and yaw error during shift procedure is analyzed and compensated in our method. Compared with multi-rotation method proposed before, it only needs a 90° rotation and a shift, which is easy to carry out especially in condition of large size surface. It allows pixel level spatial resolution to be achieved without interpolation or assumption to the test surface. In addition, numerical simulations and optical tests are implemented and show the high accuracy recovery capability of the proposed method.

  6. NEAT: an efficient network enrichment analysis test.

    PubMed

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-09-05

    Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).

  7. Selecting Measures to Evaluate Complex Sociotechnical Systems: An Empirical Comparison of a Task-based and Constraint-based Method

    DTIC Science & Technology

    2013-07-01

    experimental requirements of the research are described (See Appendix A for a full description of the development and testing ). 3.1.3 The Black...41 3. TEST SYSTEM USED FOR THE RESEARCH ...Chapter 3: Test system used for the research A complex socio-technical system is required to compare the methods. An emulation of a radar warning

  8. Uptake and linkage into care over one year of providing HIV testing and counselling through community and health facility testing modalities in urban informal settlement of Kibera, Nairobi Kenya.

    PubMed

    Muhula, Samuel; Memiah, Peter; Mbau, Lilian; Oruko, Happiness; Baker, Bebora; Ikiara, Geoffrey; Mungai, Margaret; Ndirangu, Meshack; Achwoka, Dunstan; Ilako, Festus

    2016-05-04

    We examine the uptake of HIV Testing and Counselling (HTC) and linkage into care over one year of providing HTC through community and health facility testing modalities among people living in Kibera informal urban settlement in Nairobi Kenya. We analyzed program data on health facility-based HIV testing and counselling and community- based testing and counselling approaches for the period starting October 2013 to September 2014. Univariate and bivariate analysis methods were used to compare the two approaches with regard to uptake of HTC and subsequent linkage to care. The exact Confidence Intervals (CI) to the proportions were approximated using simple normal approximation to binomial distribution method. Majority of the 18,591 clients were tested through health facility-based testing approaches 72.5 % (n = 13485) vs those tested through community-based testing comprised 27.5 % (n = 5106). More clients tested at health facilities were reached through Provider Initiated Testing and Counselling PITC 81.7 % (n = 11015) while 18.3 % were reached through Voluntary Counselling and Testing (VCT)/Client Initiated Testing and Counselling (CITC) services. All clients who tested positive during health facility-based testing were successfully linked to care either at the project sites or sites of client choice while not all who tested positive during community based testing were linked to care. The HIV prevalence among all those who were tested for HIV in the program was 5.2 % (n = 52, 95 % CI: 3.9 %-6.8 %). Key study limitation included use of aggregate data to report uptake of HTC through the two testing approaches and not being able to estimate the population in the catchment area likely to test for HIV. Health facility-based HTC approach achieved more clients tested for HIV, and this method also resulted in identifying greater numbers of people who were HIV positive in Kibera slum within one year period of testing for HIV compared to community-based HTC approach. Linking HIV positive clients to care proved much easier during health facility- based HTC compared to community- based HTC.

  9. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood

    ERIC Educational Resources Information Center

    Karabatsos, George

    2017-01-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon…

  10. Evaluation of medical students of teacher-based and student-based teaching methods in Infectious diseases course.

    PubMed

    Ghasemzadeh, I; Aghamolaei, T; Hosseini-Parandar, F

    2015-01-01

    Introduction: In recent years, medical education has changed dramatically and many medical schools in the world have been trying for expand modern training methods. Purpose of the research is to appraise the medical students of teacher-based and student-based teaching methods in Infectious diseases course, in the Medical School of Hormozgan Medical Sciences University. Methods: In this interventional study, a total of 52 medical scholars that used Section in this Infectious diseases course were included. About 50% of this course was presented by a teacher-based teaching method (lecture) and 50% by a student-based teaching method (problem-based learning). The satisfaction of students regarding these methods was assessed by a questionnaire and a test was used to measure their learning. information are examined with using SPSS 19 and paired t-test. Results: The satisfaction of students of student-based teaching method (problem-based learning) was more positive than their satisfaction of teacher-based teaching method (lecture).The mean score of students in teacher-based teaching method was 12.03 (SD=4.08) and in the student-based teaching method it was 15.50 (SD=4.26) and where is a considerable variation among them (p<0.001). Conclusion: The use of the student-based teaching method (problem-based learning) in comparison with the teacher-based teaching method (lecture) to present the Infectious diseases course led to the student satisfaction and provided additional learning opportunities.

  11. Performance of the AOAC use-dilution method with targeted modifications: collaborative study.

    PubMed

    Tomasino, Stephen F; Parker, Albert E; Hamilton, Martin A; Hamilton, Gordon C

    2012-01-01

    The U.S. Environmental Protection Agency (EPA), in collaboration with an industry work group, spearheaded a collaborative study designed to further enhance the AOAC use-dilution method (UDM). Based on feedback from laboratories that routinely conduct the UDM, improvements to the test culture preparation steps were prioritized. A set of modifications, largely based on culturing the test microbes on agar as specified in the AOAC hard surface carrier test method, were evaluated in a five-laboratory trial. The modifications targeted the preparation of the Pseudomonas aeruginosa test culture due to the difficulty in separating the pellicle from the broth in the current UDM. The proposed modifications (i.e., the modified UDM) were compared to the current UDM methodology for P. aeruginosa and Staphylococcus aureus. Salmonella choleraesuis was not included in the study. The goal was to determine if the modifications reduced method variability. Three efficacy response variables were statistically analyzed: the number of positive carriers, the log reduction, and the pass/fail outcome. The scope of the collaborative study was limited to testing one liquid disinfectant (an EPA-registered quaternary ammonium product) at two levels of presumed product efficacies, high and low. Test conditions included use of 400 ppm hard water as the product diluent and a 5% organic soil load (horse serum) added to the inoculum. Unfortunately, the study failed to support the adoption of the major modification (use of an agar-based approach to grow the test cultures) based on an analysis of method's variability. The repeatability and reproducibility standard deviations for the modified method were equal to or greater than those for the current method across the various test variables. However, the authors propose retaining the frozen stock preparation step of the modified method, and based on the statistical equivalency of the control log densities, support its adoption as a procedural change to the current UDM. The current UDM displayed acceptable responsiveness to changes in product efficacy; acceptable repeatability across multiple tests in each laboratory for the control counts and log reductions; and acceptable reproducibility across multiple laboratories for the control log density values and log reductions. Although the data do not support the adoption of all modifications, the UDM collaborative study data are valuable for assessing sources of method variability and a reassessment of the performance standard for the UDM.

  12. Testing non-inferiority of a new treatment in three-arm clinical trials with binary endpoints.

    PubMed

    Tang, Nian-Sheng; Yu, Bin; Tang, Man-Lai

    2014-12-18

    A two-arm non-inferiority trial without a placebo is usually adopted to demonstrate that an experimental treatment is not worse than a reference treatment by a small pre-specified non-inferiority margin due to ethical concerns. Selection of the non-inferiority margin and establishment of assay sensitivity are two major issues in the design, analysis and interpretation for two-arm non-inferiority trials. Alternatively, a three-arm non-inferiority clinical trial including a placebo is usually conducted to assess the assay sensitivity and internal validity of a trial. Recently, some large-sample approaches have been developed to assess the non-inferiority of a new treatment based on the three-arm trial design. However, these methods behave badly with small sample sizes in the three arms. This manuscript aims to develop some reliable small-sample methods to test three-arm non-inferiority. Saddlepoint approximation, exact and approximate unconditional, and bootstrap-resampling methods are developed to calculate p-values of the Wald-type, score and likelihood ratio tests. Simulation studies are conducted to evaluate their performance in terms of type I error rate and power. Our empirical results show that the saddlepoint approximation method generally behaves better than the asymptotic method based on the Wald-type test statistic. For small sample sizes, approximate unconditional and bootstrap-resampling methods based on the score test statistic perform better in the sense that their corresponding type I error rates are generally closer to the prespecified nominal level than those of other test procedures. Both approximate unconditional and bootstrap-resampling test procedures based on the score test statistic are generally recommended for three-arm non-inferiority trials with binary outcomes.

  13. Review and test of chilldown methods for space-based cryogenic tanks

    NASA Astrophysics Data System (ADS)

    Chato, David J.; Sanabria, Rafael

    The literature for tank chilldown methods applicable to cryogenic tankage in the zero gravity environment of earth orbit is reviewed. One method is selected for demonstration in a ground based test. The method selected for investigation was the charge-hold-vent method which uses repeated injection of liquid slugs, followed by a hold to allow complete vaporization of the liquid and a vent of the tank to space vacuum to cool tankage to the desired temperature. The test was conducted on a 175 cubic foot, 2219 aluminum walled tank weighing 329 pounds, which was previously outfitted with spray systems to test nonvented fill technologies. To minimize hardware changes, a simple control-by-pressure scheme was implemented to control injected liquid quantities. The tank cooled from 440 R sufficiently in six charge-hold-vent cycles to allow a complete nonvented fill of the test tank. Liquid hydrogen consumed in the process is estimated at 32 pounds.

  14. Review and test of chilldown methods for space-based cryogenic tanks

    NASA Technical Reports Server (NTRS)

    Chato, David J.; Sanabria, Rafael

    1991-01-01

    The literature for tank chilldown methods applicable to cryogenic tankage in the zero gravity environment of earth orbit is reviewed. One method is selected for demonstration in a ground based test. The method selected for investigation was the charge-hold-vent method which uses repeated injection of liquid slugs, followed by a hold to allow complete vaporization of the liquid and a vent of the tank to space vacuum to cool tankage to the desired temperature. The test was conducted on a 175 cubic foot, 2219 aluminum walled tank weighing 329 pounds, which was previously outfitted with spray systems to test nonvented fill technologies. To minimize hardware changes, a simple control-by-pressure scheme was implemented to control injected liquid quantities. The tank cooled from 440 R sufficiently in six charge-hold-vent cycles to allow a complete nonvented fill of the test tank. Liquid hydrogen consumed in the process is estimated at 32 pounds.

  15. A review of propeller noise prediction methodology: 1919-1994

    NASA Technical Reports Server (NTRS)

    Metzger, F. Bruce

    1995-01-01

    This report summarizes a review of the literature regarding propeller noise prediction methods. The review is divided into six sections: (1) early methods; (2) more recent methods based on earlier theory; (3) more recent methods based on the Acoustic Analogy; (4) more recent methods based on Computational Acoustics; (5) empirical methods; and (6) broadband methods. The report concludes that there are a large number of noise prediction procedures available which vary markedly in complexity. Deficiencies in accuracy of methods in many cases may be related, not to the methods themselves, but the accuracy and detail of the aerodynamic inputs used to calculate noise. The steps recommended in the report to provide accurate and easy to use prediction methods are: (1) identify reliable test data; (2) define and conduct test programs to fill gaps in the existing data base; (3) identify the most promising prediction methods; (4) evaluate promising prediction methods relative to the data base; (5) identify and correct the weaknesses in the prediction methods, including lack of user friendliness, and include features now available only in research codes; (6) confirm the accuracy of improved prediction methods to the data base; and (7) make the methods widely available and provide training in their use.

  16. High-resolution image reconstruction technique applied to the optical testing of ground-based astronomical telescopes

    NASA Astrophysics Data System (ADS)

    Jin, Zhenyu; Lin, Jing; Liu, Zhong

    2008-07-01

    By study of the classical testing techniques (such as Shack-Hartmann Wave-front Sensor) adopted in testing the aberration of ground-based astronomical optical telescopes, we bring forward two testing methods on the foundation of high-resolution image reconstruction technology. One is based on the averaged short-exposure OTF and the other is based on the Speckle Interferometric OTF by Antoine Labeyrie. Researches made by J.Ohtsubo, F. Roddier, Richard Barakat and J.-Y. ZHANG indicated that the SITF statistical results would be affected by the telescope optical aberrations, which means the SITF statistical results is a function of optical system aberration and the atmospheric Fried parameter (seeing). Telescope diffraction-limited information can be got through two statistics methods of abundant speckle images: by the first method, we can extract the low frequency information such as the full width at half maximum (FWHM) of the telescope PSF to estimate the optical quality; by the second method, we can get a more precise description of the telescope PSF with high frequency information. We will apply the two testing methods to the 2.4m optical telescope of the GMG Observatory, in china to validate their repeatability and correctness and compare the testing results with that of the Shack-Hartmann Wave-Front Sensor got. This part will be described in detail in our paper.

  17. High Frequency Vibration Based Fatigue Testing of Developmental Alloys

    NASA Astrophysics Data System (ADS)

    Holycross, Casey M.; Srinivasan, Raghavan; George, Tommy J.; Tamirisakandala, Seshacharyulu; Russ, Stephan M.

    Many fatigue test methods have been previously developed to rapidly evaluate fatigue behavior. This increased test speed can come at some expense, since these methods may require non-standard specimen geometry or increased facility and equipment capability. One such method, developed by George et al, involves a base-excited plate specimen driven into a high frequency bending resonant mode. This resonant mode is of sufficient frequency (typically 1200 to 1700 Hertz) to accumulate 107 cycles in a few hours. One of the main limitations of this test method is that fatigue cracking is almost certainly guaranteed to be surface initiated at regions of high stress. This brings into question the validity of the fatigue test results, as compared to more traditional uniaxial, smooth-bar testing, since high stresses are subjecting only a small volume to fatigue damage. This limitation also brings into question the suitability of this method to screen developmental alloys, should their initiation life be governed by subsurface flaws. However, if applicable, the rapid generation of fatigue data using this method would facilitate faster design iterations, identifying more quickly, material and manufacturing process deficiencies. The developmental alloy used in this study was a powder metallurgy boron-modified Ti-6Al-4V, a new alloy currently being considered for gas turbine engine fan blades. Plate specimens were subjected to fully reversed bending fatigue. Results are compared with existing data from commercially available Ti-6Al-4V using both vibration based and more traditional fatigue test methods.

  18. Jaccard distance based weighted sparse representation for coarse-to-fine plant species recognition.

    PubMed

    Zhang, Shanwen; Wu, Xiaowei; You, Zhuhong

    2017-01-01

    Leaf based plant species recognition plays an important role in ecological protection, however its application to large and modern leaf databases has been a long-standing obstacle due to the computational cost and feasibility. Recognizing such limitations, we propose a Jaccard distance based sparse representation (JDSR) method which adopts a two-stage, coarse to fine strategy for plant species recognition. In the first stage, we use the Jaccard distance between the test sample and each training sample to coarsely determine the candidate classes of the test sample. The second stage includes a Jaccard distance based weighted sparse representation based classification(WSRC), which aims to approximately represent the test sample in the training space, and classify it by the approximation residuals. Since the training model of our JDSR method involves much fewer but more informative representatives, this method is expected to overcome the limitation of high computational and memory costs in traditional sparse representation based classification. Comparative experimental results on a public leaf image database demonstrate that the proposed method outperforms other existing feature extraction and SRC based plant recognition methods in terms of both accuracy and computational speed.

  19. Implementation of the soil compactor analyzer into test method TEX-113-E : technical report.

    DOT National Transportation Integrated Search

    2012-04-01

    Test method Tex-113-E prepares laboratory aggregate base test specimens with an impact hammer : compactor. These specimens are used for compaction characteristics and design tests. Although the : historical Tex-113-E required a certain amount of comp...

  20. Versatile light-emitting-diode-based spectral response measurement system for photovoltaic device characterization.

    PubMed

    Hamadani, Behrang H; Roller, John; Dougherty, Brian; Yoon, Howard W

    2012-07-01

    An absolute differential spectral response measurement system for solar cells is presented. The system couples an array of light emitting diodes with an optical waveguide to provide large area illumination. Two unique yet complementary measurement methods were developed and tested with the same measurement apparatus. Good agreement was observed between the two methods based on testing of a variety of solar cells. The first method is a lock-in technique that can be performed over a broad pulse frequency range. The second method is based on synchronous multifrequency optical excitation and electrical detection. An innovative scheme for providing light bias during each measurement method is discussed.

  1. A Simulation Comparison of Parametric and Nonparametric Dimensionality Detection Procedures

    ERIC Educational Resources Information Center

    Mroch, Andrew A.; Bolt, Daniel M.

    2006-01-01

    Recently, nonparametric methods have been proposed that provide a dimensionally based description of test structure for tests with dichotomous items. Because such methods are based on different notions of dimensionality than are assumed when using a psychometric model, it remains unclear whether these procedures might lead to a different…

  2. Introduction to Permutation and Resampling-Based Hypothesis Tests

    ERIC Educational Resources Information Center

    LaFleur, Bonnie J.; Greevy, Robert A.

    2009-01-01

    A resampling-based method of inference--permutation tests--is often used when distributional assumptions are questionable or unmet. Not only are these methods useful for obvious departures from parametric assumptions (e.g., normality) and small sample sizes, but they are also more robust than their parametric counterparts in the presences of…

  3. An Information-Correction Method for Testlet-Based Test Analysis: From the Perspectives of Item Response Theory and Generalizability Theory. Research Report. ETS RR-17-27

    ERIC Educational Resources Information Center

    Li, Feifei

    2017-01-01

    An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…

  4. Investigating Measurement Invariance in Computer-Based Personality Testing: The Impact of Using Anchor Items on Effect Size Indices

    ERIC Educational Resources Information Center

    Egberink, Iris J. L.; Meijer, Rob R.; Tendeiro, Jorge N.

    2015-01-01

    A popular method to assess measurement invariance of a particular item is based on likelihood ratio tests with all other items as anchor items. The results of this method are often only reported in terms of statistical significance, and researchers proposed different methods to empirically select anchor items. It is unclear, however, how many…

  5. Integrating conventional and inverse representation for face recognition.

    PubMed

    Xu, Yong; Li, Xuelong; Yang, Jian; Lai, Zhihui; Zhang, David

    2014-10-01

    Representation-based classification methods are all constructed on the basis of the conventional representation, which first expresses the test sample as a linear combination of the training samples and then exploits the deviation between the test sample and the expression result of every class to perform classification. However, this deviation does not always well reflect the difference between the test sample and each class. With this paper, we propose a novel representation-based classification method for face recognition. This method integrates conventional and the inverse representation-based classification for better recognizing the face. It first produces conventional representation of the test sample, i.e., uses a linear combination of the training samples to represent the test sample. Then it obtains the inverse representation, i.e., provides an approximation representation of each training sample of a subject by exploiting the test sample and training samples of the other subjects. Finally, the proposed method exploits the conventional and inverse representation to generate two kinds of scores of the test sample with respect to each class and combines them to recognize the face. The paper shows the theoretical foundation and rationale of the proposed method. Moreover, this paper for the first time shows that a basic nature of the human face, i.e., the symmetry of the face can be exploited to generate new training and test samples. As these new samples really reflect some possible appearance of the face, the use of them will enable us to obtain higher accuracy. The experiments show that the proposed conventional and inverse representation-based linear regression classification (CIRLRC), an improvement to linear regression classification (LRC), can obtain very high accuracy and greatly outperforms the naive LRC and other state-of-the-art conventional representation based face recognition methods. The accuracy of CIRLRC can be 10% greater than that of LRC.

  6. Indirect potentiometric titration of ascorbic acid in pharmaceutical preparations using copper based mercury film electrode.

    PubMed

    Abdul Kamal Nazer, Meeran Mohideen; Hameed, Abdul Rahman Shahul; Riyazuddin, Patel

    2004-01-01

    A simple and rapid potentiometric method for the estimation of ascorbic acid in pharmaceutical dosage forms has been developed. The method is based on treating ascorbic acid with iodine and titration of the iodide produced equivalent to ascorbic acid with silver nitrate using Copper Based Mercury Film Electrode (CBMFE) as an indicator electrode. Interference study was carried to check possible interference of usual excipients and other vitamins. The precision and accuracy of the method was assessed by the application of lack-of-fit test and other statistical methods. The results of the proposed method and British Pharmacopoeia method were compared using F and t-statistical tests of significance.

  7. Web-based versus traditional lecture: are they equally effective as a flexible bronchoscopy teaching method?

    PubMed

    Mata, Caio Augusto Sterse; Ota, Luiz Hirotoshi; Suzuki, Iunis; Telles, Adriana; Miotto, Andre; Leão, Luiz Eduardo Vilaça

    2012-01-01

    This study compares the traditional live lecture to a web-based approach in the teaching of bronchoscopy and evaluates the positive and negative aspects of both methods. We developed a web-based bronchoscopy curriculum, which integrates texts, images and animations. It was applied to first-year interns, who were later administered a multiple-choice test. Another group of eight first-year interns received the traditional teaching method and the same test. The two groups were compared using the Student's t-test. The mean scores (± SD) of students who used the website were 14.63 ± 1.41 (range 13-17). The test scores of the other group had the same range, with a mean score of 14.75 ± 1. The Student's t-test showed no difference between the test results. The common positive point noted was the presence of multimedia content. The web group cited as positive the ability to review the pages, and the other one the role of the teacher. Web-based bronchoscopy education showed results similar to the traditional live lecture in effectiveness.

  8. A New Approach in Force-Limited Vibration Testing of Flight Hardware

    NASA Technical Reports Server (NTRS)

    Kolaini, Ali R.; Kern, Dennis L.

    2012-01-01

    The force-limited vibration test approaches discussed in NASA-7004C were developed to reduce overtesting associated with base shake vibration tests of aerospace hardware where the interface responses are excited coherently. This handbook outlines several different methods of specifying the force limits. The rationale for force limiting is based on the disparity between the impedances of typical aerospace mounting structures and the large impedances of vibration test shakers when the interfaces in general are coherently excited. Among these approaches, the semi-empirical method is presently the most widely used method to derive the force limits. The inclusion of the incoherent excitation of the aerospace structures at mounting interfaces has not been accounted for in the past and provides the basis for more realistic force limits for qualifying the hardware using shaker testing. In this paper current methods for defining the force limiting specifications discussed in the NASA handbook are reviewed using data from a series of acoustic and vibration tests. A new approach based on considering the incoherent excitation of the structural mounting interfaces using acoustic test data is also discussed. It is believed that the new approach provides much more realistic force limits that may further remove conservatism inherent in shaker vibration testing not accounted for by methods discussed in the NASA handbook. A discussion on using FEM/BEM analysis to obtain realistic force limits for flight hardware is provided.

  9. A Method of DTM Construction Based on Quadrangular Irregular Networks and Related Error Analysis

    PubMed Central

    Kang, Mengjun

    2015-01-01

    A new method of DTM construction based on quadrangular irregular networks (QINs) that considers all the original data points and has a topological matrix is presented. A numerical test and a real-world example are used to comparatively analyse the accuracy of QINs against classical interpolation methods and other DTM representation methods, including SPLINE, KRIGING and triangulated irregular networks (TINs). The numerical test finds that the QIN method is the second-most accurate of the four methods. In the real-world example, DTMs are constructed using QINs and the three classical interpolation methods. The results indicate that the QIN method is the most accurate method tested. The difference in accuracy rank seems to be caused by the locations of the data points sampled. Although the QIN method has drawbacks, it is an alternative method for DTM construction. PMID:25996691

  10. An analysis of the static load test on single square pile of 40x40 cm2, using finite element method in Rusunawa project, Jatinegara, Jakarta

    NASA Astrophysics Data System (ADS)

    Harasid, Harun; Roesyanto; Iskandar, Rudi; Silalahi, Sofyan A.

    2018-03-01

    Piling Foundation is one of the foundations which is used to penetrate its load through soil layer. The power carried by the piling is obtained from the end bearing capacity, that is, the compressive end piling and friction bearing capacity obtained from friction bearing and adhesive capacity between the piling and the soil around it. The investigation on the Standard Penetration Test is aimed to get the description of soil layer, based on the type and color of soil through visual observation, and soil characteristics. SPT data can be used to calculate bearing capacity. Besides investigating the SPT, this study is also been equipped by taking the samples in laboratory and loading test on the piling and Ducth Cone Penetrometer (DCP) data to confirm its bearing capacity. This study analyzed bearing capacity and settlement in the square pile of 40X40 cm in diameter in a single pile or grouped, using an empirical method, AllPile program, Plaxis program, and comparing the result with interpreting its loading test in the foundation of Rusunawa project, Jatinegara, Jakarta. The analysis was been done by using the data on soil investigation and laboratory by comparing them with Mohr-Coulomb soil model. Ultimate bearing capacity from the SPT data in the piling of 15.4 meters was 189.81 tons and the parameter of soil shear strength was 198.67 tons. The sander point, based on Aoki and De Alencar bearing capacity was 276.241 tons and based on Mayerhoff it was 305.49 tons. Based on the loading test of bearing capacity, unlimited bearing capacity for the three methods was Davisson (260 tons), Mazurkiewich (270 tons), and Chin (250 tons). The efficiency of grouped piles according to Converse-Library Equation method = 0.73, according to Los Angeles Group Action Equation method = 0.59, and according to Sheila-Keeny method = 0.94. Bearing capacity based on piling strength was 221.76 tons, bearing capacity based on calendaring data was 201.71 tons, and lateral bearing capacity of a single piling foundation was 129.6 kN (12.96 tons). When the maximum load (280 tons) was been given, more decrease occurred in the Maintained load test of 21.00 mm and Quick Load Test method of 20.67 mm, compared with the result of Load Test in the field of 18.74 mm. Based on ASTM D1143/81, the permitted value was 25.40 mm. Therefore, based on that decreasing, it could be concluded that foundation piles were safe in the construction. The pore water pressure is highly influenced by time so that in Maintained Load Test and Quick Load Test, there was the disparity in the level of pore water pressure. Based on the result of the calculation, Quick Load Test showed that in pore water pressure was dissipated in its acceleration.

  11. Optimal Test Design with Rule-Based Item Generation

    ERIC Educational Resources Information Center

    Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.

    2013-01-01

    Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…

  12. A degradation-based sorting method for lithium-ion battery reuse.

    PubMed

    Chen, Hao; Shen, Julia

    2017-01-01

    In a world where millions of people are dependent on batteries to provide them with convenient and portable energy, battery recycling is of the utmost importance. In this paper, we developed a new method to sort 18650 Lithium-ion batteries in large quantities and in real time for harvesting used cells with enough capacity for battery reuse. Internal resistance and capacity tests were conducted as a basis for comparison with a novel degradation-based method based on X-ray radiographic scanning and digital image contrast computation. The test results indicate that the sorting accuracy of the test cells is about 79% and the execution time of our algorithm is at a level of 200 milliseconds, making our method a potential real-time solution for reusing the remaining capacity in good used cells.

  13. A prevalence-based association test for case-control studies.

    PubMed

    Ryckman, Kelli K; Jiang, Lan; Li, Chun; Bartlett, Jacquelaine; Haines, Jonathan L; Williams, Scott M

    2008-11-01

    Genetic association is often determined in case-control studies by the differential distribution of alleles or genotypes. Recent work has demonstrated that association can also be assessed by deviations from the expected distributions of alleles or genotypes. Specifically, multiple methods motivated by the principles of Hardy-Weinberg equilibrium (HWE) have been developed. However, these methods do not take into account many of the assumptions of HWE. Therefore, we have developed a prevalence-based association test (PRAT) as an alternative method for detecting association in case-control studies. This method, also motivated by the principles of HWE, uses an estimated population allele frequency to generate expected genotype frequencies instead of using the case and control frequencies separately. Our method often has greater power, under a wide variety of genetic models, to detect association than genotypic, allelic or Cochran-Armitage trend association tests. Therefore, we propose PRAT as a powerful alternative method of testing for association.

  14. Protostellar hydrodynamics: Constructing and testing a spacially and temporally second-order accurate method. 2: Cartesian coordinates

    NASA Technical Reports Server (NTRS)

    Myhill, Elizabeth A.; Boss, Alan P.

    1993-01-01

    In Boss & Myhill (1992) we described the derivation and testing of a spherical coordinate-based scheme for solving the hydrodynamic equations governing the gravitational collapse of nonisothermal, nonmagnetic, inviscid, radiative, three-dimensional protostellar clouds. Here we discuss a Cartesian coordinate-based scheme based on the same set of hydrodynamic equations. As with the spherical coorrdinate-based code, the Cartesian coordinate-based scheme employs explicit Eulerian methods which are both spatially and temporally second-order accurate. We begin by describing the hydrodynamic equations in Cartesian coordinates and the numerical methods used in this particular code. Following Finn & Hawley (1989), we pay special attention to the proper implementations of high-order accuracy, finite difference methods. We evaluate the ability of the Cartesian scheme to handle shock propagation problems, and through convergence testing, we show that the code is indeed second-order accurate. To compare the Cartesian scheme discussed here with the spherical coordinate-based scheme discussed in Boss & Myhill (1992), the two codes are used to calculate the standard isothermal collapse test case described by Bodenheimer & Boss (1981). We find that with the improved codes, the intermediate bar-configuration found previously disappears, and the cloud fragments directly into a binary protostellar system. Finally, we present the results from both codes of a new test for nonisothermal protostellar collapse.

  15. The Effects of Web-Based Interactive Virtual Tours on the Development of Prospective Mathematics Teachers' Spatial Skills

    ERIC Educational Resources Information Center

    Kurtulus, Aytac

    2013-01-01

    The aim of this study was to investigate the effects of web-based interactive virtual tours on the development of prospective mathematics teachers' spatial skills. The study was designed based on experimental method. The "one-group pre-test post-test design" of this method was taken as the research model. The study was conducted with 3rd year…

  16. Item Selection for the Development of Parallel Forms from an IRT-Based Seed Test Using a Sampling and Classification Approach

    ERIC Educational Resources Information Center

    Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan

    2012-01-01

    Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…

  17. Implementation of an Improved Adaptive Testing Theory

    ERIC Educational Resources Information Center

    Al-A'ali, Mansoor

    2007-01-01

    Computer adaptive testing is the study of scoring tests and questions based on assumptions concerning the mathematical relationship between examinees' ability and the examinees' responses. Adaptive student tests, which are based on item response theory (IRT), have many advantages over conventional tests. We use the least square method, a…

  18. A rule-based software test data generator

    NASA Technical Reports Server (NTRS)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  19. Assessment of corrosion fatigue damage by acoustic emission and periodic proof tests

    NASA Astrophysics Data System (ADS)

    Mehdizadeh, P.

    1976-03-01

    The development of a better nondestructive inspection method for detecting corrosion fatigue damage based on acoustic emission (AE) and periodic proof testing (PPT) is studied for corrosion fatigue tests in salt water solution under tension-tension loading. It is shown that PPT combined with AE monitoring can be a sensitive method for assessing the progress of corrosion fatigue damage as the continuous AE monitoring method. The AE-PPT technique is shown to be dependent on the geometry and size of the crack relative to the test specimen. A qualitative method based on plateauing of acoustic emission counts during proof tests due to changes in the fracture mode is used to predict the remaining fatigue life up to 70% of the actual values. PPT is shown to have no adverse effect on fatigue performance in salt water.

  20. Optimal Stratification of Item Pools in a-Stratified Computerized Adaptive Testing.

    ERIC Educational Resources Information Center

    Chang, Hua-Hua; van der Linden, Wim J.

    2003-01-01

    Developed a method based on 0-1 linear programming to stratify an item pool optimally for use in alpha-stratified adaptive testing. Applied the method to a previous item pool from the computerized adaptive test of the Graduate Record Examinations. Results show the new method performs well in practical situations. (SLD)

  1. A Comparison of Methods for Transforming Sentences into Test Questions for Instructional Materials. Technical Report #1.

    ERIC Educational Resources Information Center

    Roid, Gale; And Others

    Several measurement theorists have convincingly argued that methods of writing test questions, particularly for criterion-referenced tests, should be based on operationally defined rules. This study was designed to examine and further refine a method for objectively generating multiple-choice questions for prose instructional materials. Important…

  2. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  3. Pilot Test of a Novel Method for Assessing Community Response to Low-Amplitude Sonic Booms

    NASA Technical Reports Server (NTRS)

    Fidell, Sanford; Horonjeff, Richard D.; Harris, Michael

    2012-01-01

    A pilot test of a novel method for assessing residents annoyance to sonic booms was performed. During a two-week period, residents of the base housing area at Edwards Air Force Base provided data on their reactions to sonic booms using Smartphone-based interviews. Noise measurements were conducted at the same time. The report presents information about data collection methods and about test participants reactions to low-amplitude sonic booms. The latter information should not be viewed as definitive for several reasons. It may not be reliably generalized to the wider U.S. residential population (because it was not derived from a representative random sample) and the sample itself was not large.

  4. Flight Test Results of a GPS-Based Pitot-Static Calibration Method Using Output-Error Optimization for a Light Twin-Engine Airplane

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Kiszely, Paul; Foster, John V.

    2011-01-01

    As part of the NASA Aviation Safety Program (AvSP), a novel pitot-static calibration method was developed to allow rapid in-flight calibration for subscale aircraft while flying within confined test areas. This approach uses Global Positioning System (GPS) technology coupled with modern system identification methods that rapidly computes optimal pressure error models over a range of airspeed with defined confidence bounds. This method has been demonstrated in subscale flight tests and has shown small 2- error bounds with significant reduction in test time compared to other methods. The current research was motivated by the desire to further evaluate and develop this method for full-scale aircraft. A goal of this research was to develop an accurate calibration method that enables reductions in test equipment and flight time, thus reducing costs. The approach involved analysis of data acquisition requirements, development of efficient flight patterns, and analysis of pressure error models based on system identification methods. Flight tests were conducted at The University of Tennessee Space Institute (UTSI) utilizing an instrumented Piper Navajo research aircraft. In addition, the UTSI engineering flight simulator was used to investigate test maneuver requirements and handling qualities issues associated with this technique. This paper provides a summary of piloted simulation and flight test results that illustrates the performance and capabilities of the NASA calibration method. Discussion of maneuver requirements and data analysis methods is included as well as recommendations for piloting technique.

  5. Evaluation of culture- and PCR-based detection methods for Escherichia coli O157:H7 in inoculated ground beeft.

    PubMed

    Arthur, Terrance M; Bosilevac, Joseph M; Nou, Xiangwu; Koohmaraie, Mohammad

    2005-08-01

    Currently, several beef processors employ test-and-hold systems for increased quality control of ground beef. In such programs, each lot of product must be tested and found negative for Escherichia coli O157:H7 prior to release of the product into commerce. Optimization of three testing attributes (detection time, specificity, and sensitivity) is critical to the success of such strategies. Because ground beef is a highly perishable product, the testing methodology used must be as rapid as possible. The test also must have a low false-positive result rate so product is not needlessly discarded. False-negative results cannot be tolerated because they would allow contaminated product to be released and potentially cause disease. In this study, two culture-based and three PCR-based methods for detecting E. coli O157:H7 in ground beef were compared for their abilities to meet the above criteria. Ground beef samples were individually spiked with five genetically distinct strains of E. coli O157: H7 at concentrations of 17 and 1.7 CFU/65 g and then subjected to the various testing methodologies. There was no difference (P > 0.05) in the abilities of the PCR-based methods to detect E. coli O157:H7 inoculated in ground beef at 1.7 CFU/65 g. The culture-based systems detected more positive samples than did the PCR-based systems, but the detection times (21 to 48 h) were at least 9 h longer than those for the PCR-based methods (7.5 to 12 h). Ground beef samples were also spiked with potentially cross-reactive strains. The PCR-based systems that employed an immunomagnetic separation step prior to detection produced fewer false-positive results.

  6. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test

    PubMed Central

    2013-01-01

    Background The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. Results One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to “filter” redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. Conclusion We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the summary-statistic based approach. We also implement the summary-statistic test using Z-statistics from an already-published GWAS of Chronic Obstructive Pulmonary Disorder (COPD) and correlation structure obtained from HapMap. We experiment with the modification of this test because the correlation structure is assumed imperfectly known. PMID:24199751

  7. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test.

    PubMed

    Swanson, David M; Blacker, Deborah; Alchawa, Taofik; Ludwig, Kerstin U; Mangold, Elisabeth; Lange, Christoph

    2013-11-07

    The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to "filter" redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the summary-statistic based approach. We also implement the summary-statistic test using Z-statistics from an already-published GWAS of Chronic Obstructive Pulmonary Disorder (COPD) and correlation structure obtained from HapMap. We experiment with the modification of this test because the correlation structure is assumed imperfectly known.

  8. HIV-1 protease cleavage site prediction based on two-stage feature selection method.

    PubMed

    Niu, Bing; Yuan, Xiao-Cheng; Roeper, Preston; Su, Qiang; Peng, Chun-Rong; Yin, Jing-Yuan; Ding, Juan; Li, HaiPeng; Lu, Wen-Cong

    2013-03-01

    Knowledge of the mechanism of HIV protease cleavage specificity is critical to the design of specific and effective HIV inhibitors. Searching for an accurate, robust, and rapid method to correctly predict the cleavage sites in proteins is crucial when searching for possible HIV inhibitors. In this article, HIV-1 protease specificity was studied using the correlation-based feature subset (CfsSubset) selection method combined with Genetic Algorithms method. Thirty important biochemical features were found based on a jackknife test from the original data set containing 4,248 features. By using the AdaBoost method with the thirty selected features the prediction model yields an accuracy of 96.7% for the jackknife test and 92.1% for an independent set test, with increased accuracy over the original dataset by 6.7% and 77.4%, respectively. Our feature selection scheme could be a useful technique for finding effective competitive inhibitors of HIV protease.

  9. A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network

    DTIC Science & Technology

    1980-07-08

    to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for

  10. Test method research on weakening interface strength of steel - concrete under cyclic loading

    NASA Astrophysics Data System (ADS)

    Liu, Ming-wei; Zhang, Fang-hua; Su, Guang-quan

    2018-02-01

    The mechanical properties of steel - concrete interface under cyclic loading are the key factors affecting the rule of horizontal load transfer, the calculation of bearing capacity and cumulative horizontal deformation. Cyclic shear test is an effective method to study the strength reduction of steel - concrete interface. A test system composed of large repeated direct shear test instrument, hydraulic servo system, data acquisition system, test control software system and so on is independently designed, and a set of test method, including the specimen preparation, the instrument preparation, the loading method and so on, is put forward. By listing a set of test results, the validity of the test method is verified. The test system and the test method based on it provide a reference for the experimental study on mechanical properties of steel - concrete interface.

  11. Creating IRT-Based Parallel Test Forms Using the Genetic Algorithm Method

    ERIC Educational Resources Information Center

    Sun, Koun-Tem; Chen, Yu-Jen; Tsai, Shu-Yen; Cheng, Chien-Fen

    2008-01-01

    In educational measurement, the construction of parallel test forms is often a combinatorial optimization problem that involves the time-consuming selection of items to construct tests having approximately the same test information functions (TIFs) and constraints. This article proposes a novel method, genetic algorithm (GA), to construct parallel…

  12. Accounting for Uncertainty in Decision Analytic Models Using Rank Preserving Structural Failure Time Modeling: Application to Parametric Survival Models.

    PubMed

    Bennett, Iain; Paracha, Noman; Abrams, Keith; Ray, Joshua

    2018-01-01

    Rank Preserving Structural Failure Time models are one of the most commonly used statistical methods to adjust for treatment switching in oncology clinical trials. The method is often applied in a decision analytic model without appropriately accounting for additional uncertainty when determining the allocation of health care resources. The aim of the study is to describe novel approaches to adequately account for uncertainty when using a Rank Preserving Structural Failure Time model in a decision analytic model. Using two examples, we tested and compared the performance of the novel Test-based method with the resampling bootstrap method and with the conventional approach of no adjustment. In the first example, we simulated life expectancy using a simple decision analytic model based on a hypothetical oncology trial with treatment switching. In the second example, we applied the adjustment method on published data when no individual patient data were available. Mean estimates of overall and incremental life expectancy were similar across methods. However, the bootstrapped and test-based estimates consistently produced greater estimates of uncertainty compared with the estimate without any adjustment applied. Similar results were observed when using the test based approach on a published data showing that failing to adjust for uncertainty led to smaller confidence intervals. Both the bootstrapping and test-based approaches provide a solution to appropriately incorporate uncertainty, with the benefit that the latter can implemented by researchers in the absence of individual patient data. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. Comparing Distance vs. Campus-Based Delivery of Research Methods Courses

    ERIC Educational Resources Information Center

    Girod, Mark; Wojcikiewicz, Steve

    2009-01-01

    A causal-comparative pre-test, post-test design was used to investigate differences in learning in a research methods course for face-to-face and web-based delivery models. Analyses of participant achievement (N = 205) revealed almost no differences but post-hoc analyses revealed important differences in pedagogy between delivery models despite…

  14. Evaluation of selected static methods used to estimate element mobility, acid-generating and acid-neutralizing potentials associated with geologically diverse mining wastes

    USGS Publications Warehouse

    Hageman, Philip L.; Seal, Robert R.; Diehl, Sharon F.; Piatak, Nadine M.; Lowers, Heather

    2015-01-01

    A comparison study of selected static leaching and acid–base accounting (ABA) methods using a mineralogically diverse set of 12 modern-style, metal mine waste samples was undertaken to understand the relative performance of the various tests. To complement this study, in-depth mineralogical studies were conducted in order to elucidate the relationships between sample mineralogy, weathering features, and leachate and ABA characteristics. In part one of the study, splits of the samples were leached using six commonly used leaching tests including paste pH, the U.S. Geological Survey (USGS) Field Leach Test (FLT) (both 5-min and 18-h agitation), the U.S. Environmental Protection Agency (USEPA) Method 1312 SPLP (both leachate pH 4.2 and leachate pH 5.0), and the USEPA Method 1311 TCLP (leachate pH 4.9). Leachate geochemical trends were compared in order to assess differences, if any, produced by the various leaching procedures. Results showed that the FLT (5-min agitation) was just as effective as the 18-h leaching tests in revealing the leachate geochemical characteristics of the samples. Leaching results also showed that the TCLP leaching test produces inconsistent results when compared to results produced from the other leaching tests. In part two of the study, the ABA was determined on splits of the samples using both well-established traditional static testing methods and a relatively quick, simplified net acid–base accounting (NABA) procedure. Results showed that the traditional methods, while time consuming, provide the most in-depth data on both the acid generating, and acid neutralizing tendencies of the samples. However, the simplified NABA method provided a relatively fast, effective estimation of the net acid–base account of the samples. Overall, this study showed that while most of the well-established methods are useful and effective, the use of a simplified leaching test and the NABA acid–base accounting method provide investigators fast, quantitative tools that can be used to provide rapid, reliable information about the leachability of metals and other constituents of concern, and the acid-generating potential of metal mining waste.

  15. General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies

    PubMed Central

    Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong

    2013-01-01

    We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515

  16. Testing variance components by two jackknife methods

    USDA-ARS?s Scientific Manuscript database

    The jacknife method, a resampling technique, has been widely used for statistical tests for years. The pseudo value based jacknife method (defined as pseudo jackknife method) is commonly used to reduce the bias for an estimate; however, sometimes it could result in large variaion for an estmimate a...

  17. COMPARISON OF TWO METHODS FOR DETECTION OF GIARDIA CYSTS AND CRYTOSPORIDIUM OOCYSTS IN WATER

    EPA Science Inventory

    The steps of two immunofluorescent-antibody-based detection methods were evaluated for their efficiencies in detecting Giardia cysts and Cryptosporidium oocysts. The two methods evaluated were the American Society for Testing and Materials proposed test method for Giardia cysts a...

  18. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    NASA Astrophysics Data System (ADS)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  19. A GPS-Based Pitot-Static Calibration Method Using Global Output-Error Optimization

    NASA Technical Reports Server (NTRS)

    Foster, John V.; Cunningham, Kevin

    2010-01-01

    Pressure-based airspeed and altitude measurements for aircraft typically require calibration of the installed system to account for pressure sensing errors such as those due to local flow field effects. In some cases, calibration is used to meet requirements such as those specified in Federal Aviation Regulation Part 25. Several methods are used for in-flight pitot-static calibration including tower fly-by, pacer aircraft, and trailing cone methods. In the 1990 s, the introduction of satellite-based positioning systems to the civilian market enabled new inflight calibration methods based on accurate ground speed measurements provided by Global Positioning Systems (GPS). Use of GPS for airspeed calibration has many advantages such as accuracy, ease of portability (e.g. hand-held) and the flexibility of operating in airspace without the limitations of test range boundaries or ground telemetry support. The current research was motivated by the need for a rapid and statistically accurate method for in-flight calibration of pitot-static systems for remotely piloted, dynamically-scaled research aircraft. Current calibration methods were deemed not practical for this application because of confined test range size and limited flight time available for each sortie. A method was developed that uses high data rate measurements of static and total pressure, and GPSbased ground speed measurements to compute the pressure errors over a range of airspeed. The novel application of this approach is the use of system identification methods that rapidly compute optimal pressure error models with defined confidence intervals in nearreal time. This method has been demonstrated in flight tests and has shown 2- bounds of approximately 0.2 kts with an order of magnitude reduction in test time over other methods. As part of this experiment, a unique database of wind measurements was acquired concurrently with the flight experiments, for the purpose of experimental validation of the optimization method. This paper describes the GPS-based pitot-static calibration method developed for the AirSTAR research test-bed operated as part of the Integrated Resilient Aircraft Controls (IRAC) project in the NASA Aviation Safety Program (AvSP). A description of the method will be provided and results from recent flight tests will be shown to illustrate the performance and advantages of this approach. Discussion of maneuver requirements and data reduction will be included as well as potential applications.

  20. Efficient Blockwise Permutation Tests Preserving Exchangeability

    PubMed Central

    Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.

    2014-01-01

    In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113

  1. On-orbit characterization of hyperspectral imagers

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel

    Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne- and satellite-based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This dissertation presents a method for determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on a multispectral sensor, Moderate-resolution Imaging Spectroradiometer (MODIS), as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. A method to predict hyperspectral surface reflectance using a combination of MODIS data and spectral shape information is developed and applied for the characterization of Hyperion. Spectral shape information is based on RSG's historical in situ data for the Railroad Valley test site and spectral library data for the Libyan test site. Average atmospheric parameters, also based on historical measurements, are used in reflectance prediction and transfer to space. Results of several cross-calibration scenarios that differ in image acquisition coincidence, test site, and reference sensor are found for the characterization of Hyperion. These are compared with results from the reflectance-based approach of vicarious calibration, a well-documented method developed by the RSG that serves as a baseline for calibration performance for the cross-calibration method developed here. Cross-calibration provides results that are within 2% of those of reflectance-based results in most spectral regions. Larger disagreements exist for shorter wavelengths studied in this work as well as in spectral areas that experience absorption by the atmosphere.

  2. Usability Methods for Ensuring Health Information Technology Safety: Evidence-Based Approaches. Contribution of the IMIA Working Group Health Informatics for Patient Safety.

    PubMed

    Borycki, E; Kushniruk, A; Nohr, C; Takeda, H; Kuwata, S; Carvalho, C; Bainbridge, M; Kannry, J

    2013-01-01

    Issues related to lack of system usability and potential safety hazards continue to be reported in the health information technology (HIT) literature. Usability engineering methods are increasingly used to ensure improved system usability and they are also beginning to be applied more widely for ensuring the safety of HIT applications. These methods are being used in the design and implementation of many HIT systems. In this paper we describe evidence-based approaches to applying usability engineering methods. A multi-phased approach to ensuring system usability and safety in healthcare is described. Usability inspection methods are first described including the development of evidence-based safety heuristics for HIT. Laboratory-based usability testing is then conducted under artificial conditions to test if a system has any base level usability problems that need to be corrected. Usability problems that are detected are corrected and then a new phase is initiated where the system is tested under more realistic conditions using clinical simulations. This phase may involve testing the system with simulated patients. Finally, an additional phase may be conducted, involving a naturalistic study of system use under real-world clinical conditions. The methods described have been employed in the analysis of the usability and safety of a wide range of HIT applications, including electronic health record systems, decision support systems and consumer health applications. It has been found that at least usability inspection and usability testing should be applied prior to the widespread release of HIT. However, wherever possible, additional layers of testing involving clinical simulations and a naturalistic evaluation will likely detect usability and safety issues that may not otherwise be detected prior to widespread system release. The framework presented in the paper can be applied in order to develop more usable and safer HIT, based on multiple layers of evidence.

  3. Computational Pollutant Environment Assessment from Propulsion-System Testing

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; McConnaughey, Paul; Chen, Yen-Sen; Warsi, Saif

    1996-01-01

    An asymptotic plume growth method based on a time-accurate three-dimensional computational fluid dynamics formulation has been developed to assess the exhaust-plume pollutant environment from a simulated RD-170 engine hot-fire test on the F1 Test Stand at Marshall Space Flight Center. Researchers have long known that rocket-engine hot firing has the potential for forming thermal nitric oxides, as well as producing carbon monoxide when hydrocarbon fuels are used. Because of the complex physics involved, most attempts to predict the pollutant emissions from ground-based engine testing have used simplified methods, which may grossly underpredict and/or overpredict the pollutant formations in a test environment. The objective of this work has been to develop a computational fluid dynamics-based methodology that replicates the underlying test-stand flow physics to accurately and efficiently assess pollutant emissions from ground-based rocket-engine testing. A nominal RD-170 engine hot-fire test was computed, and pertinent test-stand flow physics was captured. The predicted total emission rates compared reasonably well with those of the existing hydrocarbon engine hot-firing test data.

  4. Standard Error Estimation of 3PL IRT True Score Equating with an MCMC Method

    ERIC Educational Resources Information Center

    Liu, Yuming; Schulz, E. Matthew; Yu, Lei

    2008-01-01

    A Markov chain Monte Carlo (MCMC) method and a bootstrap method were compared in the estimation of standard errors of item response theory (IRT) true score equating. Three test form relationships were examined: parallel, tau-equivalent, and congeneric. Data were simulated based on Reading Comprehension and Vocabulary tests of the Iowa Tests of…

  5. Comparison between the Standardized Clinical and Laboratory Standards Institute M38-A2 Method and a 2,3-Bis(2-Methoxy-4-Nitro-5-[(Sulphenylamino)Carbonyl]-2H-Tetrazolium Hydroxide- Based Method for Testing Antifungal Susceptibility of Dermatophytes ▿

    PubMed Central

    Shehata, Atef S.; Mukherjee, Pranab K.; Ghannoum, Mahmoud A.

    2008-01-01

    In this study, we determined the utility of a 2,3-bis(2-methoxy-4-nitro-5-[(sulfenylamino)carbonyl]-2H-tetrazolium hydroxide (XTT)-based assay for determining antifungal susceptibilities of dermatophytes to terbinafine, ciclopirox, and voriconazole in comparison to the Clinical and Laboratory Standards Institute (CLSI) M38-A2 method. Forty-eight dermatophyte isolates, including Trichophyton rubrum (n = 15), Trichophyton mentagrophytes (n = 7), Trichophyton tonsurans (n = 11), and Epidermophyton floccosum (n = 13), and two quality control strains, were tested. In the XTT-based method, MICs were determined spectrophotometrically at 490 nm after addition of XTT and menadione. For the CLSI method, the MICs were determined visually. With T. rubrum, the XTT assay revealed MIC ranges of 0.004 to >64 μg/ml, 0.125 to 0.25 μg/ml, and 0.008 to 0.025 μg/ml for terbinafine, ciclopirox, and voriconazole, respectively. Similar MIC ranges were obtained against T. rubrum by using the CLSI method. Additionally, when tested with T. mentagrophytes, T. tonsurans, and E. floccosum isolates, the XTT and CLSI methods resulted in comparable MIC ranges. Both methods revealed similar lowest drug concentrations that inhibited 90% of the isolates for the majority of tested drug-dermatophyte combinations. The levels of agreement within 1 dilution between both methods were as follows: 100% with terbinafine, 97.8% with ciclopirox, and 89.1% with voriconazole. However, the agreement within 2 dilutions between these two methods was 100% for all tested drugs. Our results revealed that the XTT assay can be a useful tool for antifungal susceptibility testing of dermatophytes. PMID:18832129

  6. Recommendations for Developing Alternative Test Methods for Developmental Neurotoxicity

    EPA Science Inventory

    There is great interest in developing alternative methods for developmental neurotoxicity testing (DNT) that are cost-efficient, use fewer animals and are based on current scientific knowledge of the developing nervous system. Alternative methods will require demonstration of the...

  7. Development of FEB Test Platform for ATLAS New Small Wheel Upgrade

    NASA Astrophysics Data System (ADS)

    Lu, Houbing; Hu, Kun; Wang, Xu; Li, Feng; Han, Liang; Jin, Ge

    2016-10-01

    This concept of test platform is based on the test requirements of the front-end board (FEB) which is developed for the phase I upgrade of the small Thin Gap Chamber(sTGC) detector on New Small Wheel(NSW) of ATLAS. The front-end electronics system of sTGC consists of 1,536 FEBs with about 322,000 readout of strips, wires and pads in total. A test platform for FEB with up to 256 channels has been designed to keep the testing efficiency at a controllable level. We present the circuit model architecture of the platform, and its functions and implementation as well. The firmware based on Field Programmable Gate Array (FPGA) and the software based on PC have been developed, and basic test methods have been established. FEB readout measurements have been performed in analog injection from the test platform, which will provide a fast and efficient test method for the production of FEB.

  8. Threshold Assessment of Gear Diagnostic Tools on Flight and Test Rig Data

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Mosher, Marianne; Huff, Edward M.

    2003-01-01

    A method for defining thresholds for vibration-based algorithms that provides the minimum number of false alarms while maintaining sensitivity to gear damage was developed. This analysis focused on two vibration based gear damage detection algorithms, FM4 and MSA. This method was developed using vibration data collected during surface fatigue tests performed in a spur gearbox rig. The thresholds were defined based on damage progression during tests with damage. The thresholds false alarm rates were then evaluated on spur gear tests without damage. Next, the same thresholds were applied to flight data from an OH-58 helicopter transmission. Results showed that thresholds defined in test rigs can be used to define thresholds in flight to correctly classify the transmission operation as normal.

  9. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    PubMed

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  10. Theory and practice of conventional adventitious virus testing.

    PubMed

    Gregersen, Jens-Peter

    2011-01-01

    CONFERENCE PROCEEDING Proceedings of the PDA/FDA Adventitious Viruses in Biologics: Detection and Mitigation Strategies Workshop in Bethesda, MD, USA; December 1-3, 2010 Guest Editors: Arifa Khan (Bethesda, MD), Patricia Hughes (Bethesda, MD) and Michael Wiebe (San Francisco, CA) For decades conventional tests in cell cultures and in laboratory animals have served as standard methods for broad-spectrum screening for adventitious viruses. New virus detection methods based on molecular biology have broadened and improved our knowledge about potential contaminating viruses and about the suitability of the conventional test methods. This paper summarizes and discusses practical aspects of conventional test schemes, such as detectability of various viruses, questionable or false-positive results, animal numbers needed, time and cost of testing, and applicability for rapidly changing starting materials. Strategies to improve the virus safety of biological medicinal products are proposed. The strategies should be based upon a flexible application of existing and new methods along with a scientifically based risk assessment. However, testing alone does not guarantee the absence of adventitious agents and must be accompanied by virus removing or virus inactivating process steps for critical starting materials, raw materials, and for the drug product.

  11. Identification of characteristic frequencies of damaged railway tracks using field hammer test measurements

    NASA Astrophysics Data System (ADS)

    Oregui, M.; Li, Z.; Dollevoet, R.

    2015-03-01

    In this paper, the feasibility of the Frequency Response Function (FRF)-based statistical method to identify the characteristic frequencies of railway track defects is studied. The method compares a damaged track state to a healthy state based on non-destructive field hammer test measurements. First, a study is carried out to investigate the repeatability of hammer tests in railway tracks. By changing the excitation and measurement locations it is shown that the variability introduced by the test process is negligible. Second, following the concepts of control charts employed in process monitoring, a method to define an approximate healthy state is introduced by using hammer test measurements at locations without visual damage. Then, the feasibility study includes an investigation into squats (i.e. a major type of rail surface defect) of varying severity. The identified frequency ranges related to squats agree with those found in an extensively validated vehicle-borne detection system. Therefore, the FRF-based statistical method in combination with the non-destructive hammer test measurements has the potential to be employed to identify the characteristic frequencies of damaged conditions in railway tracks in the frequency range of 300-3000 Hz.

  12. On-line Monitoring Device for High-voltage Switch Cabinet Partial Discharge Based on Pulse Current Method

    NASA Astrophysics Data System (ADS)

    Y Tao, S.; Zhang, X. Z.; Cai, H. W.; Li, P.; Feng, Y.; Zhang, T. C.; Li, J.; Wang, W. S.; Zhang, X. K.

    2017-12-01

    The pulse current method for partial discharge detection is generally applied in type testing and other off-line tests of electrical equipment at delivery. After intensive analysis of the present situation and existing problems of partial discharge detection in switch cabinets, this paper designed the circuit principle and signal extraction method for partial discharge on-line detection based on a high-voltage presence indicating systems (VPIS), established a high voltage switch cabinet partial discharge on-line detection circuit based on the pulse current method, developed background software integrated with real-time monitoring, judging and analyzing functions, carried out a real discharge simulation test on a real-type partial discharge defect simulation platform of a 10KV switch cabinet, and verified the sensitivity and validity of the high-voltage switch cabinet partial discharge on-line monitoring device based on the pulse current method. The study presented in this paper is of great significance for switch cabinet maintenance and theoretical study on pulse current method on-line detection, and has provided a good implementation method for partial discharge on-line monitoring devices for 10KV distribution network equipment.

  13. A new IRT-based standard setting method: application to eCat-listening.

    PubMed

    García, Pablo Eduardo; Abad, Francisco José; Olea, Julio; Aguado, David

    2013-01-01

    Criterion-referenced interpretations of tests are highly necessary, which usually involves the difficult task of establishing cut scores. Contrasting with other Item Response Theory (IRT)-based standard setting methods, a non-judgmental approach is proposed in this study, in which Item Characteristic Curve (ICC) transformations lead to the final cut scores. eCat-Listening, a computerized adaptive test for the evaluation of English Listening, was administered to 1,576 participants, and the proposed standard setting method was applied to classify them into the performance standards of the Common European Framework of Reference for Languages (CEFR). The results showed a classification closely related to relevant external measures of the English language domain, according to the CEFR. It is concluded that the proposed method is a practical and valid standard setting alternative for IRT-based tests interpretations.

  14. A degradation-based sorting method for lithium-ion battery reuse

    PubMed Central

    Chen, Hao

    2017-01-01

    In a world where millions of people are dependent on batteries to provide them with convenient and portable energy, battery recycling is of the utmost importance. In this paper, we developed a new method to sort 18650 Lithium-ion batteries in large quantities and in real time for harvesting used cells with enough capacity for battery reuse. Internal resistance and capacity tests were conducted as a basis for comparison with a novel degradation-based method based on X-ray radiographic scanning and digital image contrast computation. The test results indicate that the sorting accuracy of the test cells is about 79% and the execution time of our algorithm is at a level of 200 milliseconds, making our method a potential real-time solution for reusing the remaining capacity in good used cells. PMID:29023485

  15. The special case of the 2 × 2 table: asymptotic unconditional McNemar test can be used to estimate sample size even for analysis based on GEE.

    PubMed

    Borkhoff, Cornelia M; Johnston, Patrick R; Stephens, Derek; Atenafu, Eshetu

    2015-07-01

    Aligning the method used to estimate sample size with the planned analytic method ensures the sample size needed to achieve the planned power. When using generalized estimating equations (GEE) to analyze a paired binary primary outcome with no covariates, many use an exact McNemar test to calculate sample size. We reviewed the approaches to sample size estimation for paired binary data and compared the sample size estimates on the same numerical examples. We used the hypothesized sample proportions for the 2 × 2 table to calculate the correlation between the marginal proportions to estimate sample size based on GEE. We solved the inside proportions based on the correlation and the marginal proportions to estimate sample size based on exact McNemar, asymptotic unconditional McNemar, and asymptotic conditional McNemar. The asymptotic unconditional McNemar test is a good approximation of GEE method by Pan. The exact McNemar is too conservative and yields unnecessarily large sample size estimates than all other methods. In the special case of a 2 × 2 table, even when a GEE approach to binary logistic regression is the planned analytic method, the asymptotic unconditional McNemar test can be used to estimate sample size. We do not recommend using an exact McNemar test. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Two Methods of Automatic Evaluation of Speech Signal Enhancement Recorded in the Open-Air MRI Environment

    NASA Astrophysics Data System (ADS)

    Přibil, Jiří; Přibilová, Anna; Frollo, Ivan

    2017-12-01

    The paper focuses on two methods of evaluation of successfulness of speech signal enhancement recorded in the open-air magnetic resonance imager during phonation for the 3D human vocal tract modeling. The first approach enables to obtain a comparison based on statistical analysis by ANOVA and hypothesis tests. The second method is based on classification by Gaussian mixture models (GMM). The performed experiments have confirmed that the proposed ANOVA and GMM classifiers for automatic evaluation of the speech quality are functional and produce fully comparable results with the standard evaluation based on the listening test method.

  17. A Residual Mass Ballistic Testing Method to Compare Armor Materials or Components (Residual Mass Ballistic Testing Method)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin Langhorst; Thomas M Lillo; Henry S Chu

    2014-05-01

    A statistics based ballistic test method is presented for use when comparing multiple groups of test articles of unknown relative ballistic perforation resistance. The method is intended to be more efficient than many traditional methods for research and development testing. To establish the validity of the method, it is employed in this study to compare test groups of known relative ballistic performance. Multiple groups of test articles were perforated using consistent projectiles and impact conditions. Test groups were made of rolled homogeneous armor (RHA) plates and differed in thickness. After perforation, each residual projectile was captured behind the target andmore » its mass was measured. The residual masses measured for each test group were analyzed to provide ballistic performance rankings with associated confidence levels. When compared to traditional V50 methods, the residual mass (RM) method was found to require fewer test events and be more tolerant of variations in impact conditions.« less

  18. Alternative methods of flexible base compaction acceptance.

    DOT National Transportation Integrated Search

    2012-11-01

    "This report presents the results from the second year of research work investigating issues with flexible base acceptance testing within the Texas Department of Transportation. This second year of work focused on shadow testing non-density-based acc...

  19. Comparative evaluation of two quantitative test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface: a precollaborative study.

    PubMed

    Tomasino, Stephen F; Hamilton, Martin A

    2007-01-01

    Two quantitative carrier-based test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface, the Standard Quantitative Carrier Test Method-ASTM E 2111-00 and an adaptation of a quantitative micro-method as reported by Sagripanti and Bonifacino, were compared in this study. The methods were selected based on their desirable characteristics (e.g., well-developed protocol, previous use with spores, fully quantitative, and use of readily available equipment) for testing liquid sporicides and sterilants on a hard surface. In this paper, the Sagripanti-Bonifacino procedure is referred to as the Three Step Method (TSM). AOAC Official Method 966.04 was included in this study as a reference method. Three laboratories participated in the evaluation. Three chemical treatments were tested: (1) 3000 ppm sodium hypochlorite with pH adjusted to 7.0, (2) a hydrogen peroxide/peroxyacetic acid product, and (3) 3000 ppm sodium hypochlorite with pH unadjusted (pH of approximately 10.0). A fourth treatment, 6000 ppm sodium hypochlorite solution with pH adjusted to 7.0, was included only for Method 966.04 as a positive control (high level of efficacy). The contact time was 10 min for all chemical treatments except the 6000 ppm sodium hypochlorite treatment which was tested at 30 min. Each chemical treatment was tested 3 times using each of the methods. Only 2 of the laboratories performed the AOAC method. Method performance was assessed by the within-laboratory variance, between-laboratory variance, and total variance associated with the log reduction (LR) estimates generated by each quantitative method. The quantitative methods performed similarly, and the LR values generated by each method were not statistically different for the 3 treatments evaluated. Based on feedback from the participating laboratories, compared to the TSM, ASTM E 2111-00 was more resource demanding and required more set-up time. The logistical and resource concerns identified for ASTM E 2111-00 were largely associated with the filtration process and counting bacterial colonies on filters. Thus, the TSM was determined to be the most suitable method.

  20. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    PubMed

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  1. Data entry errors and design for model-based tight glycemic control in critical care.

    PubMed

    Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey

    2012-01-01

    Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. The final data entry method tested reduced errors to less than 1-2%, a 60-80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0-20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. © 2012 Diabetes Technology Society.

  2. Three methods to construct predictive models using logistic regression and likelihood ratios to facilitate adjustment for pretest probability give similar results.

    PubMed

    Chan, Siew Foong; Deeks, Jonathan J; Macaskill, Petra; Irwig, Les

    2008-01-01

    To compare three predictive models based on logistic regression to estimate adjusted likelihood ratios allowing for interdependency between diagnostic variables (tests). This study was a review of the theoretical basis, assumptions, and limitations of published models; and a statistical extension of methods and application to a case study of the diagnosis of obstructive airways disease based on history and clinical examination. Albert's method includes an offset term to estimate an adjusted likelihood ratio for combinations of tests. Spiegelhalter and Knill-Jones method uses the unadjusted likelihood ratio for each test as a predictor and computes shrinkage factors to allow for interdependence. Knottnerus' method differs from the other methods because it requires sequencing of tests, which limits its application to situations where there are few tests and substantial data. Although parameter estimates differed between the models, predicted "posttest" probabilities were generally similar. Construction of predictive models using logistic regression is preferred to the independence Bayes' approach when it is important to adjust for dependency of tests errors. Methods to estimate adjusted likelihood ratios from predictive models should be considered in preference to a standard logistic regression model to facilitate ease of interpretation and application. Albert's method provides the most straightforward approach.

  3. 40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... test method documentation, including a description of the technology and/or instrumentation that makes... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...

  4. 40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... test method documentation, including a description of the technology and/or instrumentation that makes... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...

  5. The Development of MST Test Information for the Prediction of Test Performances

    ERIC Educational Resources Information Center

    Park, Ryoungsun; Kim, Jiseon; Chung, Hyewon; Dodd, Barbara G.

    2017-01-01

    The current study proposes novel methods to predict multistage testing (MST) performance without conducting simulations. This method, called MST test information, is based on analytic derivation of standard errors of ability estimates across theta levels. We compared standard errors derived analytically to the simulation results to demonstrate the…

  6. 10 CFR Appendix V to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Ceiling Fan Light Kits

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... with ceiling fan light kits that have medium screw base sockets shall conform to the requirements... testing pin-based fluorescent lamps packaged with ceiling fan light kits that have pin-based sockets shall... base sockets, measure the efficacy, expressed in lumens per watt, in accordance with the test...

  7. A Classroom-Based Assessment Method to Test Speaking Skills in English for Specific Purposes

    ERIC Educational Resources Information Center

    Alberola Colomar, María Pilar

    2014-01-01

    This article presents and analyses a classroom-based assessment method to test students' speaking skills in a variety of professional settings in tourism. The assessment system has been implemented in the Communication in English for Tourism course, as part of the Tourism Management degree programme, at Florida Universitaria (affiliated to the…

  8. Reform-Based-Instructional Method and Learning Styles on Students' Achievement and Retention in Mathematics: Administrative Implications

    ERIC Educational Resources Information Center

    Modebelu, M. N.; Ogbonna, C. C.

    2014-01-01

    This study aimed at determining the effect of reform-based-instructional method learning styles on students' achievement and retention in mathematics. A sample size of 119 students was randomly selected. The quasiexperimental design comprising pre-test, post-test, and randomized control group were employed. The Collin Rose learning styles…

  9. Functional Assessment-Based Interventions for Students with or At-Risk for High-Incidence Disabilities: Field Testing Single-Case Synthesis Methods

    ERIC Educational Resources Information Center

    Common, Eric Alan; Lane, Kathleen Lynne; Pustejovsky, James E.; Johnson, Austin H.; Johl, Liane Elizabeth

    2017-01-01

    This systematic review investigated one systematic approach to designing, implementing, and evaluating functional assessment-based interventions (FABI) for use in supporting school-age students with or at-risk for high-incidence disabilities. We field tested several recently developed methods for single-case design syntheses. First, we appraised…

  10. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    NASA Astrophysics Data System (ADS)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  11. A Mixture Rasch Model-Based Computerized Adaptive Test for Latent Class Identification

    ERIC Educational Resources Information Center

    Jiao, Hong; Macready, George; Liu, Junhui; Cho, Youngmi

    2012-01-01

    This study explored a computerized adaptive test delivery algorithm for latent class identification based on the mixture Rasch model. Four item selection methods based on the Kullback-Leibler (KL) information were proposed and compared with the reversed and the adaptive KL information under simulated testing conditions. When item separation was…

  12. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    PubMed

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current methods. Analytical methods are often used to ensure safety, efficacy, and quality of medicinal products. According to government regulations and regulatory guidelines, these methods need to be validated through well-designed studies to minimize the risk of accepting unsuitable methods. This article describes a novel statistical test for analytical method validation, which provides better protection for the risk of accepting unsuitable analytical methods. © PDA, Inc. 2015.

  13. [Research progress on mechanical performance evaluation of artificial intervertebral disc].

    PubMed

    Li, Rui; Wang, Song; Liao, Zhenhua; Liu, Weiqiang

    2018-03-01

    The mechanical properties of artificial intervertebral disc (AID) are related to long-term reliability of prosthesis. There are three testing methods involved in the mechanical performance evaluation of AID based on different tools: the testing method using mechanical simulator, in vitro specimen testing method and finite element analysis method. In this study, the testing standard, testing equipment and materials of AID were firstly introduced. Then, the present status of AID static mechanical properties test (static axial compression, static axial compression-shear), dynamic mechanical properties test (dynamic axial compression, dynamic axial compression-shear), creep and stress relaxation test, device pushout test, core pushout test, subsidence test, etc. were focused on. The experimental techniques using in vitro specimen testing method and testing results of available artificial discs were summarized. The experimental methods and research status of finite element analysis were also summarized. Finally, the research trends of AID mechanical performance evaluation were forecasted. The simulator, load, dynamic cycle, motion mode, specimen and test standard would be important research fields in the future.

  14. LABORATORY TOXICITY TESTS FOR EVALUATING POTENTIAL EFFECTS OF ENDOCRINE-DISRUPTING COMPOUNDS

    EPA Science Inventory

    The scope of the Laboratory Testing Work Group was to evaluate methods for testing aquatic and terrestrial invertebrates in the laboratory. Specifically, discussions focused on the following objectives: 1) assess the extent to which consensus-based standard methods and other pub...

  15. Language Testing: An Overview and Language Testing in Educational Institutions of Bangladesh

    ERIC Educational Resources Information Center

    Hossain, Md. Mahroof; Ahmed, Md. Kawser

    2015-01-01

    Test is procedures for measuring ability, knowledge or performance. Testing can be defined as a method of assessment and improvement of the students. Language testing in any point is an extremely multifarious task that ought to be based on method as well as exercise. The results of assessments are used for one or more purposes. So they have an…

  16. Integrating the ACR Appropriateness Criteria Into the Radiology Clerkship: Comparison of Didactic Format and Group-Based Learning.

    PubMed

    Stein, Marjorie W; Frank, Susan J; Roberts, Jeffrey H; Finkelstein, Malka; Heo, Moonseong

    2016-05-01

    The aim of this study was to determine whether group-based or didactic teaching is more effective to teach ACR Appropriateness Criteria to medical students. An identical pretest, posttest, and delayed multiple-choice test was used to evaluate the efficacy of the two teaching methods. Descriptive statistics comparing test scores were obtained. On the posttest, the didactic group gained 12.5 points (P < .0001), and the group-based learning students gained 16.3 points (P < .0001). On the delayed test, the didactic group gained 14.4 points (P < .0001), and the group-based learning students gained 11.8 points (P < .001). The gains in scores on both tests were statistically significant for both groups. However, the differences in scores were not statistically significant comparing the two educational methods. Compared with didactic lectures, group-based learning is more enjoyable, time efficient, and equally efficacious. The choice of educational method can be individualized for each institution on the basis of group size, time constraints, and faculty availability. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  17. Investigating a Judgemental Rank-Ordering Method for Maintaining Standards in UK Examinations

    ERIC Educational Resources Information Center

    Black, Beth; Bramley, Tom

    2008-01-01

    A new judgemental method of equating raw scores on two tests, based on rank-ordering scripts from both tests, has been developed by Bramley. The rank-ordering method has potential application as a judgemental standard-maintaining mechanism, because given a mark on one test (e.g. the A grade boundary mark), the equivalent mark (i.e. at the same…

  18. What do results from coordinate-based meta-analyses tell us?

    PubMed

    Albajes-Eizagirre, Anton; Radua, Joaquim

    2018-08-01

    Coordinate-based meta-analyses (CBMA) methods, such as Activation Likelihood Estimation (ALE) and Seed-based d Mapping (SDM), have become an invaluable tool for summarizing the findings of voxel-based neuroimaging studies. However, the progressive sophistication of these methods may have concealed two particularities of their statistical tests. Common univariate voxelwise tests (such as the t/z-tests used in SPM and FSL) detect voxels that activate, or voxels that show differences between groups. Conversely, the tests conducted in CBMA test for "spatial convergence" of findings, i.e., they detect regions where studies report "more peaks than in most regions", regions that activate "more than most regions do", or regions that show "larger differences between groups than most regions do". The first particularity is that these tests rely on two spatial assumptions (voxels are independent and have the same probability to have a "false" peak), whose violation may make their results either conservative or liberal, though fortunately current versions of ALE, SDM and some other methods consider these assumptions. The second particularity is that the use of these tests involves an important paradox: the statistical power to detect a given effect is higher if there are no other effects in the brain, whereas lower in presence of multiple effects. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Shrinkage regression-based methods for microarray missing value imputation.

    PubMed

    Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng

    2013-01-01

    Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.

  20. Validity evidence based on test content.

    PubMed

    Sireci, Stephen; Faulkner-Bond, Molly

    2014-01-01

    Validity evidence based on test content is one of the five forms of validity evidence stipulated in the Standards for Educational and Psychological Testing developed by the American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. In this paper, we describe the logic and theory underlying such evidence and describe traditional and modern methods for gathering and analyzing content validity data. A comprehensive review of the literature and of the aforementioned Standards is presented. For educational tests and other assessments targeting knowledge and skill possessed by examinees, validity evidence based on test content is necessary for building a validity argument to support the use of a test for a particular purpose. By following the methods described in this article, practitioners have a wide arsenal of tools available for determining how well the content of an assessment is congruent with and appropriate for the specific testing purposes.

  1. A fast boosting-based screening method for large-scale association study in complex traits with genetic heterogeneity.

    PubMed

    Wang, Lu-Yong; Fasulo, D

    2006-01-01

    Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.

  2. Multiplex Microsphere Immunoassays for the Detection of IgM and IgG to Arboviral Diseases

    PubMed Central

    Basile, Alison J.; Horiuchi, Kalanthe; Panella, Amanda J.; Laven, Janeen; Kosoy, Olga; Lanciotti, Robert S.; Venkateswaran, Neeraja; Biggerstaff, Brad J.

    2013-01-01

    Serodiagnosis of arthropod-borne viruses (arboviruses) at the Division of Vector-Borne Diseases, CDC, employs a combination of individual enzyme-linked immunosorbent assays and microsphere immunoassays (MIAs) to test for IgM and IgG, followed by confirmatory plaque-reduction neutralization tests. Based upon the geographic origin of a sample, it may be tested concurrently for multiple arboviruses, which can be a cumbersome task. The advent of multiplexing represents an opportunity to streamline these types of assays; however, because serologic cross-reactivity of the arboviral antigens often confounds results, it is of interest to employ data analysis methods that address this issue. Here, we constructed 13-virus multiplexed IgM and IgG MIAs that included internal and external controls, based upon the Luminex platform. Results from samples tested using these methods were analyzed using 8 different statistical schemes to identify the best way to classify the data. Geographic batteries were also devised to serve as a more practical diagnostic format, and further samples were tested using the abbreviated multiplexes. Comparative error rates for the classification schemes identified a specific boosting method based on logistic regression “Logitboost” as the classification method of choice. When the data from all samples tested were combined into one set, error rates from the multiplex IgM and IgG MIAs were <5% for all geographic batteries. This work represents both the most comprehensive, validated multiplexing method for arboviruses to date, and also the most systematic attempt to determine the most useful classification method for use with these types of serologic tests. PMID:24086608

  3. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    PubMed

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  4. Algorithms for the Construction of Parallel Tests by Zero-One Programming. Project Psychometric Aspects of Item Banking No. 7. Research Report 86-7.

    ERIC Educational Resources Information Center

    Boekkooi-Timminga, Ellen

    Nine methods for automated test construction are described. All are based on the concepts of information from item response theory. Two general kinds of methods for the construction of parallel tests are presented: (1) sequential test design; and (2) simultaneous test design. Sequential design implies that the tests are constructed one after the…

  5. Interesting Developments in Testing Methods Applied to Foundation Piles

    NASA Astrophysics Data System (ADS)

    Sobala, Dariusz; Tkaczyński, Grzegorz

    2017-10-01

    Both: piling technologies and pile testing methods are a subject of current development. New technologies, providing larger diameters or using in-situ materials, are very demanding in terms of providing proper quality of execution of works. That concerns the material quality and continuity which define the integral strength of pile. On the other side we have the capacity of the ground around the pile and its ability to carry the loads transferred by shaft and pile base. Inhomogeneous nature of soils and a relatively small amount of tested piles imposes very good understanding of small amount of results. In some special cases the capacity test itself form an important cost in the piling contract. This work presents a brief description of selected testing methods and authors remarks based on cooperation with Universities constantly developing new ideas. Paper presents some experience based remarks on integrity testing by means of low energy impact (low strain) and introduces selected (Polish) developments in the field of closed-end pipe piles testing based on bi-directional loading, similar to Osterberg idea, but without sacrificial hydraulic jack. Such test is suitable especially when steel piles are used for temporary support in the rivers, where constructing of conventional testing appliance with anchor piles or kentledge meets technical problems. According to the author’s experience, such tests were not yet used on the building site but they bring a real potential especially, when the displacement control can be provided from the river bank using surveying techniques.

  6. 40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... marker content of distillate fuels and how will EPA qualify or decline to qualify a test method?—(1... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...

  7. 40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... marker content of distillate fuels and how will EPA qualify or decline to qualify a test method?—(1... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...

  8. 40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... marker content of distillate fuels and how will EPA qualify or decline to qualify a test method?—(1... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...

  9. Resampling and Distribution of the Product Methods for Testing Indirect Effects in Complex Models

    ERIC Educational Resources Information Center

    Williams, Jason; MacKinnon, David P.

    2008-01-01

    Recent advances in testing mediation have found that certain resampling methods and tests based on the mathematical distribution of 2 normal random variables substantially outperform the traditional "z" test. However, these studies have primarily focused only on models with a single mediator and 2 component paths. To address this limitation, a…

  10. A Primer-Test Centered Equating Method for Setting Cut-Off Scores

    ERIC Educational Resources Information Center

    Zhu, Weimo; Plowman, Sharon Ann; Park, Youngsik

    2010-01-01

    This study evaluated the use of a new primary field test method based on test equating to address inconsistent classification among field tests. We analyzed students' information on the Progressive Aerobic Cardiovascular Endurance Run (PACER), mile run (MR), and VO[subscript 2]max from three data sets (college: n = 94; middle school: n = 39;…

  11. A method for evaluating horizontal well pumping tests.

    PubMed

    Langseth, David E; Smyth, Andrew H; May, James

    2004-01-01

    Predicting the future performance of horizontal wells under varying pumping conditions requires estimates of basic aquifer parameters, notably transmissivity and storativity. For vertical wells, there are well-established methods for estimating these parameters, typically based on either the recovery from induced head changes in a well or from the head response in observation wells to pumping in a test well. Comparable aquifer parameter estimation methods for horizontal wells have not been presented in the ground water literature. Formation parameter estimation methods based on measurements of pressure in horizontal wells have been presented in the petroleum industry literature, but these methods have limited applicability for ground water evaluation and are based on pressure measurements in only the horizontal well borehole, rather than in observation wells. This paper presents a simple and versatile method by which pumping test procedures developed for vertical wells can be applied to horizontal well pumping tests. The method presented here uses the principle of superposition to represent the horizontal well as a series of partially penetrating vertical wells. This concept is used to estimate a distance from an observation well at which a vertical well that has the same total pumping rate as the horizontal well will produce the same drawdown as the horizontal well. This equivalent distance may then be associated with an observation well for use in pumping test algorithms and type curves developed for vertical wells. The method is shown to produce good results for confined aquifers and unconfined aquifers in the absence of delayed yield response. For unconfined aquifers, the presence of delayed yield response increases the method error.

  12. Screening methods for assessment of biodegradability of chemicals in seawater--results from a ring test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nyholm, N.; Kristensen, P.

    1992-04-01

    An international ring test involving 14 laboratories was organized on behalf of the Commission of the European Economic Communities (EEC) with the purpose of evaluating two proposed screening methods for assessment of biodegradability in seawater: (a) a shake flask die-away test based primarily on analysis of dissolved organic carbon and (b) a closed bottle test based on determination of dissolved oxygen. Both tests are performed with nutrient-enriched natural seawater as the test medium and with no inoculum added other than the natural seawater microflora. The test methods are seawater versions of the modified OECD screening test and the closed bottlemore » test, respectively, adopted by the Organization for Economic Cooperation and Development (OECD) and by the EEC as tests for ready biodegradability.' The following five chemicals were examined: sodium benzoate, aniline, diethylene glycol, pentaerythritol, and 4-nitrophenol. Sodium benzoate and aniline, which are known to be generally readily biodegradable consistently degraded in practically all tests, thus demonstrating the technical feasibility of the methods. Like in previous ring tests with freshwater screening methods variable results were obtained with the other three compounds, which is believed primarily to be due to site-specific differences between the microflora of the different seawater samples used and to some extent also to differences in the applied concentrations of test material. A positive result with the screening methods indicates that the test substance will most likely degrade relatively rapidly in seawater from the site of collection, while a negative test result does not preclude biodegradability under environmental conditions where the concentrations of chemicals are much lower than the concentrations applied for analytical reasons in screening tests.« less

  13. An Efficient Functional Test Generation Method For Processors Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Hudec, Ján; Gramatová, Elena

    2015-07-01

    The paper presents a new functional test generation method for processors testing based on genetic algorithms and evolutionary strategies. The tests are generated over an instruction set architecture and a processor description. Such functional tests belong to the software-oriented testing. Quality of the tests is evaluated by code coverage of the processor description using simulation. The presented test generation method uses VHDL models of processors and the professional simulator ModelSim. The rules, parameters and fitness functions were defined for various genetic algorithms used in automatic test generation. Functionality and effectiveness were evaluated using the RISC type processor DP32.

  14. The Sequential Probability Ratio Test and Binary Item Response Models

    ERIC Educational Resources Information Center

    Nydick, Steven W.

    2014-01-01

    The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…

  15. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  16. Using a fuzzy comprehensive evaluation method to determine product usability: A test case

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. OBJECTIVE AND METHODS: In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. RESULTS AND CONCLUSIONS: This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method. PMID:28035942

  17. Biases and power for groups comparison on subjective health measurements.

    PubMed

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.

  18. 78 FR 68735 - Reduction or Suspension of Safe Harbor Contributions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-15

    ... forth in section 401(k)(3), called the actual deferral percentage (ADP) test, or one of the design-based... design-based safe harbor method under which a CODA is treated as satisfying the ADP test if the... the design-based alternatives in section 401(m)(10), 401(m)(11), or 401(m)(12). The ACP test in...

  19. A Model Based Security Testing Method for Protocol Implementation

    PubMed Central

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation. PMID:25105163

  20. A model based security testing method for protocol implementation.

    PubMed

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  1. A review of consensus test methods for established medical imaging modalities and their implications for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Pfefer, Joshua; Agrawal, Anant

    2012-03-01

    In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.

  2. SU-E-T-769: T-Test Based Prior Error Estimate and Stopping Criterion for Monte Carlo Dose Calculation in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, X; Gao, H; Schuemann, J

    2015-06-15

    Purpose: The Monte Carlo (MC) method is a gold standard for dose calculation in radiotherapy. However, it is not a priori clear how many particles need to be simulated to achieve a given dose accuracy. Prior error estimate and stopping criterion are not well established for MC. This work aims to fill this gap. Methods: Due to the statistical nature of MC, our approach is based on one-sample t-test. We design the prior error estimate method based on the t-test, and then use this t-test based error estimate for developing a simulation stopping criterion. The three major components are asmore » follows.First, the source particles are randomized in energy, space and angle, so that the dose deposition from a particle to the voxel is independent and identically distributed (i.i.d.).Second, a sample under consideration in the t-test is the mean value of dose deposition to the voxel by sufficiently large number of source particles. Then according to central limit theorem, the sample as the mean value of i.i.d. variables is normally distributed with the expectation equal to the true deposited dose.Third, the t-test is performed with the null hypothesis that the difference between sample expectation (the same as true deposited dose) and on-the-fly calculated mean sample dose from MC is larger than a given error threshold, in addition to which users have the freedom to specify confidence probability and region of interest in the t-test based stopping criterion. Results: The method is validated for proton dose calculation. The difference between the MC Result based on the t-test prior error estimate and the statistical Result by repeating numerous MC simulations is within 1%. Conclusion: The t-test based prior error estimate and stopping criterion are developed for MC and validated for proton dose calculation. Xiang Hong and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less

  3. AFNOR validation of Premi Test, a microbiological-based screening tube-test for the detection of antimicrobial residues in animal muscle tissue.

    PubMed

    Gaudin, Valerie; Juhel-Gaugain, Murielle; Morétain, Jean-Pierre; Sanders, Pascal

    2008-12-01

    Premi Test contains viable spores of a strain of Bacillus stearothermophilus which is sensitive to antimicrobial residues, such as beta-lactams, tetracyclines, macrolides and sulphonamides. The growth of the strain is inhibited by the presence of antimicrobial residues in muscle tissue samples. Premi Test was validated according to AFNOR rules (French Association for Normalisation). The AFNOR validation was based on the comparison of reference methods (French Official method, i.e. four plate test (FPT) and the STAR protocol (five plate test)) with the alternative method (Premi Test). A preliminary study was conducted in an expert laboratory (Community Reference Laboratory, CRL) on both spiked and incurred samples (field samples). Several method performance criteria (sensitivity, specificity, relative accuracy) were estimated and are discussed, in addition to detection capabilities. Adequate agreement was found between the alternative method and the reference methods. However, Premi Test was more sensitive to beta-lactams and sulphonamides than the FPT. Subsequently, a collaborative study with 11 laboratories was organised by the CRL. Blank and spiked meat juice samples were sent to participants. The expert laboratory (CRL) statistically analysed the results. It was concluded that Premi Test could be used for the routine determination of antimicrobial residues in muscle of different animal origin with acceptable analytical performance. The detection capabilities of Premi Test for beta-lactams (amoxicillin, ceftiofur), one macrolide (tylosin) and tetracycline were at the level of the respective maximum residue limits (MRL) in muscle samples or even lower.

  4. Testing the feasibility of eliciting preferences for health states from adolescents using direct methods.

    PubMed

    Crump, R Trafford; Lau, Ryan; Cox, Elizabeth; Currie, Gillian; Panepinto, Julie

    2018-06-22

    Measuring adolescents' preferences for health states can play an important role in evaluating the delivery of pediatric healthcare. However, formal evaluation of the common direct preference elicitation methods for health states has not been done with adolescents. Therefore, the purpose of this study is to test how these methods perform in terms of their feasibility, reliability, and validity for measuring health state preferences in adolescents. This study used a web-based survey of adolescents, 18 years of age or younger, living in the United States. The survey included four health states, each comprised of six attributes. Preferences for these health states were elicited using the visual analogue scale, time trade-off, and standard gamble. The feasibility, test-retest reliability, and construct validity of each of these preference elicitation methods were tested and compared. A total of 144 participants were included in this study. Using a web-based survey format to elicit preferences for health states from adolescents was feasible. A majority of participants completed all three elicitation methods, ranked those methods as being easy, with very few requiring assistance from someone else. However, all three elicitation methods demonstrated weak test-retest reliability, with Kendall's tau-a values ranging from 0.204 to 0.402. Similarly, all three methods demonstrated poor construct validity, with 9-50% of all rankings aligning with our expectations. There were no significant differences across age groups. Using a web-based survey format to elicit preferences for health states from adolescents is feasible. However, the reliability and construct validity of the methods used to elicit these preferences when using this survey format are poor. Further research into the effects of a web-based survey approach to eliciting preferences for health states from adolescents is needed before health services researchers or pediatric clinicians widely employ these methods.

  5. Satellite Vibration Testing: Angle optimisation method to Reduce Overtesting

    NASA Astrophysics Data System (ADS)

    Knight, Charly; Remedia, Marcello; Aglietti, Guglielmo S.; Richardson, Guy

    2018-06-01

    Spacecraft overtesting is a long running problem, and the main focus of most attempts to reduce it has been to adjust the base vibration input (i.e. notching). Instead this paper examines testing alternatives for secondary structures (equipment) coupled to the main structure (satellite) when they are tested separately. Even if the vibration source is applied along one of the orthogonal axes at the base of the coupled system (satellite plus equipment), the dynamics of the system and potentially the interface configuration mean the vibration at the interface may not occur all along one axis much less the corresponding orthogonal axis of the base excitation. This paper proposes an alternative testing methodology in which the testing of a piece of equipment occurs at an offset angle. This Angle Optimisation method may have multiple tests but each with an altered input direction allowing for the best match between all specified equipment system responses with coupled system tests. An optimisation process that compares the calculated equipment RMS values for a range of inputs with the maximum coupled system RMS values, and is used to find the optimal testing configuration for the given parameters. A case study was performed to find the best testing angles to match the acceleration responses of the centre of mass and sum of interface forces for all three axes, as well as the von Mises stress for an element by a fastening point. The angle optimisation method resulted in RMS values and PSD responses that were much closer to the coupled system when compared with traditional testing. The optimum testing configuration resulted in an overall average error significantly smaller than the traditional method. Crucially, this case study shows that the optimum test campaign could be a single equipment level test opposed to the traditional three orthogonal direction tests.

  6. Research on environmental impact of water-based fire extinguishing agents

    NASA Astrophysics Data System (ADS)

    Wang, Shuai

    2018-02-01

    This paper offers current status of application of water-based fire extinguishing agents, the environmental and research considerations of the need for the study of toxicity research. This paper also offers systematic review of test methods of toxicity and environmental impact of water-based fire extinguishing agents currently available, illustrate the main requirements and relevant test methods, and offer some research findings for future research considerations. The paper also offers limitations of current study.

  7. Tree Testing of Hierarchical Menu Structures for Health Applications

    PubMed Central

    Le, Thai; Chaudhuri, Shomir; Chung, Jane; Thompson, Hilaire J; Demiris, George

    2014-01-01

    To address the need for greater evidence-based evaluation of Health Information Technology (HIT) systems we introduce a method of usability testing termed tree testing. In a tree test, participants are presented with an abstract hierarchical tree of the system taxonomy and asked to navigate through the tree in completing representative tasks. We apply tree testing to a commercially available health application, demonstrating a use case and providing a comparison with more traditional in-person usability testing methods. Online tree tests (N=54) and in-person usability tests (N=15) were conducted from August to September 2013. Tree testing provided a method to quantitatively evaluate the information structure of a system using various navigational metrics including completion time, task accuracy, and path length. The results of the analyses compared favorably to the results seen from the traditional usability test. Tree testing provides a flexible, evidence-based approach for researchers to evaluate the information structure of HITs. In addition, remote tree testing provides a quick, flexible, and high volume method of acquiring feedback in a structured format that allows for quantitative comparisons. With the diverse nature and often large quantities of health information available, addressing issues of terminology and concept classifications during the early development process of a health information system will improve navigation through the system and save future resources. Tree testing is a usability method that can be used to quickly and easily assess information hierarchy of health information systems. PMID:24582924

  8. Molecular method for detection of total coliforms in drinking water samples.

    PubMed

    Maheux, Andrée F; Boudreau, Dominique K; Bisson, Marc-Antoine; Dion-Dupont, Vanessa; Bouchard, Sébastien; Nkuranga, Martine; Bergeron, Michel G; Rodriguez, Manuel J

    2014-07-01

    This work demonstrates the ability of a bacterial concentration and recovery procedure combined with three different PCR assays targeting the lacZ, wecG, and 16S rRNA genes, respectively, to detect the presence of total coliforms in 100-ml samples of potable water (presence/absence test). PCR assays were first compared to the culture-based Colilert and MI agar methods to determine their ability to detect 147 coliform strains representing 76 species of Enterobacteriaceae encountered in fecal and environmental settings. Results showed that 86 (58.5%) and 109 (74.1%) strains yielded a positive signal with Colilert and MI agar methods, respectively, whereas the lacZ, wecG, and 16S rRNA PCR assays detected 133 (90.5%), 111 (75.5%), and 146 (99.3%) of the 147 total coliform strains tested. These assays were then assessed by testing 122 well water samples collected in the Québec City region of Canada. Results showed that 97 (79.5%) of the samples tested by culture-based methods and 95 (77.9%), 82 (67.2%), and 98 (80.3%) of samples tested using PCR-based methods contained total coliforms, respectively. Consequently, despite the high genetic variability of the total coliform group, this study demonstrated that it is possible to use molecular assays to detect total coliforms in potable water: the 16S rRNA molecular assay was shown to be as efficient as recommended culture-based methods. This assay might be used in combination with an Escherichia coli molecular assay to assess drinking water quality. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  9. Molecular Method for Detection of Total Coliforms in Drinking Water Samples

    PubMed Central

    Boudreau, Dominique K.; Bisson, Marc-Antoine; Dion-Dupont, Vanessa; Bouchard, Sébastien; Nkuranga, Martine; Bergeron, Michel G.; Rodriguez, Manuel J.

    2014-01-01

    This work demonstrates the ability of a bacterial concentration and recovery procedure combined with three different PCR assays targeting the lacZ, wecG, and 16S rRNA genes, respectively, to detect the presence of total coliforms in 100-ml samples of potable water (presence/absence test). PCR assays were first compared to the culture-based Colilert and MI agar methods to determine their ability to detect 147 coliform strains representing 76 species of Enterobacteriaceae encountered in fecal and environmental settings. Results showed that 86 (58.5%) and 109 (74.1%) strains yielded a positive signal with Colilert and MI agar methods, respectively, whereas the lacZ, wecG, and 16S rRNA PCR assays detected 133 (90.5%), 111 (75.5%), and 146 (99.3%) of the 147 total coliform strains tested. These assays were then assessed by testing 122 well water samples collected in the Québec City region of Canada. Results showed that 97 (79.5%) of the samples tested by culture-based methods and 95 (77.9%), 82 (67.2%), and 98 (80.3%) of samples tested using PCR-based methods contained total coliforms, respectively. Consequently, despite the high genetic variability of the total coliform group, this study demonstrated that it is possible to use molecular assays to detect total coliforms in potable water: the 16S rRNA molecular assay was shown to be as efficient as recommended culture-based methods. This assay might be used in combination with an Escherichia coli molecular assay to assess drinking water quality. PMID:24771030

  10. Pre-test probability of obstructive coronary stenosis in patients undergoing coronary CT angiography: Comparative performance of the modified diamond-Forrester algorithm versus methods incorporating cardiovascular risk factors.

    PubMed

    Ferreira, António Miguel; Marques, Hugo; Tralhão, António; Santos, Miguel Borges; Santos, Ana Rita; Cardoso, Gonçalo; Dores, Hélder; Carvalho, Maria Salomé; Madeira, Sérgio; Machado, Francisco Pereira; Cardim, Nuno; de Araújo Gonçalves, Pedro

    2016-11-01

    Current guidelines recommend the use of the Modified Diamond-Forrester (MDF) method to assess the pre-test likelihood of obstructive coronary artery disease (CAD). We aimed to compare the performance of the MDF method with two contemporary algorithms derived from multicenter trials that additionally incorporate cardiovascular risk factors: the calculator-based 'CAD Consortium 2' method, and the integer-based CONFIRM score. We assessed 1069 consecutive patients without known CAD undergoing coronary CT angiography (CCTA) for stable chest pain. Obstructive CAD was defined as the presence of coronary stenosis ≥50% on 64-slice dual-source CT. The three methods were assessed for calibration, discrimination, net reclassification, and changes in proposed downstream testing based upon calculated pre-test likelihoods. The observed prevalence of obstructive CAD was 13.8% (n=147). Overestimations of the likelihood of obstructive CAD were 140.1%, 9.8%, and 18.8%, respectively, for the MDF, CAD Consortium 2 and CONFIRM methods. The CAD Consortium 2 showed greater discriminative power than the MDF method, with a C-statistic of 0.73 vs. 0.70 (p<0.001), while the CONFIRM score did not (C-statistic 0.71, p=0.492). Reclassification of pre-test likelihood using the 'CAD Consortium 2' or CONFIRM scores resulted in a net reclassification improvement of 0.19 and 0.18, respectively, which would change the diagnostic strategy in approximately half of the patients. Newer risk factor-encompassing models allow for a more precise estimation of pre-test probabilities of obstructive CAD than the guideline-recommended MDF method. Adoption of these scores may improve disease prediction and change the diagnostic pathway in a significant proportion of patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Comparing K-mer based methods for improved classification of 16S sequences.

    PubMed

    Vinje, Hilde; Liland, Kristian Hovde; Almøy, Trygve; Snipen, Lars

    2015-07-01

    The need for precise and stable taxonomic classification is highly relevant in modern microbiology. Parallel to the explosion in the amount of sequence data accessible, there has also been a shift in focus for classification methods. Previously, alignment-based methods were the most applicable tools. Now, methods based on counting K-mers by sliding windows are the most interesting classification approach with respect to both speed and accuracy. Here, we present a systematic comparison on five different K-mer based classification methods for the 16S rRNA gene. The methods differ from each other both in data usage and modelling strategies. We have based our study on the commonly known and well-used naïve Bayes classifier from the RDP project, and four other methods were implemented and tested on two different data sets, on full-length sequences as well as fragments of typical read-length. The difference in classification error obtained by the methods seemed to be small, but they were stable and for both data sets tested. The Preprocessed nearest-neighbour (PLSNN) method performed best for full-length 16S rRNA sequences, significantly better than the naïve Bayes RDP method. On fragmented sequences the naïve Bayes Multinomial method performed best, significantly better than all other methods. For both data sets explored, and on both full-length and fragmented sequences, all the five methods reached an error-plateau. We conclude that no K-mer based method is universally best for classifying both full-length sequences and fragments (reads). All methods approach an error plateau indicating improved training data is needed to improve classification from here. Classification errors occur most frequent for genera with few sequences present. For improving the taxonomy and testing new classification methods, the need for a better and more universal and robust training data set is crucial.

  12. EVALUATION OF A TEST METHOD FOR MEASURING INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPIERS

    EPA Science Inventory

    A large chamber test method for measuring indoor air emissions from office equipment was developed, evaluated, and revised based on the initial testing of four dry-process photocopiers. Because all chambers may not necessarily produce similar results (e.g., due to differences in ...

  13. 77 FR 30540 - Proposed Collection; Comment Request; Cognitive Testing of Instrumentation and Materials for the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-23

    ... methods of administration (e.g., computer assisted personal interviews [CAPI], audio computer assisted self-interviews [ACASI], web-based interviews). Cognitive testing of these materials and methods will...

  14. Detecting Gear Tooth Fatigue Cracks in Advance of Complete Fracture

    NASA Technical Reports Server (NTRS)

    Zakrajsek, James J.; Lewicki, David G.

    1996-01-01

    Results of using vibration-based methods to detect gear tooth fatigue cracks are presented. An experimental test rig was used to fail a number of spur gear specimens through bending fatigue. The gear tooth fatigue crack in each test was initiated through a small notch in the fillet area of a tooth on the gear. The primary purpose of these tests was to verify analytical predictions of fatigue crack propagation direction and rate as a function of gear rim thickness. The vibration signal from a total of three tests was monitored and recorded for gear fault detection research. The damage consisted of complete rim fracture on the two thin rim gears and single tooth fracture on the standard full rim test gear. Vibration-based fault detection methods were applied to the vibration signal both on-line and after the tests were completed. The objectives of this effort were to identify methods capable of detecting the fatigue crack and to determine how far in advance of total failure positive detection was given. Results show that the fault detection methods failed to respond to the fatigue crack prior to complete rim fracture in the thin rim gear tests. In the standard full rim gear test all of the methods responded to the fatigue crack in advance of tooth fracture; however, only three of the methods responded to the fatigue crack in the early stages of crack propagation.

  15. Examinations of electron temperature calculation methods in Thomson scattering diagnostics.

    PubMed

    Oh, Seungtae; Lee, Jong Ha; Wi, Hanmin

    2012-10-01

    Electron temperature from Thomson scattering diagnostic is derived through indirect calculation based on theoretical model. χ-square test is commonly used in the calculation, and the reliability of the calculation method highly depends on the noise level of input signals. In the simulations, noise effects of the χ-square test are examined and scale factor test is proposed as an alternative method.

  16. Numerical Simulation of Selecting Model Scale of Cable in Wind Tunnel Test

    NASA Astrophysics Data System (ADS)

    Huang, Yifeng; Yang, Jixin

    The numerical simulation method based on computational Fluid Dynamics (CFD) provides a possible alternative means of physical wind tunnel test. Firstly, the correctness of the numerical simulation method is validated by one certain example. In order to select the minimum length of the cable as to a certain diameter in the numerical wind tunnel tests, the numerical wind tunnel tests based on CFD are carried out on the cables with several different length-diameter ratios (L/D). The results show that, when the L/D reaches to 18, the drag coefficient is stable essentially.

  17. Conditional Monte Carlo randomization tests for regression models.

    PubMed

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  18. Non-destructive testing of full-length bonded rock bolts based on HHT signal analysis

    NASA Astrophysics Data System (ADS)

    Shi, Z. M.; Liu, L.; Peng, M.; Liu, C. C.; Tao, F. J.; Liu, C. S.

    2018-04-01

    Full-length bonded rock bolts are commonly used in mining, tunneling and slope engineering because of their simple design and resistance to corrosion. However, the length of a rock bolt and grouting quality do not often meet the required design standards in practice because of the concealment and complexity of bolt construction. Non-destructive testing is preferred when testing a rock bolt's quality because of the convenience, low cost and wide detection range. In this paper, a signal analysis method for the non-destructive sound wave testing of full-length bonded rock bolts is presented, which is based on the Hilbert-Huang transform (HHT). First, we introduce the HHT analysis method to calculate the bolt length and identify defect locations based on sound wave reflection test signals, which includes decomposing the test signal via empirical mode decomposition (EMD), selecting the intrinsic mode functions (IMF) using the Pearson Correlation Index (PCI) and calculating the instantaneous phase and frequency via the Hilbert transform (HT). Second, six model tests are conducted using different grouting defects and bolt protruding lengths to verify the effectiveness of the HHT analysis method. Lastly, the influence of the bolt protruding length on the test signal, identification of multiple reflections from defects, bolt end and protruding end, and mode mixing from EMD are discussed. The HHT analysis method can identify the bolt length and grouting defect locations from signals that contain noise at multiple reflected interfaces. The reflection from the long protruding end creates an irregular test signal with many frequency peaks on the spectrum. The reflections from defects barely change the original signal because they are low energy, which cannot be adequately resolved using existing methods. The HHT analysis method can identify reflections from the long protruding end of the bolt and multiple reflections from grouting defects based on mutations in the instantaneous frequency, which makes weak reflections more noticeable. The mode mixing phenomenon is observed in several tests, but this does not markedly affect the identification results due to the simple medium in bolt tests. The mode mixing can be reduced by ensemble EMD (EEMD) or complete ensemble EMD with adaptive noise (CEEMDAN), which are powerful tools to used analyze the test signal in a complex medium and may play an important role in future studies. The HHT bolt signal analysis method is a self-adaptive and automatic process, which can be programed as analysis software and will make bolt tests more convenient in practice.

  19. Efficient data assimilation algorithm for bathymetry application

    NASA Astrophysics Data System (ADS)

    Ghorbanidehno, H.; Lee, J. H.; Farthing, M.; Hesser, T.; Kitanidis, P. K.; Darve, E. F.

    2017-12-01

    Information on the evolving state of the nearshore zone bathymetry is crucial to shoreline management, recreational safety, and naval operations. The high cost and complex logistics of using ship-based surveys for bathymetry estimation have encouraged the use of remote sensing techniques. Data assimilation methods combine the remote sensing data and nearshore hydrodynamic models to estimate the unknown bathymetry and the corresponding uncertainties. In particular, several recent efforts have combined Kalman Filter-based techniques such as ensembled-based Kalman filters with indirect video-based observations to address the bathymetry inversion problem. However, these methods often suffer from ensemble collapse and uncertainty underestimation. Here, the Compressed State Kalman Filter (CSKF) method is used to estimate the bathymetry based on observed wave celerity. In order to demonstrate the accuracy and robustness of the CSKF method, we consider twin tests with synthetic observations of wave celerity, while the bathymetry profiles are chosen based on surveys taken by the U.S. Army Corps of Engineer Field Research Facility (FRF) in Duck, NC. The first test case is a bathymetry estimation problem for a spatially smooth and temporally constant bathymetry profile. The second test case is a bathymetry estimation problem for a temporally evolving bathymetry from a smooth to a non-smooth profile. For both problems, we compare the results of CSKF with those obtained by the local ensemble transform Kalman filter (LETKF), which is a popular ensemble-based Kalman filter method.

  20. Diagnostics of Polymer Composite Materials and Analysis of Their Production Technology by Using the Method of Acoustic Emission

    NASA Astrophysics Data System (ADS)

    Bashkov, O. V.; Protsenko, A. E.; Bryanskii, A. A.; Romashko, R. V.

    2017-09-01

    The strength properties of glass-fiber-reinforced plastics produced by vacuum and vacuum autoclave molding techniques are studied. Based on acoustic emission data, a method of diagnostic and prediction of the bearing capacity of polymer composite materials by using data from three-point bending tests is developed. The method is based on evaluating changes in the exponent of a power function relating the total acoustic emission to the test stress.

  1. Improved volumetric measurement of brain structure with a distortion correction procedure using an ADNI phantom.

    PubMed

    Maikusa, Norihide; Yamashita, Fumio; Tanaka, Kenichiro; Abe, Osamu; Kawaguchi, Atsushi; Kabasawa, Hiroyuki; Chiba, Shoma; Kasahara, Akihiro; Kobayashi, Nobuhisa; Yuasa, Tetsuya; Sato, Noriko; Matsuda, Hiroshi; Iwatsubo, Takeshi

    2013-06-01

    Serial magnetic resonance imaging (MRI) images acquired from multisite and multivendor MRI scanners are widely used in measuring longitudinal structural changes in the brain. Precise and accurate measurements are important in understanding the natural progression of neurodegenerative disorders such as Alzheimer's disease. However, geometric distortions in MRI images decrease the accuracy and precision of volumetric or morphometric measurements. To solve this problem, the authors suggest a commercially available phantom-based distortion correction method that accommodates the variation in geometric distortion within MRI images obtained with multivendor MRI scanners. The authors' method is based on image warping using a polynomial function. The method detects fiducial points within a phantom image using phantom analysis software developed by the Mayo Clinic and calculates warping functions for distortion correction. To quantify the effectiveness of the authors' method, the authors corrected phantom images obtained from multivendor MRI scanners and calculated the root-mean-square (RMS) of fiducial errors and the circularity ratio as evaluation values. The authors also compared the performance of the authors' method with that of a distortion correction method based on a spherical harmonics description of the generic gradient design parameters. Moreover, the authors evaluated whether this correction improves the test-retest reproducibility of voxel-based morphometry in human studies. A Wilcoxon signed-rank test with uncorrected and corrected images was performed. The root-mean-square errors and circularity ratios for all slices significantly improved (p < 0.0001) after the authors' distortion correction. Additionally, the authors' method was significantly better than a distortion correction method based on a description of spherical harmonics in improving the distortion of root-mean-square errors (p < 0.001 and 0.0337, respectively). Moreover, the authors' method reduced the RMS error arising from gradient nonlinearity more than gradwarp methods. In human studies, the coefficient of variation of voxel-based morphometry analysis of the whole brain improved significantly from 3.46% to 2.70% after distortion correction of the whole gray matter using the authors' method (Wilcoxon signed-rank test, p < 0.05). The authors proposed a phantom-based distortion correction method to improve reproducibility in longitudinal structural brain analysis using multivendor MRI. The authors evaluated the authors' method for phantom images in terms of two geometrical values and for human images in terms of test-retest reproducibility. The results showed that distortion was corrected significantly using the authors' method. In human studies, the reproducibility of voxel-based morphometry analysis for the whole gray matter significantly improved after distortion correction using the authors' method.

  2. Delta Clipper-Experimental In-Ground Effect on Base-Heating Environment

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    1998-01-01

    A quasitransient in-ground effect method is developed to study the effect of vertical landing on a launch vehicle base-heating environment. This computational methodology is based on a three-dimensional, pressure-based, viscous flow, chemically reacting, computational fluid dynamics formulation. Important in-ground base-flow physics such as the fountain-jet formation, plume growth, air entrainment, and plume afterburning are captured with the present methodology. Convective and radiative base-heat fluxes are computed for comparison with those of a flight test. The influence of the laminar Prandtl number on the convective heat flux is included in this study. A radiative direction-dependency test is conducted using both the discrete ordinate and finite volume methods. Treatment of the plume afterburning is found to be very important for accurate prediction of the base-heat fluxes. Convective and radiative base-heat fluxes predicted by the model using a finite rate chemistry option compared reasonably well with flight-test data.

  3. Physico-chemical properties of manufactured nanomaterials - Characterisation and relevant methods. An outlook based on the OECD Testing Programme.

    PubMed

    Rasmussen, Kirsten; Rauscher, Hubert; Mech, Agnieszka; Riego Sintes, Juan; Gilliland, Douglas; González, Mar; Kearns, Peter; Moss, Kenneth; Visser, Maaike; Groenewold, Monique; Bleeker, Eric A J

    2018-02-01

    Identifying and characterising nanomaterials require additional information on physico-chemical properties and test methods, compared to chemicals in general. Furthermore, regulatory decisions for chemicals are usually based upon certain toxicological properties, and these effects may not be equivalent to those for nanomaterials. However, regulatory agencies lack an authoritative decision framework for nanomaterials that links the relevance of certain physico-chemical endpoints to toxicological effects. This paper investigates various physico-chemical endpoints and available test methods that could be used to produce such a decision framework for nanomaterials. It presents an overview of regulatory relevance and methods used for testing fifteen proposed physico-chemical properties of eleven nanomaterials in the OECD Working Party on Manufactured Nanomaterials' Testing Programme, complemented with methods from literature, and assesses the methods' adequacy and applications limits. Most endpoints are of regulatory relevance, though the specific parameters depend on the nanomaterial and type of assessment. Size (distribution) is the common characteristic of all nanomaterials and is decisive information for classifying a material as a nanomaterial. Shape is an important particle descriptor. The octanol-water partitioning coefficient is undefined for particulate nanomaterials. Methods, including sample preparation, need to be further standardised, and some new methods are needed. The current work of OECD's Test Guidelines Programme regarding physico-chemical properties is highlighted. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Using a fuzzy comprehensive evaluation method to determine product usability: A test case.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method.

  5. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.

    PubMed

    Karabatsos, George

    2018-06-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.

  6. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  7. The Impact of Team-Based Learning on Nervous System Examination Knowledge of Nursing Students.

    PubMed

    Hemmati Maslakpak, Masomeh; Parizad, Naser; Zareie, Farzad

    2015-12-01

    Team-based learning is one of the active learning approaches in which independent learning is combined with small group discussion in the class. This study aimed to determine the impact of team-based learning in nervous system examination knowledge of nursing students. This quasi-experimental study was conducted on 3(rd) grade nursing students, including 5th semester (intervention group) and 6(th) semester (control group). The traditional lecture method and the team-based learning method were used for educating the examination of the nervous system for intervention and control groups, respectively. The data were collected by a test covering 40-questions (multiple choice, matching, gap-filling and descriptive questions) before and after intervention in both groups. Individual Readiness Assurance Test (RAT) and Group Readiness Assurance Test (GRAT) used to collect data in the intervention group. In the end, the collected data were analyzed by SPSS ver. 13 using descriptive and inferential statistical tests. In team-based learning group, mean and standard deviation was 13.39 (4.52) before the intervention, which had been increased to 31.07 (3.20) after the intervention and this increase was statistically significant. Also, there was a statistically significant difference between the scores of RAT and GRAT in team-based learning group. Using team-based learning approach resulted in much better improvement and stability in the nervous system examination knowledge of nursing students compared to traditional lecture method; therefore, this method could be efficiently used as an effective educational approach in nursing education.

  8. IMPROVED METHODS FOR HEPATITIS A VIRUS AND ROTAVIRUS CONCENTRATION AND DETECTION IN RECREATIONAL, RAW POTABLE, AND FINISHED WATERS

    EPA Science Inventory

    The report contains procedures for detecting rotaviruses based upon an immunofluorescence test using a monoclonal antibody and fluorescein-isothiocyanate-conjugated antibody staining method to visualize virus-infected cells. Also contained in the report are test methods for detec...

  9. Effects of an Inquiry-Based Short Intervention on State Test Anxiety in Comparison to Alternative Coping Strategies

    PubMed Central

    Krispenz, Ann; Dickhäuser, Oliver

    2018-01-01

    Background and Objectives: Test anxiety can have undesirable consequences for learning and academic achievement. The control-value theory of achievement emotions assumes that test anxiety is experienced if a student appraises an achievement situation as important (value appraisal), but feels that the situation and its outcome are not fully under his or her control (control appraisal). Accordingly, modification of cognitive appraisals is assumed to reduce test anxiety. One method aiming at the modification of appraisals is inquiry-based stress reduction. In the present study (N = 162), we assessed the effects of an inquiry-based short intervention on test anxiety. Design: Short-term longitudinal, randomized control trial. Methods: Focusing on an individual worry thought, 53 university students received an inquiry-based short intervention. Control participants reflected on their worry thought (n = 55) or were distracted (n = 52). Thought related test anxiety was assessed before, immediately after, and 2 days after the experimental treatment. Results: After the intervention as well as 2 days later, individuals who had received the inquiry-based intervention demonstrated significantly lower test anxiety than participants from the pooled control groups. Further analyses showed that the inquiry-based short intervention was more effective than reflecting on a worry thought but had no advantage over distraction. Conclusions: Our findings provide first experimental evidence for the effectiveness of an inquiry-based short intervention in reducing students’ test anxiety. PMID:29515507

  10. A Combined Independent Source Separation and Quality Index Optimization Method for Fetal ECG Extraction from Abdominal Maternal Leads

    PubMed Central

    Billeci, Lucia; Varanini, Maurizio

    2017-01-01

    The non-invasive fetal electrocardiogram (fECG) technique has recently received considerable interest in monitoring fetal health. The aim of our paper is to propose a novel fECG algorithm based on the combination of the criteria of independent source separation and of a quality index optimization (ICAQIO-based). The algorithm was compared with two methods applying the two different criteria independently—the ICA-based and the QIO-based methods—which were previously developed by our group. All three methods were tested on the recently implemented Fetal ECG Synthetic Database (FECGSYNDB). Moreover, the performance of the algorithm was tested on real data from the PhysioNet fetal ECG Challenge 2013 Database. The proposed combined method outperformed the other two algorithms on the FECGSYNDB (ICAQIO-based: 98.78%, QIO-based: 97.77%, ICA-based: 97.61%). Significant differences were obtained in particular in the conditions when uterine contractions and maternal and fetal ectopic beats occurred. On the real data, all three methods obtained very high performances, with the QIO-based method proving slightly better than the other two (ICAQIO-based: 99.38%, QIO-based: 99.76%, ICA-based: 99.37%). The findings from this study suggest that the proposed method could potentially be applied as a novel algorithm for accurate extraction of fECG, especially in critical recording conditions. PMID:28509860

  11. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  12. Test pattern generation for ILA sequential circuits

    NASA Technical Reports Server (NTRS)

    Feng, YU; Frenzel, James F.; Maki, Gary K.

    1993-01-01

    An efficient method of generating test patterns for sequential machines implemented using one-dimensional, unilateral, iterative logic arrays (ILA's) of BTS pass transistor networks is presented. Based on a transistor level fault model, the method affords a unique opportunity for real-time fault detection with improved fault coverage. The resulting test sets are shown to be equivalent to those obtained using conventional gate level models, thus eliminating the need for additional test patterns. The proposed method advances the simplicity and ease of the test pattern generation for a special class of sequential circuitry.

  13. A comparison of respondent-driven and venue-based sampling of female sex workers in Liuzhou, China

    PubMed Central

    Weir, Sharon S; Merli, M Giovanna; Li, Jing; Gandhi, Anisha D; Neely, William W; Edwards, Jessie K; Suchindran, Chirayath M; Henderson, Gail E; Chen, Xiang-Sheng

    2012-01-01

    Objectives To compare two methods for sampling female sex workers (FSWs) for bio-behavioural surveillance. We compared the populations of sex workers recruited by the venue-based Priorities for Local AIDS Control Efforts (PLACE) method and a concurrently implemented network-based sampling method, respondent-driven sampling (RDS), in Liuzhou, China. Methods For the PLACE protocol, all female workers at a stratified random sample of venues identified as places where people meet new sexual partners were interviewed and tested for syphilis. Female workers who reported sex work in the past 4 weeks were categorised as FSWs. RDS used peer recruitment and chain referral to obtain a sample of FSWs. Data were collected between October 2009 and January 2010. We compared the socio-demographic characteristics and the percentage with a positive syphilis test of FSWs recruited by PLACE and RDS. Results The prevalence of a positive syphilis test was 24% among FSWs recruited by PLACE and 8.5% among those recruited by RDS and tested (prevalence ratio 3.3; 95% CI 1.5 to 7.2). Socio-demographic characteristics (age, residence and monthly income) also varied by sampling method. PLACE recruited fewer FSWs than RDS (161 vs 583), was more labour-intensive and had difficulty gaining access to some venues. RDS was more likely to recruit from areas near the RDS office and from large low prevalence entertainment venues. Conclusions Surveillance protocols using different sampling methods can obtain different estimates of prevalence and population characteristics. Venue-based and network-based methods each have strengths and limitations reflecting differences in design and assumptions. We recommend that more research be conducted on measuring bias in bio-behavioural surveillance. PMID:23172350

  14. Multiaxial Fatigue Life Prediction Based on Nonlinear Continuum Damage Mechanics and Critical Plane Method

    NASA Astrophysics Data System (ADS)

    Wu, Z. R.; Li, X.; Fang, L.; Song, Y. D.

    2018-04-01

    A new multiaxial fatigue life prediction model has been proposed in this paper. The concepts of nonlinear continuum damage mechanics and critical plane criteria were incorporated in the proposed model. The shear strain-based damage control parameter was chosen to account for multiaxial fatigue damage under constant amplitude loading. Fatigue tests were conducted on nickel-based superalloy GH4169 tubular specimens at the temperature of 400 °C under proportional and nonproportional loading. The proposed method was checked against the multiaxial fatigue test data of GH4169. Most of prediction results are within a factor of two scatter band of the test results.

  15. Field demonstration of on-site analytical methods for TNT and RDX in ground water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, H.; Ferguson, G.; Markos, A.

    1996-12-31

    A field demonstration was conducted to assess the performance of eight commercially-available and emerging colorimetric, immunoassay, and biosensor on-site analytical methods for explosives 2,4,6-trinitrotoluene (TNT) and hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) in ground water and leachate at the Umatilla Army Depot Activity, Hermiston, Oregon and US Naval Submarine Base, Bangor, Washington, Superfund sites. Ground water samples were analyzed by each of the on-site methods and results compared to laboratory analysis using high performance liquid chromatography (HPLC) with EPA SW-846 Method 8330. The commercial methods evaluated include the EnSys, Inc., TNT and RDX colorimetric test kits (EPA SW-846 Methods 8515 and 8510) with amore » solid phase extraction (SPE) step, the DTECH/EM Science TNT and RDX immunoassay test kits (EPA SW-846 Methods 4050 and 4051), and the Ohmicron TNT immunoassay test kit. The emerging methods tested include the antibody-based Naval Research Laboratory (NRL) Continuous Flow Immunosensor (CFI) for TNT and RDX, and the Fiber Optic Biosensor (FOB) for TNT. Accuracy of the on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison criteria. Over the range of conditions tested, the colorimetric methods for TNT and RDX showed the highest accuracy of the emerging methods for TNT and RDX. The colorimetric method was selected for routine ground water monitoring at the Umatilla site, and further field testing on the NRL CFI and FOB biosensors will continue at both Superfund sites.« less

  16. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  17. [Interlaboratory Study on Evaporation Residue Test for Food Contact Products (Report 1)].

    PubMed

    Ohno, Hiroyuki; Mutsuga, Motoh; Abe, Tomoyuki; Abe, Yutaka; Amano, Homare; Ishihara, Kinuyo; Ohsaka, Ikue; Ohno, Haruka; Ohno, Yuichiro; Ozaki, Asako; Kakihara, Yoshiteru; Kobayashi, Hisashi; Sakuragi, Hiroshi; Shibata, Hiroshi; Shirono, Katsuhiro; Sekido, Haruko; Takasaka, Noriko; Takenaka, Yu; Tajima, Yoshiyasu; Tanaka, Aoi; Tanaka, Hideyuki; Tonooka, Hiroyuki; Nakanishi, Toru; Nomura, Chie; Haneishi, Nahoko; Hayakawa, Masato; Miura, Toshihiko; Yamaguchi, Miku; Watanabe, Kazunari; Sato, Kyoko

    2018-01-01

    An interlaboratory study was performed to evaluate the equivalence between an official method and a modified method of evaporation residue test using three food-simulating solvents (water, 4% acetic acid and 20% ethanol), based on the Japanese Food Sanitation Law for food contact products. Twenty-three laboratories participated, and tested the evaporation residues of nine test solutions as blind duplicates. For evaporation, a water bath was used in the official method, and a hot plate in the modified method. In most laboratories, the test solutions were heated until just prior to evaporation to dryness, and then allowed to dry under residual heat. Statistical analysis revealed that there was no significant difference between the two methods, regardless of the heating equipment used. Accordingly, the modified method provides performance equal to the official method, and is available as an alternative method.

  18. Exact test-based approach for equivalence test with parameter margin.

    PubMed

    Cassie Dong, Xiaoyu; Bian, Yuanyuan; Tsong, Yi; Wang, Tianhua

    2017-01-01

    The equivalence test has a wide range of applications in pharmaceutical statistics which we need to test for the similarity between two groups. In recent years, the equivalence test has been used in assessing the analytical similarity between a proposed biosimilar product and a reference product. More specifically, the mean values of the two products for a given quality attribute are compared against an equivalence margin in the form of ±f × σ R , where ± f × σ R is a function of the reference variability. In practice, this margin is unknown and is estimated from the sample as ±f × S R . If we use this estimated margin with the classic t-test statistic on the equivalence test for the means, both Type I and Type II error rates may inflate. To resolve this issue, we develop an exact-based test method and compare this method with other proposed methods, such as the Wald test, the constrained Wald test, and the Generalized Pivotal Quantity (GPQ) in terms of Type I error rate and power. Application of those methods on data analysis is also provided in this paper. This work focuses on the development and discussion of the general statistical methodology and is not limited to the application of analytical similarity.

  19. Measuring and Specifying Combinatorial Coverage of Test Input Configurations

    PubMed Central

    Kuhn, D. Richard; Kacker, Raghu N.; Lei, Yu

    2015-01-01

    A key issue in testing is how many tests are needed for a required level of coverage or fault detection. Estimates are often based on error rates in initial testing, or on code coverage. For example, tests may be run until a desired level of statement or branch coverage is achieved. Combinatorial methods present an opportunity for a different approach to estimating required test set size, using characteristics of the test set. This paper describes methods for estimating the coverage of, and ability to detect, t-way interaction faults of a test set based on a covering array. We also develop a connection between (static) combinatorial coverage and (dynamic) code coverage, such that if a specific condition is satisfied, 100% branch coverage is assured. Using these results, we propose practical recommendations for using combinatorial coverage in specifying test requirements. PMID:28133442

  20. Testing for genetic association taking into account phenotypic information of relatives.

    PubMed

    Uh, Hae-Won; Wijk, Henk Jan van der; Houwing-Duistermaat, Jeanine J

    2009-12-15

    We investigated efficient case-control association analysis using family data. The outcome of interest was coronary heart disease. We employed existing and new methods that take into account the correlations among related individuals to obtain the proper type I error rates. The methods considered for autosomal single-nucleotide polymorphisms were: 1) generalized estimating equations-based methods, 2) variance-modified Cochran-Armitage (MCA) trend test incorporating kinship coefficients, and 3) genotypic modified quasi-likelihood score test. Additionally, for X-linked single-nucleotide polymorphisms we proposed a two-degrees-of-freedom test. Performance of these methods was tested using Framingham Heart Study 500 k array data.

  1. Implementation and performance test of cloud platform based on Hadoop

    NASA Astrophysics Data System (ADS)

    Xu, Jingxian; Guo, Jianhong; Ren, Chunlan

    2018-01-01

    Hadoop, as an open source project for the Apache foundation, is a distributed computing framework that deals with large amounts of data and has been widely used in the Internet industry. Therefore, it is meaningful to study the implementation of Hadoop platform and the performance of test platform. The purpose of this subject is to study the method of building Hadoop platform and to study the performance of test platform. This paper presents a method to implement Hadoop platform and a test platform performance method. Experimental results show that the proposed test performance method is effective and it can detect the performance of Hadoop platform.

  2. Measurement uncertainty of the EU methods for microbiological examination of red meat.

    PubMed

    Corry, Janet E L; Hedges, Alan J; Jarvis, Basil

    2007-09-01

    Three parallel trials were made of EU methods proposed for the microbiological examination of red meat using two analysts in each of seven laboratories within the UK. The methods involved determination of aerobic colony count (ACC) and Enterobacteriaceae colony count (ECC) using simulated methods and a freeze-dried standardised culture preparation. Trial A was based on a simulated swab test, Trial B a simulated meat excision test and Trial C was a reference test on reconstituted inoculum. Statistical analysis (ANOVA) was carried out before and after rejection of outlying data. Expanded uncertainty values (relative standard deviation x2) for repeatability and reproducibility, based on the log10 cfu/ml, on the ACC ranged from +/-2.1% to +/-2.7% and from +/-5.5% to +/-10.5%, respectively, depending upon the test procedure. Similarly for the ECC, expanded uncertainty estimates for repeatability and reproducibility ranged from +/-4.6% to +/-16.9% and from +/-21.6% to +/-23.5%, respectively. The results are discussed in relation to the potential application of the methods.

  3. Impact of SMS/GPRS Printers in Reducing Time to Early Infant Diagnosis Compared With Routine Result Reporting: A Systematic Review and Meta-Analysis

    PubMed Central

    Markby, Jessica; Boeke, Caroline; Penazzato, Martina; Urick, Brittany; Ghadrshenas, Anisa; Harris, Lindsay; Ford, Nathan; Peter, Trevor

    2017-01-01

    Background: Despite significant gains made toward improving access, early infant diagnosis (EID) testing programs suffer from long test turnaround times that result in substantial loss to follow-up and mortality associated with delays in antiretroviral therapy initiation. These delays in treatment initiation are particularly impactful because of significant HIV-related infant mortality observed by 2–3 months of age. Short message service (SMS) and general packet radio service (GPRS) printers allow test results to be transmitted immediately to health care facilities on completion of testing in the laboratory. Methods: We conducted a systematic review and meta-analysis to assess the benefit of using SMS/GPRS printers to increase the efficiency of EID test result delivery compared with traditional courier paper–based results delivery methods. Results: We identified 11 studies contributing data for over 16,000 patients from East and Southern Africa. The test turnaround time from specimen collection to result received at the health care facility with courier paper–based methods was 68.0 days (n = 6835), whereas the test turnaround time with SMS/GPRS printers was 51.1 days (n = 6711), resulting in a 2.5-week (25%) reduction in the turnaround time. Conclusions: Courier paper–based EID test result delivery methods are estimated to add 2.5 weeks to EID test turnaround times in low resource settings and increase the risk that infants receive test results during or after the early peak of infant mortality. SMS/GPRS result delivery to health care facility printers significantly reduced test turnaround time and may reduce this risk. SMS/GPRS printers should be considered for expedited delivery of EID and other centralized laboratory test results. PMID:28825941

  4. Improving the detection of pathways in genome-wide association studies by combined effects of SNPs from Linkage Disequilibrium blocks.

    PubMed

    Zhao, Huiying; Nyholt, Dale R; Yang, Yuanhao; Wang, Jihua; Yang, Yuedong

    2017-06-14

    Genome-wide association studies (GWAS) have successfully identified single variants associated with diseases. To increase the power of GWAS, gene-based and pathway-based tests are commonly employed to detect more risk factors. However, the gene- and pathway-based association tests may be biased towards genes or pathways containing a large number of single-nucleotide polymorphisms (SNPs) with small P-values caused by high linkage disequilibrium (LD) correlations. To address such bias, numerous pathway-based methods have been developed. Here we propose a novel method, DGAT-path, to divide all SNPs assigned to genes in each pathway into LD blocks, and to sum the chi-square statistics of LD blocks for assessing the significance of the pathway by permutation tests. The method was proven robust with the type I error rate >1.6 times lower than other methods. Meanwhile, the method displays a higher power and is not biased by the pathway size. The applications to the GWAS summary statistics for schizophrenia and breast cancer indicate that the detected top pathways contain more genes close to associated SNPs than other methods. As a result, the method identified 17 and 12 significant pathways containing 20 and 21 novel associated genes, respectively for two diseases. The method is available online by http://sparks-lab.org/server/DGAT-path .

  5. A powerful microbiome-based association test and a microbial taxa discovery framework for comprehensive association mapping.

    PubMed

    Koh, Hyunwook; Blaser, Martin J; Li, Huilin

    2017-04-24

    The role of the microbiota in human health and disease has been increasingly studied, gathering momentum through the use of high-throughput technologies. Further identification of the roles of specific microbes is necessary to better understand the mechanisms involved in diseases related to microbiome perturbations. Here, we introduce a new microbiome-based group association testing method, optimal microbiome-based association test (OMiAT). OMiAT is a data-driven testing method which takes an optimal test throughout different tests from the sum of powered score tests (SPU) and microbiome regression-based kernel association test (MiRKAT). We illustrate that OMiAT efficiently discovers significant association signals arising from varying microbial abundances and different relative contributions from microbial abundance and phylogenetic information. We also propose a way to apply it to fine-mapping of diverse upper-level taxa at different taxonomic ranks (e.g., phylum, class, order, family, and genus), as well as the entire microbial community, within a newly introduced microbial taxa discovery framework, microbiome comprehensive association mapping (MiCAM). Our extensive simulations demonstrate that OMiAT is highly robust and powerful compared with other existing methods, while correctly controlling type I error rates. Our real data analyses also confirm that MiCAM is especially efficient for the assessment of upper-level taxa by integrating OMiAT as a group analytic method. OMiAT is attractive in practice due to the high complexity of microbiome data and the unknown true nature of the state. MiCAM also provides a hierarchical association map for numerous microbial taxa and can also be used as a guideline for further investigation on the roles of discovered taxa in human health and disease.

  6. Use of HPLC/UPLC-spectrophotometry for detection of formazan in in vitro Reconstructed human Tissue (RhT)-based test methods employing the MTT-reduction assay to expand their applicability to strongly coloured test chemicals.

    PubMed

    Alépée, N; Barroso, J; De Smedt, A; De Wever, B; Hibatallah, J; Klaric, M; Mewes, K R; Millet, M; Pfannenbecker, U; Tailhardat, M; Templier, M; McNamee, P

    2015-06-01

    A number of in vitro test methods using Reconstructed human Tissues (RhT) are regulatory accepted for evaluation of skin corrosion/irritation. In such methods, test chemical corrosion/irritation potential is determined by measuring tissue viability using the photometric MTT-reduction assay. A known limitation of this assay is possible interference of strongly coloured test chemicals with measurement of formazan by absorbance (OD). To address this, Cosmetics Europe evaluated use of HPLC/UPLC-spectrophotometry as an alternative formazan measurement system. Using the approach recommended by the FDA guidance for validation of bio-analytical methods, three independent laboratories established and qualified their HPLC/UPLC-spectrophotometry systems to reproducibly measure formazan from tissue extracts. Up to 26 chemicals were then tested in RhT test systems for eye/skin irritation and skin corrosion. Results support that: (1) HPLC/UPLC-spectrophotometry formazan measurement is highly reproducible; (2) formazan measurement by HPLC/UPLC-spectrophotometry and OD gave almost identical tissue viabilities for test chemicals not exhibiting colour interference nor direct MTT reduction; (3) independent of the test system used, HPLC/UPLC-spectrophotometry can measure formazan for strongly coloured test chemicals when this is not possible by absorbance only. It is therefore recommended that HPLC/UPLC-spectrophotometry to measure formazan be included in the procedures of in vitro RhT-based test methods, irrespective of the test system used and the toxicity endpoint evaluated to extend the applicability of these test methods to strongly coloured chemicals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Can currently available non-animal methods detect pre and pro-haptens relevant for skin sensitization?

    PubMed

    Patlewicz, Grace; Casati, Silvia; Basketter, David A; Asturiol, David; Roberts, David W; Lepoittevin, Jean-Pierre; Worth, Andrew P; Aschberger, Karin

    2016-12-01

    Predictive testing to characterize substances for their skin sensitization potential has historically been based on animal tests such as the Local Lymph Node Assay (LLNA). In recent years, regulations in the cosmetics and chemicals sectors have provided strong impetus to develop non-animal alternatives. Three test methods have undergone OECD validation: the direct peptide reactivity assay (DPRA), the KeratinoSens™ and the human Cell Line Activation Test (h-CLAT). Whilst these methods perform relatively well in predicting LLNA results, a concern raised is their ability to predict chemicals that need activation to be sensitizing (pre- or pro-haptens). This current study reviewed an EURL ECVAM dataset of 127 substances for which information was available in the LLNA and three non-animal test methods. Twenty eight of the sensitizers needed to be activated, with the majority being pre-haptens. These were correctly identified by 1 or more of the test methods. Six substances were categorized exclusively as pro-haptens, but were correctly identified by at least one of the cell-based assays. The analysis here showed that skin metabolism was not likely to be a major consideration for assessing sensitization potential and that sensitizers requiring activation could be identified correctly using one or more of the current non-animal methods. Published by Elsevier Inc.

  8. Normalized Rotational Multiple Yield Surface Framework (NRMYSF) stress-strain curve prediction method based on small strain triaxial test data on undisturbed Auckland residual clay soils

    NASA Astrophysics Data System (ADS)

    Noor, M. J. Md; Ibrahim, A.; Rahman, A. S. A.

    2018-04-01

    Small strain triaxial test measurement is considered to be significantly accurate compared to the external strain measurement using conventional method due to systematic errors normally associated with the test. Three submersible miniature linear variable differential transducer (LVDT) mounted on yokes which clamped directly onto the soil sample at equally 120° from the others. The device setup using 0.4 N resolution load cell and 16 bit AD converter was capable of consistently resolving displacement of less than 1µm and measuring axial strains ranging from less than 0.001% to 2.5%. Further analysis of small strain local measurement data was performed using new Normalized Multiple Yield Surface Framework (NRMYSF) method and compared with existing Rotational Multiple Yield Surface Framework (RMYSF) prediction method. The prediction of shear strength based on combined intrinsic curvilinear shear strength envelope using small strain triaxial test data confirmed the significant improvement and reliability of the measurement and analysis methods. Moreover, the NRMYSF method shows an excellent data prediction and significant improvement toward more reliable prediction of soil strength that can reduce the cost and time of experimental laboratory test.

  9. An entropy-based nonparametric test for the validation of surrogate endpoints.

    PubMed

    Miao, Xiaopeng; Wang, Yong-Cheng; Gangopadhyay, Ashis

    2012-06-30

    We present a nonparametric test to validate surrogate endpoints based on measure of divergence and random permutation. This test is a proposal to directly verify the Prentice statistical definition of surrogacy. The test does not impose distributional assumptions on the endpoints, and it is robust to model misspecification. Our simulation study shows that the proposed nonparametric test outperforms the practical test of the Prentice criterion in terms of both robustness of size and power. We also evaluate the performance of three leading methods that attempt to quantify the effect of surrogate endpoints. The proposed method is applied to validate magnetic resonance imaging lesions as the surrogate endpoint for clinical relapses in a multiple sclerosis trial. Copyright © 2012 John Wiley & Sons, Ltd.

  10. A new scenario-based approach to damage detection using operational modal parameter estimates

    NASA Astrophysics Data System (ADS)

    Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.

    2017-09-01

    In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.

  11. Performance-based quality assurance/quality control (QA/QC) acceptance procedures for in-place soil testing phase 3.

    DOT National Transportation Integrated Search

    2015-01-01

    One of the objectives of this study was to evaluate soil testing equipment based on its capability of measuring in-place stiffness or modulus values. : As design criteria transition from empirical to mechanistic-empirical, soil test methods and equip...

  12. Estimating Measures of Pass-Fail Reliability from Parallel Half-Tests.

    ERIC Educational Resources Information Center

    Woodruff, David J.; Sawyer, Richard L.

    Two methods for estimating measures of pass-fail reliability are derived, by which both theta and kappa may be estimated from a single test administration. The methods require only a single test administration and are computationally simple. Both are based on the Spearman-Brown formula for estimating stepped-up reliability. The non-distributional…

  13. A Probability Based Framework for Testing the Missing Data Mechanism

    ERIC Educational Resources Information Center

    Lin, Johnny Cheng-Han

    2013-01-01

    Many methods exist for imputing missing data but fewer methods have been proposed to test the missing data mechanism. Little (1988) introduced a multivariate chi-square test for the missing completely at random data mechanism (MCAR) that compares observed means for each pattern with expectation-maximization (EM) estimated means. As an alternative,…

  14. Reasoning Maps: A Generally Applicable Method for Characterizing Hypothesis-Testing Behaviour. Research Report

    ERIC Educational Resources Information Center

    White, Brian

    2004-01-01

    This paper presents a generally applicable method for characterizing subjects' hypothesis-testing behaviour based on a synthesis that extends on previous work. Beginning with a transcript of subjects' speech and videotape of their actions, a Reasoning Map is created that depicts the flow of their hypotheses, tests, predictions, results, and…

  15. Possible strategies for EDC testing in the future: exploring roles of pathway-based in silico, in vitro and in vivo methods

    EPA Science Inventory

    Current methods for screening, testing and monitoring endocrine-disrupting chemicals (EDCs) rely relatively substantially upon moderate- to long-term assays that can, in some instances, require significant numbers of animals. Recent developments in the areas of in vitro testing...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Passarge, M; Fix, M K; Manser, P

    Purpose: To create and test an accurate EPID-frame-based VMAT QA metric to detect gross dose errors in real-time and to provide information about the source of error. Methods: A Swiss cheese model was created for an EPID-based real-time QA process. The system compares a treatmentplan- based reference set of EPID images with images acquired over each 2° gantry angle interval. The metric utilizes a sequence of independent consecutively executed error detection Methods: a masking technique that verifies infield radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment to quantify rotation, scaling andmore » translation; standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each test were determined. For algorithm testing, twelve different types of errors were selected to modify the original plan. Corresponding predictions for each test case were generated, which included measurement-based noise. Each test case was run multiple times (with different noise per run) to assess the ability to detect introduced errors. Results: Averaged over five test runs, 99.1% of all plan variations that resulted in patient dose errors were detected within 2° and 100% within 4° (∼1% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 91.5% were detected by the system within 2°. Based on the type of method that detected the error, determination of error sources was achieved. Conclusion: An EPID-based during-treatment error detection system for VMAT deliveries was successfully designed and tested. The system utilizes a sequence of methods to identify and prevent gross treatment delivery errors. The system was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of errors in real-time and indicate the error source. J. V. Siebers receives funding support from Varian Medical Systems.« less

  17. Evaluation of dysphagia in early stroke patients by bedside, endoscopic, and electrophysiological methods.

    PubMed

    Umay, Ebru Karaca; Unlu, Ece; Saylam, Guleser Kılıc; Cakci, Aytul; Korkmaz, Hakan

    2013-09-01

    We aimed in this study to evaluate dysphagia in early stroke patients using a bedside screening test and flexible fiberoptic endoscopic evaluation of swallowing (FFEES) and electrophysiological evaluation (EE) methods and to compare the effectiveness of these methods. Twenty-four patients who were hospitalized in our clinic within the first 3 months after stroke were included in this study. Patients were evaluated using a bedside screening test [including bedside dysphagia score (BDS), neurological examination dysphagia score (NEDS), and total dysphagia score (TDS)] and FFEES and EE methods. Patients were divided into normal-swallowing and dysphagia groups according to the results of the evaluation methods. Patients with dysphagia as determined by any of these methods were compared to the patients with normal swallowing based on the results of the other two methods. Based on the results of our study, a high BDS was positively correlated with dysphagia identified by FFEES and EE methods. Moreover, the FFEES and EE methods were positively correlated. There was no significant correlation between NEDS and TDS levels and either EE or FFEES method. Bedside screening tests should be used mainly as an initial screening test; then FFEES and EE methods should be combined in patients who show risks. This diagnostic algorithm may provide a practical and fast solution for selected stroke patients.

  18. Nondestructive equipment study

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Identification of existing nondestructive Evaluation (NDE) methods that could be used in a low Earth orbit environment; evaluation of each method with respect to the set of criteria called out in the statement of work; selection of the most promising NDE methods for further evaluation; use of selected NDE methods to test samples of pressure vessel materials in a vacuum; pressure testing of a complex monolythic pressure vessel with known flaws using acoustic emissions in a vacuum; and recommendations for further studies based on analysis and testing are covered.

  19. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  20. The impact of internet and simulation-based training on transoesophageal echocardiography learning in anaesthetic trainees: a prospective randomised study.

    PubMed

    Sharma, V; Chamos, C; Valencia, O; Meineri, M; Fletcher, S N

    2013-06-01

    With the increasing role of transoesophageal echocardiography in clinical fields other than cardiac surgery, we decided to assess the efficacy of multi-modular echocardiography learning in echo-naïve anaesthetic trainees. Twenty-eight trainees undertook a pre-test to ascertain basic echocardiography knowledge, following which the study subjects were randomly assigned to two groups: learning via traditional methods such as review of guidelines and other literature (non-internet group); and learning via an internet-based echocardiography resource (internet group). After this, subjects in both groups underwent simulation-based echocardiography training. More tests were then conducted after a review of the respective educational resources and simulation sessions. Mean (SD) scores of subjects in the non-internet group were 28 (10)%, 44 (10)% and 63 (5)% in the pre-test, post-intervention test and post-simulation test, respectively, whereas those in the internet group scored 29 (8)%, 59 (10)%, (p = 0.001) and 72 (8)%, p = 0.005, respectively. The use of internet- and simulation-based learning methods led to a significant improvement in knowledge of transoesophageal echocardiography by anaesthetic trainees. The impact of simulation-based training was greater in the group who did not use the internet-based resource. We conclude that internet- and simulation-based learning methods both improve transoesophageal echocardiography knowledge in echo-naïve anaesthetic trainees. Anaesthesia © 2013 The Association of Anaesthetists of Great Britain and Ireland.

  1. A comparative review of methods for comparing means using partially paired data.

    PubMed

    Guo, Beibei; Yuan, Ying

    2017-06-01

    In medical experiments with the objective of testing the equality of two means, data are often partially paired by design or because of missing data. The partially paired data represent a combination of paired and unpaired observations. In this article, we review and compare nine methods for analyzing partially paired data, including the two-sample t-test, paired t-test, corrected z-test, weighted t-test, pooled t-test, optimal pooled t-test, multiple imputation method, mixed model approach, and the test based on a modified maximum likelihood estimate. We compare the performance of these methods through extensive simulation studies that cover a wide range of scenarios with different effect sizes, sample sizes, and correlations between the paired variables, as well as true underlying distributions. The simulation results suggest that when the sample size is moderate, the test based on the modified maximum likelihood estimator is generally superior to the other approaches when the data is normally distributed and the optimal pooled t-test performs the best when the data is not normally distributed, with well-controlled type I error rates and high statistical power; when the sample size is small, the optimal pooled t-test is to be recommended when both variables have missing data and the paired t-test is to be recommended when only one variable has missing data.

  2. Impact of an engineering design-based curriculum compared to an inquiry-based curriculum on fifth graders' content learning of simple machines

    NASA Astrophysics Data System (ADS)

    Marulcu, Ismail; Barnett, Michael

    2016-01-01

    Background: Elementary Science Education is struggling with multiple challenges. National and State test results confirm the need for deeper understanding in elementary science education. Moreover, national policy statements and researchers call for increased exposure to engineering and technology in elementary science education. The basic motivation of this study is to suggest a solution to both improving elementary science education and increasing exposure to engineering and technology in it. Purpose/Hypothesis: This mixed-method study examined the impact of an engineering design-based curriculum compared to an inquiry-based curriculum on fifth graders' content learning of simple machines. We hypothesize that the LEGO-engineering design unit is as successful as the inquiry-based unit in terms of students' science content learning of simple machines. Design/Method: We used a mixed-methods approach to investigate our research questions; we compared the control and the experimental groups' scores from the tests and interviews by using Analysis of Covariance (ANCOVA) and compared each group's pre- and post-scores by using paired t-tests. Results: Our findings from the paired t-tests show that both the experimental and comparison groups significantly improved their scores from the pre-test to post-test on the multiple-choice, open-ended, and interview items. Moreover, ANCOVA results show that students in the experimental group, who learned simple machines with the design-based unit, performed significantly better on the interview questions. Conclusions: Our analyses revealed that the design-based Design a people mover: Simple machines unit was, if not better, as successful as the inquiry-based FOSS Levers and pulleys unit in terms of students' science content learning.

  3. Biases and Power for Groups Comparison on Subjective Health Measurements

    PubMed Central

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald’s test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative. PMID:23115620

  4. Performance of human fecal anaerobe-associated PCR-based assays in a multi-laboratory method evaluation study

    EPA Science Inventory

    A number of PCR-based methods for detecting human fecal material in environmental waters have been developed over the past decade, but these methods have rarely received independent comparative testing. Here, we evaluated ten of these methods (BacH, BacHum-UCD, B. thetaiotaomic...

  5. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network

    PubMed Central

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006

  6. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network.

    PubMed

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-08

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.

  7. GPR-Based Water Leak Models in Water Distribution Systems

    PubMed Central

    Ayala-Cabrera, David; Herrera, Manuel; Izquierdo, Joaquín; Ocaña-Levario, Silvia J.; Pérez-García, Rafael

    2013-01-01

    This paper addresses the problem of leakage in water distribution systems through the use of ground penetrating radar (GPR) as a nondestructive method. Laboratory tests are performed to extract features of water leakage from the obtained GPR images. Moreover, a test in a real-world urban system under real conditions is performed. Feature extraction is performed by interpreting GPR images with the support of a pre-processing methodology based on an appropriate combination of statistical methods and multi-agent systems. The results of these tests are presented, interpreted, analyzed and discussed in this paper.

  8. Mechanical testing of hydrogels in cartilage tissue engineering: beyond the compressive modulus.

    PubMed

    Xiao, Yinghua; Friis, Elizabeth A; Gehrke, Stevin H; Detamore, Michael S

    2013-10-01

    Injuries to articular cartilage result in significant pain to patients and high medical costs. Unfortunately, cartilage repair strategies have been notoriously unreliable and/or complex. Biomaterial-based tissue-engineering strategies offer great promise, including the use of hydrogels to regenerate articular cartilage. Mechanical integrity is arguably the most important functional outcome of engineered cartilage, although mechanical testing of hydrogel-based constructs to date has focused primarily on deformation rather than failure properties. In addition to deformation testing, as the field of cartilage tissue engineering matures, this community will benefit from the addition of mechanical failure testing to outcome analyses, given the crucial clinical importance of the success of engineered constructs. However, there is a tremendous disparity in the methods used to evaluate mechanical failure of hydrogels and articular cartilage. In an effort to bridge the gap in mechanical testing methods of articular cartilage and hydrogels in cartilage regeneration, this review classifies the different toughness measurements for each. The urgency for identifying the common ground between these two disparate fields is high, as mechanical failure is ready to stand alongside stiffness as a functional design requirement. In comparing toughness measurement methods between hydrogels and cartilage, we recommend that the best option for evaluating mechanical failure of hydrogel-based constructs for cartilage tissue engineering may be tensile testing based on the single edge notch test, in part because specimen preparation is more straightforward and a related American Society for Testing and Materials (ASTM) standard can be adopted in a fracture mechanics context.

  9. A New Test Method of Circuit Breaker Spring Telescopic Characteristics Based Image Processing

    NASA Astrophysics Data System (ADS)

    Huang, Huimin; Wang, Feifeng; Lu, Yufeng; Xia, Xiaofei; Su, Yi

    2018-06-01

    This paper applied computer vision technology to the fatigue condition monitoring of springs, and a new telescopic characteristics test method is proposed for circuit breaker operating mechanism spring based on image processing technology. High-speed camera is utilized to capture spring movement image sequences when high voltage circuit breaker operated. Then the image-matching method is used to obtain the deformation-time curve and speed-time curve, and the spring expansion and deformation parameters are extracted from it, which will lay a foundation for subsequent spring force analysis and matching state evaluation. After performing simulation tests at the experimental site, this image analyzing method could solve the complex problems of traditional mechanical sensor installation and monitoring online, status assessment of the circuit breaker spring.

  10. Correaltion of full-scale drag predictions with flight measurements on the C-141A aircraft. Phase 2: Wind tunnel test, analysis, and prediction techniques. Volume 1: Drag predictions, wind tunnel data analysis and correlation

    NASA Technical Reports Server (NTRS)

    Macwilkinson, D. G.; Blackerby, W. T.; Paterson, J. H.

    1974-01-01

    The degree of cruise drag correlation on the C-141A aircraft is determined between predictions based on wind tunnel test data, and flight test results. An analysis of wind tunnel tests on a 0.0275 scale model at Reynolds number up to 3.05 x 1 million/MAC is reported. Model support interference corrections are evaluated through a series of tests, and fully corrected model data are analyzed to provide details on model component interference factors. It is shown that predicted minimum profile drag for the complete configuration agrees within 0.75% of flight test data, using a wind tunnel extrapolation method based on flat plate skin friction and component shape factors. An alternative method of extrapolation, based on computed profile drag from a subsonic viscous theory, results in a prediction four percent lower than flight test data.

  11. Working with Sparse Data in Rated Language Tests: Generalizability Theory Applications

    ERIC Educational Resources Information Center

    Lin, Chih-Kai

    2017-01-01

    Sparse-rated data are common in operational performance-based language tests, as an inevitable result of assigning examinee responses to a fraction of available raters. The current study investigates the precision of two generalizability-theory methods (i.e., the rating method and the subdividing method) specifically designed to accommodate the…

  12. Leap-frog-based BPM (LF-BPM) method for solving nanophotonic structures

    NASA Astrophysics Data System (ADS)

    Ayoub, Ahmad B.; Swillam, Mohamed A.

    2018-02-01

    In this paper, we propose an efficient approach to solve the BPM equation. By splitting the complex field into real and imaginary parts, the method is proved to be at least 30% faster than the conventional BPM. This method was tested on several optical components to test the accuracy.

  13. Fast and precise dense grid size measurement method based on coaxial dual optical imaging system

    NASA Astrophysics Data System (ADS)

    Guo, Jiping; Peng, Xiang; Yu, Jiping; Hao, Jian; Diao, Yan; Song, Tao; Li, Ameng; Lu, Xiaowei

    2015-10-01

    Test sieves with dense grid structure are widely used in many fields, accurate gird size calibration is rather critical for success of grading analysis and test sieving. But traditional calibration methods suffer from the disadvantages of low measurement efficiency and shortage of sampling number of grids which could lead to quality judgment risk. Here, a fast and precise test sieve inspection method is presented. Firstly, a coaxial imaging system with low and high optical magnification probe is designed to capture the grid images of the test sieve. Then, a scaling ratio between low and high magnification probes can be obtained by the corresponding grids in captured images. With this, all grid dimensions in low magnification image can be obtained by measuring few corresponding grids in high magnification image with high accuracy. Finally, by scanning the stage of the tri-axis platform of the measuring apparatus, whole surface of the test sieve can be quickly inspected. Experiment results show that the proposed method can measure the test sieves with higher efficiency compare to traditional methods, which can measure 0.15 million grids (gird size 0.1mm) within only 60 seconds, and it can measure grid size range from 20μm to 5mm precisely. In a word, the presented method can calibrate the grid size of test sieve automatically with high efficiency and accuracy. By which, surface evaluation based on statistical method can be effectively implemented, and the quality judgment will be more reasonable.

  14. The effect of instructional methodology on high school students natural sciences standardized tests scores

    NASA Astrophysics Data System (ADS)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  15. LandScape: a simple method to aggregate p-values and other stochastic variables without a priori grouping.

    PubMed

    Wiuf, Carsten; Schaumburg-Müller Pallesen, Jonatan; Foldager, Leslie; Grove, Jakob

    2016-08-01

    In many areas of science it is custom to perform many, potentially millions, of tests simultaneously. To gain statistical power it is common to group tests based on a priori criteria such as predefined regions or by sliding windows. However, it is not straightforward to choose grouping criteria and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method was demonstrated using simulations and real data analyses. Our method may be a useful supplement to standard procedures relying on evaluation of test statistics individually. Moreover, by being agnostic and not relying on predefined selected regions, it might be a practical alternative to conventionally used methods of aggregation of p-values over regions. The method is implemented in Python and freely available online (through GitHub, see the Supplementary information).

  16. Testability analysis on a hydraulic system in a certain equipment based on simulation model

    NASA Astrophysics Data System (ADS)

    Zhang, Rui; Cong, Hua; Liu, Yuanhong; Feng, Fuzhou

    2018-03-01

    Aiming at the problem that the complicated structure and the shortage of fault statistics information in hydraulic systems, a multi value testability analysis method based on simulation model is proposed. Based on the simulation model of AMESim, this method injects the simulated faults and records variation of test parameters ,such as pressure, flow rate, at each test point compared with those under normal conditions .Thus a multi-value fault-test dependency matrix is established. Then the fault detection rate (FDR) and fault isolation rate (FIR) are calculated based on the dependency matrix. Finally the system of testability and fault diagnosis capability are analyzed and evaluated, which can only reach a lower 54%(FDR) and 23%(FIR). In order to improve testability performance of the system,. number and position of the test points are optimized on the system. Results show the proposed test placement scheme can be used to solve the problems that difficulty, inefficiency and high cost in the system maintenance.

  17. Examining the Implementation of a Problem-Based Learning and Traditional Hybrid Model of Instruction in Remedial Mathematics Classes Designed for State Testing Preparation of Eleventh Grade Students

    ERIC Educational Resources Information Center

    Rodgers, Lindsay D.

    2011-01-01

    The following paper examined the effects of a new method of teaching for remedial mathematics, named the hybrid model of instruction. Due to increasing importance of high stakes testing, the study sought to determine if this method of instruction, that blends traditional teaching and problem-based learning, had different learning effects on…

  18. Testing survey-based methods for rapid monitoring of child mortality, with implications for summary birth history data.

    PubMed

    Brady, Eoghan; Hill, Kenneth

    2017-01-01

    Under-five mortality estimates are increasingly used in low and middle income countries to target interventions and measure performance against global development goals. Two new methods to rapidly estimate under-5 mortality based on Summary Birth Histories (SBH) were described in a previous paper and tested with data available. This analysis tests the methods using data appropriate to each method from 5 countries that lack vital registration systems. SBH data are collected across many countries through censuses and surveys, and indirect methods often rely upon their quality to estimate mortality rates. The Birth History Imputation method imputes data from a recent Full Birth History (FBH) onto the birth, death and age distribution of the SBH to produce estimates based on the resulting distribution of child mortality. DHS FBHs and MICS SBHs are used for all five countries. In the implementation, 43 of 70 estimates are within 20% of validation estimates (61%). Mean Absolute Relative Error is 17.7.%. 1 of 7 countries produces acceptable estimates. The Cohort Change method considers the differences in births and deaths between repeated Summary Birth Histories at 1 or 2-year intervals to estimate the mortality rate in that period. SBHs are taken from Brazil's PNAD Surveys 2004-2011 and validated against IGME estimates. 2 of 10 estimates are within 10% of validation estimates. Mean absolute relative error is greater than 100%. Appropriate testing of these new methods demonstrates that they do not produce sufficiently good estimates based on the data available. We conclude this is due to the poor quality of most SBH data included in the study. This has wider implications for the next round of censuses and future household surveys across many low- and middle- income countries.

  19. An Effective Electrical Resonance-Based Method to Detect Delamination in Thermal Barrier Coating

    NASA Astrophysics Data System (ADS)

    Kim, Jong Min; Park, Jae-Ha; Lee, Ho Girl; Kim, Hak-Joon; Song, Sung-Jin; Seok, Chang-Sung; Lee, Young-Ze

    2017-12-01

    This research proposes a simple yet highly sensitive method based on electrical resonance of an eddy-current probe to detect delamination of thermal barrier coating (TBC). This method can directly measure the mechanical characteristics of TBC compared to conventional ultrasonic testing and infrared thermography methods. The electrical resonance-based method can detect the delamination of TBC from the metallic bond coat by shifting the electrical impedance of eddy current testing (ECT) probe coupling with degraded TBC, and, due to this shift, the resonant frequencies near the peak impedance of ECT probe revealed high sensitivity to the delamination. In order to verify the performance of the proposed method, a simple experiment is performed with degraded TBC specimens by thermal cyclic exposure. Consequently, the delamination with growth of thermally grown oxide in a TBC system is experimentally identified. Additionally, the results are in good agreement with the results obtained from ultrasonic C-scanning.

  20. An Effective Electrical Resonance-Based Method to Detect Delamination in Thermal Barrier Coating

    NASA Astrophysics Data System (ADS)

    Kim, Jong Min; Park, Jae-Ha; Lee, Ho Girl; Kim, Hak-Joon; Song, Sung-Jin; Seok, Chang-Sung; Lee, Young-Ze

    2018-02-01

    This research proposes a simple yet highly sensitive method based on electrical resonance of an eddy-current probe to detect delamination of thermal barrier coating (TBC). This method can directly measure the mechanical characteristics of TBC compared to conventional ultrasonic testing and infrared thermography methods. The electrical resonance-based method can detect the delamination of TBC from the metallic bond coat by shifting the electrical impedance of eddy current testing (ECT) probe coupling with degraded TBC, and, due to this shift, the resonant frequencies near the peak impedance of ECT probe revealed high sensitivity to the delamination. In order to verify the performance of the proposed method, a simple experiment is performed with degraded TBC specimens by thermal cyclic exposure. Consequently, the delamination with growth of thermally grown oxide in a TBC system is experimentally identified. Additionally, the results are in good agreement with the results obtained from ultrasonic C-scanning.

  1. [Comprehensive weighted recognition method for hydrological abrupt change: With the runoff series of Jiajiu hydrological station in Lancang River as an example].

    PubMed

    Gu, Hai Ting; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi

    2018-04-01

    Abrupt change is an important manifestation of hydrological process with dramatic variation in the context of global climate change, the accurate recognition of which has great significance to understand hydrological process changes and carry out the actual hydrological and water resources works. The traditional method is not reliable at both ends of the samples. The results of the methods are often inconsistent. In order to solve the problem, we proposed a comprehensive weighted recognition method for hydrological abrupt change based on weighting by comparing of 12 commonly used methods for testing change points. The reliability of the method was verified by Monte Carlo statistical test. The results showed that the efficiency of the 12 methods was influenced by the factors including coefficient of variation (Cv), deviation coefficient (Cs) before the change point, mean value difference coefficient, Cv difference coefficient and Cs difference coefficient, but with no significant relationship with the mean value of the sequence. Based on the performance of each method, the weight of each test method was given following the results from statistical test. The sliding rank sum test method and the sliding run test method had the highest weight, whereas the RS test method had the lowest weight. By this means, the change points with the largest comprehensive weight could be selected as the final result when the results of the different methods were inconsistent. This method was used to analyze the daily maximum sequence of Jiajiu station in the lower reaches of the Lancang River (1-day, 3-day, 5-day, 7-day and 1-month). The results showed that each sequence had obvious jump variation in 2004, which was in agreement with the physical causes of hydrological process change and water conservancy construction. The rationality and reliability of the proposed method was verified.

  2. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  3. 16 CFR 1000.29 - Directorate for Engineering Sciences.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... standards, product safety tests and test methods, performance criteria, design specifications, and quality control standards for consumer products, based on engineering and scientific methods. It conducts... consumer interest groups. The Directorate conducts human factors studies and research of consumer product...

  4. 16 CFR 1000.29 - Directorate for Engineering Sciences.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... standards, product safety tests and test methods, performance criteria, design specifications, and quality control standards for consumer products, based on engineering and scientific methods. It conducts... consumer interest groups. The Directorate conducts human factors studies and research of consumer product...

  5. Portable apparatus with CRT display for nondestructive testing of concrete by the ultrasonic pulse method

    NASA Technical Reports Server (NTRS)

    Manta, G.; Gurau, Y.; Nica, P.; Facacaru, I.

    1974-01-01

    The development of methods for the nondestructive study of concrete structures is discussed. The nondestructive test procedure is based on the method of ultrasonic pulse transmission through the material. The measurements indicate that the elastic properties of concrete or other heterogeneous materials are a function of the rate of ultrasonic propagation. Diagrams of the test equipment are provided. Mathematical models are included to support the theoretical aspects.

  6. High-fidelity simulation versus case-based discussion for teaching medical students in Brazil about pediatric emergencies

    PubMed Central

    Couto, Thomaz Bittencourt; Farhat, Sylvia C.L.; Geis, Gary L; Olsen, Orjan; Schvartsman, Claudio

    2015-01-01

    OBJECTIVE: To compare high-fidelity simulation with case-based discussion for teaching medical students about pediatric emergencies, as assessed by a knowledge post-test, a knowledge retention test and a survey of satisfaction with the method. METHODS: This was a non-randomized controlled study using a crossover design for the methods, as well as multiple-choice questionnaire tests and a satisfaction survey. Final-year medical students were allocated into two groups: group 1 participated in an anaphylaxis simulation and a discussion of a supraventricular tachycardia case, and conversely, group 2 participated in a discussion of an anaphylaxis case and a supraventricular tachycardia simulation. Students were tested on each theme at the end of their rotation (post-test) and 4–6 months later (retention test). RESULTS: Most students (108, or 66.3%) completed all of the tests. The mean scores for simulation versus case-based discussion were respectively 43.6% versus 46.6% for the anaphylaxis pre-test (p=0.42), 63.5% versus 67.8% for the post-test (p=0.13) and 61.5% versus 65.5% for the retention test (p=0.19). Additionally, the mean scores were respectively 33.9% versus 31.6% for the supraventricular tachycardia pre-test (p=0.44), 42.5% versus 47.7% for the post-test (p=0.09) and 41.5% versus 39.5% for the retention test (p=0.47). For both themes, there was improvement between the pre-test and the post-test (p<0.05), and no significant difference was observed between the post-test and the retention test (p>0.05). Moreover, the satisfaction survey revealed a preference for simulation (p<0.001). CONCLUSION: As a single intervention, simulation is not significantly different from case-based discussion in terms of acquisition and retention of knowledge but is superior in terms of student satisfaction. PMID:26106956

  7. Evaluation of paper gradient concentration strips for antifungal combination testing of Candida spp.

    PubMed

    Siopi, Maria; Siafakas, Nikolaos; Zerva, Loukia; Meletiadis, Joseph

    2015-11-01

    In vitro combination testing with broth microdilution chequerboard (CHEQ) method is widely used although it is time-consuming, cumbersome and difficult to apply in routine setting of clinical microbiology laboratory. A new gradient concentration paper strip method, the Liofilchem(®) MIC test strips (MTS), provides an alternative easy and fast method enabling the simultaneous diffusion of both drugs in combination. We therefore tested a polyene+azole and an azole+echinocandin combination against 18 Candida isolates with the CHEQ method based on EUCAST guidelines and the MTS method in research and routine settings. Fractional inhibitory concentration (FIC) indices were calculated after 24 and 48 h of incubation based on complete and prominent (FIC-2) growth inhibition endpoints. Reproducibility and agreement within 1 twofold dilution was assessed. The FICs of the two methods were correlated quantitatively with t-test and Pearson analysis and qualitatively with Chi-squared test. The reproducibility of the CHEQ and MTS method was 88-100% and their agreement was 80% with 62-77% of MTS FICs being higher than the corresponding CHEQ FICs. A statistically significant Pearson correlation (r = 0.86, P = 0.0003) and association (χ(2) = 17.05, df = 4, P = 0.002) was found between MTS FIC and CHEQ FIC-2 after 24 h. Categorical agreement was 63% with no very major or major errors. All MTS synergistic interactions were also synergistic with the CHEQ method. © 2015 Blackwell Verlag GmbH.

  8. Laser-based standoff detection of explosives: a critical review.

    PubMed

    Wallin, Sara; Pettersson, Anna; Ostmark, Henric; Hobro, Alison

    2009-09-01

    A review of standoff detection technologies for explosives has been made. The review is focused on trace detection methods (methods aiming to detect traces from handling explosives or the vapours surrounding an explosive charge due to the vapour pressure of the explosive) rather than bulk detection methods (methods aiming to detect the bulk explosive charge). The requirements for standoff detection technologies are discussed. The technologies discussed are mostly laser-based trace detection technologies, such as laser-induced-breakdown spectroscopy, Raman spectroscopy, laser-induced-fluorescence spectroscopy and IR spectroscopy but the bulk detection technologies millimetre wave imaging and terahertz spectroscopy are also discussed as a complement to the laser-based methods. The review includes novel techniques, not yet tested in realistic environments, more mature technologies which have been tested outdoors in realistic environments as well as the most mature millimetre wave imaging technique.

  9. The presence-absence coliform test for monitoring drinking water quality.

    PubMed Central

    Rice, E W; Geldreich, E E; Read, E J

    1989-01-01

    The concern for improved monitoring of the sanitary quality of drinking water has prompted interest in alternative methods for the detection of total coliform bacteria. A simplified qualitative presence-absence test has been proposed as an alternate procedure for detecting coliform bacteria in potable water. In this paper data from four comparative studies were analyzed to compare the recovery of total coliform bacteria from drinking water using the presence-absence test, the multiple fermentation tube procedure, and the membrane filter technique. The four studies were of water samples taken from four different geographic areas of the United States: Hawaii, New England (Vermont and New Hampshire), Oregon, and Pennsylvania. Analysis of the results of these studies were compared, based upon the number of positive samples detected by each method. Combined recoveries showed the presence-absence test detected significantly higher numbers of samples with coliforms than either the fermentation tube or membrane filter methods, P less than 0.01. The fermentation tube procedure detected significantly more positive samples than the membrane filter technique, P less than 0.01. Based upon the analysis of the combined data base, it is clear that the presence-absence test is as sensitive as the current coliform methods for the examination of potable water. The presence-absence test offers a viable alternative to water utility companies that elect to use the frequency-of-occurrence approach for compliance monitoring. PMID:2493663

  10. Application of Hydrophilic Silanol-Based Chemical Grout for Strengthening Damaged Reinforced Concrete Flexural Members

    PubMed Central

    Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon

    2014-01-01

    In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis. PMID:28788708

  11. Application of Hydrophilic Silanol-Based Chemical Grout for Strengthening Damaged Reinforced Concrete Flexural Members.

    PubMed

    Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon

    2014-06-23

    In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis.

  12. Diagnosis of Dementia by Machine learning methods in Epidemiological studies: a pilot exploratory study from south India.

    PubMed

    Bhagyashree, Sheshadri Iyengar Raghavan; Nagaraj, Kiran; Prince, Martin; Fall, Caroline H D; Krishna, Murali

    2018-01-01

    There are limited data on the use of artificial intelligence methods for the diagnosis of dementia in epidemiological studies in low- and middle-income country (LMIC) settings. A culture and education fair battery of cognitive tests was developed and validated for population based studies in low- and middle-income countries including India by the 10/66 Dementia Research Group. We explored the machine learning methods based on the 10/66 battery of cognitive tests for the diagnosis of dementia based in a birth cohort study in South India. The data sets for 466 men and women for this study were obtained from the on-going Mysore Studies of Natal effect of Health and Ageing (MYNAH), in south India. The data sets included: demographics, performance on the 10/66 cognitive function tests, the 10/66 diagnosis of mental disorders and population based normative data for the 10/66 battery of cognitive function tests. Diagnosis of dementia from the rule based approach was compared against the 10/66 diagnosis of dementia. We have applied machine learning techniques to identify minimal number of the 10/66 cognitive function tests required for diagnosing dementia and derived an algorithm to improve the accuracy of dementia diagnosis. Of 466 subjects, 27 had 10/66 diagnosis of dementia, 19 of whom were correctly identified as having dementia by Jrip classification with 100% accuracy. This pilot exploratory study indicates that machine learning methods can help identify community dwelling older adults with 10/66 criterion diagnosis of dementia with good accuracy in a LMIC setting such as India. This should reduce the duration of the diagnostic assessment and make the process easier and quicker for clinicians, patients and will be useful for 'case' ascertainment in population based epidemiological studies.

  13. Illustrating, Quantifying, and Correcting for Bias in Post-hoc Analysis of Gene-Based Rare Variant Tests of Association

    PubMed Central

    Grinde, Kelsey E.; Arbet, Jaron; Green, Alden; O'Connell, Michael; Valcarcel, Alessandra; Westra, Jason; Tintle, Nathan

    2017-01-01

    To date, gene-based rare variant testing approaches have focused on aggregating information across sets of variants to maximize statistical power in identifying genes showing significant association with diseases. Beyond identifying genes that are associated with diseases, the identification of causal variant(s) in those genes and estimation of their effect is crucial for planning replication studies and characterizing the genetic architecture of the locus. However, we illustrate that straightforward single-marker association statistics can suffer from substantial bias introduced by conditioning on gene-based test significance, due to the phenomenon often referred to as “winner's curse.” We illustrate the ramifications of this bias on variant effect size estimation and variant prioritization/ranking approaches, outline parameters of genetic architecture that affect this bias, and propose a bootstrap resampling method to correct for this bias. We find that our correction method significantly reduces the bias due to winner's curse (average two-fold decrease in bias, p < 2.2 × 10−6) and, consequently, substantially improves mean squared error and variant prioritization/ranking. The method is particularly helpful in adjustment for winner's curse effects when the initial gene-based test has low power and for relatively more common, non-causal variants. Adjustment for winner's curse is recommended for all post-hoc estimation and ranking of variants after a gene-based test. Further work is necessary to continue seeking ways to reduce bias and improve inference in post-hoc analysis of gene-based tests under a wide variety of genetic architectures. PMID:28959274

  14. Analysis of Added Value of Subscores with Respect to Classification

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2014-01-01

    Brennan noted that users of test scores often want (indeed, demand) that subscores be reported, along with total test scores, for diagnostic purposes. Haberman suggested a method based on classical test theory (CTT) to determine if subscores have added value over the total score. One way to interpret the method is that a subscore has added value…

  15. A Method to Examine Content Domain Structures

    ERIC Educational Resources Information Center

    D'Agostino, Jerome; Karpinski, Aryn; Welsh, Megan

    2011-01-01

    After a test is developed, most content validation analyses shift from ascertaining domain definition to studying domain representation and relevance because the domain is assumed to be set once a test exists. We present an approach that allows for the examination of alternative domain structures based on extant test items. In our example based on…

  16. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE PAGES

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund; ...

    2018-03-28

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  17. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  18. Testing for intracycle determinism in pseudoperiodic time series.

    PubMed

    Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  19. Comparing the Effects of Simulation-Based and Traditional Teaching Methods on the Critical Thinking Abilities and Self-Confidence of Nursing Students.

    PubMed

    Alamrani, Mashael Hasan; Alammar, Kamila Ahmad; Alqahtani, Sarah Saad; Salem, Olfat A

    2018-06-01

    Critical thinking and self-confidence are imperative to success in clinical practice. Educators should use teaching strategies that will help students enhance their critical thinking and self-confidence in complex content such as electrocardiogram interpretation. Therefore, teaching electrocardiogram interpretation to students is important for nurse educators. This study compares the effect of simulation-based and traditional teaching methods on the critical thinking and self-confidence of students during electrocardiogram interpretation sessions. Thirty undergraduate nursing students volunteered to participate in this study. The participants were divided into intervention and control groups, which were taught respectively using the simulation-based and traditional teaching programs. All of the participants were asked to complete the study instrumentpretest and posttest to measure their critical thinking and self-confidence. Improvement was observed in the control and experimental groups with respect to critical thinking and self-confidence, as evidenced by the results of the paired samples t test and the Wilcoxon signed-rank test (p < .05). However, the independent t test and Mann-Whitney U test indicate that the difference between the two groups was not significant (p > .05). This study evaluated an innovative simulation-based teaching method for nurses. No significant differences in outcomes were identified between the simulator-based and traditional teaching methods, indicating that well-implemented educational programs that use either teaching method effectively promote critical thinking and self-confidence in nursing students. Nurse educators are encouraged to design educational plans with clear objectives to improve the critical thinking and self-confidence of their students. Future research should compare the effects of several teaching sessions using each method in a larger sample.

  20. A novel dissolution media for testing drug release from a nanostructured polysaccharide-based colon specific drug delivery system: an approach to alternative colon media.

    PubMed

    Kotla, Niranjan G; Singh, Sima; Maddiboyina, Balaji; Sunnapu, Omprakash; Webster, Thomas J

    2016-01-01

    The aim of this study was to develop a novel microbially triggered and animal-sparing dissolution method for testing of nanorough polysaccharide-based micron granules for colonic drug delivery. In this method, probiotic cultures of bacteria present in the colonic region were prepared and added to the dissolution media and compared with the performance of conventional dissolution methodologies (such as media with rat cecal and human fecal media). In this study, the predominant species (such as Bacteroides, Bifidobacterium, Lactobacillus species, Eubacterium and Streptococcus) were cultured in 12% w/v skimmed milk powder and 5% w/v grade "A" honey. Approximately 10(10)-10(11) colony forming units m/L of probiotic culture was added to the dissolution media to test the drug release of polysaccharide-based formulations. A USP dissolution apparatus I/II using a gradient pH dissolution method was used to evaluate drug release from formulations meant for colonic drug delivery. Drug release of guar gum/Eudragit FS30D coated 5-fluorouracil granules was assessed under gastric and small intestine conditions within a simulated colonic environment involving fermentation testing with the probiotic culture. The results with the probiotic system were comparable to those obtained from the rat cecal and human fecal-based fermentation model, thereby suggesting that a probiotic dissolution method can be successfully applied for drug release testing of any polysaccharide-based oral formulation meant for colonic delivery. As such, this study significantly adds to the nanostructured biomaterials' community by elucidating an easier assay for colonic drug delivery.

  1. Partial F-tests with multiply imputed data in the linear regression framework via coefficient of determination.

    PubMed

    Chaurasia, Ashok; Harel, Ofer

    2015-02-10

    Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.

  2. TTCN-3 Based Conformance Testing of Mobile Broadcast Business Management System in 3G Networks

    NASA Astrophysics Data System (ADS)

    Wang, Zhiliang; Yin, Xia; Xiang, Yang; Zhu, Ruiping; Gao, Shirui; Wu, Xin; Liu, Shijian; Gao, Song; Zhou, Li; Li, Peng

    Mobile broadcast service is one of the emerging most important new services in 3G networks. To better operate and manage mobile broadcast services, mobile broadcast business management system (MBBMS) should be designed and developed. Such a system, with its distributed nature, complicated XML data and security mechanism, faces many challenges in testing technology. In this paper, we study the conformance testing methodology of MBBMS, and design and implement a MBBMS protocol conformance testing tool based on TTCN-3, a standardized test description language that can be used in black-box testing of reactive and distributed system. In this methodology and testing tool, we present a semi-automatic XML test data generation method of TTCN-3 test suite and use HMSC model to help the design of test suite. In addition, we also propose an integrated testing method for hierarchical MBBMS security architecture. This testing tool has been used in industrial level’s testing.

  3. An automatic and accurate method of full heart segmentation from CT image based on linear gradient model

    NASA Astrophysics Data System (ADS)

    Yang, Zili

    2017-07-01

    Heart segmentation is an important auxiliary method in the diagnosis of many heart diseases, such as coronary heart disease and atrial fibrillation, and in the planning of tumor radiotherapy. Most of the existing methods for full heart segmentation treat the heart as a whole part and cannot accurately extract the bottom of the heart. In this paper, we propose a new method based on linear gradient model to segment the whole heart from the CT images automatically and accurately. Twelve cases were tested in order to test this method and accurate segmentation results were achieved and identified by clinical experts. The results can provide reliable clinical support.

  4. Scalable parallel elastic-plastic finite element analysis using a quasi-Newton method with a balancing domain decomposition preconditioner

    NASA Astrophysics Data System (ADS)

    Yusa, Yasunori; Okada, Hiroshi; Yamada, Tomonori; Yoshimura, Shinobu

    2018-04-01

    A domain decomposition method for large-scale elastic-plastic problems is proposed. The proposed method is based on a quasi-Newton method in conjunction with a balancing domain decomposition preconditioner. The use of a quasi-Newton method overcomes two problems associated with the conventional domain decomposition method based on the Newton-Raphson method: (1) avoidance of a double-loop iteration algorithm, which generally has large computational complexity, and (2) consideration of the local concentration of nonlinear deformation, which is observed in elastic-plastic problems with stress concentration. Moreover, the application of a balancing domain decomposition preconditioner ensures scalability. Using the conventional and proposed domain decomposition methods, several numerical tests, including weak scaling tests, were performed. The convergence performance of the proposed method is comparable to that of the conventional method. In particular, in elastic-plastic analysis, the proposed method exhibits better convergence performance than the conventional method.

  5. Rapid and Accurate Multiple Testing Correction and Power Estimation for Millions of Correlated Markers

    PubMed Central

    Han, Buhm; Kang, Hyun Min; Eskin, Eleazar

    2009-01-01

    With the development of high-throughput sequencing and genotyping technologies, the number of markers collected in genetic association studies is growing rapidly, increasing the importance of methods for correcting for multiple hypothesis testing. The permutation test is widely considered the gold standard for accurate multiple testing correction, but it is often computationally impractical for these large datasets. Recently, several studies proposed efficient alternative approaches to the permutation test based on the multivariate normal distribution (MVN). However, they cannot accurately correct for multiple testing in genome-wide association studies for two reasons. First, these methods require partitioning of the genome into many disjoint blocks and ignore all correlations between markers from different blocks. Second, the true null distribution of the test statistic often fails to follow the asymptotic distribution at the tails of the distribution. We propose an accurate and efficient method for multiple testing correction in genome-wide association studies—SLIDE. Our method accounts for all correlation within a sliding window and corrects for the departure of the true null distribution of the statistic from the asymptotic distribution. In simulations using the Wellcome Trust Case Control Consortium data, the error rate of SLIDE's corrected p-values is more than 20 times smaller than the error rate of the previous MVN-based methods' corrected p-values, while SLIDE is orders of magnitude faster than the permutation test and other competing methods. We also extend the MVN framework to the problem of estimating the statistical power of an association study with correlated markers and propose an efficient and accurate power estimation method SLIP. SLIP and SLIDE are available at http://slide.cs.ucla.edu. PMID:19381255

  6. Towards standardized assessment of endoscope optical performance: geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Desai, Viraj N.; Ngo, Ying Z.; Cheng, Wei-Chung; Pfefer, Joshua

    2013-12-01

    Technological advances in endoscopes, such as capsule, ultrathin and disposable devices, promise significant improvements in safety, clinical effectiveness and patient acceptance. Unfortunately, the industry lacks test methods for preclinical evaluation of key optical performance characteristics (OPCs) of endoscopic devices that are quantitative, objective and well-validated. As a result, it is difficult for researchers and developers to compare image quality and evaluate equivalence to, or improvement upon, prior technologies. While endoscope OPCs include resolution, field of view, and depth of field, among others, our focus in this paper is geometric image distortion. We reviewed specific test methods for distortion and then developed an objective, quantitative test method based on well-defined experimental and data processing steps to evaluate radial distortion in the full field of view of an endoscopic imaging system. Our measurements and analyses showed that a second-degree polynomial equation could well describe the radial distortion curve of a traditional endoscope. The distortion evaluation method was effective for correcting the image and can be used to explain other widely accepted evaluation methods such as picture height distortion. Development of consensus standards based on promising test methods for image quality assessment, such as the method studied here, will facilitate clinical implementation of innovative endoscopic devices.

  7. Entropy-based goodness-of-fit test: Application to the Pareto distribution

    NASA Astrophysics Data System (ADS)

    Lequesne, Justine

    2013-08-01

    Goodness-of-fit tests based on entropy have been introduced in [13] for testing normality. The maximum entropy distribution in a class of probability distributions defined by linear constraints induces a Pythagorean equality between the Kullback-Leibler information and an entropy difference. This allows one to propose a goodness-of-fit test for maximum entropy parametric distributions which is based on the Kullback-Leibler information. We will focus on the application of the method to the Pareto distribution. The power of the proposed test is computed through Monte Carlo simulation.

  8. Clinic-based testing for rectal and pharyngeal Neisseria gonorrhoeae and Chlamydia trachomatis infections by community-based organizations--five cities, United States, 2007.

    PubMed

    2009-07-10

    CDC recommends screening of at-risk men who have sex with men (MSM) at least annually for urethral and rectal gonorrhea and chlamydia, and for pharyngeal gonorrhea. Although the standard method for diagnosis is culture, nucleic acid amplification (NAA) testing is generally more sensitive and favored by most experts. NAA tests have not been cleared by the Food and Drug Administration (FDA) for the diagnosis of extragenital chlamydia or gonorrhea and may not be marketed for that purpose. However, under U.S. law, laboratories may offer NAA testing for diagnosis of extragenital chlamydia or gonorrhea after internal validation of the method by a verification study. To determine sexually transmitted disease (STD) testing practices among community-based organizations serving MSM, CDC and the San Francisco Department of Public Health gathered data on rectal and pharyngeal gonorrhea and chlamydia testing at screening sites managed by six gay-focused community-based organizations in five U.S. cities during 2007. This report summarizes the results of the study, which found that three organizations collected samples for NAA testing and three for culture. In total, approximately 30,000 tests were performed; 5.4% of rectal gonorrhea, 8.9% of rectal chlamydia, 5.3% of pharyngeal gonorrhea, and 1.6% of pharyngeal chlamydia tests were positive. These results demonstrate that gay-focused community-based organizations can detect large numbers of gonorrhea and chlamydia cases and might reach MSM not being tested elsewhere. Public health officials could consider providing support to certain community-based organizations to facilitate testing and treatment of gonorrhea and chlamydia.

  9. An Integrated Analysis-Test Approach

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2003-01-01

    This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.

  10. Antifungal Susceptibility Testing of Aspergillus spp. by Using a Composite Correlation Index (CCI)-Based Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry Method Appears To Not Offer Benefit over Traditional Broth Microdilution Testing

    PubMed Central

    Gitman, Melissa R.; McTaggart, Lisa; Spinato, Joanna; Poopalarajah, Rahgavi; Lister, Erin; Husain, Shahid

    2017-01-01

    ABSTRACT Aspergillus spp. cause serious invasive lung infections, and Aspergillus fumigatus is the most commonly encountered clinically significant species. Voriconazole is considered to be the drug of choice for treating A. fumigatus infections; however, rising resistance rates have been reported. We evaluated a matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS)-based method for the differentiation between wild-type and non-wild-type isolates of 20 Aspergillus spp. (including 2 isolates of Aspergillus ustus and 1 of Aspergillus calidoustus that were used as controls due their intrinsic low azole susceptibility with respect to the in vitro response to voriconazole). At 30 and 48 h of incubation, there was complete agreement between Cyp51A sequence analysis, broth microdilution, and MALDI-TOF MS classification of isolates as wild type or non-wild type. In this proof-of-concept study, we demonstrated that MALDI-TOF MS can be used to accurately detect A. fumigatus strains with reduced voriconazole susceptibility. However, rather than proving to be a rapid and simple method for antifungal susceptibility testing, this particular MS-based method showed no benefit over conventional testing methods. PMID:28404678

  11. Antifungal Susceptibility Testing of Aspergillus spp. by Using a Composite Correlation Index (CCI)-Based Matrix-Assisted Laser Desorption Ionization-Time of Flight Mass Spectrometry Method Appears To Not Offer Benefit over Traditional Broth Microdilution Testing.

    PubMed

    Gitman, Melissa R; McTaggart, Lisa; Spinato, Joanna; Poopalarajah, Rahgavi; Lister, Erin; Husain, Shahid; Kus, Julianne V

    2017-07-01

    Aspergillus spp. cause serious invasive lung infections, and Aspergillus fumigatus is the most commonly encountered clinically significant species. Voriconazole is considered to be the drug of choice for treating A. fumigatus infections; however, rising resistance rates have been reported. We evaluated a matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS)-based method for the differentiation between wild-type and non-wild-type isolates of 20 Aspergillus spp. (including 2 isolates of Aspergillus ustus and 1 of Aspergillus calidoustus that were used as controls due their intrinsic low azole susceptibility with respect to the in vitro response to voriconazole). At 30 and 48 h of incubation, there was complete agreement between Cyp51A sequence analysis, broth microdilution, and MALDI-TOF MS classification of isolates as wild type or non-wild type. In this proof-of-concept study, we demonstrated that MALDI-TOF MS can be used to accurately detect A. fumigatus strains with reduced voriconazole susceptibility. However, rather than proving to be a rapid and simple method for antifungal susceptibility testing, this particular MS-based method showed no benefit over conventional testing methods. © Crown copyright 2017.

  12. Implementation of centrifuge testing of expansive soils for pavement design.

    DOT National Transportation Integrated Search

    2017-03-01

    The novel centrifuge-based method for testing of expansive soils from project 5-6048-01 was implemented into : use for the determination of the Potential Vertical Rise (PVR) of roadways that sit on expansive subgrades. The : centrifuge method was mod...

  13. Analysis of visual quality improvements provided by known tools for HDR content

    NASA Astrophysics Data System (ADS)

    Kim, Jaehwan; Alshina, Elena; Lee, JongSeok; Park, Youngo; Choi, Kwang Pyo

    2016-09-01

    In this paper, the visual quality of different solutions for high dynamic range (HDR) compression using MPEG test contents is analyzed. We also simulate the method for an efficient HDR compression which is based on statistical property of the signal. The method is compliant with HEVC specification and also easily compatible with other alternative methods which might require HEVC specification changes. It was subjectively tested on commercial TVs and compared with alternative solutions for HDR coding. Subjective visual quality tests were performed using SUHD TVs model which is SAMSUNG JS9500 with maximum luminance up to 1000nit in test. The solution that is based on statistical property shows not only improvement of objective performance but improvement of visual quality compared to other HDR solutions, while it is compatible with HEVC specification.

  14. How to detect carbapenemase producers? A literature review of phenotypic and molecular methods.

    PubMed

    Hammoudi, D; Moubareck, C Ayoub; Sarkis, D Karam

    2014-12-01

    This review describes the current state-of-art of carbapenemase detection methods. Identification of carbapenemases is first based on conventional phenotypic tests including antimicrobial susceptibility testing, modified-Hodge test and carbapenemase-inhibitor culture tests. Second, molecular characterization of carbapenemase genes by PCR sequencing is essential. Third, innovative biochemical and spectrometric detection may be applied. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. A Method for Implementing Force-Limited Vibration Control

    NASA Technical Reports Server (NTRS)

    Worth, Daniel B.

    1997-01-01

    NASA/GSFC has implemented force-limited vibration control on a controller which can only accept one profile. The method uses a personal computer based digital signal processing board to convert force and/or moment signals into what appears to he an acceleration signal to the controller. This technique allows test centers with older controllers to use the latest force-limited control techniques for random vibration testing. The paper describes the method, hardware, and test procedures used. An example from a test performed at NASA/GSFC is used as a guide.

  16. Task-based statistical image reconstruction for high-quality cone-beam CT

    NASA Astrophysics Data System (ADS)

    Dang, Hao; Webster Stayman, J.; Xu, Jennifer; Zbijewski, Wojciech; Sisniega, Alejandro; Mow, Michael; Wang, Xiaohui; Foos, David H.; Aygun, Nafi; Koliatsos, Vassilis E.; Siewerdsen, Jeffrey H.

    2017-11-01

    Task-based analysis of medical imaging performance underlies many ongoing efforts in the development of new imaging systems. In statistical image reconstruction, regularization is often formulated in terms to encourage smoothness and/or sharpness (e.g. a linear, quadratic, or Huber penalty) but without explicit formulation of the task. We propose an alternative regularization approach in which a spatially varying penalty is determined that maximizes task-based imaging performance at every location in a 3D image. We apply the method to model-based image reconstruction (MBIR—viz., penalized weighted least-squares, PWLS) in cone-beam CT (CBCT) of the head, focusing on the task of detecting a small, low-contrast intracranial hemorrhage (ICH), and we test the performance of the algorithm in the context of a recently developed CBCT prototype for point-of-care imaging of brain injury. Theoretical predictions of local spatial resolution and noise are computed via an optimization by which regularization (specifically, the quadratic penalty strength) is allowed to vary throughout the image to maximize local task-based detectability index ({{d}\\prime} ). Simulation studies and test-bench experiments were performed using an anthropomorphic head phantom. Three PWLS implementations were tested: conventional (constant) penalty; a certainty-based penalty derived to enforce constant point-spread function, PSF; and the task-based penalty derived to maximize local detectability at each location. Conventional (constant) regularization exhibited a fairly strong degree of spatial variation in {{d}\\prime} , and the certainty-based method achieved uniform PSF, but each exhibited a reduction in detectability compared to the task-based method, which improved detectability up to ~15%. The improvement was strongest in areas of high attenuation (skull base), where the conventional and certainty-based methods tended to over-smooth the data. The task-driven reconstruction method presents a promising regularization method in MBIR by explicitly incorporating task-based imaging performance as the objective. The results demonstrate improved ICH conspicuity and support the development of high-quality CBCT systems.

  17. Test Scheduling for Core-Based SOCs Using Genetic Algorithm Based Heuristic Approach

    NASA Astrophysics Data System (ADS)

    Giri, Chandan; Sarkar, Soumojit; Chattopadhyay, Santanu

    This paper presents a Genetic algorithm (GA) based solution to co-optimize test scheduling and wrapper design for core based SOCs. Core testing solutions are generated as a set of wrapper configurations, represented as rectangles with width equal to the number of TAM (Test Access Mechanism) channels and height equal to the corresponding testing time. A locally optimal best-fit heuristic based bin packing algorithm has been used to determine placement of rectangles minimizing the overall test times, whereas, GA has been utilized to generate the sequence of rectangles to be considered for placement. Experimental result on ITC'02 benchmark SOCs shows that the proposed method provides better solutions compared to the recent works reported in the literature.

  18. Optics-Only Calibration of a Neural-Net Based Optical NDE Method for Structural Health Monitoring

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    2004-01-01

    A calibration process is presented that uses optical measurements alone to calibrate a neural-net based NDE method. The method itself detects small changes in the vibration mode shapes of structures. The optics-only calibration process confirms previous work that the sensitivity to vibration-amplitude changes can be as small as 10 nanometers. A more practical value in an NDE service laboratory is shown to be 50 nanometers. Both model-generated and experimental calibrations are demonstrated using two implementations of the calibration technique. The implementations are based on previously published demonstrations of the NDE method and an alternative calibration procedure that depends on comparing neural-net and point sensor measurements. The optics-only calibration method, unlike the alternative method, does not require modifications of the structure being tested or the creation of calibration objects. The calibration process can be used to test improvements in the NDE process and to develop a vibration-mode-independence of damagedetection sensitivity. The calibration effort was intended to support NASA s objective to promote safety in the operations of ground test facilities or aviation safety, in general, by allowing the detection of the gradual onset of structural changes and damage.

  19. Traditional use of indigenous mosquito-repellents to protect humans against mosquitoes and other insect bites in a rural community of Cameroon.

    PubMed

    Ntonifor, N N; Ngufor, C A; Kimbi, H K; Oben, B O

    2006-10-01

    To document and test the efficacy of indigenous traditional personal protection methods against mosquito bites and general nuisance. A prospective study based on a survey and field evaluation of selected plant-based personal protection methods against mosquito bites. Bolifamba, a rural setting of the Mount Cameroon region. A structured questionnaire was administered to 179 respondents and two anti-mosquito measures were tested under field conditions. Identified traditional anti-mosquito methods used by indigenes of Bolifamba. Two plants tested under field conditions were found to be effective. Of the 179 respondents, 88 (49.16%) used traditional anti-mosquito methods; 57 (64.77%) used plant-based methods while 31 (35.2%) used various petroleum oils. The rest of the respondents, 91 (50.8%) used conventional personal protection methods. Reasons for using traditional methods were because they were available, affordable and lack of known more effective alternatives. The demerits of these methods were: labourious to implement, stain dresses, produce a lot of smoke/ repulsive odours when used; those of conventional methods were lack of adequate information about them, high cost and non-availability. When the two most frequently used plants, Saccharum officinarium and Ocimum basilicum were evaluated under field conditions, each gave a better protection than the control. Most plants used against mosquitoes in the area are known potent mosquito repellents but others identified in the study warrant further research. The two tested under field conditions were effective though less than the commonly used commercial diethyltoluamide.

  20. Illegal performance enhancing drugs and doping in sport: a picture-based brief implicit association test for measuring athletes’ attitudes

    PubMed Central

    2014-01-01

    Background Doping attitude is a key variable in predicting athletes’ intention to use forbidden performance enhancing drugs. Indirect reaction-time based attitude tests, such as the implicit association test, conceal the ultimate goal of measurement from the participant better than questionnaires. Indirect tests are especially useful when socially sensitive constructs such as attitudes towards doping need to be described. The present study serves the development and validation of a novel picture-based brief implicit association test (BIAT) for testing athletes’ attitudes towards doping in sport. It shall provide the basis for a transnationally compatible research instrument able to harmonize anti-doping research efforts. Method Following a known-group differences validation strategy, the doping attitudes of 43 athletes from bodybuilding (representative for a highly doping prone sport) and handball (as a contrast group) were compared using the picture-based doping-BIAT. The Performance Enhancement Attitude Scale (PEAS) was employed as a corresponding direct measure in order to additionally validate the results. Results As expected, in the group of bodybuilders, indirectly measured doping attitudes as tested with the picture-based doping-BIAT were significantly less negative (η2 = .11). The doping-BIAT and PEAS scores correlated significantly at r = .50 for bodybuilders, and not significantly at r = .36 for handball players. There was a low error rate (7%) and a satisfactory internal consistency (r tt  = .66) for the picture-based doping-BIAT. Conclusions The picture-based doping-BIAT constitutes a psychometrically tested method, ready to be adopted by the international research community. The test can be administered via the internet. All test material is available “open source”. The test might be implemented, for example, as a new effect-measure in the evaluation of prevention programs. PMID:24479865

  1. Experimental Study on Welded Headed Studs Used In Steel Plate-Concrete Composite Structures Compared with Contactless Method of Measuring Displacement

    NASA Astrophysics Data System (ADS)

    Kisała, Dawid; Tekieli, Marcin

    2017-10-01

    Steel plate-concrete composite structures are a new innovative design concept in which a thin steel plate is attached to the reinforced concrete beam by means of welded headed studs. The comparison between experimental studies and theoretical analysis of this type of structures shows that their behaviour is dependent on the load-slip relationship of the shear connectors used to ensure sufficient bond between the concrete and steel parts of the structure. The aim of this paper is to describe an experimental study on headed studs used in steel plate-concrete composite structures. Push-out tests were carried out to investigate the behaviour of shear connectors. The test specimens were prepared according to standard push-out tests, however, instead of I-beam, a steel plate 16 mm thick was used to better reflect the conditions in the real structure. The test specimens were produced in two batches using concrete with significantly different compressive strength. The experimental study was carried out on twelve specimens. Besides the traditional measurements based on LVDT sensors, optical measurements based on the digital image correlation method (DIC) and pattern tracking methods were used. DIC is a full-field contactless optical method for measuring displacements in experimental testing, based on the correlation of the digital images taken during test execution. With respect to conventional methods, optical measurements offer a wider scope of results and can give more information about the material or construction behaviour during the test. The ultimate load capacity and load-slip curves obtained from the experiments were compared with the values calculated based on Eurocodes, American and Chinese design specifications. It was observed that the use of the relationships developed for the traditional steel-concrete composite structures is justified in the case of ultimate load capacity of shear connectors in steel plate-concrete composite structures.

  2. Genetics-based methods for detection of Salmonella spp. in foods.

    PubMed

    Mozola, Mark A

    2006-01-01

    Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.

  3. Model-based phase-shifting interferometer

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  4. Card Sorting in an Online Environment: Key to Involving Online-Only Student Population in Usability Testing of an Academic Library Web Site?

    ERIC Educational Resources Information Center

    Paladino, Emily B.; Klentzin, Jacqueline C.; Mills, Chloe P.

    2017-01-01

    Based on in-person, task-based usability testing and interviews, the authors' library Web site was recently overhauled in order to improve user experience. This led to the authors' interest in additional usability testing methods and test environments that would most closely fit their library's goals and situation. The appeal of card sorting…

  5. Reliability enhancement of Navier-Stokes codes through convergence enhancement

    NASA Technical Reports Server (NTRS)

    Choi, K.-Y.; Dulikravich, G. S.

    1993-01-01

    Reduction of total computing time required by an iterative algorithm for solving Navier-Stokes equations is an important aspect of making the existing and future analysis codes more cost effective. Several attempts have been made to accelerate the convergence of an explicit Runge-Kutta time-stepping algorithm. These acceleration methods are based on local time stepping, implicit residual smoothing, enthalpy damping, and multigrid techniques. Also, an extrapolation procedure based on the power method and the Minimal Residual Method (MRM) were applied to the Jameson's multigrid algorithm. The MRM uses same values of optimal weights for the corrections to every equation in a system and has not been shown to accelerate the scheme without multigriding. Our Distributed Minimal Residual (DMR) method based on our General Nonlinear Minimal Residual (GNLMR) method allows each component of the solution vector in a system of equations to have its own convergence speed. The DMR method was found capable of reducing the computation time by 10-75 percent depending on the test case and grid used. Recently, we have developed and tested a new method termed Sensitivity Based DMR or SBMR method that is easier to implement in different codes and is even more robust and computationally efficient than our DMR method.

  6. Reliability enhancement of Navier-Stokes codes through convergence enhancement

    NASA Astrophysics Data System (ADS)

    Choi, K.-Y.; Dulikravich, G. S.

    1993-11-01

    Reduction of total computing time required by an iterative algorithm for solving Navier-Stokes equations is an important aspect of making the existing and future analysis codes more cost effective. Several attempts have been made to accelerate the convergence of an explicit Runge-Kutta time-stepping algorithm. These acceleration methods are based on local time stepping, implicit residual smoothing, enthalpy damping, and multigrid techniques. Also, an extrapolation procedure based on the power method and the Minimal Residual Method (MRM) were applied to the Jameson's multigrid algorithm. The MRM uses same values of optimal weights for the corrections to every equation in a system and has not been shown to accelerate the scheme without multigriding. Our Distributed Minimal Residual (DMR) method based on our General Nonlinear Minimal Residual (GNLMR) method allows each component of the solution vector in a system of equations to have its own convergence speed. The DMR method was found capable of reducing the computation time by 10-75 percent depending on the test case and grid used. Recently, we have developed and tested a new method termed Sensitivity Based DMR or SBMR method that is easier to implement in different codes and is even more robust and computationally efficient than our DMR method.

  7. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention.

    PubMed

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V

    2017-04-01

    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (<4% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 89.1% were detected by the SCED method within 2°. Based on the type of check that detected the error, determination of error sources was achieved. With noise ranging from no random noise to four times the established noise value, the averaged relevant dose error detection rate of the SCED method was between 94.0% and 95.8% and that of gamma between 82.8% and 89.8%. An EPID-frame-based error detection process for VMAT deliveries was successfully designed and tested via simulations. The SCED method was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of relevant dose errors. Compared to a typical (3%, 3 mm) gamma analysis, the SCED method produced a higher detection rate for all introduced dose errors, identified errors in an earlier stage, displayed a higher robustness to noise variations, and indicated the error source. © 2017 American Association of Physicists in Medicine.

  8. Position Accuracy Analysis of a Robust Vision-Based Navigation

    NASA Astrophysics Data System (ADS)

    Gaglione, S.; Del Pizzo, S.; Troisi, S.; Angrisano, A.

    2018-05-01

    Using images to determine camera position and attitude is a consolidated method, very widespread for application like UAV navigation. In harsh environment, where GNSS could be degraded or denied, image-based positioning could represent a possible candidate for an integrated or alternative system. In this paper, such method is investigated using a system based on single camera and 3D maps. A robust estimation method is proposed in order to limit the effect of blunders or noisy measurements on position solution. The proposed approach is tested using images collected in an urban canyon, where GNSS positioning is very unaccurate. A previous photogrammetry survey has been performed to build the 3D model of tested area. The position accuracy analysis is performed and the effect of the robust method proposed is validated.

  9. A deep learning-based multi-model ensemble method for cancer prediction.

    PubMed

    Xiao, Yawen; Wu, Jun; Lin, Zongli; Zhao, Xiaodong

    2018-01-01

    Cancer is a complex worldwide health problem associated with high mortality. With the rapid development of the high-throughput sequencing technology and the application of various machine learning methods that have emerged in recent years, progress in cancer prediction has been increasingly made based on gene expression, providing insight into effective and accurate treatment decision making. Thus, developing machine learning methods, which can successfully distinguish cancer patients from healthy persons, is of great current interest. However, among the classification methods applied to cancer prediction so far, no one method outperforms all the others. In this paper, we demonstrate a new strategy, which applies deep learning to an ensemble approach that incorporates multiple different machine learning models. We supply informative gene data selected by differential gene expression analysis to five different classification models. Then, a deep learning method is employed to ensemble the outputs of the five classifiers. The proposed deep learning-based multi-model ensemble method was tested on three public RNA-seq data sets of three kinds of cancers, Lung Adenocarcinoma, Stomach Adenocarcinoma and Breast Invasive Carcinoma. The test results indicate that it increases the prediction accuracy of cancer for all the tested RNA-seq data sets as compared to using a single classifier or the majority voting algorithm. By taking full advantage of different classifiers, the proposed deep learning-based multi-model ensemble method is shown to be accurate and effective for cancer prediction. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A suite of phantom-based test methods for assessing image quality of photoacoustic tomography systems

    NASA Astrophysics Data System (ADS)

    Vogt, William C.; Jia, Congxian; Wear, Keith A.; Garra, Brian S.; Pfefer, T. Joshua

    2017-03-01

    As Photoacoustic Tomography (PAT) matures and undergoes clinical translation, objective performance test methods are needed to facilitate device development, regulatory clearance and clinical quality assurance. For mature medical imaging modalities such as CT, MRI, and ultrasound, tissue-mimicking phantoms are frequently incorporated into consensus standards for performance testing. A well-validated set of phantom-based test methods is needed for evaluating performance characteristics of PAT systems. To this end, we have constructed phantoms using a custom tissue-mimicking material based on PVC plastisol with tunable, biologically-relevant optical and acoustic properties. Each phantom is designed to enable quantitative assessment of one or more image quality characteristics including 3D spatial resolution, spatial measurement accuracy, ultrasound/PAT co-registration, uniformity, penetration depth, geometric distortion, sensitivity, and linearity. Phantoms contained targets including high-intensity point source targets and dye-filled tubes. This suite of phantoms was used to measure the dependence of performance of a custom PAT system (equipped with four interchangeable linear array transducers of varying design) on design parameters (e.g., center frequency, bandwidth, element geometry). Phantoms also allowed comparison of image artifacts, including surface-generated clutter and bandlimited sensing artifacts. Results showed that transducer design parameters create strong variations in performance including a trade-off between resolution and penetration depth, which could be quantified with our method. This study demonstrates the utility of phantom-based image quality testing in device performance assessment, which may guide development of consensus standards for PAT systems.

  11. [Isolation and identification methods of enterobacteria group and its technological advancement].

    PubMed

    Furuta, Itaru

    2007-08-01

    In the last half-century, isolation and identification methods of enterobacteria groups have markedly improved by technological advancement. Clinical microbiology tests have changed overtime from tube methods to commercial identification kits and automated identification. Tube methods are the original method for the identification of enterobacteria groups, that is, a basically essential method to recognize bacterial fermentation and biochemical principles. In this paper, traditional tube tests are discussed, such as the utilization of carbohydrates, indole, methyl red, and citrate and urease tests. Commercial identification kits and automated instruments by computer based analysis as current methods are also discussed, and those methods provide rapidity and accuracy. Nonculture techniques of nucleic acid typing methods using PCR analysis, and immunochemical methods using monoclonal antibodies can be further developed.

  12. Cylinder surface test with Chebyshev polynomial fitting method

    NASA Astrophysics Data System (ADS)

    Yu, Kui-bang; Guo, Pei-ji; Chen, Xi

    2017-10-01

    Zernike polynomials fitting method is often applied in the test of optical components and systems, used to represent the wavefront and surface error in circular domain. Zernike polynomials are not orthogonal in rectangular region which results in its unsuitable for the test of optical element with rectangular aperture such as cylinder surface. Applying the Chebyshev polynomials which are orthogonal among the rectangular area as an substitution to the fitting method, can solve the problem. Corresponding to a cylinder surface with diameter of 50 mm and F number of 1/7, a measuring system has been designed in Zemax based on Fizeau Interferometry. The expressions of the two-dimensional Chebyshev polynomials has been given and its relationship with the aberration has been presented. Furthermore, Chebyshev polynomials are used as base items to analyze the rectangular aperture test data. The coefficient of different items are obtained from the test data through the method of least squares. Comparing the Chebyshev spectrum in different misalignment, it show that each misalignment is independence and has a certain relationship with the certain Chebyshev terms. The simulation results show that, through the Legendre polynomials fitting method, it will be a great improvement in the efficient of the detection and adjustment of the cylinder surface test.

  13. Development and application of hybrid structure based method for efficient screening of ligands binding to G-protein coupled receptors

    NASA Astrophysics Data System (ADS)

    Kortagere, Sandhya; Welsh, William J.

    2006-12-01

    G-protein coupled receptors (GPCRs) comprise a large superfamily of proteins that are targets for nearly 50% of drugs in clinical use today. In the past, the use of structure-based drug design strategies to develop better drug candidates has been severely hampered due to the absence of the receptor's three-dimensional structure. However, with recent advances in molecular modeling techniques and better computing power, atomic level details of these receptors can be derived from computationally derived molecular models. Using information from these models coupled with experimental evidence, it has become feasible to build receptor pharmacophores. In this study, we demonstrate the use of the Hybrid Structure Based (HSB) method that can be used effectively to screen and identify prospective ligands that bind to GPCRs. Essentially; this multi-step method combines ligand-based methods for building enriched libraries of small molecules and structure-based methods for screening molecules against the GPCR target. The HSB method was validated to identify retinal and its analogues from a random dataset of ˜300,000 molecules. The results from this study showed that the 9 top-ranking molecules are indeed analogues of retinal. The method was also tested to identify analogues of dopamine binding to the dopamine D2 receptor. Six of the ten top-ranking molecules are known analogues of dopamine including a prodrug, while the other thirty-four molecules are currently being tested for their activity against all dopamine receptors. The results from both these test cases have proved that the HSB method provides a realistic solution to bridge the gap between the ever-increasing demand for new drugs to treat psychiatric disorders and the lack of efficient screening methods for GPCRs.

  14. A method for diagnosing time dependent faults using model-based reasoning systems

    NASA Technical Reports Server (NTRS)

    Goodrich, Charles H.

    1995-01-01

    This paper explores techniques to apply model-based reasoning to equipment and systems which exhibit dynamic behavior (that which changes as a function of time). The model-based system of interest is KATE-C (Knowledge based Autonomous Test Engineer) which is a C++ based system designed to perform monitoring and diagnosis of Space Shuttle electro-mechanical systems. Methods of model-based monitoring and diagnosis are well known and have been thoroughly explored by others. A short example is given which illustrates the principle of model-based reasoning and reveals some limitations of static, non-time-dependent simulation. This example is then extended to demonstrate representation of time-dependent behavior and testing of fault hypotheses in that environment.

  15. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  16. Abstract: Inference and Interval Estimation for Indirect Effects With Latent Variable Models.

    PubMed

    Falk, Carl F; Biesanz, Jeremy C

    2011-11-30

    Models specifying indirect effects (or mediation) and structural equation modeling are both popular in the social sciences. Yet relatively little research has compared methods that test for indirect effects among latent variables and provided precise estimates of the effectiveness of different methods. This simulation study provides an extensive comparison of methods for constructing confidence intervals and for making inferences about indirect effects with latent variables. We compared the percentile (PC) bootstrap, bias-corrected (BC) bootstrap, bias-corrected accelerated (BC a ) bootstrap, likelihood-based confidence intervals (Neale & Miller, 1997), partial posterior predictive (Biesanz, Falk, and Savalei, 2010), and joint significance tests based on Wald tests or likelihood ratio tests. All models included three reflective latent variables representing the independent, dependent, and mediating variables. The design included the following fully crossed conditions: (a) sample size: 100, 200, and 500; (b) number of indicators per latent variable: 3 versus 5; (c) reliability per set of indicators: .7 versus .9; (d) and 16 different path combinations for the indirect effect (α = 0, .14, .39, or .59; and β = 0, .14, .39, or .59). Simulations were performed using a WestGrid cluster of 1680 3.06GHz Intel Xeon processors running R and OpenMx. Results based on 1,000 replications per cell and 2,000 resamples per bootstrap method indicated that the BC and BC a bootstrap methods have inflated Type I error rates. Likelihood-based confidence intervals and the PC bootstrap emerged as methods that adequately control Type I error and have good coverage rates.

  17. Laboratory-based versus non-laboratory-based method for assessment of cardiovascular disease risk: the NHANES I Follow-up Study cohort

    PubMed Central

    Gaziano, Thomas A; Young, Cynthia R; Fitzmaurice, Garrett; Atwood, Sidney; Gaziano, J Michael

    2008-01-01

    Summary Background Around 80% of all cardiovascular deaths occur in developing countries. Assessment of those patients at high risk is an important strategy for prevention. Since developing countries have limited resources for prevention strategies that require laboratory testing, we assessed if a risk prediction method that did not require any laboratory tests could be as accurate as one requiring laboratory information. Methods The National Health and Nutrition Examination Survey (NHANES) was a prospective cohort study of 14 407 US participants aged between 25–74 years at the time they were first examined (between 1971 and 1975). Our follow-up study population included participants with complete information on these surveys who did not report a history of cardiovascular disease (myocardial infarction, heart failure, stroke, angina) or cancer, yielding an analysis dataset N=6186. We compared how well either method could predict first-time fatal and non-fatal cardiovascular disease events in this cohort. For the laboratory-based model, which required blood testing, we used standard risk factors to assess risk of cardiovascular disease: age, systolic blood pressure, smoking status, total cholesterol, reported diabetes status, and current treatment for hypertension. For the non-laboratory-based model, we substituted body-mass index for cholesterol. Findings In the cohort of 6186, there were 1529 first-time cardiovascular events and 578 (38%) deaths due to cardiovascular disease over 21 years. In women, the laboratory-based model was useful for predicting events, with a c statistic of 0·829. The c statistic of the non-laboratory-based model was 0·831. In men, the results were similar (0·784 for the laboratory-based model and 0·783 for the non-laboratory-based model). Results were similar between the laboratory-based and non-laboratory-based models in both men and women when restricted to fatal events only. Interpretation A method that uses non-laboratory-based risk factors predicted cardiovascular events as accurately as one that relied on laboratory-based values. This approach could simplify risk assessment in situations where laboratory testing is inconvenient or unavailable. PMID:18342687

  18. Glossary of reference terms for alternative test methods and their validation.

    PubMed

    Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas

    2014-01-01

    This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.

  19. Statistical Tests of System Linearity Based on the Method of Surrogate Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, N.; Paez, T.; Red-Horse, J.

    When dealing with measured data from dynamic systems we often make the tacit assumption that the data are generated by linear dynamics. While some systematic tests for linearity and determinism are available - for example the coherence fimction, the probability density fimction, and the bispectrum - fi,u-ther tests that quanti$ the existence and the degree of nonlinearity are clearly needed. In this paper we demonstrate a statistical test for the nonlinearity exhibited by a dynamic system excited by Gaussian random noise. We perform the usual division of the input and response time series data into blocks as required by themore » Welch method of spectrum estimation and search for significant relationships between a given input fkequency and response at harmonics of the selected input frequency. We argue that systematic tests based on the recently developed statistical method of surrogate data readily detect significant nonlinear relationships. The paper elucidates the method of surrogate data. Typical results are illustrated for a linear single degree-of-freedom system and for a system with polynomial stiffness nonlinearity.« less

  20. Guiding principles for the implementation of non-animal safety assessment approaches for cosmetics: skin sensitisation.

    PubMed

    Goebel, Carsten; Aeby, Pierre; Ade, Nadège; Alépée, Nathalie; Aptula, Aynur; Araki, Daisuke; Dufour, Eric; Gilmour, Nicola; Hibatallah, Jalila; Keller, Detlef; Kern, Petra; Kirst, Annette; Marrec-Fairley, Monique; Maxwell, Gavin; Rowland, Joanna; Safford, Bob; Schellauf, Florian; Schepky, Andreas; Seaman, Chris; Teichert, Thomas; Tessier, Nicolas; Teissier, Silvia; Weltzien, Hans Ulrich; Winkler, Petra; Scheel, Julia

    2012-06-01

    Characterisation of skin sensitisation potential is a key endpoint for the safety assessment of cosmetic ingredients especially when significant dermal exposure to an ingredient is expected. At present the mouse local lymph node assay (LLNA) remains the 'gold standard' test method for this purpose however non-animal test methods are under development that aim to replace the need for new animal test data. COLIPA (the European Cosmetics Association) funds an extensive programme of skin sensitisation research, method development and method evaluation and helped coordinate the early evaluation of the three test methods currently undergoing pre-validation. In May 2010, a COLIPA scientific meeting was held to analyse to what extent skin sensitisation safety assessments for cosmetic ingredients can be made in the absence of animal data. In order to propose guiding principles for the application and further development of non-animal safety assessment strategies it was evaluated how and when non-animal test methods, predictions based on physico-chemical properties (including in silico tools), threshold concepts and weight-of-evidence based hazard characterisation could be used to enable safety decisions. Generation and assessment of potency information from alternative tools which at present is predominantly derived from the LLNA is considered the future key research area. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. "RCL-Pooling Assay": A Simplified Method for the Detection of Replication-Competent Lentiviruses in Vector Batches Using Sequential Pooling.

    PubMed

    Corre, Guillaume; Dessainte, Michel; Marteau, Jean-Brice; Dalle, Bruno; Fenard, David; Galy, Anne

    2016-02-01

    Nonreplicative recombinant HIV-1-derived lentiviral vectors (LV) are increasingly used in gene therapy of various genetic diseases, infectious diseases, and cancer. Before they are used in humans, preparations of LV must undergo extensive quality control testing. In particular, testing of LV must demonstrate the absence of replication-competent lentiviruses (RCL) with suitable methods, on representative fractions of vector batches. Current methods based on cell culture are challenging because high titers of vector batches translate into high volumes of cell culture to be tested in RCL assays. As vector batch size and titers are continuously increasing because of the improvement of production and purification methods, it became necessary for us to modify the current RCL assay based on the detection of p24 in cultures of indicator cells. Here, we propose a practical optimization of this method using a pairwise pooling strategy enabling easier testing of higher vector inoculum volumes. These modifications significantly decrease material handling and operator time, leading to a cost-effective method, while maintaining optimal sensibility of the RCL testing. This optimized "RCL-pooling assay" ameliorates the feasibility of the quality control of large-scale batches of clinical-grade LV while maintaining the same sensitivity.

  2. Image quality evaluation of full reference algorithm

    NASA Astrophysics Data System (ADS)

    He, Nannan; Xie, Kai; Li, Tong; Ye, Yushan

    2018-03-01

    Image quality evaluation is a classic research topic, the goal is to design the algorithm, given the subjective feelings consistent with the evaluation value. This paper mainly introduces several typical reference methods of Mean Squared Error(MSE), Peak Signal to Noise Rate(PSNR), Structural Similarity Image Metric(SSIM) and feature similarity(FSIM) of objective evaluation methods. The different evaluation methods are tested by Matlab, and the advantages and disadvantages of these methods are obtained by analyzing and comparing them.MSE and PSNR are simple, but they are not considered to introduce HVS characteristics into image quality evaluation. The evaluation result is not ideal. SSIM has a good correlation and simple calculation ,because it is considered to the human visual effect into image quality evaluation,However the SSIM method is based on a hypothesis,The evaluation result is limited. The FSIM method can be used for test of gray image and color image test, and the result is better. Experimental results show that the new image quality evaluation algorithm based on FSIM is more accurate.

  3. Identification of significant features by the Global Mean Rank test.

    PubMed

    Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph

    2014-01-01

    With the introduction of omics-technologies such as transcriptomics and proteomics, numerous methods for the reliable identification of significantly regulated features (genes, proteins, etc.) have been developed. Experimental practice requires these tests to successfully deal with conditions such as small numbers of replicates, missing values, non-normally distributed expression levels, and non-identical distributions of features. With the MeanRank test we aimed at developing a test that performs robustly under these conditions, while favorably scaling with the number of replicates. The test proposed here is a global one-sample location test, which is based on the mean ranks across replicates, and internally estimates and controls the false discovery rate. Furthermore, missing data is accounted for without the need of imputation. In extensive simulations comparing MeanRank to other frequently used methods, we found that it performs well with small and large numbers of replicates, feature dependent variance between replicates, and variable regulation across features on simulation data and a recent two-color microarray spike-in dataset. The tests were then used to identify significant changes in the phosphoproteomes of cancer cells induced by the kinase inhibitors erlotinib and 3-MB-PP1 in two independently published mass spectrometry-based studies. MeanRank outperformed the other global rank-based methods applied in this study. Compared to the popular Significance Analysis of Microarrays and Linear Models for Microarray methods, MeanRank performed similar or better. Furthermore, MeanRank exhibits more consistent behavior regarding the degree of regulation and is robust against the choice of preprocessing methods. MeanRank does not require any imputation of missing values, is easy to understand, and yields results that are easy to interpret. The software implementing the algorithm is freely available for academic and commercial use.

  4. Dental enamel defect diagnosis through different technology-based devices.

    PubMed

    Kobayashi, Tatiana Yuriko; Vitor, Luciana Lourenço Ribeiro; Carrara, Cleide Felício Carvalho; Silva, Thiago Cruvinel; Rios, Daniela; Machado, Maria Aparecida Andrade Moreira; Oliveira, Thais Marchini

    2018-06-01

    Dental enamel defects (DEDs) are faulty or deficient enamel formations of primary and permanent teeth. Changes during tooth development result in hypoplasia (a quantitative defect) and/or hypomineralisation (a qualitative defect). To compare technology-based diagnostic methods for detecting DEDs. Two-hundred and nine dental surfaces of anterior permanent teeth were selected in patients, 6-11 years of age, with cleft lip with/without cleft palate. First, a conventional clinical examination was conducted according to the modified Developmental Defects of Enamel Index (DDE Index). Dental surfaces were evaluated using an operating microscope and a fluorescence-based device. Interexaminer reproducibility was determined using the kappa test. To compare groups, McNemar's test was used. Cramer's V test was used for comparing the distribution of index codes obtained after classification of all dental surfaces. Cramer's V test revealed statistically significant differences (P < .0001) in the distribution of index codes obtained using the different methods; the coefficients were 0.365 for conventional clinical examination versus fluorescence, 0.961 for conventional clinical examination versus operating microscope and 0.358 for operating microscope versus fluorescence. The sensitivity of the operating microscope and fluorescence method was statistically significant (P = .008 and P < .0001, respectively). Otherwise, the results did not show statistically significant differences in accuracy and specificity for either the operating microscope or the fluorescence methods. This study suggests that the operating microscope performed better than the fluorescence-based device and could be an auxiliary method for the detection of DEDs. © 2017 FDI World Dental Federation.

  5. A procedure for testing the quality of LANDSAT atmospheric correction algorithms

    NASA Technical Reports Server (NTRS)

    Dias, L. A. V. (Principal Investigator); Vijaykumar, N. L.; Neto, G. C.

    1982-01-01

    There are two basic methods for testing the quality of an algorithm to minimize atmospheric effects on LANDSAT imagery: (1) test the results a posteriori, using ground truth or control points; (2) use a method based on image data plus estimation of additional ground and/or atmospheric parameters. A procedure based on the second method is described. In order to select the parameters, initially the image contrast is examined for a series of parameter combinations. The contrast improves for better corrections. In addition the correlation coefficient between two subimages, taken at different times, of the same scene is used for parameter's selection. The regions to be correlated should not have changed considerably in time. A few examples using this proposed procedure are presented.

  6. Vibration Testing of Electrical Cables to Quantify Loads at Tie-Down Locations

    NASA Technical Reports Server (NTRS)

    Dutson, Joseph D.

    2013-01-01

    The standard method for defining static equivalent structural load factors for components is based on Mile s equation. Unless test data is available, 5% critical damping is assumed for all components when calculating loads. Application of this method to electrical cable tie-down hardware often results in high loads, which often exceed the capability of typical tie-down options such as cable ties and P-clamps. Random vibration testing of electrical cables was used to better understand the factors that influence component loads: natural frequency, damping, and mass participation. An initial round of vibration testing successfully identified variables of interest, checked out the test fixture and instrumentation, and provided justification for removing some conservatism in the standard method. Additional testing is planned that will include a larger range of cable sizes for the most significant contributors to load as variables to further refine loads at cable tie-down points. Completed testing has provided justification to reduce loads at cable tie-downs by 45% with additional refinement based on measured cable natural frequencies.

  7. A Strength-Based Approach to Teacher Professional Development

    ERIC Educational Resources Information Center

    Zwart, Rosanne C.; Korthagen, Fred A. J.; Attema-Noordewier, Saskia

    2015-01-01

    Based on positive psychology, self-determination theory and a perspective on teacher quality, this study proposes and examines a strength-based approach to teacher professional development. A mixed method pre-test/post-test design was adopted to study perceived outcomes of the approach for 93 teachers of six primary schools in the Netherlands and…

  8. PREDICTING THE EFFECTIVENESS OF CHEMICAL-PROTECTIVE CLOTHING MODEL AND TEST METHOD DEVELOPMENT

    EPA Science Inventory

    A predictive model and test method were developed for determining the chemical resistance of protective polymeric gloves exposed to liquid organic chemicals. The prediction of permeation through protective gloves by solvents was based on theories of the solution thermodynamics of...

  9. Development of ASTM Standard for SiC-SiC Joint Testing Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, George; Back, Christina

    2015-10-30

    As the nuclear industry moves to advanced ceramic based materials for cladding and core structural materials for a variety of advanced reactors, new standards and test methods are required for material development and licensing purposes. For example, General Atomics (GA) is actively developing silicon carbide (SiC) based composite cladding (SiC-SiC) for its Energy Multiplier Module (EM2), a high efficiency gas cooled fast reactor. Through DOE funding via the advanced reactor concept program, GA developed a new test method for the nominal joint strength of an endplug sealed to advanced ceramic tubes, Fig. 1-1, at ambient and elevated temperatures called themore » endplug pushout (EPPO) test. This test utilizes widely available universal mechanical testers coupled with clam shell heaters, and specimen size is relatively small, making it a viable post irradiation test method. The culmination of this effort was a draft of an ASTM test standard that will be submitted for approval to the ASTM C28 ceramic committee. Once the standard has been vetted by the ceramics test community, an industry wide standard methodology to test joined tubular ceramic components will be available for the entire nuclear materials community.« less

  10. Mechanical Testing of Hydrogels in Cartilage Tissue Engineering: Beyond the Compressive Modulus

    PubMed Central

    Xiao, Yinghua; Friis, Elizabeth A.; Gehrke, Stevin H.

    2013-01-01

    Injuries to articular cartilage result in significant pain to patients and high medical costs. Unfortunately, cartilage repair strategies have been notoriously unreliable and/or complex. Biomaterial-based tissue-engineering strategies offer great promise, including the use of hydrogels to regenerate articular cartilage. Mechanical integrity is arguably the most important functional outcome of engineered cartilage, although mechanical testing of hydrogel-based constructs to date has focused primarily on deformation rather than failure properties. In addition to deformation testing, as the field of cartilage tissue engineering matures, this community will benefit from the addition of mechanical failure testing to outcome analyses, given the crucial clinical importance of the success of engineered constructs. However, there is a tremendous disparity in the methods used to evaluate mechanical failure of hydrogels and articular cartilage. In an effort to bridge the gap in mechanical testing methods of articular cartilage and hydrogels in cartilage regeneration, this review classifies the different toughness measurements for each. The urgency for identifying the common ground between these two disparate fields is high, as mechanical failure is ready to stand alongside stiffness as a functional design requirement. In comparing toughness measurement methods between hydrogels and cartilage, we recommend that the best option for evaluating mechanical failure of hydrogel-based constructs for cartilage tissue engineering may be tensile testing based on the single edge notch test, in part because specimen preparation is more straightforward and a related American Society for Testing and Materials (ASTM) standard can be adopted in a fracture mechanics context. PMID:23448091

  11. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  12. Measurement correction method for force sensor used in dynamic pressure calibration based on artificial neural network optimized by genetic algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Tingwei; Kong, Deren; Shang, Fei; Chen, Jing

    2017-12-01

    We present an optimization algorithm to obtain low-uncertainty dynamic pressure measurements from a force-transducer-based device. In this paper, the advantages and disadvantages of the methods that are commonly used to measure the propellant powder gas pressure, the applicable scope of dynamic pressure calibration devices, and the shortcomings of the traditional comparison calibration method based on the drop-weight device are firstly analysed in detail. Then, a dynamic calibration method for measuring pressure using a force sensor based on a drop-weight device is introduced. This method can effectively save time when many pressure sensors are calibrated simultaneously and extend the life of expensive reference sensors. However, the force sensor is installed between the drop-weight and the hammerhead by transition pieces through the connection mode of bolt fastening, which causes adverse effects such as additional pretightening and inertia forces. To solve these effects, the influence mechanisms of the pretightening force, the inertia force and other influence factors on the force measurement are theoretically analysed. Then a measurement correction method for the force measurement is proposed based on an artificial neural network optimized by a genetic algorithm. The training and testing data sets are obtained from calibration tests, and the selection criteria for the key parameters of the correction model is discussed. The evaluation results for the test data show that the correction model can effectively improve the force measurement accuracy of the force sensor. Compared with the traditional high-accuracy comparison calibration method, the percentage difference of the impact-force-based measurement is less than 0.6% and the relative uncertainty of the corrected force value is 1.95%, which can meet the requirements of engineering applications.

  13. Problems and Procedures in Planning a Situation Based Video Test on Teaching.

    ERIC Educational Resources Information Center

    Masonis, Edward J.

    This paper briefly outlines some problems one must solve when developing a video-based test to evaluate what a teacher knows about learning and instruction. Consideration is given to the effect the use of videotapes of actual classroom behavior have on test planning. Two methods of incorporating such situational material into the test…

  14. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    PubMed

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the test method for a given purpose. Relevance encapsulates the scientific basis of the test method, its capacity to predict adverse effects in the "target system" (i.e. human health or the environment) as well as its applicability for the intended purpose. In this chapter we focus on the validation of non-animal in vitro alternative testing methods and review the concepts, challenges, processes and tools fundamental to the validation of in vitro methods intended for hazard testing of chemicals. We explore major challenges and peculiarities of validation in this area. Based on the notion that validation per se is a scientific endeavour that needs to adhere to key scientific principles, namely objectivity and appropriate choice of methodology, we examine basic aspects of study design and management, and provide illustrations of statistical approaches to describe predictive performance of validated test methods as well as their reliability.

  15. Evaluation of Gene-Based Family-Based Methods to Detect Novel Genes Associated With Familial Late Onset Alzheimer Disease

    PubMed Central

    Fernández, Maria V.; Budde, John; Del-Aguila, Jorge L.; Ibañez, Laura; Deming, Yuetiva; Harari, Oscar; Norton, Joanne; Morris, John C.; Goate, Alison M.; Cruchaga, Carlos

    2018-01-01

    Gene-based tests to study the combined effect of rare variants on a particular phenotype have been widely developed for case-control studies, but their evolution and adaptation for family-based studies, especially studies of complex incomplete families, has been slower. In this study, we have performed a practical examination of all the latest gene-based methods available for family-based study designs using both simulated and real datasets. We examined the performance of several collapsing, variance-component, and transmission disequilibrium tests across eight different software packages and 22 models utilizing a cohort of 285 families (N = 1,235) with late-onset Alzheimer disease (LOAD). After a thorough examination of each of these tests, we propose a methodological approach to identify, with high confidence, genes associated with the tested phenotype and we provide recommendations to select the best software and model for family-based gene-based analyses. Additionally, in our dataset, we identified PTK2B, a GWAS candidate gene for sporadic AD, along with six novel genes (CHRD, CLCN2, HDLBP, CPAMD8, NLRP9, and MAS1L) as candidate genes for familial LOAD. PMID:29670507

  16. Evaluation of Gene-Based Family-Based Methods to Detect Novel Genes Associated With Familial Late Onset Alzheimer Disease.

    PubMed

    Fernández, Maria V; Budde, John; Del-Aguila, Jorge L; Ibañez, Laura; Deming, Yuetiva; Harari, Oscar; Norton, Joanne; Morris, John C; Goate, Alison M; Cruchaga, Carlos

    2018-01-01

    Gene-based tests to study the combined effect of rare variants on a particular phenotype have been widely developed for case-control studies, but their evolution and adaptation for family-based studies, especially studies of complex incomplete families, has been slower. In this study, we have performed a practical examination of all the latest gene-based methods available for family-based study designs using both simulated and real datasets. We examined the performance of several collapsing, variance-component, and transmission disequilibrium tests across eight different software packages and 22 models utilizing a cohort of 285 families ( N = 1,235) with late-onset Alzheimer disease (LOAD). After a thorough examination of each of these tests, we propose a methodological approach to identify, with high confidence, genes associated with the tested phenotype and we provide recommendations to select the best software and model for family-based gene-based analyses. Additionally, in our dataset, we identified PTK2B , a GWAS candidate gene for sporadic AD, along with six novel genes ( CHRD, CLCN2, HDLBP, CPAMD8, NLRP9 , and MAS1L ) as candidate genes for familial LOAD.

  17. Alternative methods of flexible base compaction acceptance.

    DOT National Transportation Integrated Search

    2012-05-01

    In the Texas Department of Transportation, flexible base construction is governed by a series of stockpile : and field tests. A series of concerns with these existing methods, along with some premature failures in the : field, led to this project inv...

  18. Second ventilatory threshold from heart-rate variability: valid when the upper body is involved?

    PubMed

    Mourot, Laurent; Fabre, Nicolas; Savoldelli, Aldo; Schena, Federico

    2014-07-01

    To determine the most accurate method based on spectral analysis of heart-rate variability (SA-HRV) during an incremental and continuous maximal test involving the upper body, the authors tested 4 different methods to obtain the heart rate (HR) at the second ventilatory threshold (VT(2)). Sixteen ski mountaineers (mean ± SD; age 25 ± 3 y, height 177 ± 8 cm, mass 69 ± 10 kg) performed a roller-ski test on a treadmill. Respiratory variables and HR were continuously recorded, and the 4 SA-HRV methods were compared with the gas-exchange method through Bland and Altman analyses. The best method was the one based on a time-varying spectral analysis with high frequency ranging from 0.15 Hz to a cutoff point relative to the individual's respiratory sinus arrhythmia. The HR values were significantly correlated (r(2) = .903), with a mean HR difference with the respiratory method of 0.1 ± 3.0 beats/min and low limits of agreements (around -6 /+6 beats/min). The 3 other methods led to larger errors and lower agreements (up to 5 beats/min and around -23/+20 beats/min). It is possible to accurately determine VT(2) with an HR monitor during an incremental test involving the upper body if the appropriate HRV method is used.

  19. Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle

    NASA Technical Reports Server (NTRS)

    Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero

    2000-01-01

    This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.

  20. Data pieces-based parameter identification for lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Gao, Wei; Zou, Yuan; Sun, Fengchun; Hu, Xiaosong; Yu, Yang; Feng, Sen

    2016-10-01

    Battery characteristics vary with temperature and aging, it is necessary to identify battery parameters periodically for electric vehicles to ensure reliable State-of-Charge (SoC) estimation, battery equalization and safe operation. Aiming for on-board applications, this paper proposes a data pieces-based parameter identification (DPPI) method to identify comprehensive battery parameters including capacity, OCV (open circuit voltage)-Ah relationship and impedance-Ah relationship simultaneously only based on battery operation data. First a vehicle field test was conducted and battery operation data was recorded, then the DPPI method is elaborated based on vehicle test data, parameters of all 97 cells of the battery package are identified and compared. To evaluate the adaptability of the proposed DPPI method, it is used to identify battery parameters of different aging levels and different temperatures based on battery aging experiment data. Then a concept of ;OCV-Ah aging database; is proposed, based on which battery capacity can be identified even though the battery was never fully charged or discharged. Finally, to further examine the effectiveness of the identified battery parameters, they are used to perform SoC estimation for the test vehicle with adaptive extended Kalman filter (AEKF). The result shows good accuracy and reliability.

  1. Method for Reduction of Silver Biocide Plating on Metal Surfaces

    NASA Technical Reports Server (NTRS)

    Steele, John; Nalette, Timothy; Beringer, Durwood

    2013-01-01

    Silver ions in aqueous solutions (0.05 to 1 ppm) are used for microbial control in water systems. The silver ions remain in solution when stored in plastic containers, but the concentration rapidly decreases to non-biocidal levels when stored in metal containers. The silver deposits onto the surface and is reduced to non-biocidal silver metal when it contacts less noble metal surfaces, including stainless steel, titanium, and nickel-based alloys. Five methods of treatment of contact metal surfaces to deter silver deposition and reduction are proposed: (1) High-temperature oxidation of the metal surface; (2) High-concentration silver solution pre-treatment; (3) Silver plating; (4) Teflon coat by vapor deposition (titanium only); and (5) A combination of methods (1) and (2), which proved to be the best method for the nickel-based alloy application. The mechanism associated with surface treatments (1), (2), and (5) is thought to be the development of a less active oxide layer that deters ionic silver deposition. Mechanism (3) is an attempt to develop an equilibrium ionic silver concentration via dissolution of metallic silver. Mechanism (4) provides a non-reactive barrier to deter ionic silver plating. Development testing has shown that ionic silver in aqueous solution was maintained at essentially the same level of addition (0.4 ppm) for up to 15 months with method (5) (a combination of methods (1) and (2)), before the test was discontinued for nickel-based alloys. Method (1) resulted in the maintenance of a biocidal level (approximately 0.05 ppm) for up to 10 months before that test was discontinued for nickel-based alloys. Methods (1) and (2) used separately were able to maintain ionic silver in aqueous solution at essentially the same level of addition (0.4 ppm) for up to 10 months before the test was discontinued for stainless steel alloys. Method (3) was only utilized for titanium alloys, and was successful at maintaining ionic silver in aqueous solution at essentially the same level of addition (0.4 ppm) for up to 10 months before the test was discontinued for simple flat geometries, but not for geometries that are difficult to Teflon coat.

  2. Resilient moduli of typical Missouri soils and unbound granular base materials

    DOT National Transportation Integrated Search

    2008-03-01

    The objective of this project is to accurately determine the resilient moduli for common Missouri subgrade soils and unbound granular base materials in accordance with the AASHTO T 307 test method. The test results included moduli data from 27 common...

  3. Pulse retrieval algorithm for interferometric frequency-resolved optical gating based on differential evolution.

    PubMed

    Hyyti, Janne; Escoto, Esmerando; Steinmeyer, Günter

    2017-10-01

    A novel algorithm for the ultrashort laser pulse characterization method of interferometric frequency-resolved optical gating (iFROG) is presented. Based on a genetic method, namely, differential evolution, the algorithm can exploit all available information of an iFROG measurement to retrieve the complex electric field of a pulse. The retrieval is subjected to a series of numerical tests to prove the robustness of the algorithm against experimental artifacts and noise. These tests show that the integrated error-correction mechanisms of the iFROG method can be successfully used to remove the effect from timing errors and spectrally varying efficiency in the detection. Moreover, the accuracy and noise resilience of the new algorithm are shown to outperform retrieval based on the generalized projections algorithm, which is widely used as the standard method in FROG retrieval. The differential evolution algorithm is further validated with experimental data, measured with unamplified three-cycle pulses from a mode-locked Ti:sapphire laser. Additionally introducing group delay dispersion in the beam path, the retrieval results show excellent agreement with independent measurements with a commercial pulse measurement device based on spectral phase interferometry for direct electric-field retrieval. Further experimental tests with strongly attenuated pulses indicate resilience of differential-evolution-based retrieval against massive measurement noise.

  4. Spread Spectrum Receiver Electromagnetic Interference (EMI) Test Guide

    NASA Technical Reports Server (NTRS)

    Wheeler, M. L.

    1998-01-01

    The objective of this test guide is to document appropriate unit level test methods and techniques for the performance of EMI testing of Direct Sequence (DS) spread spectrum receivers. Consideration of EMI test methods tailored for spread spectrum receivers utilizing frequency spreading, techniques other than direct sequence (such as frequency hopping, frequency chirping, and various hybrid methods) is beyond the scope of this test guide development program and is not addressed as part of this document EMI test requirements for NASA programs are primarily developed based on the requirements contained in MIL-STD-46 1 D (or earlier revisions of MIL-STD-46 1). The corresponding test method guidelines for the MIL-STD-461 D tests are provided in MIL-STD-462D. These test methods are well documented with the exception of the receiver antenna port susceptibility tests (intermodulation, cross modulation, and rejection of undesired signals) which must be tailored to the specific type of receiver that is being tested. Thus, test methods addressed in this guide consist only of antenna port tests designed to evaluate receiver susceptibility characteristics. MIL-STD-462D should be referred for guidance pertaining to test methods for EMI tests other than the antenna port tests. The scope of this test guide includes: (1) a discussion of generic DS receiver performance characteristics; (2) a summary of S-band TDRSS receiver operation; (3) a discussion of DS receiver EMI susceptibility mechanisms and characteristics; (4) a summary of military standard test guidelines; (5) recommended test approach and methods; and (6) general conclusions and recommendations for future studies in the area of spread spectrum receiver testing.

  5. Calculus domains modelled using an original bool algebra based on polygons

    NASA Astrophysics Data System (ADS)

    Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.

    2016-08-01

    Analytical and numerical computer based models require analytical definitions of the calculus domains. The paper presents a method to model a calculus domain based on a bool algebra which uses solid and hollow polygons. The general calculus relations of the geometrical characteristics that are widely used in mechanical engineering are tested using several shapes of the calculus domain in order to draw conclusions regarding the most effective methods to discretize the domain. The paper also tests the results of several CAD commercial software applications which are able to compute the geometrical characteristics, being drawn interesting conclusions. The tests were also targeting the accuracy of the results vs. the number of nodes on the curved boundary of the cross section. The study required the development of an original software consisting of more than 1700 computer code lines. In comparison with other calculus methods, the discretization using convex polygons is a simpler approach. Moreover, this method doesn't lead to large numbers as the spline approximation did, in that case being required special software packages in order to offer multiple, arbitrary precision. The knowledge resulted from this study may be used to develop complex computer based models in engineering.

  6. A sampling and classification item selection approach with content balancing.

    PubMed

    Chen, Pei-Hua

    2015-03-01

    Existing automated test assembly methods typically employ constrained combinatorial optimization. Constructing forms sequentially based on an optimization approach usually results in unparallel forms and requires heuristic modifications. Methods based on a random search approach have the major advantage of producing parallel forms sequentially without further adjustment. This study incorporated a flexible content-balancing element into the statistical perspective item selection method of the cell-only method (Chen et al. in Educational and Psychological Measurement, 72(6), 933-953, 2012). The new method was compared with a sequential interitem distance weighted deviation model (IID WDM) (Swanson & Stocking in Applied Psychological Measurement, 17(2), 151-166, 1993), a simultaneous IID WDM, and a big-shadow-test mixed integer programming (BST MIP) method to construct multiple parallel forms based on matching a reference form item-by-item. The results showed that the cell-only method with content balancing and the sequential and simultaneous versions of IID WDM yielded results comparable to those obtained using the BST MIP method. The cell-only method with content balancing is computationally less intensive than the sequential and simultaneous versions of IID WDM.

  7. USE OF BACTEROIDES PCR-BASED METHODS TO EXAMINE FECAL CONTAMINATION SOURCES IN TROPICAL COASTAL WATERS

    EPA Science Inventory

    Several library independent Microbial Source Tracking methods have been developed to rapidly determine the source of fecal contamination. Thus far, none of these methods have been tested in tropical marine waters. In this study, we used a Bacteroides 16S rDNA PCR-based...

  8. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    PubMed

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  9. Quantitative PCR detection of Batrachochytrium dendrobatidis DNA from sediments and water

    USGS Publications Warehouse

    Kirshtein, Julie D.; Anderson, Chauncey W.; Wood, J.S.; Longcore, Joyce E.; Voytek, Mary A.

    2007-01-01

    The fungal pathogen Batrachochytrium dendrobatidis (Bd) causes chytridiomycosis, a disease implicated in amphibian declines on 5 continents. Polymerase chain reaction (PCR) primer sets exist with which amphibians can be tested for this disease, and advances in sampling techniques allow non-invasive testing of animals. We developed filtering and PCR based quantitative methods by modifying existing PCR assays to detect Bd DNA in water and sediments, without the need for testing amphibians; we tested the methods at 4 field sites. The SYBR based assay using Boyle primers (SYBR/Boyle assay) and the Taqman based assay using Wood primers performed similarly with samples generated in the laboratory (Bd spiked filters), but the SYBR/Boyle assay detected Bd DNA in more field samples. We detected Bd DNA in water from 3 of 4 sites tested, including one pond historically negative for chytridiomycosis. Zoospore equivalents in sampled water ranged from 19 to 454 l-1 (nominal detection limit is 10 DNA copies, or about 0.06 zoospore). We did not detect DNA of Bd from sediments collected at any sites. Our filtering and amplification methods provide a new tool to investigate critical aspects of Bd in the environment. ?? Inter-Research 2007.

  10. Health condition identification of multi-stage planetary gearboxes using a mRVM-based method

    NASA Astrophysics Data System (ADS)

    Lei, Yaguo; Liu, Zongyao; Wu, Xionghui; Li, Naipeng; Chen, Wu; Lin, Jing

    2015-08-01

    Multi-stage planetary gearboxes are widely applied in aerospace, automotive and heavy industries. Their key components, such as gears and bearings, can easily suffer from damage due to tough working environment. Health condition identification of planetary gearboxes aims to prevent accidents and save costs. This paper proposes a method based on multiclass relevance vector machine (mRVM) to identify health condition of multi-stage planetary gearboxes. In this method, a mRVM algorithm is adopted as a classifier, and two features, i.e. accumulative amplitudes of carrier orders (AACO) and energy ratio based on difference spectra (ERDS), are used as the input of the classifier to classify different health conditions of multi-stage planetary gearboxes. To test the proposed method, seven health conditions of a two-stage planetary gearbox are considered and vibration data is acquired from the planetary gearbox under different motor speeds and loading conditions. The results of three tests based on different data show that the proposed method obtains an improved identification performance and robustness compared with the existing method.

  11. Use of focused ultrasonication in activity-based profiling of deubiquitinating enzymes in tissue.

    PubMed

    Nanduri, Bindu; Shack, Leslie A; Rai, Aswathy N; Epperson, William B; Baumgartner, Wes; Schmidt, Ty B; Edelmann, Mariola J

    2016-12-15

    To develop a reproducible tissue lysis method that retains enzyme function for activity-based protein profiling, we compared four different methods to obtain protein extracts from bovine lung tissue: focused ultrasonication, standard sonication, mortar & pestle method, and homogenization combined with standard sonication. Focused ultrasonication and mortar & pestle methods were sufficiently effective for activity-based profiling of deubiquitinases in tissue, and focused ultrasonication also had the fastest processing time. We used focused-ultrasonicator for subsequent activity-based proteomic analysis of deubiquitinases to test the compatibility of this method in sample preparation for activity-based chemical proteomics. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Computer game-based and traditional learning method: a comparison regarding students’ knowledge retention

    PubMed Central

    2013-01-01

    Background Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Methods Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students’ prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students’ short and long-term knowledge retention. PMID:23442203

  13. Remote control missile model test

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.; Shaw, David S.; Sawyer, Wallace C.

    1989-01-01

    An extremely large, systematic, axisymmetric body/tail fin data base was gathered through tests of an innovative missile model design which is described herein. These data were originally obtained for incorporation into a missile aerodynamics code based on engineering methods (Program MISSILE3), but can also be used as diagnostic test cases for developing computational methods because of the individual-fin data included in the data base. Detailed analysis of four sample cases from these data are presented to illustrate interesting individual-fin force and moment trends. These samples quantitatively show how bow shock, fin orientation, fin deflection, and body vortices can produce strong, unusual, and computationally challenging effects on individual fin loads. Comparisons between these data and calculations from the SWINT Euler code are also presented.

  14. Scientific method by argumentation design: learning process for maintaining student’s retention

    NASA Astrophysics Data System (ADS)

    Siswanto; Yusiran; Asriyadin; Gumilar, S.; Subali, B.

    2018-03-01

    The purpose of this research describes the effect of scientific methods designed by argumentation in maintaining retention of pre-service physics teachers (students) in mechanical concept. This learning consists of five stages including the first two stages namely observing and questioning. While the next three stages of reasoning, trying, and communicating are made of argumentation design. To know the effectiveness of treatment, students are given pre-test and post-test in one time. On the other hand, students were given advanced post-test to know the durability of retention as many as four times in four months. The results show that there was mean difference between pre-test and post-test based on the Wilcoxon test (z = -3.4, p=0.001). While the effectiveness of treatment is in the high category based on normalized gain values ( = 0.86). Meanwhile, mean difference of all post-test is significantly different based on Analysis of Varian (F = 365.63, p = 0.00). However, in the fourth month, students retention rates began to stabilize based on Tuckey’s HSD (p=0.074) for comparison of mean difference between fourth and fifth post-test. Overall, learning designed can maintain students retention within 4 months after the learning finish.

  15. The Enterococcus QPCR Method for Recreational Water Quality Testing: Testing Background, Performance and Issues

    EPA Science Inventory

    Currently accepted culture-based monitoring methods for fecal indicator bacteria in surface waters take at least 24 hr to determine if unacceptable levels of fecal pollution have reached our recreational beaches. During this waiting period changing water conditions may result eit...

  16. Aspheric surface testing by irradiance transport equation

    NASA Astrophysics Data System (ADS)

    Shomali, Ramin; Darudi, Ahmad; Nasiri, Sadollah; Asgharsharghi Bonab, Armir

    2010-10-01

    In this paper a method for aspheric surface testing is presented. The method is based on solving the Irradiance Transport Equation (ITE).The accuracy of ITE normally depends on the amount of the pick to valley of the phase distribution. This subject is investigated by a simulation procedure.

  17. Large-aperture space optical system testing based on the scanning Hartmann.

    PubMed

    Wei, Haisong; Yan, Feng; Chen, Xindong; Zhang, Hao; Cheng, Qiang; Xue, Donglin; Zeng, Xuefeng; Zhang, Xuejun

    2017-03-10

    Based on the Hartmann testing principle, this paper proposes a novel image quality testing technology which applies to a large-aperture space optical system. Compared with the traditional testing method through a large-aperture collimator, the scanning Hartmann testing technology has great advantages due to its simple structure, low cost, and ability to perform wavefront measurement of an optical system. The basic testing principle of the scanning Hartmann testing technology, data processing method, and simulation process are presented in this paper. Certain simulation results are also given to verify the feasibility of this technology. Furthermore, a measuring system is developed to conduct a wavefront measurement experiment for a 200 mm aperture optical system. The small deviation (6.3%) of root mean square (RMS) between experimental results and interferometric results indicates that the testing system can measure low-order aberration correctly, which means that the scanning Hartmann testing technology has the ability to test the imaging quality of a large-aperture space optical system.

  18. MR-assisted PET motion correction in simultaneous PET/MRI studies of dementia subjects.

    PubMed

    Chen, Kevin T; Salcedo, Stephanie; Chonde, Daniel B; Izquierdo-Garcia, David; Levine, Michael A; Price, Julie C; Dickerson, Bradford C; Catana, Ciprian

    2018-03-08

    Subject motion in positron emission tomography (PET) studies leads to image blurring and artifacts; simultaneously acquired magnetic resonance imaging (MRI) data provides a means for motion correction (MC) in integrated PET/MRI scanners. To assess the effect of realistic head motion and MR-based MC on static [ 18 F]-fluorodeoxyglucose (FDG) PET images in dementia patients. Observational study. Thirty dementia subjects were recruited. 3T hybrid PET/MR scanner where EPI-based and T 1 -weighted sequences were acquired simultaneously with the PET data. Head motion parameters estimated from high temporal resolution MR volumes were used for PET MC. The MR-based MC method was compared to PET frame-based MC methods in which motion parameters were estimated by coregistering 5-minute frames before and after accounting for the attenuation-emission mismatch. The relative changes in standardized uptake value ratios (SUVRs) between the PET volumes processed with the various MC methods, without MC, and the PET volumes with simulated motion were compared in relevant brain regions. The absolute value of the regional SUVR relative change was assessed with pairwise paired t-tests testing at the P = 0.05 level, comparing the values obtained through different MR-based MC processing methods as well as across different motion groups. The intraregion voxelwise variability of regional SUVRs obtained through different MR-based MC processing methods was also assessed with pairwise paired t-tests testing at the P = 0.05 level. MC had a greater impact on PET data quantification in subjects with larger amplitude motion (higher than 18% in the medial orbitofrontal cortex) and greater changes were generally observed for the MR-based MC method compared to the frame-based methods. Furthermore, a mean relative change of ∼4% was observed after MC even at the group level, suggesting the importance of routinely applying this correction. The intraregion voxelwise variability of regional SUVRs was also decreased using MR-based MC. All comparisons were significant at the P = 0.05 level. Incorporating temporally correlated MR data to account for intraframe motion has a positive impact on the FDG PET image quality and data quantification in dementia patients. 3 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.

  19. Highly Effective DNA Extraction Method for Nuclear Short Tandem Repeat Testing of Skeletal Remains from Mass Graves

    PubMed Central

    Davoren, Jon; Vanek, Daniel; Konjhodzić, Rijad; Crews, John; Huffine, Edwin; Parsons, Thomas J.

    2007-01-01

    Aim To quantitatively compare a silica extraction method with a commonly used phenol/chloroform extraction method for DNA analysis of specimens exhumed from mass graves. Methods DNA was extracted from twenty randomly chosen femur samples, using the International Commission on Missing Persons (ICMP) silica method, based on Qiagen Blood Maxi Kit, and compared with the DNA extracted by the standard phenol/chloroform-based method. The efficacy of extraction methods was compared by real time polymerase chain reaction (PCR) to measure DNA quantity and the presence of inhibitors and by amplification with the PowerPlex 16 (PP16) multiplex nuclear short tandem repeat (STR) kit. Results DNA quantification results showed that the silica-based method extracted on average 1.94 ng of DNA per gram of bone (range 0.25-9.58 ng/g), compared with only 0.68 ng/g by the organic method extracted (range 0.0016-4.4880 ng/g). Inhibition tests showed that there were on average significantly lower levels of PCR inhibitors in DNA isolated by the organic method. When amplified with PP16, all samples extracted by silica-based method produced 16 full loci profiles, while only 75% of the DNA extracts obtained by organic technique amplified 16 loci profiles. Conclusions The silica-based extraction method showed better results in nuclear STR typing from degraded bone samples than a commonly used phenol/chloroform method. PMID:17696302

  20. Supervised segmentation of microelectrode recording artifacts using power spectral density.

    PubMed

    Bakstein, Eduard; Schneider, Jakub; Sieger, Tomas; Novak, Daniel; Wild, Jiri; Jech, Robert

    2015-08-01

    Appropriate detection of clean signal segments in extracellular microelectrode recordings (MER) is vital for maintaining high signal-to-noise ratio in MER studies. Existing alternatives to manual signal inspection are based on unsupervised change-point detection. We present a method of supervised MER artifact classification, based on power spectral density (PSD) and evaluate its performance on a database of 95 labelled MER signals. The proposed method yielded test-set accuracy of 90%, which was close to the accuracy of annotation (94%). The unsupervised methods achieved accuracy of about 77% on both training and testing data.

  1. Image fusion based on Bandelet and sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Jiuxing; Zhang, Wei; Li, Xuzhi

    2018-04-01

    Bandelet transform could acquire geometric regular direction and geometric flow, sparse representation could represent signals with as little as possible atoms on over-complete dictionary, both of which could be used to image fusion. Therefore, a new fusion method is proposed based on Bandelet and Sparse Representation, to fuse Bandelet coefficients of multi-source images and obtain high quality fusion effects. The test are performed on remote sensing images and simulated multi-focus images, experimental results show that the performance of new method is better than tested methods according to objective evaluation indexes and subjective visual effects.

  2. 10 CFR 429.71 - Maintenance of records.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... certification reports, of the underlying test data for all certification testing, and of any other testing... based on the alternative method. (b) Such records shall be organized and indexed in a fashion that makes...

  3. 10 CFR 429.71 - Maintenance of records.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... certification reports, of the underlying test data for all certification testing, and of any other testing... based on the alternative method. (b) Such records shall be organized and indexed in a fashion that makes...

  4. 10 CFR 429.71 - Maintenance of records.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... certification reports, of the underlying test data for all certification testing, and of any other testing... based on the alternative method. (b) Such records shall be organized and indexed in a fashion that makes...

  5. Testing prediction methods: Earthquake clustering versus the Poisson model

    USGS Publications Warehouse

    Michael, A.J.

    1997-01-01

    Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.

  6. Confidence intervals for single-case effect size measures based on randomization test inversion.

    PubMed

    Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick

    2017-02-01

    In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.

  7. Equipercentile Test Equating: The Effects of Presmoothing and Postsmoothing on the Magnitude of Sample-Dependent Errors.

    DTIC Science & Technology

    1985-04-01

    EM 32 12 MICROCOP REOUTO TETCHR NTOA B URA FSA4ARS16- AFHRL-TR-84-64 9 AIR FORCE 6 __ H EQUIPERCENTILE TEST EQUATING: THE EFFECTS OF PRESMOOTHING AND...combined or compound presmoother and a presmoothing method based on a particular model of test scores. Of the seven methods of presmoothing the score...unsmoothed distributions, the smoothing of that sequence of differences by the same compound method, and, finally, adding the smoothed differences back

  8. Detection of osmotic damages in GRP boat hulls

    NASA Astrophysics Data System (ADS)

    Krstulović-Opara, L.; Domazet, Ž.; Garafulić, E.

    2013-09-01

    Infrared thermography as a tool of non-destructive testing is method enabling visualization and estimation of structural anomalies and differences in structure's topography. In presented paper problem of osmotic damage in submerged glass reinforced polymer structures is addressed. The osmotic damage can be detected by a simple humidity gauging, but for proper evaluation and estimation testing methods are restricted and hardly applicable. In this paper it is demonstrated that infrared thermography, based on estimation of heat wave propagation, can be used. Three methods are addressed; Pulsed thermography, Fast Fourier Transform and Continuous Morlet Wavelet. An additional image processing based on gradient approach is applied on all addressed methods. It is shown that the Continuous Morlet Wavelet is the most appropriate method for detection of osmotic damage.

  9. Complement dependent cytotoxicity (CDC) activity of a humanized anti Lewis-Y antibody: FACS-based assay versus the 'classical' radioactive method -- qualification, comparison and application of the FACS-based approach.

    PubMed

    Nechansky, A; Szolar, O H J; Siegl, P; Zinoecker, I; Halanek, N; Wiederkum, S; Kircheis, R

    2009-05-01

    The fully humanized Lewis-Y carbohydrate specific monoclonal antibody (mAb) IGN311 is currently tested in a passive immunotherapy approach in a clinical phase I trail and therefore regulatory requirements demand qualified assays for product analysis. To demonstrate the functionality of its Fc-region, the capacity of IGN311 to mediate complement dependent cytotoxicity (CDC) against human breast cancer cells was evaluated. The "classical" radioactive method using chromium-51 and a FACS-based assay were established and qualified according to ICH guidelines. Parameters evaluated were specificity, response function, bias, repeatability (intra-day precision), intermediate precision (operator-time different), and linearity (assay range). In the course of a fully nested design, a four-parameter logistic equation was identified as appropriate calibration model for both methods. For the radioactive assay, the bias ranged from -6.1% to -3.6%. The intermediate precision for future means of duplicate measurements revealed values from 12.5% to 15.9% and the total error (beta-expectation tolerance interval) of the method was found to be <40%. For the FACS-based assay, the bias ranged from -8.3% to 0.6% and the intermediate precision for future means of duplicate measurements revealed values from 4.2% to 8.0%. The total error of the method was found to be <25%. The presented data demonstrate that the FACS-based CDC is more accurate than the radioactive assay. Also, the elimination of radioactivity and the 'real-time' counting of apoptotic cells further justifies the implementation of this method which was subsequently applied for testing the influence of storage at 4 degrees C and 25 degrees C ('stability testing') on the potency of IGN311 drug product. The obtained results demonstrate that the qualified functional assay represents a stability indicating test method.

  10. SMART empirical approaches for predicting field performance of PV modules from results of reliability tests

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata

    2016-09-01

    Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.

  11. Words, concepts, or both: optimal indexing units for automated information retrieval.

    PubMed Central

    Hersh, W. R.; Hickam, D. H.; Leone, T. J.

    1992-01-01

    What is the best way to represent the content of documents in an information retrieval system? This study compares the retrieval effectiveness of five different methods for automated (machine-assigned) indexing using three test collections. The consistently best methods are those that use indexing based on the words that occur in the available text of each document. Methods used to map text into concepts from a controlled vocabulary showed no advantage over the word-based methods. This study also looked at an approach to relevance feedback which showed benefit for both word-based and concept-based methods. PMID:1482951

  12. A New Corrosion Sensor to Determine the Start and Development of Embedded Rebar Corrosion Process at Coastal Concrete

    PubMed Central

    Xu, Chen; Li, Zhiyuan; Jin, Weiliang

    2013-01-01

    The corrosion of reinforcements induced by chloride has resulted to be one of the most frequent causes of their premature damage. Most corrosion sensors were designed to monitor corrosion state in concrete, such as Anode-Ladder-System and Corrowatch System, which are widely used to monitor chloride ingress in marine concrete. However, the monitoring principle of these corrosion sensors is based on the macro-cell test method, so erroneous information may be obtained, especially from concrete under drying or saturated conditions due to concrete resistance taking control in macro-cell corrosion. In this paper, a fast weak polarization method to test corrosion state of reinforcements based on electrochemical polarization dynamics was proposed. Furthermore, a new corrosion sensor for monitoring the corrosion state of concrete cover was developed based on the proposed test method. The sensor was tested in cement mortar, with dry-wet cycle tests to accelerate the chloride ingress rate. The results show that the corrosion sensor can effectively monitor chloride penetration into concrete with little influence of the relative humidity in the concrete. With a reasonable corrosion sensor electrode arrangement, it seems the Ohm-drop effect measured by EIS can be ignored, which makes the tested electrochemical parameters more accurate. PMID:24084117

  13. A new corrosion sensor to determine the start and development of embedded rebar corrosion process at coastal concrete.

    PubMed

    Xu, Chen; Li, Zhiyuan; Jin, Weiliang

    2013-09-30

    The corrosion of reinforcements induced by chloride has resulted to be one of the most frequent causes of their premature damage. Most corrosion sensors were designed to monitor corrosion state in concrete, such as Anode-Ladder-System and Corrowatch System, which are widely used to monitor chloride ingress in marine concrete. However, the monitoring principle of these corrosion sensors is based on the macro-cell test method, so erroneous information may be obtained, especially from concrete under drying or saturated conditions due to concrete resistance taking control in macro-cell corrosion. In this paper, a fast weak polarization method to test corrosion state of reinforcements based on electrochemical polarization dynamics was proposed. Furthermore, a new corrosion sensor for monitoring the corrosion state of concrete cover was developed based on the proposed test method. The sensor was tested in cement mortar, with dry-wet cycle tests to accelerate the chloride ingress rate. The results show that the corrosion sensor can effectively monitor chloride penetration into concrete with little influence of the relative humidity in the concrete. With a reasonable corrosion sensor electrode arrangement, it seems the Ohm-drop effect measured by EIS can be ignored, which makes the tested electrochemical parameters more accurate.

  14. Nested-PCR and a new ELISA-based NovaLisa test kit for malaria diagnosis in an endemic area of Thailand.

    PubMed

    Thongdee, Pimwan; Chaijaroenkul, Wanna; Kuesap, Jiraporn; Na-Bangchang, Kesara

    2014-08-01

    Microscopy is considered as the gold standard for malaria diagnosis although its wide application is limited by the requirement of highly experienced microscopists. PCR and serological tests provide efficient diagnostic performance and have been applied for malaria diagnosis and research. The aim of this study was to investigate the diagnostic performance of nested PCR and a recently developed an ELISA-based new rapid diagnosis test (RDT), NovaLisa test kit, for diagnosis of malaria infection, using microscopic method as the gold standard. The performance of nested-PCR as a malaria diagnostic tool is excellent with respect to its high accuracy, sensitivity, specificity, and ability to discriminate Plasmodium species. The sensitivity and specificity of nested-PCR compared with the microscopic method for detection of Plasmodium falciparum, Plasmodium vivax, and P. falciparum/P. vivax mixed infection were 71.4 vs 100%, 100 vs 98.7%, and 100 vs 95.0%, respectively. The sensitivity and specificity of the ELISA-based NovaLisa test kit compared with the microscopic method for detection of Plasmodium genus were 89.0 vs 91.6%, respectively. NovaLisa test kit provided comparable diagnostic performance. Its relatively low cost, simplicity, and rapidity enables large scale field application.

  15. Potential applicability of stress wave velocity method on pavement base materials as a non-destructive testing technique

    NASA Astrophysics Data System (ADS)

    Mahedi, Masrur

    Aggregates derived from natural sources have been used traditionally as the pavement base materials. But in recent times, the extraction of these natural aggregates has become more labor intensive and costly due to resource depletion and environmental concerns. Thus, the uses of recycled aggregates as the supplementary of natural aggregates are increasing considerably in pavement construction. Use of recycled aggregates such as recycled crushed concrete (RCA) and recycled asphalt pavement (RAP) reduces the rate of natural resource depletion, construction debris and cost. Although recycled aggregates could be used as a viable alternative of conventional base materials, strength characteristics and product variability limit their utility to a great extent. Hence, their applicability is needed to be evaluated extensively based on strength, stiffness and cost factors. But for extensive evaluation, traditionally practiced test methods are proven to be unreasonable in terms of time, cost, reliability and applicability. On the other hand, rapid non-destructive methods have the potential to be less time consuming and inexpensive along with the low variability of test results; therefore improving the reliability of estimated performance of the pavement. In this research work, the experimental program was designed to assess the potential application of stress wave velocity method as a non-destructive test in evaluating recycled base materials. Different combinations of cement treated recycled concrete aggregate (RAP) and recycled crushed concrete (RCA) were used to evaluate the applicability of stress wave velocity method. It was found that, stress wave velocity method is excellent in characterizing the strength and stiffness properties of cement treated base materials. Statistical models, based on P-wave velocity were derived for predicting the modulus of elasticity and compressive strength of different combinations of cement treated RAP, Grade-1 and Grade-2 materials. Two, three and four parameter modeling were also done for characterizing the resilient modulus response. It is anticipated that, derived correlations can be useful in estimating the strength and stiffness response of cement treated base materials with satisfactory level of confidence, if the P-wave velocity remains within the range of 500 ft/sec to 1500 ft/sec.

  16. Comparative evaluation of tensile bond strength of a polyvinyl acetate-based resilient liner following various denture base surface pre-treatment methods and immersion in artificial salivary medium: An in vitro study.

    PubMed

    Philip, Jacob M; Ganapathy, Dhanraj M; Ariga, Padma

    2012-07-01

    This study was formulated to evaluate and estimate the influence of various denture base resin surface pre-treatments (chemical and mechanical and combinations) upon tensile bond strength between a poly vinyl acetate-based denture liner and a denture base resin. A universal testing machine was used for determining the bond strength of the liner to surface pre-treated acrylic resin blocks. The data was analyzed by one-way analysis of variance and the t-test (α =.05). This study infers that denture base surface pre-treatment can improve the adhesive tensile bond strength between the liner and denture base specimens. The results of this study infer that chemical, mechanical, and mechano-chemical pre-treatments will have different effects on the bond strength of the acrylic soft resilient liner to the denture base. Among the various methods of pre-treatment of denture base resins, it was inferred that the mechano-chemical pre-treatment method with air-borne particle abrasion followed by monomer application exhibited superior bond strength than other methods with the resilient liner. Hence, this method could be effectively used to improve bond strength between liner and denture base and thus could minimize delamination of liner from the denture base during function.

  17. Harmonisation of microbial sampling and testing methods for distillate fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, G.C.; Hill, E.C.

    1995-05-01

    Increased incidence of microbial infection in distillate fuels has led to a demand for organisations such as the Institute of Petroleum to propose standards for microbiological quality, based on numbers of viable microbial colony forming units. Variations in quality requirements, and in the spoilage significance of contaminating microbes plus a tendency for temporal and spatial changes in the distribution of microbes, makes such standards difficult to implement. The problem is compounded by a diversity in the procedures employed for sampling and testing for microbial contamination and in the interpretation of the data obtained. The following paper reviews these problems andmore » describes the efforts of The Institute of Petroleum Microbiology Fuels Group to address these issues and in particular to bring about harmonisation of sampling and testing methods. The benefits and drawbacks of available test methods, both laboratory based and on-site, are discussed.« less

  18. Proof test methodology for composites

    NASA Technical Reports Server (NTRS)

    Wu, Edward M.; Bell, David K.

    1992-01-01

    The special requirements for proof test of composites are identified based on the underlying failure process of composites. Two proof test methods are developed to eliminate the inevitable weak fiber sites without also causing flaw clustering which weakens the post-proof-test composite. Significant reliability enhancement by these proof test methods has been experimentally demonstrated for composite strength and composite life in tension. This basic proof test methodology is relevant to the certification and acceptance of critical composite structures. It can also be applied to the manufacturing process development to achieve zero-reject for very large composite structures.

  19. Antenna reconfiguration verification and validation

    NASA Technical Reports Server (NTRS)

    Becker, Robert C. (Inventor); Meyers, David W. (Inventor); Muldoon, Kelly P. (Inventor); Carlson, Douglas R. (Inventor); Drexler, Jerome P. (Inventor)

    2009-01-01

    A method of testing the electrical functionality of an optically controlled switch in a reconfigurable antenna is provided. The method includes configuring one or more conductive paths between one or more feed points and one or more test point with switches in the reconfigurable antenna. Applying one or more test signals to the one or more feed points. Monitoring the one or more test points in response to the one or more test signals and determining the functionality of the switch based upon the monitoring of the one or more test points.

  20. FY 2016 Status Report: CIRFT Testing Data Analyses and Updated Curvature Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jy-An John; Wang, Hong

    This report provides a detailed description of FY15 test result corrections/analysis based on the FY16 Cyclic Integrated Reversible-Bending Fatigue Tester (CIRFT) test program methodology update used to evaluate the vibration integrity of spent nuclear fuel (SNF) under normal transportation conditions. The CIRFT consists of a U-frame testing setup and a real-time curvature measurement method. The three-component U-frame setup of the CIRFT has two rigid arms and linkages to a universal testing machine. The curvature of rod bending is obtained through a three-point deflection measurement method. Three linear variable differential transformers (LVDTs) are used and clamped to the side connecting platesmore » of the U-frame to capture the deformation of the rod. The contact-based measurement, or three-LVDT-based curvature measurement system, on SNF rods has been proven to be quite reliable in CIRFT testing. However, how the LVDT head contacts the SNF rod may have a significant effect on the curvature measurement, depending on the magnitude and direction of rod curvature. It has been demonstrated that the contact/curvature issues can be corrected by using a correction on the sensor spacing. The sensor spacing defines the separation of the three LVDT probes and is a critical quantity in calculating the rod curvature once the deflections are obtained. The sensor spacing correction can be determined by using chisel-type probes. The method has been critically examined this year and has been shown to be difficult to implement in a hot cell environment, and thus cannot be implemented effectively. A correction based on the proposed equivalent gauge-length has the required flexibility and accuracy and can be appropriately used as a correction factor. The correction method based on the equivalent gauge length has been successfully demonstrated in CIRFT data analysis for the dynamic tests conducted on Limerick (LMK) (17 tests), North Anna (NA) (6 tests), and Catawba mixed oxide (MOX) (10 tests) SNF samples. These CIRFT tests were completed in FY14 and FY15. Specifically, the data sets obtained from measurement and monitoring were processed and analyzed. The fatigue life of rods has been characterized in terms of moment, curvature, and equivalent stress and strain..« less

  1. Research of test fault diagnosis method for micro-satellite PSS

    NASA Astrophysics Data System (ADS)

    Wu, Haichao; Wang, Jinqi; Yang, Zhi; Yan, Meizhi

    2017-11-01

    Along with the increase in the number of micro-satellite and the shortening of the product's lifecycle, negative effects of satellite ground test failure become more and more serious. Real-time and efficient fault diagnosis becomes more and more necessary. PSS plays an important role in the satellite ground test's safety and reliability as one of the most important subsystems that guarantees the safety of micro-satellite energy. Take test fault diagnosis method of micro-satellite PSS as research object. On the basis of system features of PSS and classic fault diagnosis methods, propose a kind of fault diagnosis method based on the layered and loose coupling way. This article can provide certain reference for fault diagnosis methods research of other subsystems of micro-satellite.

  2. Validation of the tablet-administered Brief Assessment of Cognition (BAC App).

    PubMed

    Atkins, Alexandra S; Tseng, Tina; Vaughan, Adam; Twamley, Elizabeth W; Harvey, Philip; Patterson, Thomas; Narasimhan, Meera; Keefe, Richard S E

    2017-03-01

    Computerized tests benefit from automated scoring procedures and standardized administration instructions. These methods can reduce the potential for rater error. However, especially in patients with severe mental illnesses, the equivalency of traditional and tablet-based tests cannot be assumed. The Brief Assessment of Cognition in Schizophrenia (BACS) is a pen-and-paper cognitive assessment tool that has been used in hundreds of research studies and clinical trials, and has normative data available for generating age- and gender-corrected standardized scores. A tablet-based version of the BACS called the BAC App has been developed. This study compared performance on the BACS and the BAC App in patients with schizophrenia and healthy controls. Test equivalency was assessed, and the applicability of paper-based normative data was evaluated. Results demonstrated the distributions of standardized composite scores for the tablet-based BAC App and the pen-and-paper BACS were indistinguishable, and the between-methods mean differences were not statistically significant. The discrimination between patients and controls was similarly robust. The between-methods correlations for individual measures in patients were r>0.70 for most subtests. When data from the Token Motor Test was omitted, the between-methods correlation of composite scores was r=0.88 (df=48; p<0.001) in healthy controls and r=0.89 (df=46; p<0.001) in patients, consistent with the test-retest reliability of each measure. Taken together, results indicate that the tablet-based BAC App generates results consistent with the traditional pen-and-paper BACS, and support the notion that the BAC App is appropriate for use in clinical trials and clinical practice. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Network Penetration Testing and Research

    NASA Technical Reports Server (NTRS)

    Murphy, Brandon F.

    2013-01-01

    This paper will focus the on research and testing done on penetrating a network for security purposes. This research will provide the IT security office new methods of attacks across and against a company's network as well as introduce them to new platforms and software that can be used to better assist with protecting against such attacks. Throughout this paper testing and research has been done on two different Linux based operating systems, for attacking and compromising a Windows based host computer. Backtrack 5 and BlackBuntu (Linux based penetration testing operating systems) are two different "attacker'' computers that will attempt to plant viruses and or NASA USRP - Internship Final Report exploits on a host Windows 7 operating system, as well as try to retrieve information from the host. On each Linux OS (Backtrack 5 and BlackBuntu) there is penetration testing software which provides the necessary tools to create exploits that can compromise a windows system as well as other operating systems. This paper will focus on two main methods of deploying exploits 1 onto a host computer in order to retrieve information from a compromised system. One method of deployment for an exploit that was tested is known as a "social engineering" exploit. This type of method requires interaction from unsuspecting user. With this user interaction, a deployed exploit may allow a malicious user to gain access to the unsuspecting user's computer as well as the network that such computer is connected to. Due to more advance security setting and antivirus protection and detection, this method is easily identified and defended against. The second method of exploit deployment is the method mainly focused upon within this paper. This method required extensive research on the best way to compromise a security enabled protected network. Once a network has been compromised, then any and all devices connected to such network has the potential to be compromised as well. With a compromised network, computers and devices can be penetrated through deployed exploits. This paper will illustrate the research done to test ability to penetrate a network without user interaction, in order to retrieve personal information from a targeted host.

  4. Sporicidal activity of chemical and physical tissue fixation methods.

    PubMed Central

    Vardaxis, N J; Hoogeveen, M M; Boon, M E; Hair, C G

    1997-01-01

    AIMS: The effects of alcohol based fixation and microwave stimulated alcohol fixation were investigated on spores of Bacillus stearothermophilus and Bacillus subtilis (var. niger). METHODS: Spores were exposed to 10% formalin, or different concentrations of various alcohol containing fixatives (Kryofix/Spuitfix). Adequate controls were also set up in conjunction with the test solutions. The spores were immersed with and without adjunctive microwave stimulation in the various solutions tested. Possible surviving spores were recovered in revival broth and after incubation, and Gram staining viable counts were performed. RESULTS: Alcohol based fixatives did not have a sporicidal effect on B stearothermophilus or B subtilis (var. niger) spores, and microwave stimulated alcohol fixation at 450 W and up to 75 degrees C did not have a sporicidal effect. CONCLUSIONS: When alcohol based fixatives are used for fixation, precautions should be taken with the material thus treated, as it may contain viable spores or other pathogens, which are destroyed after 24 hours of formalin treatment. Of the physicochemical methods tested involving microwaving, none was successful in eliminating viable spores from the test material. PMID:9215128

  5. Compression Testing of Textile Composite Materials

    NASA Technical Reports Server (NTRS)

    Masters, John E.

    1996-01-01

    The applicability of existing test methods, which were developed primarily for laminates made of unidirectional prepreg tape, to textile composites is an area of concern. The issue is whether the values measured for the 2-D and 3-D braided, woven, stitched, and knit materials are accurate representations of the true material response. This report provides a review of efforts to establish a compression test method for textile reinforced composite materials. Experimental data have been gathered from several sources and evaluated to assess the effectiveness of a variety of test methods. The effectiveness of the individual test methods to measure the material's modulus and strength is determined. Data are presented for 2-D triaxial braided, 3-D woven, and stitched graphite/epoxy material. However, the determination of a recommended test method and specimen dimensions is based, primarily, on experimental results obtained by the Boeing Defense and Space Group for 2-D triaxially braided materials. They evaluated seven test methods: NASA Short Block, Modified IITRI, Boeing Open Hole Compression, Zabora Compression, Boeing Compression after Impact, NASA ST-4, and a Sandwich Column Test.

  6. Potential testing of reprocessing procedures by real-time polymerase chain reaction: A multicenter study of colonoscopy devices.

    PubMed

    Valeriani, Federica; Agodi, Antonella; Casini, Beatrice; Cristina, Maria Luisa; D'Errico, Marcello Mario; Gianfranceschi, Gianluca; Liguori, Giorgio; Liguori, Renato; Mucci, Nicolina; Mura, Ida; Pasquarella, Cesira; Piana, Andrea; Sotgiu, Giovanni; Privitera, Gaetano; Protano, Carmela; Quattrocchi, Annalisa; Ripabelli, Giancarlo; Rossini, Angelo; Spagnolo, Anna Maria; Tamburro, Manuela; Tardivo, Stefano; Veronesi, Licia; Vitali, Matteo; Romano Spica, Vincenzo

    2018-02-01

    Reprocessing of endoscopes is key to preventing cross-infection after colonoscopy. Culture-based methods are recommended for monitoring, but alternative and rapid approaches are needed to improve surveillance and reduce turnover times. A molecular strategy based on detection of residual traces from gut microbiota was developed and tested using a multicenter survey. A simplified sampling and DNA extraction protocol using nylon-tipped flocked swabs was optimized. A multiplex real-time polymerase chain reaction (PCR) test was developed that targeted 6 bacteria genes that were amplified in 3 mixes. The method was validated by interlaboratory tests involving 5 reference laboratories. Colonoscopy devices (n = 111) were sampled in 10 Italian hospitals. Culture-based microbiology and metagenomic tests were performed to verify PCR data. The sampling method was easily applied in all 10 endoscopy units and the optimized DNA extraction and amplification protocol was successfully performed by all of the involved laboratories. This PCR-based method allowed identification of both contaminated (n = 59) and fully reprocessed endoscopes (n = 52) with high sensibility (98%) and specificity (98%), within 3-4 hours, in contrast to the 24-72 hours needed for a classic microbiology test. Results were confirmed by next-generation sequencing and classic microbiology. A novel approach for monitoring reprocessing of colonoscopy devices was developed and successfully applied in a multicenter survey. The general principle of tracing biological fluids through microflora DNA amplification was successfully applied and may represent a promising approach for hospital hygiene. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  7. Comparison of Standard Culture-Based Method to Culture-Independent Method for Evaluation of Hygiene Effects on the Hand Microbiome.

    PubMed

    Zapka, C; Leff, J; Henley, J; Tittl, J; De Nardo, E; Butler, M; Griggs, R; Fierer, N; Edmonds-Wilson, S

    2017-03-28

    Hands play a critical role in the transmission of microbiota on one's own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples ( P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS ( P < 0.05) and ethanol control ( P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. IMPORTANCE The hand microbiome is a critical area of research for diverse fields, such as public health and forensics. The suitability of culture-independent methods for assessing effects of hygiene products on microbiota has not been demonstrated. This is the first controlled laboratory clinical hand study to have compared traditional hand hygiene test methods with newer culture-independent characterization methods typically used by skin microbiologists. This study resulted in recommendations for hand hygiene product testing, development of methods, and future hand skin microbiome research. It also demonstrated the importance of inclusion of skin physiological metadata in skin microbiome research, which is atypical for skin microbiome studies. Copyright © 2017 Zapka et al.

  8. Bayes factors based on robust TDT-type tests for family trio design.

    PubMed

    Yuan, Min; Pan, Xiaoqing; Yang, Yaning

    2015-06-01

    Adaptive transmission disequilibrium test (aTDT) and MAX3 test are two robust-efficient association tests for case-parent family trio data. Both tests incorporate information of common genetic models including recessive, additive and dominant models and are efficient in power and robust to genetic model specifications. The aTDT uses information of departure from Hardy-Weinberg disequilibrium to identify the potential genetic model underlying the data and then applies the corresponding TDT-type test, and the MAX3 test is defined as the maximum of the absolute value of three TDT-type tests under the three common genetic models. In this article, we propose three robust Bayes procedures, the aTDT based Bayes factor, MAX3 based Bayes factor and Bayes model averaging (BMA), for association analysis with case-parent trio design. The asymptotic distributions of aTDT under the null and alternative hypothesis are derived in order to calculate its Bayes factor. Extensive simulations show that the Bayes factors and the p-values of the corresponding tests are generally consistent and these Bayes factors are robust to genetic model specifications, especially so when the priors on the genetic models are equal. When equal priors are used for the underlying genetic models, the Bayes factor method based on aTDT is more powerful than those based on MAX3 and Bayes model averaging. When the prior placed a small (large) probability on the true model, the Bayes factor based on aTDT (BMA) is more powerful. Analysis of a simulation data about RA from GAW15 is presented to illustrate applications of the proposed methods.

  9. Students Perception on the Use of Computer Based Test

    NASA Astrophysics Data System (ADS)

    Nugroho, R. A.; Kusumawati, N. S.; Ambarwati, O. C.

    2018-02-01

    Teaching nowadays might use technology in order to disseminate science and knowledge. As part of teaching, the way evaluating study progress and result has also benefited from this IT rapid progress. The computer-based test (CBT) has been introduced to replace the more conventional Paper and Pencil Test (PPT). CBT are considered more advantageous than PPT. It is considered as more efficient, transparent, and has the ability of minimising fraud in cognitive evaluation. Current studies have indicated the debate of CBT vs PPT usage. Most of the current research compares the two methods without exploring the students’ perception about the test. This study will fill the gap in the literature by providing students’ perception on the two tests method. Survey approach is conducted to obtain the data. The sample is collected in two identical classes with similar subject in a public university in Indonesia. Mann-Whitney U test used to analyse the data. The result indicates that there is a significant difference between two groups of students regarding CBT usage. Student with different test method prefers to have test other than what they were having. Further discussion and research implication is discussed in the paper.

  10. Feasibility of supervised self-testing using an oral fluid-based HIV rapid testing method: a cross-sectional, mixed method study among pregnant women in rural India

    PubMed Central

    Sarkar, Archana; Mburu, Gitau; Shivkumar, Poonam Varma; Sharma, Pankhuri; Campbell, Fiona; Behera, Jagannath; Dargan, Ritu; Mishra, Surendra Kumar; Mehra, Sunil

    2016-01-01

    Introduction HIV self-testing can increase coverage of essential HIV services. This study aimed to establish the acceptability, concordance and feasibility of supervised HIV self-testing among pregnant women in rural India. Methods A cross-sectional, mixed methods study was conducted among 202 consenting pregnant women in a rural Indian hospital between August 2014 and January 2015. Participants were provided with instructions on how to self-test using OraQuick® HIV antibody test, and subsequently asked to self-test under supervision of a community health worker. Test results were confirmed at a government-run integrated counselling and testing centre. A questionnaire was used to obtain information on patient demographics and the ease, acceptability and difficulties of self-testing. In-depth interviews were conducted with a sub-sample of 35 participants to understand their experiences. Results In total, 202 participants performed the non-invasive, oral fluid-based, rapid test under supervision for HIV screening. Acceptance rate was 100%. Motivators for self-testing included: ease of testing (43.4%), quick results (27.3%) and non-invasive procedure (23.2%). Sensitivity and specificity were 100% for 201 tests, and one test was invalid. Concordance of test result interpretation between community health workers and participants was 98.5% with a Cohen's Kappa (k) value of k=0.566 with p<0.001 for inter-rater agreement. Although 92.6% participants reported that the instructions for the test were easy to understand, 18.7% required the assistance of a supervisor to self-test. Major themes that emerged from the qualitative interviews indicated the importance of the following factors in influencing acceptability of self-testing: clarity and accessibility of test instructions; time-efficiency and convenience of testing; non-invasiveness of the test; and fear of incorrect results. Overall, 96.5% of the participants recommended that the OraQuick® test kits should become publicly available. Conclusions Self-testing for HIV status using an oral fluid-based rapid test under the supervision of a community health worker was acceptable and feasible among pregnant women in rural India. Participants were supportive of making self-testing publicly available. Policy guidelines and implementation research are required to advance HIV self-testing for larger populations at scale. PMID:27630096

  11. Dynamic Pathfinders: Leveraging Your OPAC to Create Resource Guides

    ERIC Educational Resources Information Center

    Hunter, Ben

    2008-01-01

    Library pathfinders are a time-tested method of leading library users to important resources. However, paper-based pathfinders suffer from space limitations, and both paper-based and Web-based pathfinders require frequent updates to keep up with new library acquisitions. This article details a step-by-step method to create an online dynamic…

  12. An automated smartphone-based diagnostic assay for point-of-care semen analysis

    PubMed Central

    Kanakasabapathy, Manoj Kumar; Sadasivam, Magesh; Singh, Anupriya; Preston, Collin; Thirumalaraju, Prudhvi; Venkataraman, Maanasa; Bormann, Charles L.; Draz, Mohamed Shehata; Petrozza, John C.; Shafiee, Hadi

    2017-01-01

    Male infertility affects up to 12% of the world’s male population and is linked to various environmental and medical conditions. Manual microscope-based testing and computer-assisted semen analysis (CASA) are the current standard methods to diagnose male infertility; however, these methods are labor-intensive, expensive, and laboratory-based. Cultural and socially dominated stigma against male infertility testing hinders a large number of men from getting tested for infertility, especially in resource-limited African countries. We describe the development and clinical testing of an automated smartphone-based semen analyzer designed for quantitative measurement of sperm concentration and motility for point-of-care male infertility screening. Using a total of 350 clinical semen specimens at a fertility clinic, we have shown that our assay can analyze an unwashed, unprocessed liquefied semen sample with <5-s mean processing time and provide the user a semen quality evaluation based on the World Health Organization (WHO) guidelines with ~98% accuracy. The work suggests that the integration of microfluidics, optical sensing accessories, and advances in consumer electronics, particularly smartphone capabilities, can make remote semen quality testing accessible to people in both developed and developing countries who have access to smartphones. PMID:28330865

  13. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  14. Android malware detection based on evolutionary super-network

    NASA Astrophysics Data System (ADS)

    Yan, Haisheng; Peng, Lingling

    2018-04-01

    In the paper, an android malware detection method based on evolutionary super-network is proposed in order to improve the precision of android malware detection. Chi square statistics method is used for selecting characteristics on the basis of analyzing android authority. Boolean weighting is utilized for calculating characteristic weight. Processed characteristic vector is regarded as the system training set and test set; hyper edge alternative strategy is used for training super-network classification model, thereby classifying test set characteristic vectors, and it is compared with traditional classification algorithm. The results show that the detection method proposed in the paper is close to or better than traditional classification algorithm. The proposed method belongs to an effective Android malware detection means.

  15. Investigation of laboratory test procedures for assessing the structural capacity of geogrid-reinforced aggregate base materials.

    DOT National Transportation Integrated Search

    2015-04-01

    The objective of this research was to identify a laboratory test method that can be used to quantify improvements in structural capacity of aggregate base materials reinforced with geogrid. For this research, National Cooperative Highway Research Pro...

  16. Mass detection, localization and estimation for wind turbine blades based on statistical pattern recognition

    NASA Astrophysics Data System (ADS)

    Colone, L.; Hovgaard, M. K.; Glavind, L.; Brincker, R.

    2018-07-01

    A method for mass change detection on wind turbine blades using natural frequencies is presented. The approach is based on two statistical tests. The first test decides if there is a significant mass change and the second test is a statistical group classification based on Linear Discriminant Analysis. The frequencies are identified by means of Operational Modal Analysis using natural excitation. Based on the assumption of Gaussianity of the frequencies, a multi-class statistical model is developed by combining finite element model sensitivities in 10 classes of change location on the blade, the smallest area being 1/5 of the span. The method is experimentally validated for a full scale wind turbine blade in a test setup and loaded by natural wind. Mass change from natural causes was imitated with sand bags and the algorithm was observed to perform well with an experimental detection rate of 1, localization rate of 0.88 and mass estimation rate of 0.72.

  17. Impact of screening colonoscopy on outcomes in colorectal cancer.

    PubMed

    Matsuda, Takahisa; Ono, Akiko; Kakugawa, Yasuo; Matsumoto, Minori; Saito, Yutaka

    2015-10-01

    Colorectal cancer is one of the most common cancers in both men and women worldwide and a good candidate for screening programs. There are two modalities of colorectal cancer screening: (i) population-based screening and (ii) opportunistic screening. The first one is based on organized, well-coordinated, monitored and established programs with a systematic invitation covering the entire target population. In contrast, opportunistic screening tests are offered to people who are being examined for other reasons. Recently, a variety of colorectal cancer screening tests have become available; each country should make a choice, based on national demographics and resources, on the screening method to be used. Fecal occult blood test, especially the fecal immunochemical test, would be the best modality for decreasing colorectal cancer mortality through population-based screening. In contrast, if the aim includes the early detection of colorectal cancer and adenomas, endoscopic methods are more appropriate. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. A Statistical Analysis of Activity-Based and Traditional Introductory Algebra Physics Using the Force and Motion Conceptual Evaluation

    NASA Astrophysics Data System (ADS)

    Trecia Markes, Cecelia

    2006-03-01

    With a three-year FIPSE grant, it has been possible at the University of Nebraska at Kearney (UNK) to develop and implement activity- based introductory physics at the algebra level. It has generally been recognized that students enter physics classes with misconceptions about motion and force. Many of these misconceptions persist after instruction. Pretest and posttest responses on the ``Force and Motion Conceptual Evaluation'' (FMCE) are analyzed to determine the effectiveness of the activity- based method of instruction relative to the traditional (lecture/lab) method of instruction. Data were analyzed to determine the following: student understanding at the beginning of the course, student understanding at the end of the course, how student understanding is related to the type of class taken, student understanding based on gender and type of class. Some of the tests used are the t-test, the chi-squared test, and analysis of variance. The results of these tests will be presented, and their implications will be discussed.

  19. The Effectiveness of the Creative Writing Instruction Program Based on Speaking Activities (CWIPSA)

    ERIC Educational Resources Information Center

    Bayat, Seher

    2016-01-01

    This study aims to develop a creative writing instruction program based on speaking activities and to investigate its effect on fourth-grade primary school students' creative writing achievements and writing attitudes. The experimental method based on the pre-test/post-test model was used in this research. The research was conducted with 42…

  20. Pilot-Testing CATCH Early Childhood: A Preschool-Based Healthy Nutrition and Physical Activity Program

    ERIC Educational Resources Information Center

    Sharma, Shreela; Chuang, Ru-Jye; Hedberg, Ann Marie

    2011-01-01

    Background: The literature on theoretically-based programs targeting healthy nutrition and physical activity in preschools is scarce. Purpose: To pilot test CATCH Early Childhood (CEC), a preschool-based nutrition and physical activity program among children ages three to five in Head Start. Methods: The study was conducted in two Head Start…

  1. Testing large aspheric surfaces with complementary annular subaperture interferometric method

    NASA Astrophysics Data System (ADS)

    Hou, Xi; Wu, Fan; Lei, Baiping; Fan, Bin; Chen, Qiang

    2008-07-01

    Annular subaperture interferometric method has provided an alternative solution to testing rotationally symmetric aspheric surfaces with low cost and flexibility. However, some new challenges, particularly in the motion and algorithm components, appear when applied to large aspheric surfaces with large departure in the practical engineering. Based on our previously reported annular subaperture reconstruction algorithm with Zernike annular polynomials and matrix method, and the experimental results for an approximate 130-mm diameter and f/2 parabolic mirror, an experimental investigation by testing an approximate 302-mm diameter and f/1.7 parabolic mirror with the complementary annular subaperture interferometric method is presented. We have focused on full-aperture reconstruction accuracy, and discuss some error effects and limitations of testing larger aspheric surfaces with the annular subaperture method. Some considerations about testing sector segment with complementary sector subapertures are provided.

  2. Practicing Accounting Profession Criterial Skills in the Classroom: A Study of Collaborative Testing and the Impact on Final Exam Scores

    ERIC Educational Resources Information Center

    VanderLaan, Ski R.

    2010-01-01

    This mixed methods study (Creswell, 2008) was designed to test the influence of collaborative testing on learning using a quasi-experimental approach. This study used a modified embedded mixed method design in which the qualitative and quantitative data, associated with the secondary questions, provided a supportive role in a study based primarily…

  3. 10 CFR Appendix W to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Medium Base Compact Fluorescent Lamps

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of Medium Base Compact Fluorescent Lamps W Appendix W to Subpart B of Part 430 Energy DEPARTMENT OF... Consumption of Medium Base Compact Fluorescent Lamps 1. Scope: This appendix covers the test requirements used... rated life, rapid cycle stress, and lamp life of medium base compact fluorescent lamps. 2. Definitions...

  4. 10 CFR Appendix W to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Medium Base Compact Fluorescent Lamps

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... of Medium Base Compact Fluorescent Lamps W Appendix W to Subpart B of Part 430 Energy DEPARTMENT OF... Consumption of Medium Base Compact Fluorescent Lamps 1. Scope: This appendix covers the test requirements used... rated life, rapid cycle stress, and lamp life of medium base compact fluorescent lamps. 2. Definitions...

  5. 10 CFR Appendix W to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Medium Base Compact Fluorescent Lamps

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of Medium Base Compact Fluorescent Lamps W Appendix W to Subpart B of Part 430 Energy DEPARTMENT OF... Consumption of Medium Base Compact Fluorescent Lamps 1. Scope: This appendix covers the test requirements used... rated life, rapid cycle stress, and lamp life of medium base compact fluorescent lamps. 2. Definitions...

  6. 10 CFR Appendix W to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Medium Base Compact Fluorescent Lamps

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of Medium Base Compact Fluorescent Lamps W Appendix W to Subpart B of Part 430 Energy DEPARTMENT OF... Consumption of Medium Base Compact Fluorescent Lamps 1. Scope: This appendix covers the test requirements used... rated life, rapid cycle stress, and lamp life of medium base compact fluorescent lamps. 2. Definitions...

  7. The t-CWT: a new ERP detection and quantification method based on the continuous wavelet transform and Student's t-statistics.

    PubMed

    Bostanov, Vladimir; Kotchoubey, Boris

    2006-12-01

    This study was aimed at developing a method for extraction and assessment of event-related brain potentials (ERP) from single-trials. This method should be applicable in the assessment of single persons' ERPs and should be able to handle both single ERP components and whole waveforms. We adopted a recently developed ERP feature extraction method, the t-CWT, for the purposes of hypothesis testing in the statistical assessment of ERPs. The t-CWT is based on the continuous wavelet transform (CWT) and Student's t-statistics. The method was tested in two ERP paradigms, oddball and semantic priming, by assessing individual-participant data on a single-trial basis, and testing the significance of selected ERP components, P300 and N400, as well as of whole ERP waveforms. The t-CWT was also compared to other univariate and multivariate ERP assessment methods: peak picking, area computation, discrete wavelet transform (DWT) and principal component analysis (PCA). The t-CWT produced better results than all of the other assessment methods it was compared with. The t-CWT can be used as a reliable and powerful method for ERP-component detection and testing of statistical hypotheses concerning both single ERP components and whole waveforms extracted from either single persons' or group data. The t-CWT is the first such method based explicitly on the criteria of maximal statistical difference between two average ERPs in the time-frequency domain and is particularly suitable for ERP assessment of individual data (e.g. in clinical settings), but also for the investigation of small and/or novel ERP effects from group data.

  8. Molecular diffusion of stable water isotopes in polar firn as a proxy for past temperatures

    NASA Astrophysics Data System (ADS)

    Holme, Christian; Gkinis, Vasileios; Vinther, Bo M.

    2018-03-01

    Polar precipitation archived in ice caps contains information on past temperature conditions. Such information can be retrieved by measuring the water isotopic signals of δ18O and δD in ice cores. These signals have been attenuated during densification due to molecular diffusion in the firn column, where the magnitude of the diffusion is isotopologue specific and temperature dependent. By utilizing the differential diffusion signal, dual isotope measurements of δ18O and δD enable multiple temperature reconstruction techniques. This study assesses how well six different methods can be used to reconstruct past surface temperatures from the diffusion-based temperature proxies. Two of the methods are based on the single diffusion lengths of δ18O and δD , three of the methods employ the differential diffusion signal, while the last uses the ratio between the single diffusion lengths. All techniques are tested on synthetic data in order to evaluate their accuracy and precision. We perform a benchmark test to thirteen high resolution Holocene data sets from Greenland and Antarctica, which represent a broad range of mean annual surface temperatures and accumulation rates. Based on the benchmark test, we comment on the accuracy and precision of the methods. Both the benchmark test and the synthetic data test demonstrate that the most precise reconstructions are obtained when using the single isotope diffusion lengths, with precisions of approximately 1.0 °C . In the benchmark test, the single isotope diffusion lengths are also found to reconstruct consistent temperatures with a root-mean-square-deviation of 0.7 °C . The techniques employing the differential diffusion signals are more uncertain, where the most precise method has a precision of 1.9 °C . The diffusion length ratio method is the least precise with a precision of 13.7 °C . The absolute temperature estimates from this method are also shown to be highly sensitive to the choice of fractionation factor parameterization.

  9. Instance Analysis for the Error of Three-pivot Pressure Transducer Static Balancing Method for Hydraulic Turbine Runner

    NASA Astrophysics Data System (ADS)

    Weng, Hanli; Li, Youping

    2017-04-01

    The working principle, process device and test procedure of runner static balancing test method by weighting with three-pivot pressure transducers are introduced in this paper. Based on an actual instance of a V hydraulic turbine runner, the error and sensitivity of the three-pivot pressure transducer static balancing method are analysed. Suggestions about improving the accuracy and the application of the method are also proposed.

  10. Use of FTA gene guard filter paper for the storage and transportation of tumor cells for molecular testing.

    PubMed

    Dobbs, Larry J; Madigan, Merle N; Carter, Alexis B; Earls, Lori

    2002-01-01

    Efficient methods of storing tumor specimens for molecular testing are needed in the modern surgical pathology laboratory. The FTA Gene Guard system is a novel method for the collection and room temperature storage of blood samples for DNA testing. The method uses index card-sized filter papers that provide an ideal medium on which to store tumor specimens for DNA testing. To determine whether FTA filter paper can be used in the surgical pathology laboratory to store tumor cells for DNA testing. Cell suspensions were prepared from 60 surgical specimens, and DNA was extracted either immediately or after storage on FTA paper. The DNA extracted by each method was tested by polymerase chain reaction (PCR) for the beta-globin and interferon gamma genes, and the results were compared. Fifteen lymph node specimens stored on FTA paper were then tested for immunoglobulin heavy chain (IgH) gene rearrangement by PCR, and these results were compared with those obtained for immediately extracted DNA. University medical center. The DNA extracted from cells stored on FTA paper performed as well in the PCR as the freshly extracted DNA in nearly all cases (>95%). The results of tests for IgH gene rearrangements showed 100% concordance between the 2 methods of DNA extraction.Conclusion.-Cells from surgical specimens can be stored on FTA paper for extended lengths of time, and DNA can be extracted from these cells for PCR-based testing. FTA filter paper is a reliable medium for the storage and/or transport of tumor cells for PCR-based DNA analysis.

  11. The importance of aerodynamics for prosthetic limb design used by competitive cyclists with an amputation: An introduction.

    PubMed

    Dyer, Bryce

    2015-06-01

    This study introduces the importance of the aerodynamics to prosthetic limb design for athletes with either a lower-limb or upper-limb amputation. The study comprises two elements: 1) An initial experiment investigating the stability of outdoor velodrome-based field tests, and 2) An experiment evaluating the application of outdoor velodrome aerodynamic field tests to detect small-scale changes in aerodynamic drag respective of prosthetic limb componentry changes. An outdoor field-testing method is used to detect small and repeatable changes in the aerodynamic drag of an able-bodied cyclist. These changes were made at levels typical of alterations in prosthetic componentry. The field-based test method of assessment is used at a smaller level of resolution than previously reported. With a carefully applied protocol, the field test method proved to be statistically stable. The results of the field test experiments demonstrate a noticeable change in overall athlete performance. Aerodynamic refinement of artificial limbs is worthwhile for athletes looking to maximise their competitive performance. A field-testing method illustrates the importance of the aerodynamic optimisation of prosthetic limb components. The field-testing protocol undertaken in this study gives an accessible and affordable means of doing so by prosthetists and sports engineers. Using simple and accessible field-testing methods, this exploratory experiment demonstrates how small changes to riders' equipment, consummate of the scale of a small change in prosthetics componentry, can affect the performance of an athlete. Prosthetists should consider such opportunities for performance enhancement when possible. © The International Society for Prosthetics and Orthotics 2014.

  12. USING PARTIAL LEAST SQUARES REGRESSION TO OBTAIN COTTON FIBER LENGTH DISTRIBUTIONS FROM THE BEARD TESTING METHOD

    USDA-ARS?s Scientific Manuscript database

    The beard testing method for measuring cotton fiber length is based on the fibrogram theory. However, in the instrumental implementations, the engineering complexity alters the original fiber length distribution observed by the instrument. This causes challenges in obtaining the entire original le...

  13. Constitutive Model Calibration via Autonomous Multiaxial Experimentation (Postprint)

    DTIC Science & Technology

    2016-09-17

    test machine. Experimental data is reduced and finite element simulations are conducted in parallel with the test based on experimental strain...data is reduced and finite element simulations are conducted in parallel with the test based on experimental strain conditions. Optimization methods...be used directly in finite element simulations of more complex geometries. Keywords Axial/torsional experimentation • Plasticity • Constitutive model

  14. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    ERIC Educational Resources Information Center

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  15. Direct structural parameter identification by modal test results

    NASA Technical Reports Server (NTRS)

    Chen, J.-C.; Kuo, C.-P.; Garba, J. A.

    1983-01-01

    A direct identification procedure is proposed to obtain the mass and stiffness matrices based on the test measured eigenvalues and eigenvectors. The method is based on the theory of matrix perturbation in which the correct mass and stiffness matrices are expanded in terms of analytical values plus a modification matrix. The simplicity of the procedure enables real time operation during the structural testing.

  16. Effects of the Sense-Based Science Education Program on Scientific Process Skills of Children Aged 60-66 Months

    ERIC Educational Resources Information Center

    Tekerci, Hacer; Kandir, Adalet

    2017-01-01

    Purpose: This study aimed to examine the effects of the Sense-Based Science Education Program on 60-66 months old children's scientific process skills. Research Methods: In this study, which carries experimental attribute features, the pre-test/final-test/observing-test control grouped experimental pattern, and qualitative research were used.…

  17. Gene toxicity studies on titanium dioxide and zinc oxide nanomaterials used for UV-protection in cosmetic formulations.

    PubMed

    Landsiedel, Robert; Ma-Hock, Lan; Van Ravenzwaay, Ben; Schulz, Markus; Wiench, Karin; Champ, Samantha; Schulte, Stefan; Wohlleben, Wendel; Oesch, Franz

    2010-12-01

    Titanium dioxide and zinc oxide nanomaterials, used as UV protecting agents in sunscreens, were investigated for their potential genotoxicity in in vitro and in vivo test systems. Since standard OECD test methods are designed for soluble materials and genotoxicity testing for nanomaterials is still under revision, a battery of standard tests was used, covering different endpoints. Additionally, a procedure to disperse the nanomaterials in the test media and careful characterization of the dispersed test item was added to the testing methods. No genotoxicity was observed in vitro (Ames' Salmonella gene mutation test and V79 micronucleus chromosome mutation test) or in vivo (mouse bone marrow micronucleus test and Comet DNA damage assay in lung cells from rats exposed by inhalation). These results add to the still limited data base on genotoxicity test results with nanomaterials and provide congruent results of a battery of standard OECD test methods applied to nanomaterials.

  18. Computer game-based and traditional learning method: a comparison regarding students' knowledge retention.

    PubMed

    Rondon, Silmara; Sassi, Fernanda Chiarion; Furquim de Andrade, Claudia Regina

    2013-02-25

    Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students' prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students' performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students' short and long-term knowledge retention.

  19. Coordinate-Based Clustering Method for Indoor Fingerprinting Localization in Dense Cluttered Environments.

    PubMed

    Liu, Wen; Fu, Xiao; Deng, Zhongliang

    2016-12-02

    Indoor positioning technologies has boomed recently because of the growing commercial interest in indoor location-based service (ILBS). Due to the absence of satellite signal in Global Navigation Satellite System (GNSS), various technologies have been proposed for indoor applications. Among them, Wi-Fi fingerprinting has been attracting much interest from researchers because of its pervasive deployment, flexibility and robustness to dense cluttered indoor environments. One challenge, however, is the deployment of Access Points (AP), which would bring a significant influence on the system positioning accuracy. This paper concentrates on WLAN based fingerprinting indoor location by analyzing the AP deployment influence, and studying the advantages of coordinate-based clustering compared to traditional RSS-based clustering. A coordinate-based clustering method for indoor fingerprinting location, named Smallest-Enclosing-Circle-based (SEC), is then proposed aiming at reducing the positioning error lying in the AP deployment and improving robustness to dense cluttered environments. All measurements are conducted in indoor public areas, such as the National Center For the Performing Arts (as Test-bed 1) and the XiDan Joy City (Floors 1 and 2, as Test-bed 2), and results show that SEC clustering algorithm can improve system positioning accuracy by about 32.7% for Test-bed 1, 71.7% for Test-bed 2 Floor 1 and 73.7% for Test-bed 2 Floor 2 compared with traditional RSS-based clustering algorithms such as K-means.

  20. Coordinate-Based Clustering Method for Indoor Fingerprinting Localization in Dense Cluttered Environments

    PubMed Central

    Liu, Wen; Fu, Xiao; Deng, Zhongliang

    2016-01-01

    Indoor positioning technologies has boomed recently because of the growing commercial interest in indoor location-based service (ILBS). Due to the absence of satellite signal in Global Navigation Satellite System (GNSS), various technologies have been proposed for indoor applications. Among them, Wi-Fi fingerprinting has been attracting much interest from researchers because of its pervasive deployment, flexibility and robustness to dense cluttered indoor environments. One challenge, however, is the deployment of Access Points (AP), which would bring a significant influence on the system positioning accuracy. This paper concentrates on WLAN based fingerprinting indoor location by analyzing the AP deployment influence, and studying the advantages of coordinate-based clustering compared to traditional RSS-based clustering. A coordinate-based clustering method for indoor fingerprinting location, named Smallest-Enclosing-Circle-based (SEC), is then proposed aiming at reducing the positioning error lying in the AP deployment and improving robustness to dense cluttered environments. All measurements are conducted in indoor public areas, such as the National Center For the Performing Arts (as Test-bed 1) and the XiDan Joy City (Floors 1 and 2, as Test-bed 2), and results show that SEC clustering algorithm can improve system positioning accuracy by about 32.7% for Test-bed 1, 71.7% for Test-bed 2 Floor 1 and 73.7% for Test-bed 2 Floor 2 compared with traditional RSS-based clustering algorithms such as K-means. PMID:27918454

  1. Evaluation of contents-based image retrieval methods for a database of logos on drug tablets

    NASA Astrophysics Data System (ADS)

    Geradts, Zeno J.; Hardy, Huub; Poortman, Anneke; Bijhold, Jurrien

    2001-02-01

    In this research an evaluation has been made of the different ways of contents based image retrieval of logos of drug tablets. On a database of 432 illicitly produced tablets (mostly containing MDMA), we have compared different retrieval methods. Two of these methods were available from commercial packages, QBIC and Imatch, where the implementation of the contents based image retrieval methods are not exactly known. We compared the results for this database with the MPEG-7 shape comparison methods, which are the contour-shape, bounding box and region-based shape methods. In addition, we have tested the log polar method that is available from our own research.

  2. A ricin forensic profiling approach based on a complex set of biomarkers.

    PubMed

    Fredriksson, Sten-Åke; Wunschel, David S; Lindström, Susanne Wiklund; Nilsson, Calle; Wahl, Karen; Åstot, Crister

    2018-08-15

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1-PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods and robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. A systematic methodology for creep master curve construction using the stepped isostress method (SSM): a numerical assessment

    NASA Astrophysics Data System (ADS)

    Miranda Guedes, Rui

    2018-02-01

    Long-term creep of viscoelastic materials is experimentally inferred through accelerating techniques based on the time-temperature superposition principle (TTSP) or on the time-stress superposition principle (TSSP). According to these principles, a given property measured for short times at a higher temperature or higher stress level remains the same as that obtained for longer times at a lower temperature or lower stress level, except that the curves are shifted parallel to the horizontal axis, matching a master curve. These procedures enable the construction of creep master curves with short-term experimental tests. The Stepped Isostress Method (SSM) is an evolution of the classical TSSP method. Higher reduction of the required number of test specimens to obtain the master curve is achieved by the SSM technique, since only one specimen is necessary. The classical approach, using creep tests, demands at least one specimen per each stress level to produce a set of creep curves upon which TSSP is applied to obtain the master curve. This work proposes an analytical method to process the SSM raw data. The method is validated using numerical simulations to reproduce the SSM tests based on two different viscoelastic models. One model represents the viscoelastic behavior of a graphite/epoxy laminate and the other represents an adhesive based on epoxy resin.

  4. IGBT Switching Characteristic Curve Embedded Half-Bridge MMC Modelling and Real Time Simulation Realization

    NASA Astrophysics Data System (ADS)

    Zhengang, Lu; Hongyang, Yu; Xi, Yang

    2017-05-01

    The Modular Multilevel Converter (MMC) is one of the most attractive topologies in recent years for medium or high voltage industrial applications, such as high voltage dc transmission (HVDC) and medium voltage varying speed motor drive. The wide adoption of MMCs in industry is mainly due to its flexible expandability, transformer-less configuration, common dc bus, high reliability from redundancy, and so on. But, when the sub module number of MMC is more, the test of MMC controller will cost more time and effort. Hardware in the loop test based on real time simulator will save a lot of time and money caused by the MMC test. And due to the flexible of HIL, it becomes more and more popular in the industry area. The MMC modelling method remains an important issue for the MMC HIL test. Specifically, the VSC model should realistically reflect the nonlinear device switching characteristics, switching and conduction losses, tailing current, and diode reverse recovery behaviour of a realistic converter. In this paper, an IGBT switching characteristic curve embedded half-bridge MMC modelling method is proposed. This method is based on the switching curve referring and sample circuit calculation, and it is sample for implementation. Based on the proposed method, a FPGA real time simulation is carried out with 200ns sample time. The real time simulation results show the proposed method is correct.

  5. A simple microplate-based method for the determination of α-amylase activity using the glucose assay kit (GOD method).

    PubMed

    Visvanathan, Rizliya; Jayathilake, Chathuni; Liyanage, Ruvini

    2016-11-15

    For the first time, a reliable, simple, rapid and high-throughput analytical method for the detection and quantification of α-amylase inhibitory activity using the glucose assay kit was developed. The new method facilitates rapid screening of a large number of samples, reduces labor, time and reagents and is also suitable for kinetic studies. This method is based on the reaction of maltose with glucose oxidase (GOD) and the development of a red quinone. The test is done in microtitre plates with a total volume of 260μL and an assay time of 40min including the pre-incubation steps. The new method is tested for linearity, sensitivity, precision, reproducibility and applicability. The new method is also compared with the most commonly used 3,5-dinitrosalicylic acid (DNSA) method for determining α-amylase activity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A risk-based classification scheme for genetically modified foods. II: Graded testing.

    PubMed

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    This paper presents a graded approach to the testing of crop-derived genetically modified (GM) foods based on concern levels in a proposed risk-based classification scheme (RBCS) and currently available testing methods. A graded approach offers the potential for more efficient use of testing resources by focusing less on lower concern GM foods, and more on higher concern foods. In this proposed approach to graded testing, products that are classified as Level I would have met baseline testing requirements that are comparable to what is widely applied to premarket assessment of GM foods at present. In most cases, Level I products would require no further testing, or very limited confirmatory analyses. For products classified as Level II or higher, additional testing would be required, depending on the type of the substance, prior dietary history, estimated exposure level, prior knowledge of toxicity of the substance, and the nature of the concern related to unintended changes in the modified food. Level III testing applies only to the assessment of toxic and antinutritional effects from intended changes and is tailored to the nature of the substance in question. Since appropriate test methods are not currently available for all effects of concern, future research to strengthen the testing of GM foods is discussed.

  7. Panel-based Genetic Diagnostic Testing for Inherited Eye Diseases is Highly Accurate and Reproducible and More Sensitive for Variant Detection Than Exome Sequencing

    PubMed Central

    Bujakowska, Kinga M.; Sousa, Maria E.; Fonseca-Kelly, Zoë D.; Taub, Daniel G.; Janessian, Maria; Wang, Dan Yi; Au, Elizabeth D.; Sims, Katherine B.; Sweetser, David A.; Fulton, Anne B.; Liu, Qin; Wiggs, Janey L.; Gai, Xiaowu; Pierce, Eric A.

    2015-01-01

    Purpose Next-generation sequencing (NGS) based methods are being adopted broadly for genetic diagnostic testing, but the performance characteristics of these techniques have not been fully defined with regard to test accuracy and reproducibility. Methods We developed a targeted enrichment and NGS approach for genetic diagnostic testing of patients with inherited eye disorders, including inherited retinal degenerations, optic atrophy and glaucoma. In preparation for providing this Genetic Eye Disease (GEDi) test on a CLIA-certified basis, we performed experiments to measure the sensitivity, specificity, reproducibility as well as the clinical sensitivity of the test. Results The GEDi test is highly reproducible and accurate, with sensitivity and specificity for single nucleotide variant detection of 97.9% and 100%, respectively. The sensitivity for variant detection was notably better than the 88.3% achieved by whole exome sequencing (WES) using the same metrics, due to better coverage of targeted genes in the GEDi test compared to commercially available exome capture sets. Prospective testing of 192 patients with IRDs indicated that the clinical sensitivity of the GEDi test is high, with a diagnostic rate of 51%. Conclusion The data suggest that based on quantified performance metrics, selective targeted enrichment is preferable to WES for genetic diagnostic testing. PMID:25412400

  8. Comparison of Feature Selection Techniques in Machine Learning for Anatomical Brain MRI in Dementia.

    PubMed

    Tohka, Jussi; Moradi, Elaheh; Huttunen, Heikki

    2016-07-01

    We present a comparative split-half resampling analysis of various data driven feature selection and classification methods for the whole brain voxel-based classification analysis of anatomical magnetic resonance images. We compared support vector machines (SVMs), with or without filter based feature selection, several embedded feature selection methods and stability selection. While comparisons of the accuracy of various classification methods have been reported previously, the variability of the out-of-training sample classification accuracy and the set of selected features due to independent training and test sets have not been previously addressed in a brain imaging context. We studied two classification problems: 1) Alzheimer's disease (AD) vs. normal control (NC) and 2) mild cognitive impairment (MCI) vs. NC classification. In AD vs. NC classification, the variability in the test accuracy due to the subject sample did not vary between different methods and exceeded the variability due to different classifiers. In MCI vs. NC classification, particularly with a large training set, embedded feature selection methods outperformed SVM-based ones with the difference in the test accuracy exceeding the test accuracy variability due to the subject sample. The filter and embedded methods produced divergent feature patterns for MCI vs. NC classification that suggests the utility of the embedded feature selection for this problem when linked with the good generalization performance. The stability of the feature sets was strongly correlated with the number of features selected, weakly correlated with the stability of classification accuracy, and uncorrelated with the average classification accuracy.

  9. Preferences for Home-Based HIV Testing Among Heterosexuals at Increased Risk for HIV/AIDS: New Orleans, Louisiana, 2013.

    PubMed

    Robinson, William T; Zarwell, Meagan; Gruber, DeAnn

    2017-07-01

    Participants in the New Orleans arm of the National HIV Behavioral Surveillance of Heterosexuals at Increased Risk for HIV were asked about potential utilization of self-administered home-based tests for HIV. The majority (86%) would use a free home-based test if provided by mail and 99% would seek treatment based on a positive result. In addition, more than half of respondents would return test results in some format to the test provider, whereas most of the remaining participants preferred to discuss results only with their doctor. These findings point toward a potential method for advancing the National HIV/AIDS Strategy.

  10. Thermomechanical and bithermal fatigue behavior of cast B1900 + Hf and wrought Haynes 188

    NASA Technical Reports Server (NTRS)

    Halford, G. R.; Verrilli, M. J.; Kalluri, S.; Ritzert, F. J.; Duckert, R. E.; Holland, F. A.

    1992-01-01

    A thermomechanical fatigue (TMF) high-temperature life prediction method has been evaluated using the experimental data. Bithermal fatigue (BTF), bithermal creep-fatigue (BTC-F), and TMF experiments were performed using two aerospace structural alloys, cast B1900 + Hf and wrought Haynes 188. The method which is based on the total strain version of strain range partitioning and unified cyclic constitutive modeling requires, as an input, information on the flow and failure behavior of the material of interest. Bithermal temperatures of 483 and 871 C were used for the cast B1900 + Hf nickel-base alloy and 316 and 760 C for the wrought Haynes 188 cobalt-base alloy. Maximum and minimum temperatures were also used in both TMF and BTF tests. Comparisons were made between the results of these tests and isothermal tensile and fatigue test data obtained previously. Qualitative correlations were observed between tensile and isothermal fatigue tests.

  11. Assessing Auditory Discrimination Skill of Malay Children Using Computer-based Method.

    PubMed

    Ting, H; Yunus, J; Mohd Nordin, M Z

    2005-01-01

    The purpose of this paper is to investigate the auditory discrimination skill of Malay children using computer-based method. Currently, most of the auditory discrimination assessments are conducted manually by Speech-Language Pathologist. These conventional tests are actually general tests of sound discrimination, which do not reflect the client's specific speech sound errors. Thus, we propose computer-based Malay auditory discrimination test to automate the whole process of assessment as well as to customize the test according to the specific speech error sounds of the client. The ability in discriminating voiced and unvoiced Malay speech sounds was studied for the Malay children aged between 7 and 10 years old. The study showed no major difficulty for the children in discriminating the Malay speech sounds except differentiating /g/-/k/ sounds. Averagely the children of 7 years old failed to discriminate /g/-/k/ sounds.

  12. The development and standardization of testing methods for genetically modified organisms and their derived products.

    PubMed

    Zhang, Dabing; Guo, Jinchao

    2011-07-01

    As the worldwide commercialization of genetically modified organisms (GMOs) increases and consumers concern the safety of GMOs, many countries and regions are issuing labeling regulations on GMOs and their products. Analytical methods and their standardization for GM ingredients in foods and feed are essential for the implementation of labeling regulations. To date, the GMO testing methods are mainly based on the inserted DNA sequences and newly produced proteins in GMOs. This paper presents an overview of GMO testing methods as well as their standardization. © 2011 Institute of Botany, Chinese Academy of Sciences.

  13. Pulsed single-blow regenerator testing

    NASA Technical Reports Server (NTRS)

    Oldson, J. C.; Knowles, T. R.; Rauch, J.

    1992-01-01

    A pulsed single-blow method has been developed for testing of Stirling regenerator materials performance. The method uses a tubular flow arrangement with a steady gas flow passing through a regenerator matrix sample that packs the flow channel for a short distance. A wire grid heater spanning the gas flow channel is used to heat a plug of gas by approximately 2 K for approximately 350 ms. Foil thermocouples monitor the gas temperature entering and leaving the sample. Data analysis based on a 1D incompressible-flow thermal model allows the extraction of Stanton number. A figure of merit involving heat transfer and pressure drop is used to present results for steel screens and steel felt. The observations show a lower figure of merit for the materials tested than is expected based on correlations obtained by other methods.

  14. A 96-well-plate-based optical method for the quantitative and qualitative evaluation of Pseudomonas aeruginosa biofilm formation and its application to susceptibility testing.

    PubMed

    Müsken, Mathias; Di Fiore, Stefano; Römling, Ute; Häussler, Susanne

    2010-08-01

    A major reason for bacterial persistence during chronic infections is the survival of bacteria within biofilm structures, which protect cells from environmental stresses, host immune responses and antimicrobial therapy. Thus, there is concern that laboratory methods developed to measure the antibiotic susceptibility of planktonic bacteria may not be relevant to chronic biofilm infections, and it has been suggested that alternative methods should test antibiotic susceptibility within a biofilm. In this paper, we describe a fast and reliable protocol for using 96-well microtiter plates for the formation of Pseudomonas aeruginosa biofilms; the method is easily adaptable for antimicrobial susceptibility testing. This method is based on bacterial viability staining in combination with automated confocal laser scanning microscopy. The procedure simplifies qualitative and quantitative evaluation of biofilms and has proven to be effective for standardized determination of antibiotic efficiency on P. aeruginosa biofilms. The protocol can be performed within approximately 60 h.

  15. Rapid-Testing Technology and Systems Improvement for the Elimination of Congenital Syphilis in Haiti: Overcoming the “Technology to Systems Gap”

    PubMed Central

    Benoit, Daphne; Zhou, Xi K.; Pape, Jean W.; Peeling, Rosanna W.; Fitzgerald, Daniel W.; Mate, Kedar S.

    2013-01-01

    Background. Despite the availability of rapid diagnostic tests and inexpensive treatment for pregnant women, maternal-child syphilis transmission remains a leading cause of perinatal morbidity and mortality in developing countries. In Haiti, more than 3000 babies are born with congenital syphilis annually. Methods and Findings. From 2007 to 2011, we used a sequential time series, multi-intervention study design in fourteen clinics throughout Haiti to improve syphilis testing and treatment in pregnancy. The two primary interventions were the introduction of a rapid point-of-care syphilis test and systems strengthening based on quality improvement (QI) methods. Syphilis testing increased from 91.5% prediagnostic test to 95.9% after (P < 0.001) and further increased to 96.8% (P < 0.001) after the QI intervention. Despite high rates of testing across all time periods, syphilis treatment lagged behind and only increased from 70.3% to 74.7% after the introduction of rapid tests (P = 0.27), but it improved significantly from 70.2% to 84.3% (P < 0.001) after the systems strengthening QI intervention. Conclusion. Both point-of-care diagnostic testing and health systems-based quality improvement interventions can improve the delivery of specific evidence-based healthcare interventions to prevent congenital syphilis at scale in Haiti. Improved treatment rates for syphilis were seen only after the use of systems-based quality improvement approaches. PMID:26316955

  16. Rapid-Testing Technology and Systems Improvement for the Elimination of Congenital Syphilis in Haiti: Overcoming the "Technology to Systems Gap".

    PubMed

    Severe, Linda; Benoit, Daphne; Zhou, Xi K; Pape, Jean W; Peeling, Rosanna W; Fitzgerald, Daniel W; Mate, Kedar S

    2013-01-01

    Background. Despite the availability of rapid diagnostic tests and inexpensive treatment for pregnant women, maternal-child syphilis transmission remains a leading cause of perinatal morbidity and mortality in developing countries. In Haiti, more than 3000 babies are born with congenital syphilis annually. Methods and Findings. From 2007 to 2011, we used a sequential time series, multi-intervention study design in fourteen clinics throughout Haiti to improve syphilis testing and treatment in pregnancy. The two primary interventions were the introduction of a rapid point-of-care syphilis test and systems strengthening based on quality improvement (QI) methods. Syphilis testing increased from 91.5% prediagnostic test to 95.9% after (P < 0.001) and further increased to 96.8% (P < 0.001) after the QI intervention. Despite high rates of testing across all time periods, syphilis treatment lagged behind and only increased from 70.3% to 74.7% after the introduction of rapid tests (P = 0.27), but it improved significantly from 70.2% to 84.3% (P < 0.001) after the systems strengthening QI intervention. Conclusion. Both point-of-care diagnostic testing and health systems-based quality improvement interventions can improve the delivery of specific evidence-based healthcare interventions to prevent congenital syphilis at scale in Haiti. Improved treatment rates for syphilis were seen only after the use of systems-based quality improvement approaches.

  17. Moving Model Test of High-Speed Train Aerodynamic Drag Based on Stagnation Pressure Measurements

    PubMed Central

    Yang, Mingzhi; Du, Juntao; Huang, Sha; Zhou, Dan

    2017-01-01

    A moving model test method based on stagnation pressure measurements is proposed to measure the train aerodynamic drag coefficient. Because the front tip of a high-speed train has a high pressure area and because a stagnation point occurs in the center of this region, the pressure of the stagnation point is equal to the dynamic pressure of the sensor tube based on the obtained train velocity. The first derivation of the train velocity is taken to calculate the acceleration of the train model ejected by the moving model system without additional power. According to Newton’s second law, the aerodynamic drag coefficient can be resolved through many tests at different train speeds selected within a relatively narrow range. Comparisons are conducted with wind tunnel tests and numerical simulations, and good agreement is obtained, with differences of less than 6.1%. Therefore, the moving model test method proposed in this paper is feasible and reliable. PMID:28095441

  18. Can currently available non-animal methods detect pre and ...

    EPA Pesticide Factsheets

    Predictive testing to identify and characterise substances for their skin sensitisation potential has historically been based on animal tests such as the Local Lymph Node Assay (LLNA). In recent years, regulations in the cosmetics and chemicals sectors has provided a strong impetus to develop and evaluate non-animal alternative methods. The AOP for skin sensitisation provides a framework to anchor non-animal test methods to key events in the pathway to help identify what tests can be combined together to generate the potency information required for risk assessment. The 3 test methods that have undergone extensive development and validation are the direct peptide reactivity assay (DPRA), the KeratinoSensTM and the human Cell Line Activation Test (h-CLAT). Whilst these methods have been shown to perform relatively well in predicting LLNA results (accuracy ~ 80%), a particular concern that has been raised is their ability to predict chemicals that need to be activated to act as sensitisers (either abiotically on the skin (pre-hapten) or metabolically in the skin (pro-hapten)). The DPRA is a cell free system whereas the other two methods make use of cells that do not fully represent the in vivo metabolic situation. Based on previously published datasets of LLNA data, it has been found that approximately 25% of sensitisers are pre- and/or pro-haptens. This study reviewed an EURL ECVAM dataset of 127 substances for which information was available in the LLNA and the

  19. On sample size of the kruskal-wallis test with application to a mouse peritoneal cavity study.

    PubMed

    Fan, Chunpeng; Zhang, Donghui; Zhang, Cun-Hui

    2011-03-01

    As the nonparametric generalization of the one-way analysis of variance model, the Kruskal-Wallis test applies when the goal is to test the difference between multiple samples and the underlying population distributions are nonnormal or unknown. Although the Kruskal-Wallis test has been widely used for data analysis, power and sample size methods for this test have been investigated to a much lesser extent. This article proposes new power and sample size calculation methods for the Kruskal-Wallis test based on the pilot study in either a completely nonparametric model or a semiparametric location model. No assumption is made on the shape of the underlying population distributions. Simulation results show that, in terms of sample size calculation for the Kruskal-Wallis test, the proposed methods are more reliable and preferable to some more traditional methods. A mouse peritoneal cavity study is used to demonstrate the application of the methods. © 2010, The International Biometric Society.

  20. SYNCHROTRON RADIATION, FREE ELECTRON LASER, APPLICATION OF NUCLEAR TECHNOLOGY, ETC.: Study on the characteristics of linac based THz light source

    NASA Astrophysics Data System (ADS)

    Zhu, Xiong-Wei; Wang, Shu-Hong; Chen, Sen-Yu

    2009-10-01

    There are many methods based on linac for THz radiation production. As one of the options for the Beijing Advanced Light, an ERL test facility is proposed for THz radiation. In this test facility, there are 4 kinds of methods to produce THz radiation: coherent synchrotron radiation (CSR), synchrotron radiation (SR), low gain FEL oscillator, and high gain SASE FEL. In this paper, we study the characteristics of the 4 kinds of THz light sources.

  1. A new route for dental graduates.

    PubMed

    Pocock, Ian

    2007-01-01

    The two dental faculties believe that the new examination will provide a modern, fit-for-purpose, innovative assessment for today's young dentist. The introduction of a workplace-based portfolio removes the reliance on traditional tests of knowledge and, together with the OSCE elements, allows for triangulation of methods to test the areas set out in the GPT Curriculum. Furthermore, the faculties hope that the evaluation of workplace-based experience, and decreased reliance on traditional examination methods, will also have greater meaning for young dental graduates.

  2. Assessing the accuracy of TDR-based water leak detection system

    NASA Astrophysics Data System (ADS)

    Fatemi Aghda, S. M.; GanjaliPour, K.; Nabiollahi, K.

    2018-03-01

    The use of TDR system to detect leakage locations in underground pipes has been developed in recent years. In this system, a bi-wire is installed in parallel with the underground pipes and is considered as a TDR sensor. This approach greatly covers the limitations arisen with using the traditional method of acoustic leak positioning. TDR based leak detection method is relatively accurate when the TDR sensor is in contact with water in just one point. Researchers have been working to improve the accuracy of this method in recent years. In this study, the ability of TDR method was evaluated in terms of the appearance of multi leakage points simultaneously. For this purpose, several laboratory tests were conducted. In these tests in order to simulate leakage points, the TDR sensor was put in contact with water at some points, then the number and the dimension of the simulated leakage points were gradually increased. The results showed that with the increase in the number and dimension of the leakage points, the error rate of the TDR-based water leak detection system increases. The authors tried, according to the results obtained from the laboratory tests, to develop a method to improve the accuracy of the TDR-based leak detection systems. To do that, they defined a few reference points on the TDR sensor. These points were created via increasing the distance between two conductors of TDR sensor and were easily identifiable in the TDR waveform. The tests were repeated again using the TDR sensor having reference points. In order to calculate the exact distance of the leakage point, the authors developed an equation in accordance to the reference points. A comparison between the results obtained from both tests (with and without reference points) showed that using the method and equation developed by the authors can significantly improve the accuracy of positioning the leakage points.

  3. Lessons learned in induced fit docking and metadynamics in the Drug Design Data Resource Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Baumgartner, Matthew P.; Evans, David A.

    2018-01-01

    Two of the major ongoing challenges in computational drug discovery are predicting the binding pose and affinity of a compound to a protein. The Drug Design Data Resource Grand Challenge 2 was developed to address these problems and to drive development of new methods. The challenge provided the 2D structures of compounds for which the organizers help blinded data in the form of 35 X-ray crystal structures and 102 binding affinity measurements and challenged participants to predict the binding pose and affinity of the compounds. We tested a number of pose prediction methods as part of the challenge; we found that docking methods that incorporate protein flexibility (Induced Fit Docking) outperformed methods that treated the protein as rigid. We also found that using binding pose metadynamics, a molecular dynamics based method, to score docked poses provided the best predictions of our methods with an average RMSD of 2.01 Å. We tested both structure-based (e.g. docking) and ligand-based methods (e.g. QSAR) in the affinity prediction portion of the competition. We found that our structure-based methods based on docking with Smina (Spearman ρ = 0.614), performed slightly better than our ligand-based methods (ρ = 0.543), and had equivalent performance with the other top methods in the competition. Despite the overall good performance of our methods in comparison to other participants in the challenge, there exists significant room for improvement especially in cases such as these where protein flexibility plays such a large role.

  4. A multiwave range test for obstacle reconstructions with unknown physical properties

    NASA Astrophysics Data System (ADS)

    Potthast, Roland; Schulz, Jochen

    2007-08-01

    We develop a new multiwave version of the range test for shape reconstruction in inverse scattering theory. The range test [R. Potthast, et al., A `range test' for determining scatterers with unknown physical properties, Inverse Problems 19(3) (2003) 533-547] has originally been proposed to obtain knowledge about an unknown scatterer when the far field pattern for only one plane wave is given. Here, we extend the method to the case of multiple waves and show that the full shape of the unknown scatterer can be reconstructed. We further will clarify the relation between the range test methods, the potential method [A. Kirsch, R. Kress, On an integral equation of the first kind in inverse acoustic scattering, in: Inverse Problems (Oberwolfach, 1986), Internationale Schriftenreihe zur Numerischen Mathematik, vol. 77, Birkhauser, Basel, 1986, pp. 93-102] and the singular sources method [R. Potthast, Point sources and multipoles in inverse scattering theory, Habilitation Thesis, Gottingen, 1999]. In particular, we propose a new version of the Kirsch-Kress method using the range test and a new approach to the singular sources method based on the range test and potential method. Numerical examples of reconstructions for all four methods are provided.

  5. Phase demodulation from a single fringe pattern based on a correlation technique.

    PubMed

    Robin, Eric; Valle, Valéry

    2004-08-01

    We present a method for determining the demodulated phase from a single fringe pattern. This method, based on a correlation technique, searches in a zone of interest for the degree of similarity between a real fringe pattern and a mathematical model. This method, named modulated phase correlation, is tested with different examples.

  6. Explosive materials equivalency, test methods and evaluation

    NASA Technical Reports Server (NTRS)

    Koger, D. M.; Mcintyre, F. L.

    1980-01-01

    Attention is given to concepts of explosive equivalency of energetic materials based on specific airblast parameters. A description is provided of a wide bandwidth high accuracy instrumentation system which has been used extensively in obtaining pressure time profiles of energetic materials. The object of the considered test method is to determine the maximum output from the detonation of explosive materials in terms of airblast overpressure and positive impulse. The measured pressure and impulse values are compared with known characteristics of hemispherical TNT data to determine the equivalency of the test material in relation to TNT. An investigation shows that meaningful comparisons between various explosives and a standard reference material such as TNT should be based upon the same parameters. The tests should be conducted under the same conditions.

  7. Development of an automated ultrasonic testing system

    NASA Astrophysics Data System (ADS)

    Shuxiang, Jiao; Wong, Brian Stephen

    2005-04-01

    Non-Destructive Testing is necessary in areas where defects in structures emerge over time due to wear and tear and structural integrity is necessary to maintain its usability. However, manual testing results in many limitations: high training cost, long training procedure, and worse, the inconsistent test results. A prime objective of this project is to develop an automatic Non-Destructive testing system for a shaft of the wheel axle of a railway carriage. Various methods, such as the neural network, pattern recognition methods and knowledge-based system are used for the artificial intelligence problem. In this paper, a statistical pattern recognition approach, Classification Tree is applied. Before feature selection, a thorough study on the ultrasonic signals produced was carried out. Based on the analysis of the ultrasonic signals, three signal processing methods were developed to enhance the ultrasonic signals: Cross-Correlation, Zero-Phase filter and Averaging. The target of this step is to reduce the noise and make the signal character more distinguishable. Four features: 1. The Auto Regressive Model Coefficients. 2. Standard Deviation. 3. Pearson Correlation 4. Dispersion Uniformity Degree are selected. And then a Classification Tree is created and applied to recognize the peak positions and amplitudes. Searching local maximum is carried out before feature computing. This procedure reduces much computation time in the real-time testing. Based on this algorithm, a software package called SOFRA was developed to recognize the peaks, calibrate automatically and test a simulated shaft automatically. The automatic calibration procedure and the automatic shaft testing procedure are developed.

  8. Test methods for environment-assisted cracking

    NASA Astrophysics Data System (ADS)

    Turnbull, A.

    1992-03-01

    The test methods for assessing environment assisted cracking of metals in aqueous solution are described. The advantages and disadvantages are examined and the interrelationship between results from different test methods is discussed. The source of differences in susceptibility to cracking occasionally observed from the varied mechanical test methods arises often from the variation between environmental parameters in the different test conditions and the lack of adequate specification, monitoring, and control of environmental variables. Time is also a significant factor when comparing results from short term tests with long exposure tests. In addition to these factors, the intrinsic difference in the important mechanical variables, such as strain rate, associated with the various mechanical tests methods can change the apparent sensitivity of the material to stress corrosion cracking. The increasing economic pressure for more accelerated testing is in conflict with the characteristic time dependence of corrosion processes. Unreliable results may be inevitable in some cases but improved understanding of mechanisms and the development of mechanistically based models of environment assisted cracking which incorporate the key mechanical, material, and environmental variables can provide the framework for a more realistic interpretation of short term data.

  9. Diagnostic test accuracy and prevalence inferences based on joint and sequential testing with finite population sampling.

    PubMed

    Su, Chun-Lung; Gardner, Ian A; Johnson, Wesley O

    2004-07-30

    The two-test two-population model, originally formulated by Hui and Walter, for estimation of test accuracy and prevalence estimation assumes conditionally independent tests, constant accuracy across populations and binomial sampling. The binomial assumption is incorrect if all individuals in a population e.g. child-care centre, village in Africa, or a cattle herd are sampled or if the sample size is large relative to population size. In this paper, we develop statistical methods for evaluating diagnostic test accuracy and prevalence estimation based on finite sample data in the absence of a gold standard. Moreover, two tests are often applied simultaneously for the purpose of obtaining a 'joint' testing strategy that has either higher overall sensitivity or specificity than either of the two tests considered singly. Sequential versions of such strategies are often applied in order to reduce the cost of testing. We thus discuss joint (simultaneous and sequential) testing strategies and inference for them. Using the developed methods, we analyse two real and one simulated data sets, and we compare 'hypergeometric' and 'binomial-based' inferences. Our findings indicate that the posterior standard deviations for prevalence (but not sensitivity and specificity) based on finite population sampling tend to be smaller than their counterparts for infinite population sampling. Finally, we make recommendations about how small the sample size should be relative to the population size to warrant use of the binomial model for prevalence estimation. Copyright 2004 John Wiley & Sons, Ltd.

  10. Assessment of WENO-extended two-fluid modelling in compressible multiphase flows

    NASA Astrophysics Data System (ADS)

    Kitamura, Keiichi; Nonomura, Taku

    2017-03-01

    The two-fluid modelling based on an advection-upwind-splitting-method (AUSM)-family numerical flux function, AUSM+-up, following the work by Chang and Liou [Journal of Computational Physics 2007;225: 840-873], has been successfully extended to the fifth order by weighted-essentially-non-oscillatory (WENO) schemes. Then its performance is surveyed in several numerical tests. The results showed a desired performance in one-dimensional benchmark test problems: Without relying upon an anti-diffusion device, the higher-order two-fluid method captures the phase interface within a fewer grid points than the conventional second-order method, as well as a rarefaction wave and a very weak shock. At a high pressure ratio (e.g. 1,000), the interpolated variables appeared to affect the performance: the conservative-variable-based characteristic-wise WENO interpolation showed less sharper but more robust representations of the shocks and expansions than the primitive-variable-based counterpart did. In two-dimensional shock/droplet test case, however, only the primitive-variable-based WENO with a huge void fraction realised a stable computation.

  11. Meta-Analysis of Inquiry-Based Instruction Research

    NASA Astrophysics Data System (ADS)

    Hasanah, N.; Prasetyo, A. P. B.; Rudyatmi, E.

    2017-04-01

    Inquiry-based instruction in biology has been the focus of educational research conducted by Unnes biology department students in collaboration with their university supervisors. This study aimed to describe the methodological aspects, inquiry teaching methods critically, and to analyse the results claims, of the selected four student research reports, grounded in inquiry, based on the database of Unnes biology department 2014. Four experimental quantitative research of 16 were selected as research objects by purposive sampling technique. Data collected through documentation study was qualitatively analysed regarding methods used, quality of inquiry syntax, and finding claims. Findings showed that the student research was still the lack of relevant aspects of research methodology, namely in appropriate sampling procedures, limited validity tests of all research instruments, and the limited parametric statistic (t-test) not supported previously by data normality tests. Their consistent inquiry syntax supported the four mini-thesis claims that inquiry-based teaching influenced their dependent variables significantly. In other words, the findings indicated that positive claims of the research results were not fully supported by good research methods, and well-defined inquiry procedures implementation.

  12. Statistical method evaluation for differentially methylated CpGs in base resolution next-generation DNA sequencing data.

    PubMed

    Zhang, Yun; Baheti, Saurabh; Sun, Zhifu

    2018-05-01

    High-throughput bisulfite methylation sequencing such as reduced representation bisulfite sequencing (RRBS), Agilent SureSelect Human Methyl-Seq (Methyl-seq) or whole-genome bisulfite sequencing is commonly used for base resolution methylome research. These data are represented either by the ratio of methylated cytosine versus total coverage at a CpG site or numbers of methylated and unmethylated cytosines. Multiple statistical methods can be used to detect differentially methylated CpGs (DMCs) between conditions, and these methods are often the base for the next step of differentially methylated region identification. The ratio data have a flexibility of fitting to many linear models, but the raw count data take consideration of coverage information. There is an array of options in each datatype for DMC detection; however, it is not clear which is an optimal statistical method. In this study, we systematically evaluated four statistic methods on methylation ratio data and four methods on count-based data and compared their performances with regard to type I error control, sensitivity and specificity of DMC detection and computational resource demands using real RRBS data along with simulation. Our results show that the ratio-based tests are generally more conservative (less sensitive) than the count-based tests. However, some count-based methods have high false-positive rates and should be avoided. The beta-binomial model gives a good balance between sensitivity and specificity and is preferred method. Selection of methods in different settings, signal versus noise and sample size estimation are also discussed.

  13. Model-assisted probability of detection of flaws in aluminum blocks using polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming

    2018-04-01

    Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.

  14. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  15. Scale factor measure method without turntable for angular rate gyroscope

    NASA Astrophysics Data System (ADS)

    Qi, Fangyi; Han, Xuefei; Yao, Yanqing; Xiong, Yuting; Huang, Yuqiong; Wang, Hua

    2018-03-01

    In this paper, a scale factor test method without turntable is originally designed for the angular rate gyroscope. A test system which consists of test device, data acquisition circuit and data processing software based on Labview platform is designed. Taking advantage of gyroscope's sensitivity of angular rate, a gyroscope with known scale factor, serves as a standard gyroscope. The standard gyroscope is installed on the test device together with a measured gyroscope. By shaking the test device around its edge which is parallel to the input axis of gyroscope, the scale factor of the measured gyroscope can be obtained in real time by the data processing software. This test method is fast. It helps test system miniaturized, easy to carry or move. Measure quarts MEMS gyroscope's scale factor multi-times by this method, the difference is less than 0.2%. Compare with testing by turntable, the scale factor difference is less than 1%. The accuracy and repeatability of the test system seems good.

  16. [Seed quality test methods of Paeonia suffruticosa].

    PubMed

    Cao, Ya-Yue; Zhu, Zai-Biao; Guo, Qiao-Sheng; Liu, Li; Wang, Chang-Lin

    2014-11-01

    In order to optimize the testing methods for Paeonia suffruticosa seed quality, and provide basis for establishing seed testing rules and seed quality standard of P. suffruticosa. The seed quality of P. suffruticosa from different producing areas was measured based on the related seed testing regulations. The seed testing methods for quality items of P. suffruticosa was established preliminarily. The samples weight of P. suffruticosa was at least 7 000 g for purity analysis and was at least 700 g for test. The phenotypic observation and size measurement were used for authenticity testing. The 1 000-seed weight was determined by 100-seed method, and the water content was carried out by low temperature drying method (10 hours). After soaking in distilled water for 24 h, the seeds was treated with different temperature stratifications of day and night (25 degrees C/20 degrees C, day/night) in the dark for 60 d. After soaking in the liquor of GA3 300 mg x L(-1) for 24 h, the P. suffruticos seeds were cultured in wet sand at 15 degrees C for 12-60 days for germination testing. Seed viability was tested by TlC method.

  17. A rapid method for soil cement design : Louisiana slope value method.

    DOT National Transportation Integrated Search

    1964-03-01

    The current procedure used by the Louisiana Department of Highways for laboratory design of cement stabilized soil base and subbase courses is taken from standard AASHO test methods, patterned after Portland Cement Association criteria. These methods...

  18. Determination of Material Strengths by Hydraulic Bulge Test.

    PubMed

    Wang, Hankui; Xu, Tong; Shou, Binan

    2016-12-30

    The hydraulic bulge test (HBT) method is proposed to determine material tensile strengths. The basic idea of HBT is similar to the small punch test (SPT), but inspired by the manufacturing process of rupture discs-high-pressure hydraulic oil is used instead of punch to cause specimen deformation. Compared with SPT method, the HBT method can avoid some of influence factors, such as punch dimension, punch material, and the friction between punch and specimen. A calculation procedure that is entirely based on theoretical derivation is proposed for estimate yield strength and ultimate tensile strength. Both conventional tensile tests and hydraulic bulge tests were carried out for several ferrous alloys, and the results showed that hydraulic bulge test results are reliable and accurate.

  19. Fatigue analysis and testing of wind turbine blades

    NASA Astrophysics Data System (ADS)

    Greaves, Peter Robert

    This thesis focuses on fatigue analysis and testing of large, multi MW wind turbine blades. The blades are one of the most expensive components of a wind turbine, and their mass has cost implications for the hub, nacelle, tower and foundations of the turbine so it is important that they are not unnecessarily strong. Fatigue is often an important design driver, but fatigue of composites is poorly understood and so large safety factors are often applied to the loads. This has implications for the weight of the blade. Full scale fatigue testing of blades is required by the design standards, and provides manufacturers with confidence that the blade will be able to survive its service life. This testing is usually performed by resonating the blade in the flapwise and edgewise directions separately, but in service these two loads occur at the same time.. A fatigue testing method developed at Narec (the National Renewable Energy Centre) in the UK in which the flapwise and edgewise directions are excited simultaneously has been evaluated by comparing the Palmgren-Miner damage sum around the blade cross section after testing with the damage distribution caused by the service life. A method to obtain the resonant test configuration that will result in the optimum mode shapes for the flapwise and edgewise directions was then developed, and simulation software was designed to allow the blade test to be simulated so that realistic comparisons between the damage distributions after different test types could be obtained. During the course of this work the shortcomings with conventional fatigue analysis methods became apparent, and a novel method of fatigue analysis based on multi-continuum theory and the kinetic theory of fracture was developed. This method was benchmarked using physical test data from the OPTIDAT database and was applied to the analysis of a complete blade. A full scale fatigue test method based on this new analysis approach is also discussed..

  20. OPATs: Omnibus P-value association tests.

    PubMed

    Chen, Chia-Wei; Yang, Hsin-Chou

    2017-07-10

    Combining statistical significances (P-values) from a set of single-locus association tests in genome-wide association studies is a proof-of-principle method for identifying disease-associated genomic segments, functional genes and biological pathways. We review P-value combinations for genome-wide association studies and introduce an integrated analysis tool, Omnibus P-value Association Tests (OPATs), which provides popular analysis methods of P-value combinations. The software OPATs programmed in R and R graphical user interface features a user-friendly interface. In addition to analysis modules for data quality control and single-locus association tests, OPATs provides three types of set-based association test: window-, gene- and biopathway-based association tests. P-value combinations with or without threshold and rank truncation are provided. The significance of a set-based association test is evaluated by using resampling procedures. Performance of the set-based association tests in OPATs has been evaluated by simulation studies and real data analyses. These set-based association tests help boost the statistical power, alleviate the multiple-testing problem, reduce the impact of genetic heterogeneity, increase the replication efficiency of association tests and facilitate the interpretation of association signals by streamlining the testing procedures and integrating the genetic effects of multiple variants in genomic regions of biological relevance. In summary, P-value combinations facilitate the identification of marker sets associated with disease susceptibility and uncover missing heritability in association studies, thereby establishing a foundation for the genetic dissection of complex diseases and traits. OPATs provides an easy-to-use and statistically powerful analysis tool for P-value combinations. OPATs, examples, and user guide can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/genetics/association/OPATs.htm. © The Author 2017. Published by Oxford University Press.

  1. Multiple Hypothesis Testing for Experimental Gingivitis Based on Wilcoxon Signed Rank Statistics

    PubMed Central

    Preisser, John S.; Sen, Pranab K.; Offenbacher, Steven

    2011-01-01

    Dental research often involves repeated multivariate outcomes on a small number of subjects for which there is interest in identifying outcomes that exhibit change in their levels over time as well as to characterize the nature of that change. In particular, periodontal research often involves the analysis of molecular mediators of inflammation for which multivariate parametric methods are highly sensitive to outliers and deviations from Gaussian assumptions. In such settings, nonparametric methods may be favored over parametric ones. Additionally, there is a need for statistical methods that control an overall error rate for multiple hypothesis testing. We review univariate and multivariate nonparametric hypothesis tests and apply them to longitudinal data to assess changes over time in 31 biomarkers measured from the gingival crevicular fluid in 22 subjects whereby gingivitis was induced by temporarily withholding tooth brushing. To identify biomarkers that can be induced to change, multivariate Wilcoxon signed rank tests for a set of four summary measures based upon area under the curve are applied for each biomarker and compared to their univariate counterparts. Multiple hypothesis testing methods with choice of control of the false discovery rate or strong control of the family-wise error rate are examined. PMID:21984957

  2. Face recognition based on symmetrical virtual image and original training image

    NASA Astrophysics Data System (ADS)

    Ke, Jingcheng; Peng, Yali; Liu, Shigang; Li, Jun; Pei, Zhao

    2018-02-01

    In face representation-based classification methods, we are able to obtain high recognition rate if a face has enough available training samples. However, in practical applications, we only have limited training samples to use. In order to obtain enough training samples, many methods simultaneously use the original training samples and corresponding virtual samples to strengthen the ability of representing the test sample. One is directly using the original training samples and corresponding mirror samples to recognize the test sample. However, when the test sample is nearly symmetrical while the original training samples are not, the integration of the original training and mirror samples might not well represent the test samples. To tackle the above-mentioned problem, in this paper, we propose a novel method to obtain a kind of virtual samples which are generated by averaging the original training samples and corresponding mirror samples. Then, the original training samples and the virtual samples are integrated to recognize the test sample. Experimental results on five face databases show that the proposed method is able to partly overcome the challenges of the various poses, facial expressions and illuminations of original face image.

  3. MS lesion segmentation using a multi-channel patch-based approach with spatial consistency

    NASA Astrophysics Data System (ADS)

    Mechrez, Roey; Goldberger, Jacob; Greenspan, Hayit

    2015-03-01

    This paper presents an automatic method for segmentation of Multiple Sclerosis (MS) in Magnetic Resonance Images (MRI) of the brain. The approach is based on similarities between multi-channel patches (T1, T2 and FLAIR). An MS lesion patch database is built using training images for which the label maps are known. For each patch in the testing image, k similar patches are retrieved from the database. The matching labels for these k patches are then combined to produce an initial segmentation map for the test case. Finally a novel iterative patch-based label refinement process based on the initial segmentation map is performed to ensure spatial consistency of the detected lesions. A leave-one-out evaluation is done for each testing image in the MS lesion segmentation challenge of MICCAI 2008. Results are shown to compete with the state-of-the-art methods on the MICCAI 2008 challenge.

  4. Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

    NASA Astrophysics Data System (ADS)

    García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier

    2016-04-01

    Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.

  5. Current status of antifungal susceptibility testing methods.

    PubMed

    Arikan, Sevtap

    2007-11-01

    Antifungal susceptibility testing is a very dynamic field of medical mycology. Standardization of in vitro susceptibility tests by the Clinical and Laboratory Standards Institute (CLSI) and the European Committee for Antimicrobial Susceptibility Testing (EUCAST), and current availability of reference methods constituted the major remarkable steps in the field. Based on the established minimum inhibitory concentration (MIC) breakpoints, it is now possible to determine the susceptibilities of Candida strains to fluconazole, itraconazole, voriconazole, and flucytosine. Moreover, utility of fluconazole antifungal susceptibility tests as an adjunct in optimizing treatment of candidiasis has now been validated. While the MIC breakpoints and clinical significance of susceptibility testing for the remaining fungi and antifungal drugs remain yet unclear, modifications of the available methods as well as other methodologies are being intensively studied to overcome the present drawbacks and limitations. Among the other methods under investigation are Etest, colorimetric microdilution, agar dilution, determination of fungicidal activity, flow cytometry, and ergosterol quantitation. Etest offers the advantage of practical application and favorable agreement rates with the reference methods that are frequently above acceptable limits. However, MIC breakpoints for Etest remain to be evaluated and established. Development of commercially available, standardized colorimetric panels that are based on CLSI method parameters has added more to the antifungal susceptibility testing armamentarium. Flow cytometry, on the other hand, appears to offer rapid susceptibility testing but requires specified equipment and further evaluation for reproducibility and standardization. Ergosterol quantitation is another novel approach, which appears potentially beneficial particularly in discrimination of azole-resistant isolates from heavy trailers. The method is yet investigational and requires to be further studied. Developments in methodology and applications of antifungal susceptibility testing will hopefully provide enhanced utility in clinical guidance of antifungal therapy. However, and particularly in immunosuppressed host, in vitro susceptibility is and will remain only one of several factors that influence clinical outcome.

  6. On determining the most appropriate test cut-off value: the case of tests with continuous results

    PubMed Central

    Habibzadeh, Parham; Yadollahie, Mahboobeh

    2016-01-01

    There are several criteria for determination of the most appropriate cut-off value in a diagnostic test with continuous results. Mostly based on receiver operating characteristic (ROC) analysis, there are various methods to determine the test cut-off value. The most common criteria are the point on ROC curve where the sensitivity and specificity of the test are equal; the point on the curve with minimum distance from the left-upper corner of the unit square; and the point where the Youden’s index is maximum. There are also methods mainly based on Bayesian decision analysis. Herein, we show that a proposed method that maximizes the weighted number needed to misdiagnose, an index of diagnostic test effectiveness we previously proposed, is the most appropriate technique compared to the aforementioned ones. For determination of the cut-off value, we need to know the pretest probability of the disease of interest as well as the costs incurred by misdiagnosis. This means that even for a certain diagnostic test, the cut-off value is not universal and should be determined for each region and for each disease condition. PMID:27812299

  7. Product Quality Research Institute evaluation of cascade impactor profiles of pharmaceutical aerosols: part 2--evaluation of a method for determining equivalence.

    PubMed

    Christopher, David; Adams, Wallace P; Lee, Douglas S; Morgan, Beth; Pan, Ziqing; Singh, Gur Jai Pal; Tsong, Yi; Lyapustina, Svetlana

    2007-01-19

    The purpose of this article is to present the thought process, methods, and interim results of a PQRI Working Group, which was charged with evaluating the chi-square ratio test as a potential method for determining in vitro equivalence of aerodynamic particle size distribution (APSD) profiles obtained from cascade impactor measurements. Because this test was designed with the intention of being used as a tool in regulatory review of drug applications, the capability of the test to detect differences in APSD profiles correctly and consistently was evaluated in a systematic way across a designed space of possible profiles. To establish a "base line," properties of the test in the simplest case of pairs of identical profiles were studied. Next, the test's performance was studied with pairs of profiles, where some difference was simulated in a systematic way on a single deposition site using realistic product profiles. The results obtained in these studies, which are presented in detail here, suggest that the chi-square ratio test in itself is not sufficient to determine equivalence of particle size distributions. This article, therefore, introduces the proposal to combine the chi-square ratio test with a test for impactor-sized mass based on Population Bioequivalence and describes methods for evaluating discrimination capabilities of the combined test. The approaches and results described in this article elucidate some of the capabilities and limitations of the original chi-square ratio test and provide rationale for development of additional tests capable of comparing APSD profiles of pharmaceutical aerosols.

  8. Network-Based Method for Identifying Co-Regeneration Genes in Bone, Dentin, Nerve and Vessel Tissues

    PubMed Central

    Pan, Hongying; Zhang, Yu-Hang; Feng, Kaiyan; Kong, XiangYin; Cai, Yu-Dong

    2017-01-01

    Bone and dental diseases are serious public health problems. Most current clinical treatments for these diseases can produce side effects. Regeneration is a promising therapy for bone and dental diseases, yielding natural tissue recovery with few side effects. Because soft tissues inside the bone and dentin are densely populated with nerves and vessels, the study of bone and dentin regeneration should also consider the co-regeneration of nerves and vessels. In this study, a network-based method to identify co-regeneration genes for bone, dentin, nerve and vessel was constructed based on an extensive network of protein–protein interactions. Three procedures were applied in the network-based method. The first procedure, searching, sought the shortest paths connecting regeneration genes of one tissue type with regeneration genes of other tissues, thereby extracting possible co-regeneration genes. The second procedure, testing, employed a permutation test to evaluate whether possible genes were false discoveries; these genes were excluded by the testing procedure. The last procedure, screening, employed two rules, the betweenness ratio rule and interaction score rule, to select the most essential genes. A total of seventeen genes were inferred by the method, which were deemed to contribute to co-regeneration of at least two tissues. All these seventeen genes were extensively discussed to validate the utility of the method. PMID:28974058

  9. Network-Based Method for Identifying Co- Regeneration Genes in Bone, Dentin, Nerve and Vessel Tissues.

    PubMed

    Chen, Lei; Pan, Hongying; Zhang, Yu-Hang; Feng, Kaiyan; Kong, XiangYin; Huang, Tao; Cai, Yu-Dong

    2017-10-02

    Bone and dental diseases are serious public health problems. Most current clinical treatments for these diseases can produce side effects. Regeneration is a promising therapy for bone and dental diseases, yielding natural tissue recovery with few side effects. Because soft tissues inside the bone and dentin are densely populated with nerves and vessels, the study of bone and dentin regeneration should also consider the co-regeneration of nerves and vessels. In this study, a network-based method to identify co-regeneration genes for bone, dentin, nerve and vessel was constructed based on an extensive network of protein-protein interactions. Three procedures were applied in the network-based method. The first procedure, searching, sought the shortest paths connecting regeneration genes of one tissue type with regeneration genes of other tissues, thereby extracting possible co-regeneration genes. The second procedure, testing, employed a permutation test to evaluate whether possible genes were false discoveries; these genes were excluded by the testing procedure. The last procedure, screening, employed two rules, the betweenness ratio rule and interaction score rule, to select the most essential genes. A total of seventeen genes were inferred by the method, which were deemed to contribute to co-regeneration of at least two tissues. All these seventeen genes were extensively discussed to validate the utility of the method.

  10. Robot-operated quality control station based on the UTT method

    NASA Astrophysics Data System (ADS)

    Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz; Muszyńska, Magdalena; Nawrocki, Jacek

    2017-03-01

    This paper presents a robotic test stand for the ultrasonic transmission tomography (UTT) inspection of stator vane thickness. The article presents the method of the test stand design in Autodesk Robot Structural Analysis Professional 2013 software suite. The performance of the designed test stand solution was simulated in the RobotStudio software suite. The operating principle of the test stand measurement system is presented with a specific focus on the measurement strategy. The results of actual wall thickness measurements performed on stator vanes are presented.

  11. Automatic item generation implemented for measuring artistic judgment aptitude.

    PubMed

    Bezruczko, Nikolaus

    2014-01-01

    Automatic item generation (AIG) is a broad class of methods that are being developed to address psychometric issues arising from internet and computer-based testing. In general, issues emphasize efficiency, validity, and diagnostic usefulness of large scale mental testing. Rapid prominence of AIG methods and their implicit perspective on mental testing is bringing painful scrutiny to many sacred psychometric assumptions. This report reviews basic AIG ideas, then presents conceptual foundations, image model development, and operational application to artistic judgment aptitude testing.

  12. [Near infrared analysis of blending homogeneity of Chinese medicine formula particles based on moving window F test method].

    PubMed

    Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang

    2016-10-01

    Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.

  13. Expert Advisor (EA) Evaluation System Using Web-based ELECTRE Method in Foreign Exchange (Forex) Market

    NASA Astrophysics Data System (ADS)

    Satibi; Widodo, Catur Edi; Farikhin

    2018-02-01

    This research aims to optimize forex trading profit automatically using EA but its still keep considering accuracy and drawdown levels. The evaluation system will classify EA performance based on trading market sessions (Sydney, Tokyo, London and New York) to determine the right EA to be used in certain market sessions. This evaluation system is a web-based ELECTRE methods that interact in real-time with EA through web service and are able to present real-time charts performance dashboard using web socket protocol communications. Web applications are programmed using NodeJs. In the testing period, all EAs had been simulated 24 hours in all market sessions for three months, the best EA is valued by its profit, accuracy and drawdown criteria that calculated using web-based ELECTRE method. The ideas of this research are to compare the best EA on testing period with collaboration performances of each best classified EA by market sessions. This research uses three months historical data of EUR/USD as testing period and other 3 months as validation period. As a result, performance of collaboration four best EA classified by market sessions can increase profits percentage consistently in testing and validation periods and keep securing accuracy and drawdown levels.

  14. Comparing Science Virtual and Paper-Based Test to Measure Students’ Critical Thinking based on VAK Learning Style Model

    NASA Astrophysics Data System (ADS)

    Rosyidah, T. H.; Firman, H.; Rusyati, L.

    2017-02-01

    This research was comparing virtual and paper-based test to measure students’ critical thinking based on VAK (Visual-Auditory-Kynesthetic) learning style model. Quasi experiment method with one group post-test only design is applied in this research in order to analyze the data. There was 40 eight grade students at one of public junior high school in Bandung becoming the sample in this research. The quantitative data was obtained through 26 questions about living thing and environment sustainability which is constructed based on the eight elements of critical thinking and be provided in the form of virtual and paper-based test. Based on analysis of the result, it is shown that within visual, auditory, and kinesthetic were not significantly difference in virtual and paper-based test. Besides, all result was supported by quistionnaire about students’ respond on virtual test which shows 3.47 in the scale of 4. Means that student showed positive respond in all aspet measured, which are interest, impression, and expectation.

  15. Towards a formal genealogical classification of the Lezgian languages (North Caucasus): testing various phylogenetic methods on lexical data.

    PubMed

    Kassian, Alexei

    2015-01-01

    A lexicostatistical classification is proposed for 20 languages and dialects of the Lezgian group of the North Caucasian family, based on meticulously compiled 110-item wordlists, published as part of the Global Lexicostatistical Database project. The lexical data have been subsequently analyzed with the aid of the principal phylogenetic methods, both distance-based and character-based: Starling neighbor joining (StarlingNJ), Neighbor joining (NJ), Unweighted pair group method with arithmetic mean (UPGMA), Bayesian Markov chain Monte Carlo (MCMC), Unweighted maximum parsimony (UMP). Cognation indexes within the input matrix were marked by two different algorithms: traditional etymological approach and phonetic similarity, i.e., the automatic method of consonant classes (Levenshtein distances). Due to certain reasons (first of all, high lexicographic quality of the wordlists and a consensus about the Lezgian phylogeny among Caucasologists), the Lezgian database is a perfect testing area for appraisal of phylogenetic methods. For the etymology-based input matrix, all the phylogenetic methods, with the possible exception of UMP, have yielded trees that are sufficiently compatible with each other to generate a consensus phylogenetic tree of the Lezgian lects. The obtained consensus tree agrees with the traditional expert classification as well as some of the previously proposed formal classifications of this linguistic group. Contrary to theoretical expectations, the UMP method has suggested the least plausible tree of all. In the case of the phonetic similarity-based input matrix, the distance-based methods (StarlingNJ, NJ, UPGMA) have produced the trees that are rather close to the consensus etymology-based tree and the traditional expert classification, whereas the character-based methods (Bayesian MCMC, UMP) have yielded less likely topologies.

  16. Towards a Formal Genealogical Classification of the Lezgian Languages (North Caucasus): Testing Various Phylogenetic Methods on Lexical Data

    PubMed Central

    Kassian, Alexei

    2015-01-01

    A lexicostatistical classification is proposed for 20 languages and dialects of the Lezgian group of the North Caucasian family, based on meticulously compiled 110-item wordlists, published as part of the Global Lexicostatistical Database project. The lexical data have been subsequently analyzed with the aid of the principal phylogenetic methods, both distance-based and character-based: Starling neighbor joining (StarlingNJ), Neighbor joining (NJ), Unweighted pair group method with arithmetic mean (UPGMA), Bayesian Markov chain Monte Carlo (MCMC), Unweighted maximum parsimony (UMP). Cognation indexes within the input matrix were marked by two different algorithms: traditional etymological approach and phonetic similarity, i.e., the automatic method of consonant classes (Levenshtein distances). Due to certain reasons (first of all, high lexicographic quality of the wordlists and a consensus about the Lezgian phylogeny among Caucasologists), the Lezgian database is a perfect testing area for appraisal of phylogenetic methods. For the etymology-based input matrix, all the phylogenetic methods, with the possible exception of UMP, have yielded trees that are sufficiently compatible with each other to generate a consensus phylogenetic tree of the Lezgian lects. The obtained consensus tree agrees with the traditional expert classification as well as some of the previously proposed formal classifications of this linguistic group. Contrary to theoretical expectations, the UMP method has suggested the least plausible tree of all. In the case of the phonetic similarity-based input matrix, the distance-based methods (StarlingNJ, NJ, UPGMA) have produced the trees that are rather close to the consensus etymology-based tree and the traditional expert classification, whereas the character-based methods (Bayesian MCMC, UMP) have yielded less likely topologies. PMID:25719456

  17. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  18. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method was studied, and the method was compared to other surrogate-based optimization methods that aim to locate the global optimum using two two-dimensional test functions, a six-dimensional test function, and a five-dimensional engineering example.

  19. A polarized digital shearing speckle pattern interferometry system based on temporal wavelet transformation.

    PubMed

    Feng, Ziang; Gao, Zhan; Zhang, Xiaoqiong; Wang, Shengjia; Yang, Dong; Yuan, Hao; Qin, Jie

    2015-09-01

    Digital shearing speckle pattern interferometry (DSSPI) has been recognized as a practical tool in testing strain. The DSSPI system which is based on temporal analysis is attractive because of its ability to measure strain dynamically. In this paper, such a DSSPI system with Wollaston prism has been built. The principles and system arrangement are described and the preliminary experimental result of the displacement-derivative test of an aluminum plate is shown with the wavelet transformation method and the Fourier transformation method. The simulations have been conducted with the finite element method. The comparison of the results shows that quantitative measurement of displacement-derivative has been realized.

  20. Experimental and simulation flow rate analysis of the 3/2 directional pneumatic valve

    NASA Astrophysics Data System (ADS)

    Blasiak, Slawomir; Takosoglu, Jakub E.; Laski, Pawel A.; Pietrala, Dawid S.; Zwierzchowski, Jaroslaw; Bracha, Gabriel; Nowakowski, Lukasz; Blasiak, Malgorzata

    The work includes a study on the comparative analysis of two test methods. The first method - numerical method, consists in determining the flow characteristics with the use of ANSYS CFX. A modeled poppet directional valve 3/2 3D CAD software - SolidWorks was used for this purpose. Based on the solid model that was developed, simulation studies of the air flow through the way valve in the software for computational fluid dynamics Ansys CFX were conducted. The second method - experimental, entailed conducting tests on a specially constructed test stand. The comparison of the test results obtained on the basis of both methods made it possible to determine the cross-correlation. High compatibility of the results confirms the usefulness of the numerical procedures. Thus, they might serve to determine the flow characteristics of directional valves as an alternative to a costly and time-consuming test stand.

Top