Sample records for test approach based

  1. Requirements-Based Conformance Testing of ARINC 653 Real-Time Operating Systems

    NASA Astrophysics Data System (ADS)

    Maksimov, Andrey

    2010-08-01

    Requirements-based testing is emphasized in avionics certification documents because this strategy has been found to be the most effective at revealing errors. This paper describes the unified requirements-based approach to the creation of conformance test suites for mission-critical systems. The approach uses formal machine-readable specifications of requirements and finite state machine model for test sequences generation on-the-fly. The paper also presents the test system for automated test generation for ARINC 653 services built on this approach. Possible application of the presented approach to various areas of avionics embedded systems testing is discussed.

  2. Flight test evaluation of the E-systems Differential GPS category 3 automatic landing system

    NASA Technical Reports Server (NTRS)

    Kaufmann, David N.; Mcnally, B. David

    1995-01-01

    Test flights were conducted to evaluate the capability of Differential Global Positioning System (DGPS) to provide the accuracy and integrity required for International Civil Aviation Organization (ICAO) Category (CAT) III precision approach and landings. These test flights were part of a Federal Aviation Administration (FAA) program to evaluate the technical feasibility of using DGPS based technology for CAT III precision approach and landing applications. An IAI Westwind 1124 aircraft (N24RH) was equipped with DGPS receiving equipment and additional computing capability provided by E-Systems. The test flights were conducted at NASA Ames Research Center's Crows Landing Flight Facility, Crows Landing, California. The flight test evaluation was based on completing 100 approaches and landings. The navigation sensor error accuracy requirements were based on ICAO requirements for the Microwave Landing System (MLS). All of the approaches and landings were evaluated against ground truth reference data provided by a laser tracker. Analysis of these approaches and landings shows that the E-Systems DGPS system met the navigation sensor error requirements for a successful approach and landing 98 out of 100 approaches and landings, based on the requirements specified in the FAA CAT III Level 2 Flight Test Plan. In addition, the E-Systems DGPS system met the integrity requirements for a successful approach and landing or stationary trial for all 100 approaches and landings and all ten stationary trials, based on the requirements specified in the FAA CAT III Level 2 Flight Test Plan.

  3. Uptake and linkage into care over one year of providing HIV testing and counselling through community and health facility testing modalities in urban informal settlement of Kibera, Nairobi Kenya.

    PubMed

    Muhula, Samuel; Memiah, Peter; Mbau, Lilian; Oruko, Happiness; Baker, Bebora; Ikiara, Geoffrey; Mungai, Margaret; Ndirangu, Meshack; Achwoka, Dunstan; Ilako, Festus

    2016-05-04

    We examine the uptake of HIV Testing and Counselling (HTC) and linkage into care over one year of providing HTC through community and health facility testing modalities among people living in Kibera informal urban settlement in Nairobi Kenya. We analyzed program data on health facility-based HIV testing and counselling and community- based testing and counselling approaches for the period starting October 2013 to September 2014. Univariate and bivariate analysis methods were used to compare the two approaches with regard to uptake of HTC and subsequent linkage to care. The exact Confidence Intervals (CI) to the proportions were approximated using simple normal approximation to binomial distribution method. Majority of the 18,591 clients were tested through health facility-based testing approaches 72.5 % (n = 13485) vs those tested through community-based testing comprised 27.5 % (n = 5106). More clients tested at health facilities were reached through Provider Initiated Testing and Counselling PITC 81.7 % (n = 11015) while 18.3 % were reached through Voluntary Counselling and Testing (VCT)/Client Initiated Testing and Counselling (CITC) services. All clients who tested positive during health facility-based testing were successfully linked to care either at the project sites or sites of client choice while not all who tested positive during community based testing were linked to care. The HIV prevalence among all those who were tested for HIV in the program was 5.2 % (n = 52, 95 % CI: 3.9 %-6.8 %). Key study limitation included use of aggregate data to report uptake of HTC through the two testing approaches and not being able to estimate the population in the catchment area likely to test for HIV. Health facility-based HTC approach achieved more clients tested for HIV, and this method also resulted in identifying greater numbers of people who were HIV positive in Kibera slum within one year period of testing for HIV compared to community-based HTC approach. Linking HIV positive clients to care proved much easier during health facility- based HTC compared to community- based HTC.

  4. Evaluation of offspring and maternal genetic effects on disease risk using a family-based approach: the "pent" design.

    PubMed

    Mitchell, Laura E; Weinberg, Clarice R

    2005-10-01

    Diseases that develop during gestation may be influenced by the genotype of the mother and the inherited genotype of the embryo/fetus. However, given the correlation between maternal and offspring genotypes, differentiating between inherited and maternal genetic effects is not straightforward. The two-step transmission disequilibrium test was the first, family-based test proposed for the purpose of differentiating between maternal and offspring genetic effects. However, this approach, which requires data from "pents" comprising an affected child, mother, father, and maternal grandparents, provides biased tests for maternal genetic effects when the offspring genotype is associated with disease. An alternative approach based on transmissions from grandparents provides unbiased tests for maternal and offspring genetic effects but requires genotype information for paternal grandparents in addition to pents. The authors have developed two additional, pent-based approaches for the evaluation of maternal and offspring genetic effects. One approach requires the assumption of genetic mating type symmetry (pent-1), whereas the other does not (pent-2). Simulation studies demonstrate that both of these approaches provide valid estimation and testing for offspring and maternal genotypic effects. In addition, the power of the pent-1 approach is comparable with that of the approach based on data using all four grandparents.

  5. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  6. A Strength-Based Approach to Teacher Professional Development

    ERIC Educational Resources Information Center

    Zwart, Rosanne C.; Korthagen, Fred A. J.; Attema-Noordewier, Saskia

    2015-01-01

    Based on positive psychology, self-determination theory and a perspective on teacher quality, this study proposes and examines a strength-based approach to teacher professional development. A mixed method pre-test/post-test design was adopted to study perceived outcomes of the approach for 93 teachers of six primary schools in the Netherlands and…

  7. Flight test evaluation of the Stanford University/United Airlines differential GPS Category 3 automatic landing system

    NASA Technical Reports Server (NTRS)

    Kaufmann, David N.; Ncnally, B. David

    1995-01-01

    Test flights were conducted to evaluate the capability of Differential Global Positioning System (DGPS) to provide the accuracy and integrity required for International Civil Aviation Organization (ICAO) Category (CAT) 3 precision approach and landings. These test flights were part of a Federal Aviation Administration (FAA) program to evaluate the technical feasibility of using DGPS based technology for CAT 3 precision approach and landing applications. A United Airlines Boeing 737-300 (N304UA) was equipped with DGPS receiving equipment and additional computing capability provided by Stanford University. The test flights were conducted at NASA Ames Research Center's Crows Landing Flight Facility, Crows Landing, California. The flight test evaluation was based on completing 100 approaches and autolandings; 90 touch and go, and 10 terminating with a full stop. Two types of accuracy requirements were evaluated: 1) Total system error, based on the Required Navigation Performance (RNP), and 2) Navigation sensor error, based on ICAO requirements for the Microwave Landing System (MLS). All of the approaches and autolandings were evaluated against ground truth reference data provided by a laser tracker. Analysis of these approaches and autolandings shows that the Stanford University/United Airlines system met the requirements for a successful approach and autolanding 98 out of 100 approaches and autolandings, based on the total system error requirements as specified in the FAA CAT 3 Level 2 Flight Test Plan.

  8. Functional testing of space flight induced changes in tonic motor control by using limb-attached excitation and load devices

    NASA Astrophysics Data System (ADS)

    Gallasch, Eugen; Kozlovskaya, Inessa

    2007-02-01

    Long term space flights induce atrophy and contractile changes on postural muscles such effecting tonic motor control. Functional testing of tonic motor control structures is a challenge because of the difficulties to deliver appropriate test forces on crew members. In this paper we propose two approaches for functional testing by using limb attached loading devices. The first approach is based on a frequency and amplitude controllable moving magnet exciter to deliver sinusoidal test forces during limb postures. The responding limb deflection is recorded by an embedded accelerometer to obtain limb impedance. The second approach is based on elastic limb loading to evoke self-excited oscillations during arm extensions. Here the contraction force at the oscillation onset provides information about limb stiffness. The rationale for both testing approaches is based on Feldman's λ-model. An arm expander based on the second approach was probed in a 6-month MIR space flight. The results obtained from the load oscillations, confirmed that this device is well suited to capture space flight induced neuromuscular changes.

  9. Implementation and Operational Research: Cost and Efficiency of a Hybrid Mobile Multidisease Testing Approach With High HIV Testing Coverage in East Africa.

    PubMed

    Chang, Wei; Chamie, Gabriel; Mwai, Daniel; Clark, Tamara D; Thirumurthy, Harsha; Charlebois, Edwin D; Petersen, Maya; Kabami, Jane; Ssemmondo, Emmanuel; Kadede, Kevin; Kwarisiima, Dalsone; Sang, Norton; Bukusi, Elizabeth A; Cohen, Craig R; Kamya, Moses; Havlir, Diane V; Kahn, James G

    2016-11-01

    In 2013-2014, we achieved 89% adult HIV testing coverage using a hybrid testing approach in 32 communities in Uganda and Kenya (SEARCH: NCT01864603). To inform scalability, we sought to determine: (1) overall cost and efficiency of this approach; and (2) costs associated with point-of-care (POC) CD4 testing, multidisease services, and community mobilization. We applied microcosting methods to estimate costs of population-wide HIV testing in 12 SEARCH trial communities. Main intervention components of the hybrid approach are census, multidisease community health campaigns (CHC), and home-based testing for CHC nonattendees. POC CD4 tests were provided for all HIV-infected participants. Data were extracted from expenditure records, activity registers, staff interviews, and time and motion logs. The mean cost per adult tested for HIV was $20.5 (range: $17.1-$32.1) (2014 US$), including a POC CD4 test at $16 per HIV+ person identified. Cost per adult tested for HIV was $13.8 at CHC vs. $31.7 by home-based testing. The cost per HIV+ adult identified was $231 ($87-$1245), with variability due mainly to HIV prevalence among persons tested (ie, HIV positivity rate). The marginal costs of multidisease testing at CHCs were $1.16/person for hypertension and diabetes, and $0.90 for malaria. Community mobilization constituted 15.3% of total costs. The hybrid testing approach achieved very high HIV testing coverage, with POC CD4, at costs similar to previously reported mobile, home-based, or venue-based HIV testing approaches in sub-Saharan Africa. By leveraging HIV infrastructure, multidisease services were offered at low marginal costs.

  10. Design and Development of a Rapid Research, Design, and Development Platform for In-Situ Testing of Tools and Concepts for Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Underwood, Matthew C.

    2017-01-01

    To provide justification for equipping a fleet of aircraft with avionics capable of supporting trajectory-based operations, significant flight testing must be accomplished. However, equipping aircraft with these avionics and enabling technologies to communicate the clearances required for trajectory-based operations is cost-challenging using conventional avionics approaches. This paper describes an approach to minimize the costs and risks of flight testing these technologies in-situ, discusses the test-bed platform developed, and highlights results from a proof-of-concept flight test campaign that demonstrates the feasibility and efficiency of this approach.

  11. A microprocessor-based table lookup approach for magnetic bearing linearization

    NASA Technical Reports Server (NTRS)

    Groom, N. J.; Miller, J. B.

    1981-01-01

    An approach for producing a linear transfer characteristic between force command and force output of a magnetic bearing actuator without flux biasing is presented. The approach is microprocessor based and uses a table lookup to generate drive signals for the magnetic bearing power driver. An experimental test setup used to demonstrate the feasibility of the approach is described, and test results are presented. The test setup contains bearing elements similar to those used in a laboratory model annular momentum control device.

  12. A Theoretical and Empirical Comparison of Three Approaches to Achievement Testing.

    ERIC Educational Resources Information Center

    Haladyna, Tom; Roid, Gale

    Three approaches to the construction of achievement tests are compared: construct, operational, and empirical. The construct approach is based upon classical test theory and measures an abstract representation of the instructional objectives. The operational approach specifies instructional intent through instructional objectives, facet design,…

  13. Automatic testing and assessment of neuroanatomy using a digital brain atlas: method and development of computer- and mobile-based applications.

    PubMed

    Nowinski, Wieslaw L; Thirunavuukarasuu, Arumugam; Ananthasubramaniam, Anand; Chua, Beng Choon; Qian, Guoyu; Nowinska, Natalia G; Marchenko, Yevgen; Volkau, Ihar

    2009-10-01

    Preparation of tests and student's assessment by the instructor are time consuming. We address these two tasks in neuroanatomy education by employing a digital media application with a three-dimensional (3D), interactive, fully segmented, and labeled brain atlas. The anatomical and vascular models in the atlas are linked to Terminologia Anatomica. Because the cerebral models are fully segmented and labeled, our approach enables automatic and random atlas-derived generation of questions to test location and naming of cerebral structures. This is done in four steps: test individualization by the instructor, test taking by the students at their convenience, automatic student assessment by the application, and communication of the individual assessment to the instructor. A computer-based application with an interactive 3D atlas and a preliminary mobile-based application were developed to realize this approach. The application works in two test modes: instructor and student. In the instructor mode, the instructor customizes the test by setting the scope of testing and student performance criteria, which takes a few seconds. In the student mode, the student is tested and automatically assessed. Self-testing is also feasible at any time and pace. Our approach is automatic both with respect to test generation and student assessment. It is also objective, rapid, and customizable. We believe that this approach is novel from computer-based, mobile-based, and atlas-assisted standpoints.

  14. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach

    PubMed Central

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287

  15. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach.

    PubMed

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing.

  16. A Top-Down Approach to Designing the Computerized Adaptive Multistage Test

    ERIC Educational Resources Information Center

    Luo, Xiao; Kim, Doyoung

    2018-01-01

    The top-down approach to designing a multistage test is relatively understudied in the literature and underused in research and practice. This study introduced a route-based top-down design approach that directly sets design parameters at the test level and utilizes the advanced automated test assembly algorithm seeking global optimality. The…

  17. The preparedness level of final year medical students for an adequate medical approach to emergency cases: computer-based medical education in emergency medicine

    PubMed Central

    2014-01-01

    Background We aimed to observe the preparedness level of final year medical students in approaching emergencies by computer-based simulation training and evaluate the efficacy of the program. Methods A computer-based prototype simulation program (Lsim), designed by researchers from the medical education and computer science departments, was used to present virtual cases for medical learning. Fifty-four final year medical students from Ondokuz Mayis University School of Medicine attended an education program on June 20, 2012 and were trained with Lsim. Volunteer attendants completed a pre-test and post-test exam at the beginning and end of the course, respectively, on the same day. Results Twenty-nine of the 54 students who attended the course accepted to take the pre-test and post-test exams; 58.6% (n = 17) were female. In 10 emergency medical cases, an average of 3.9 correct medical approaches were performed in the pre-test and an average of 9.6 correct medical approaches were performed in the post-test (t = 17.18, P = 0.006). Conclusions This study’s results showed that the readiness level of students for an adequate medical approach to emergency cases was very low. Computer-based training could help in the adequate approach of students to various emergency cases. PMID:24386919

  18. A clinical decision support system for diagnosis of Allergic Rhinitis based on intradermal skin tests.

    PubMed

    Jabez Christopher, J; Khanna Nehemiah, H; Kannan, A

    2015-10-01

    Allergic Rhinitis is a universal common disease, especially in populated cities and urban areas. Diagnosis and treatment of Allergic Rhinitis will improve the quality of life of allergic patients. Though skin tests remain the gold standard test for diagnosis of allergic disorders, clinical experts are required for accurate interpretation of test outcomes. This work presents a clinical decision support system (CDSS) to assist junior clinicians in the diagnosis of Allergic Rhinitis. Intradermal Skin tests were performed on patients who had plausible allergic symptoms. Based on patient׳s history, 40 clinically relevant allergens were tested. 872 patients who had allergic symptoms were considered for this study. The rule based classification approach and the clinical test results were used to develop and validate the CDSS. Clinical relevance of the CDSS was compared with the Score for Allergic Rhinitis (SFAR). Tests were conducted for junior clinicians to assess their diagnostic capability in the absence of an expert. The class based Association rule generation approach provides a concise set of rules that is further validated by clinical experts. The interpretations of the experts are considered as the gold standard. The CDSS diagnoses the presence or absence of rhinitis with an accuracy of 88.31%. The allergy specialist and the junior clinicians prefer the rule based approach for its comprehendible knowledge model. The Clinical Decision Support Systems with rule based classification approach assists junior doctors and clinicians in the diagnosis of Allergic Rhinitis to make reliable decisions based on the reports of intradermal skin tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Cost and efficiency of a hybrid mobile multi-disease testing approach with high HIV testing coverage in East Africa

    PubMed Central

    Chang, Wei; Chamie, Gabriel; Mwai, Daniel; Clark, Tamara D.; Thirumurthy, Harsha; Charlebois, Edwin D.; Petersen, Maya; Kabami, Jane; Ssemmondo, Emmanuel; Kadede, Kevin; Kwarisiima, Dalsone; Sang, Norton; Bukusi, Elizabeth A.; Cohen, Craig R.; Kamya, Moses; Havlir, Diane V.; Kahn, James G.

    2016-01-01

    Background In 2013-14, we achieved 89% adult HIV testing coverage using a hybrid testing approach in 32 communities in Uganda and Kenya (SEARCH: NCT01864603). To inform scalability, we sought to determine: 1) overall cost and efficiency of this approach; and 2) costs associated with point-of-care (POC) CD4 testing, multi-disease services, and community mobilization. Methods We applied micro-costing methods to estimate costs of population-wide HIV testing in 12 SEARCH Trial communities. Main intervention components of the hybrid approach are census, multi-disease community health campaigns (CHC), and home-based testing (HBT) for CHC non-attendees. POC CD4 tests were provided for all HIV-infected participants. Data were extracted from expenditure records, activity registers, staff interviews, and time and motion logs. Results The mean cost per adult tested for HIV was $20.5 (range: $17.1 - $32.1) [2014 US$], including a POC CD4 test at $16 per HIV+ person identified. Cost per adult tested for HIV was $13.8 at CHC vs. $31.7 via HBT. The cost per HIV+ adult identified was $231 ($87 - $1,245), with variability due mainly to HIV prevalence among persons tested (i.e., HIV positivity rate). The marginal costs of multi-disease testing at CHCs were $1.16/person for hypertension and diabetes, and $0.90 for malaria. Community mobilization constituted 15.3% of total costs. Conclusions The hybrid testing approach achieved very high HIV testing coverage, with POC CD4, at costs similar to previously reported mobile, home-based, or venue-based HIV testing approaches in sub-Saharan Africa. By leveraging HIV infrastructure, multi-disease services were offered at low marginal costs. PMID:27741031

  20. Dimensionality Analysis of "CBAL"™ Writing Tests. Research Report. ETS RR-13-10

    ERIC Educational Resources Information Center

    Fu, Jianbin; Chung, Seunghee; Wise, Maxwell

    2013-01-01

    The Cognitively Based Assessment of, for, and as Learning ("CBAL"™) research initiative is aimed at developing an innovative approach to K-12 assessment based on cognitive competency models. Because the choice of scoring and equating approaches depends on test dimensionality, the dimensional structure of CBAL tests must be understood.…

  1. Multidimensional Test Assembly Based on Lagrangian Relaxation Techniques. Research Report 98-08.

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.

    In this paper, a mathematical programming approach is presented for the assembly of ability tests measuring multiple traits. The values of the variance functions of the estimators of the traits are minimized, while test specifications are met. The approach is based on Lagrangian relaxation techniques and provides good results for the two…

  2. The Effectiveness of Problem-Based Learning Approach Based on Multiple Intelligences in Terms of Student’s Achievement, Mathematical Connection Ability, and Self-Esteem

    NASA Astrophysics Data System (ADS)

    Kartikasari, A.; Widjajanti, D. B.

    2017-02-01

    The aim of this study is to explore the effectiveness of learning approach using problem-based learning based on multiple intelligences in developing student’s achievement, mathematical connection ability, and self-esteem. This study is experimental research with research sample was 30 of Grade X students of MIA III MAN Yogyakarta III. Learning materials that were implemented consisting of trigonometry and geometry. For the purpose of this study, researchers designed an achievement test made up of 44 multiple choice questions with respectively 24 questions on the concept of trigonometry and 20 questions for geometry. The researcher also designed a connection mathematical test and self-esteem questionnaire that consisted of 7 essay questions on mathematical connection test and 30 items of self-esteem questionnaire. The learning approach said that to be effective if the proportion of students who achieved KKM on achievement test, the proportion of students who achieved a minimum score of high category on the results of both mathematical connection test and self-esteem questionnaire were greater than or equal to 70%. Based on the hypothesis testing at the significance level of 5%, it can be concluded that the learning approach using problem-based learning based on multiple intelligences was effective in terms of student’s achievement, mathematical connection ability, and self-esteem.

  3. Emerging Rapid Resistance Testing Methods for Clinical Microbiology Laboratories and Their Potential Impact on Patient Management

    PubMed Central

    Frickmann, Hagen; Zautner, Andreas E.

    2014-01-01

    Atypical and multidrug resistance, especially ESBL and carbapenemase expressing Enterobacteriaceae, is globally spreading. Therefore, it becomes increasingly difficult to achieve therapeutic success by calculated antibiotic therapy. Consequently, rapid antibiotic resistance testing is essential. Various molecular and mass spectrometry-based approaches have been introduced in diagnostic microbiology to speed up the providing of reliable resistance data. PCR- and sequencing-based approaches are the most expensive but the most frequently applied modes of testing, suitable for the detection of resistance genes even from primary material. Next generation sequencing, based either on assessment of allelic single nucleotide polymorphisms or on the detection of nonubiquitous resistance mechanisms might allow for sequence-based bacterial resistance testing comparable to viral resistance testing on the long term. Fluorescence in situ hybridization (FISH), based on specific binding of fluorescence-labeled oligonucleotide probes, provides a less expensive molecular bridging technique. It is particularly useful for detection of resistance mechanisms based on mutations in ribosomal RNA. Approaches based on MALDI-TOF-MS, alone or in combination with molecular techniques, like PCR/electrospray ionization MS or minisequencing provide the fastest resistance results from pure colonies or even primary samples with a growing number of protocols. This review details the various approaches of rapid resistance testing, their pros and cons, and their potential use for the diagnostic laboratory. PMID:25343142

  4. Current and Emerging Technology Approaches in Genomics

    PubMed Central

    Conley, Yvette P.; Biesecker, Leslie G.; Gonsalves, Stephen; Merkle, Carrie J.; Kirk, Maggie; Aouizerat, Bradley E.

    2013-01-01

    Purpose To introduce current and emerging approaches that are being utilized in the field of genomics so the reader can conceptually evaluate the literature and appreciate how these approaches are advancing our understanding of health-related issues. Organizing Construct Each approach is described and includes information related to how it is advancing research, its potential clinical utility, exemplars of current uses, challenges related to technologies used for these approaches, and when appropriate information related to understanding the evidence base for clinical utilization of each approach is provided. Web-based resources are included for the reader who would like more in-depth information and to provide opportunity to stay up to date with these approaches and their utility. Conclusions The chosen approaches– genome sequencing, genome-wide association studies, epigenomics, and gene expression– are extremely valuable approaches for collecting research data to help us better understand the pathophysiology of a variety of health-related conditions, but they are also gaining in utility for clinical assessment and testing purposes. Clinical Relevance Our increased understanding of the molecular underpinnings of disease will assist with better development of screening tests, diagnostic tests, tests that allow us to prognosticate, tests that allow for individualized treatments, and tests to facilitate post-treatment surveillance. PMID:23294727

  5. A Psychometric Approach to Theory-Based Behavior Change Intervention Development: Example From the Colorado Meaning-Activity Project.

    PubMed

    Masters, Kevin S; Ross, Kaile M; Hooker, Stephanie A; Wooldridge, Jennalee L

    2018-05-18

    There has been a notable disconnect between theories of behavior change and behavior change interventions. Because few interventions are both explicitly and adequately theory-based, investigators cannot assess the impact of theory on intervention effectiveness. Theory-based interventions, designed to deliberately engage the theory's proposed mechanisms of change, are needed to adequately test theories. Thus, systematic approaches to theory-based intervention development are needed. This article will introduce and discuss the psychometric method of developing theory-based interventions. The psychometric approach to intervention development utilizes basic psychometric principles at each step of the intervention development process in order to build a theoretically driven intervention to, subsequently, be tested in process (mechanism) and outcome studies. Five stages of intervention development are presented as follows: (i) Choice of theory; (ii) Identification and characterization of key concepts and expected relations; (iii) Intervention construction; (iv) Initial testing and revision; and (v) Empirical testing of the intervention. Examples of this approach from the Colorado Meaning-Activity Project (COMAP) are presented. Based on self-determination theory integrated with meaning or purpose, and utilizing a motivational interviewing approach, the COMAP intervention is individually based with an initial interview followed by smart phone-delivered interventions for increasing daily activity. The psychometric approach to intervention development is one method to ensure careful consideration of theory in all steps of intervention development. This structured approach supports developing a research culture that endorses deliberate and systematic operationalization of theory into behavior change intervention from the outset of intervention development.

  6. Deep Learning-Based Noise Reduction Approach to Improve Speech Intelligibility for Cochlear Implant Recipients.

    PubMed

    Lai, Ying-Hui; Tsao, Yu; Lu, Xugang; Chen, Fei; Su, Yu-Ting; Chen, Kuang-Chao; Chen, Yu-Hsuan; Chen, Li-Ching; Po-Hung Li, Lieber; Lee, Chin-Hui

    2018-01-20

    We investigate the clinical effectiveness of a novel deep learning-based noise reduction (NR) approach under noisy conditions with challenging noise types at low signal to noise ratio (SNR) levels for Mandarin-speaking cochlear implant (CI) recipients. The deep learning-based NR approach used in this study consists of two modules: noise classifier (NC) and deep denoising autoencoder (DDAE), thus termed (NC + DDAE). In a series of comprehensive experiments, we conduct qualitative and quantitative analyses on the NC module and the overall NC + DDAE approach. Moreover, we evaluate the speech recognition performance of the NC + DDAE NR and classical single-microphone NR approaches for Mandarin-speaking CI recipients under different noisy conditions. The testing set contains Mandarin sentences corrupted by two types of maskers, two-talker babble noise, and a construction jackhammer noise, at 0 and 5 dB SNR levels. Two conventional NR techniques and the proposed deep learning-based approach are used to process the noisy utterances. We qualitatively compare the NR approaches by the amplitude envelope and spectrogram plots of the processed utterances. Quantitative objective measures include (1) normalized covariance measure to test the intelligibility of the utterances processed by each of the NR approaches; and (2) speech recognition tests conducted by nine Mandarin-speaking CI recipients. These nine CI recipients use their own clinical speech processors during testing. The experimental results of objective evaluation and listening test indicate that under challenging listening conditions, the proposed NC + DDAE NR approach yields higher intelligibility scores than the two compared classical NR techniques, under both matched and mismatched training-testing conditions. When compared to the two well-known conventional NR techniques under challenging listening condition, the proposed NC + DDAE NR approach has superior noise suppression capabilities and gives less distortion for the key speech envelope information, thus, improving speech recognition more effectively for Mandarin CI recipients. The results suggest that the proposed deep learning-based NR approach can potentially be integrated into existing CI signal processors to overcome the degradation of speech perception caused by noise.

  7. Resolution of Forces and Strain Measurements from an Acoustic Ground Test

    NASA Technical Reports Server (NTRS)

    Smith, Andrew M.; LaVerde, Bruce T.; Hunt, Ronald; Waldon, James M.

    2013-01-01

    The Conservatism in Typical Vibration Tests was Demonstrated: Vibration test at component level produced conservative force reactions by approximately a factor of 4 (approx.12 dB) as compared to the integrated acoustic test in 2 out of 3 axes. Reaction Forces Estimated at the Base of Equipment Using a Finite Element Based Method were Validated: FEM based estimate of interface forces may be adequate to guide development of vibration test criteria with less conservatism. Element Forces Estimated in Secondary Structure Struts were Validated: Finite element approach provided best estimate of axial strut forces in frequency range below 200 Hz where a rigid lumped mass assumption for the entire electronics box was valid. Models with enough fidelity to represent diminishing apparent mass of equipment are better suited for estimating force reactions across the frequency range. Forward Work: Demonstrate the reduction in conservatism provided by; Current force limited approach and an FEM guided approach. Validate proposed CMS approach to estimate coupled response from uncoupled system characteristics for vibroacoustics.

  8. Measuring Performance: Teacher-Made Tests.

    ERIC Educational Resources Information Center

    Haladyna, Tom

    Among the new testing developments are the use of objectives or goals in instruction, competency based approaches to instruction, criterion referenced testing, and performance oriented testing. These new approaches often emphasize individualized learning; each student's progress is individually monitored by comparison with clear statements of what…

  9. Optimal Assembly of Psychological and Educational Tests.

    ERIC Educational Resources Information Center

    van der Linden, Wim J.

    1998-01-01

    Reviews optimal test-assembly literature and introduces the contributions to this special issue. Discusses four approaches to computerized test assembly: (1) heuristic-based test assembly; (2) 0-1 linear programming; (3) network-flow programming; and (4) an optimal design approach. Contains a bibliography of 90 sources on test assembly.…

  10. Does Matching Quality Matter in Mode Comparison Studies?

    ERIC Educational Resources Information Center

    Zeng, Ji; Yin, Ping; Shedden, Kerby A.

    2015-01-01

    This article provides a brief overview and comparison of three matching approaches in forming comparable groups for a study comparing test administration modes (i.e., computer-based tests [CBT] and paper-and-pencil tests [PPT]): (a) a propensity score matching approach proposed in this article, (b) the propensity score matching approach used by…

  11. Demystifying the GMAT: Computer-Based Testing Terms

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.

    2012-01-01

    Computer-based testing can be a powerful means to make all aspects of test administration not only faster and more efficient, but also more accurate and more secure. While the Graduate Management Admission Test (GMAT) exam is a computer adaptive test, there are other approaches. This installment presents a primer of computer-based testing terms.

  12. Comparisons of node-based and element-based approaches of assigning bone material properties onto subject-specific finite element models.

    PubMed

    Chen, G; Wu, F Y; Liu, Z C; Yang, K; Cui, F

    2015-08-01

    Subject-specific finite element (FE) models can be generated from computed tomography (CT) datasets of a bone. A key step is assigning material properties automatically onto finite element models, which remains a great challenge. This paper proposes a node-based assignment approach and also compares it with the element-based approach in the literature. Both approaches were implemented using ABAQUS. The assignment procedure is divided into two steps: generating the data file of the image intensity of a bone in a MATLAB program and reading the data file into ABAQUS via user subroutines. The node-based approach assigns the material properties to each node of the finite element mesh, while the element-based approach assigns the material properties directly to each integration point of an element. Both approaches are independent from the type of elements. A number of FE meshes are tested and both give accurate solutions; comparatively the node-based approach involves less programming effort. The node-based approach is also independent from the type of analyses; it has been tested on the nonlinear analysis of a Sawbone femur. The node-based approach substantially improves the level of automation of the assignment procedure of bone material properties. It is the simplest and most powerful approach that is applicable to many types of analyses and elements. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. A New Approach in Force-Limited Vibration Testing of Flight Hardware

    NASA Technical Reports Server (NTRS)

    Kolaini, Ali R.; Kern, Dennis L.

    2012-01-01

    The force-limited vibration test approaches discussed in NASA-7004C were developed to reduce overtesting associated with base shake vibration tests of aerospace hardware where the interface responses are excited coherently. This handbook outlines several different methods of specifying the force limits. The rationale for force limiting is based on the disparity between the impedances of typical aerospace mounting structures and the large impedances of vibration test shakers when the interfaces in general are coherently excited. Among these approaches, the semi-empirical method is presently the most widely used method to derive the force limits. The inclusion of the incoherent excitation of the aerospace structures at mounting interfaces has not been accounted for in the past and provides the basis for more realistic force limits for qualifying the hardware using shaker testing. In this paper current methods for defining the force limiting specifications discussed in the NASA handbook are reviewed using data from a series of acoustic and vibration tests. A new approach based on considering the incoherent excitation of the structural mounting interfaces using acoustic test data is also discussed. It is believed that the new approach provides much more realistic force limits that may further remove conservatism inherent in shaker vibration testing not accounted for by methods discussed in the NASA handbook. A discussion on using FEM/BEM analysis to obtain realistic force limits for flight hardware is provided.

  14. A Path to an Instructional Science: Data-Generated vs. Postulated Models

    ERIC Educational Resources Information Center

    Gropper, George L.

    2016-01-01

    Psychological testing can serve as a prototype on which to base a data-generated approach to instructional design. In "testing batteries" tests are used to predict achievement. In the proposed approach batteries of prescriptions would be used to produce achievement. In creating "test batteries" tests are selected for their…

  15. LESSONS FROM A RETROSPECTIVE ANALYSIS OF A 5-YR PERIOD OF PRESHIPMENT TESTING AT SAN DIEGO ZOO: A RISK-BASED APPROACH TO PRESHIPMENT TESTING MAY BENEFIT ANIMAL WELFARE.

    PubMed

    Marinkovich, Matt; Wallace, Chelsea; Morris, Pat J; Rideout, Bruce; Pye, Geoffrey W

    2016-03-01

    The preshipment examination, with associated transmissible disease testing, has become standard practice in the movement of animals between zoos. An alternative disease risk-based approach, based on a comprehensive surveillance program including necropsy and preventive medicine examination testing and data, has been in practice since 2006 between the San Diego Zoo and San Diego Zoo Safari Park. A retrospective analysis, evaluating comprehensive necropsy data and preshipment testing over a 5-yr study period, was performed to determine the viability of this model for use with sending animals to other institutions. Animals (607 birds, 704 reptiles and amphibians, and 341 mammals) were shipped to 116 Association of Zoos and Aquariums (AZA)-accredited and 29 non-AZA-accredited institutions. The evaluation showed no evidence of the specific transmissible diseases tested for during the preshipment exam being present within the San Diego Zoo collection. We suggest that a risk-based animal and institution-specific approach to transmissible disease preshipment testing is more cost effective and is in the better interest of animal welfare than the current industry standard of dogmatic preshipment testing.

  16. Achievement Goals, Study Strategies, and Achievement: A Test of the "Learning Agenda" Framework

    ERIC Educational Resources Information Center

    Senko, Corwin; Hama, Hidetoshi; Belmonte, Kimberly

    2013-01-01

    Two classroom studies tested whether mastery-approach goals and performance-approach goals nudge students to pursue different learning agendas. Each showed that mastery-approach goals promote an interest-based studying approach in which students allocate study time disproportionately to personally interesting material over duller material. Study 2…

  17. Black-Box System Testing of Real-Time Embedded Systems Using Random and Search-Based Testing

    NASA Astrophysics Data System (ADS)

    Arcuri, Andrea; Iqbal, Muhammad Zohaib; Briand, Lionel

    Testing real-time embedded systems (RTES) is in many ways challenging. Thousands of test cases can be potentially executed on an industrial RTES. Given the magnitude of testing at the system level, only a fully automated approach can really scale up to test industrial RTES. In this paper we take a black-box approach and model the RTES environment using the UML/MARTE international standard. Our main motivation is to provide a more practical approach to the model-based testing of RTES by allowing system testers, who are often not familiar with the system design but know the application domain well-enough, to model the environment to enable test automation. Environment models can support the automation of three tasks: the code generation of an environment simulator, the selection of test cases, and the evaluation of their expected results (oracles). In this paper, we focus on the second task (test case selection) and investigate three test automation strategies using inputs from UML/MARTE environment models: Random Testing (baseline), Adaptive Random Testing, and Search-Based Testing (using Genetic Algorithms). Based on one industrial case study and three artificial systems, we show how, in general, no technique is better than the others. Which test selection technique to use is determined by the failure rate (testing stage) and the execution time of test cases. Finally, we propose a practical process to combine the use of all three test strategies.

  18. Large-Scale Multiobjective Static Test Generation for Web-Based Testing with Integer Programming

    ERIC Educational Resources Information Center

    Nguyen, M. L.; Hui, Siu Cheung; Fong, A. C. M.

    2013-01-01

    Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…

  19. Dilatancy Criteria for Salt Cavern Design: A Comparison Between Stress- and Strain-Based Approaches

    NASA Astrophysics Data System (ADS)

    Labaune, P.; Rouabhi, A.; Tijani, M.; Blanco-Martín, L.; You, T.

    2018-02-01

    This paper presents a new approach for salt cavern design, based on the use of the onset of dilatancy as a design threshold. In the proposed approach, a rheological model that includes dilatancy at the constitutive level is developed, and a strain-based dilatancy criterion is defined. As compared to classical design methods that consist in simulating cavern behavior through creep laws (fitted on long-term tests) and then using a criterion (derived from short-terms tests or experience) to determine the stability of the excavation, the proposed approach is consistent both with short- and long-term conditions. The new strain-based dilatancy criterion is compared to a stress-based dilatancy criterion through numerical simulations of salt caverns under cyclic loading conditions. The dilatancy zones predicted by the strain-based criterion are larger than the ones predicted by the stress-based criteria, which is conservative yet constructive for design purposes.

  20. Targeting Fear of Spiders with Control-, Acceptance-, and Information-Based Approaches

    ERIC Educational Resources Information Center

    Wagener, Alexandra L.; Zettle, Robert D.

    2011-01-01

    The relative impact of control-, acceptance-, and information-based approaches in targeting a midlevel fear of spiders among college students was evaluated. Participants listened to a brief protocol presenting one of the three approaches before completing the Perceived-Threat Behavioral Approach Test (PT-BAT; Cochrane, Barnes-Holmes, &…

  1. Selecting measures to prevent deleterious alkali-silica reaction in concrete : rationale for the AASHTO PP65 prescriptive approach.

    DOT National Transportation Integrated Search

    2012-10-01

    PP65-11 provides two approaches for selecting preventive measures: (i) a performance approach based on laboratory testing, and (ii) a prescriptive approach based on a consideration of the reactivity of the aggregate, type and size of structure, expos...

  2. Flight tests of IFR landing approach systems for helicopters

    NASA Technical Reports Server (NTRS)

    Bull, J. S.; Hegarty, D. M.; Peach, L. L.; Phillips, J. D.; Anderson, D. J.; Dugan, D. C.; Ross, V. L.

    1981-01-01

    Joint NASA/FAA helicopter flight tests were conducted to investigate airborne radar approaches (ARA) and microwave landing system (MLS) approaches. Flight-test results were utilized to prove NASA with a data base to be used as a performance measure for advanced guidance and navigation concepts, and to provide FAA with data for establishment of TERPS criteria. The first flight-test investigation consisted of helicopter IFR approaches to offshore oil rigs in the Gulf of Mexico, using weather/mapping radar, operational pilots, and a Bell 212 helicopter. The second flight-test investigation consisted of IFR MLS approaches at Crows Landing (near Ames Research Center), with a Bell UH-1H helicopter, using NASA, FAA, and operational industry pilots. Tests are described and results discussed.

  3. Pathway-based predictive approaches for non-animal assessment of acute inhalation toxicity.

    PubMed

    Clippinger, Amy J; Allen, David; Behrsing, Holger; BéruBé, Kelly A; Bolger, Michael B; Casey, Warren; DeLorme, Michael; Gaça, Marianna; Gehen, Sean C; Glover, Kyle; Hayden, Patrick; Hinderliter, Paul; Hotchkiss, Jon A; Iskandar, Anita; Keyser, Brian; Luettich, Karsta; Ma-Hock, Lan; Maione, Anna G; Makena, Patrudu; Melbourne, Jodie; Milchak, Lawrence; Ng, Sheung P; Paini, Alicia; Page, Kathryn; Patlewicz, Grace; Prieto, Pilar; Raabe, Hans; Reinke, Emily N; Roper, Clive; Rose, Jane; Sharma, Monita; Spoo, Wayne; Thorne, Peter S; Wilson, Daniel M; Jarabek, Annie M

    2018-06-20

    New approaches are needed to assess the effects of inhaled substances on human health. These approaches will be based on mechanisms of toxicity, an understanding of dosimetry, and the use of in silico modeling and in vitro test methods. In order to accelerate wider implementation of such approaches, development of adverse outcome pathways (AOPs) can help identify and address gaps in our understanding of relevant parameters for model input and mechanisms, and optimize non-animal approaches that can be used to investigate key events of toxicity. This paper describes the AOPs and the toolbox of in vitro and in silico models that can be used to assess the key events leading to toxicity following inhalation exposure. Because the optimal testing strategy will vary depending on the substance of interest, here we present a decision tree approach to identify an appropriate non-animal integrated testing strategy that incorporates consideration of a substance's physicochemical properties, relevant mechanisms of toxicity, and available in silico models and in vitro test methods. This decision tree can facilitate standardization of the testing approaches. Case study examples are presented to provide a basis for proof-of-concept testing to illustrate the utility of non-animal approaches to inform hazard identification and risk assessment of humans exposed to inhaled substances. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  4. Multilocus Association Mapping Using Variable-Length Markov Chains

    PubMed Central

    Browning, Sharon R.

    2006-01-01

    I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests. PMID:16685642

  5. Multilocus association mapping using variable-length Markov chains.

    PubMed

    Browning, Sharon R

    2006-06-01

    I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests.

  6. A shift from significance test to hypothesis test through power analysis in medical research.

    PubMed

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  7. A four stage approach for ontology-based health information system design.

    PubMed

    Kuziemsky, Craig E; Lau, Francis

    2010-11-01

    To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Resampling-based Methods in Single and Multiple Testing for Equality of Covariance/Correlation Matrices

    PubMed Central

    Yang, Yang; DeGruttola, Victor

    2016-01-01

    Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients. PMID:22740584

  9. Resampling-based methods in single and multiple testing for equality of covariance/correlation matrices.

    PubMed

    Yang, Yang; DeGruttola, Victor

    2012-06-22

    Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.

  10. Generating Test Templates via Automated Theorem Proving

    NASA Technical Reports Server (NTRS)

    Kancherla, Mani Prasad

    1997-01-01

    Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.

  11. Approach to evaluation and management of a patient with multiple food allergies.

    PubMed

    Bird, J Andrew

    2016-01-01

    Diagnosing food allergy is often challenging, and validated testing modalities are mostly limited to immunoglobulin E (IgE)-mediated reactions to foods. Use of food-specific IgE tests and skin prick tests in individuals without a history that supports an IgE-mediated reaction to the specific food being tested diminishes the predictive capabilities of the test. To review the literature regarding evaluation of patients with a concern for multiple food allergies and to demonstrate an evidence-based approach to diagnosis and management. A literature search was performed and articles identified as relevant based on the search terms "food allergy," "food allergy diagnosis," "skin prick test," "serum IgE test," "oral food challenge", and "food allergy management." Patients at risk of food allergy are often misdiagnosed and appropriate evaluation of patients with concern for food allergy includes taking a thorough diet history and reaction history, performing specific tests intentionally and when indicated, and conducting an oral food challenge in a safe environment by an experienced provider when test results are inconclusive. An evidence-based approach to diagnosing and managing a patient at risk of having a life-threatening food allergy is reviewed.

  12. A Comparison of Seventh Grade Thai Students' Reading Comprehension and Motivation to Read English through Applied Instruction Based on the Genre-Based Approach and the Teacher's Manual

    ERIC Educational Resources Information Center

    Sawangsamutchai, Yutthasak; Rattanavich, Saowalak

    2016-01-01

    The objective of this research is to compare the English reading comprehension and motivation to read of seventh grade Thai students taught with applied instruction through the genre-based approach and teachers' manual. A randomized pre-test post-test control group design was used through the cluster random sampling technique. The data were…

  13. A Comparison of Computer-Based Classification Testing Approaches Using Mixed-Format Tests with the Generalized Partial Credit Model

    ERIC Educational Resources Information Center

    Kim, Jiseon

    2010-01-01

    Classification testing has been widely used to make categorical decisions by determining whether an examinee has a certain degree of ability required by established standards. As computer technologies have developed, classification testing has become more computerized. Several approaches have been proposed and investigated in the context of…

  14. Revisiting a Cognitive Framework for Test Design: Applications for a Computerized Perceptual Speed Test.

    ERIC Educational Resources Information Center

    Alderton, David L.

    This paper highlights the need for a systematic, content aware, and theoretically-based approach to test design. The cognitive components approach is endorsed, and is applied to the development of a computerized perceptual speed test. Psychometric literature is reviewed and shows that: every major multi-factor theory includes a clerical/perceptual…

  15. EXSPRT: An Expert Systems Approach to Computer-Based Adaptive Testing.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; And Others

    Expert systems can be used to aid decision making. A computerized adaptive test (CAT) is one kind of expert system, although it is not commonly recognized as such. A new approach, termed EXSPRT, was devised that combines expert systems reasoning and sequential probability ratio test stopping rules. EXSPRT-R uses random selection of test items,…

  16. Review of Pearson Test of English Academic: Building an Assessment Use Argument

    ERIC Educational Resources Information Center

    Wang, Huan; Choi, Ikkyu; Schmidgall, Jonathan; Bachman, Lyle F.

    2012-01-01

    This review departs from current practice in reviewing tests in that it employs an "argument-based approach" to test validation to guide the review (e.g. Bachman, 2005; Kane, 2006; Mislevy, Steinberg, & Almond, 2002). Specifically, it follows an approach to test development and use that Bachman and Palmer (2010) call the process of "assessment…

  17. Evidence-based toxicology for the 21st century: opportunities and challenges.

    PubMed

    Stephens, Martin L; Andersen, Melvin; Becker, Richard A; Betts, Kellyn; Boekelheide, Kim; Carney, Ed; Chapin, Robert; Devlin, Dennis; Fitzpatrick, Suzanne; Fowle, John R; Harlow, Patricia; Hartung, Thomas; Hoffmann, Sebastian; Holsapple, Michael; Jacobs, Abigail; Judson, Richard; Naidenko, Olga; Pastoor, Tim; Patlewicz, Grace; Rowan, Andrew; Scherer, Roberta; Shaikh, Rashid; Simon, Ted; Wolf, Douglas; Zurlo, Joanne

    2013-01-01

    The Evidence-based Toxicology Collaboration (EBTC) was established recently to translate evidence-based approaches from medicine and health care to toxicology in an organized and sustained effort. The EBTC held a workshop on "Evidence-based Toxicology for the 21st Century: Opportunities and Challenges" in Research Triangle Park, North Carolina, USA on January 24-25, 2012. The presentations largely reflected two EBTC priorities: to apply evidence-based methods to assessing the performance of emerging pathway-based testing methods consistent with the 2007 National Research Council report on "Toxicity Testing in the 21st Century" as well as to adopt a governance structure and work processes to move that effort forward. The workshop served to clarify evidence-based approaches and to provide food for thought on substantive and administrative activities for the EBTC. Priority activities include conducting pilot studies to demonstrate the value of evidence-based approaches to toxicology, as well as conducting educational outreach on these approaches.

  18. Histogram equalization with Bayesian estimation for noise robust speech recognition.

    PubMed

    Suh, Youngjoo; Kim, Hoirin

    2018-02-01

    The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.

  19. Evidence-Based Toxicology.

    PubMed

    Hoffmann, Sebastian; Hartung, Thomas; Stephens, Martin

    Evidence-based toxicology (EBT) was introduced independently by two groups in 2005, in the context of toxicological risk assessment and causation as well as based on parallels between the evaluation of test methods in toxicology and evidence-based assessment of diagnostics tests in medicine. The role model of evidence-based medicine (EBM) motivated both proposals and guided the evolution of EBT, whereas especially systematic reviews and evidence quality assessment attract considerable attention in toxicology.Regarding test assessment, in the search of solutions for various problems related to validation, such as the imperfectness of the reference standard or the challenge to comprehensively evaluate tests, the field of Diagnostic Test Assessment (DTA) was identified as a potential resource. DTA being an EBM discipline, test method assessment/validation therefore became one of the main drivers spurring the development of EBT.In the context of pathway-based toxicology, EBT approaches, given their objectivity, transparency and consistency, have been proposed to be used for carrying out a (retrospective) mechanistic validation.In summary, implementation of more evidence-based approaches may provide the tools necessary to adapt the assessment/validation of toxicological test methods and testing strategies to face the challenges of toxicology in the twenty first century.

  20. Whole-machine calibration approach for phased array radar with self-test

    NASA Astrophysics Data System (ADS)

    Shen, Kai; Yao, Zhi-Cheng; Zhang, Jin-Chang; Yang, Jian

    2017-06-01

    The performance of the missile-borne phased array radar is greatly influenced by the inter-channel amplitude and phase inconsistencies. In order to ensure its performance, the amplitude and the phase characteristics of radar should be calibrated. Commonly used methods mainly focus on antenna calibration, such as FFT, REV, etc. However, the radar channel also contains T / R components, channels, ADC and messenger. In order to achieve on-based phased array radar amplitude information for rapid machine calibration and compensation, we adopt a high-precision plane scanning test platform for phase amplitude test. A calibration approach for the whole channel system based on the radar frequency source test is proposed. Finally, the advantages and the application prospect of this approach are analysed.

  1. A risk-based classification scheme for genetically modified foods. II: Graded testing.

    PubMed

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    This paper presents a graded approach to the testing of crop-derived genetically modified (GM) foods based on concern levels in a proposed risk-based classification scheme (RBCS) and currently available testing methods. A graded approach offers the potential for more efficient use of testing resources by focusing less on lower concern GM foods, and more on higher concern foods. In this proposed approach to graded testing, products that are classified as Level I would have met baseline testing requirements that are comparable to what is widely applied to premarket assessment of GM foods at present. In most cases, Level I products would require no further testing, or very limited confirmatory analyses. For products classified as Level II or higher, additional testing would be required, depending on the type of the substance, prior dietary history, estimated exposure level, prior knowledge of toxicity of the substance, and the nature of the concern related to unintended changes in the modified food. Level III testing applies only to the assessment of toxic and antinutritional effects from intended changes and is tailored to the nature of the substance in question. Since appropriate test methods are not currently available for all effects of concern, future research to strengthen the testing of GM foods is discussed.

  2. Automatic Generation of Tests from Domain and Multimedia Ontologies

    ERIC Educational Resources Information Center

    Papasalouros, Andreas; Kotis, Konstantinos; Kanaris, Konstantinos

    2011-01-01

    The aim of this article is to present an approach for generating tests in an automatic way. Although other methods have been already reported in the literature, the proposed approach is based on ontologies, representing both domain and multimedia knowledge. The article also reports on a prototype implementation of this approach, which…

  3. Gearbox Tooth Cut Fault Diagnostics Using Acoustic Emission and Vibration Sensors — A Comparative Study

    PubMed Central

    Qu, Yongzhi; He, David; Yoon, Jae; Van Hecke, Brandon; Bechhoefer, Eric; Zhu, Junda

    2014-01-01

    In recent years, acoustic emission (AE) sensors and AE-based techniques have been developed and tested for gearbox fault diagnosis. In general, AE-based techniques require much higher sampling rates than vibration analysis-based techniques for gearbox fault diagnosis. Therefore, it is questionable whether an AE-based technique would give a better or at least the same performance as the vibration analysis-based techniques using the same sampling rate. To answer the question, this paper presents a comparative study for gearbox tooth damage level diagnostics using AE and vibration measurements, the first known attempt to compare the gearbox fault diagnostic performance of AE- and vibration analysis-based approaches using the same sampling rate. Partial tooth cut faults are seeded in a gearbox test rig and experimentally tested in a laboratory. Results have shown that the AE-based approach has the potential to differentiate gear tooth damage levels in comparison with the vibration-based approach. While vibration signals are easily affected by mechanical resonance, the AE signals show more stable performance. PMID:24424467

  4. An Approach to Model Based Testing of Multiagent Systems

    PubMed Central

    Nadeem, Aamer

    2015-01-01

    Autonomous agents perform on behalf of the user to achieve defined goals or objectives. They are situated in dynamic environment and are able to operate autonomously to achieve their goals. In a multiagent system, agents cooperate with each other to achieve a common goal. Testing of multiagent systems is a challenging task due to the autonomous and proactive behavior of agents. However, testing is required to build confidence into the working of a multiagent system. Prometheus methodology is a commonly used approach to design multiagents systems. Systematic and thorough testing of each interaction is necessary. This paper proposes a novel approach to testing of multiagent systems based on Prometheus design artifacts. In the proposed approach, different interactions between the agent and actors are considered to test the multiagent system. These interactions include percepts and actions along with messages between the agents which can be modeled in a protocol diagram. The protocol diagram is converted into a protocol graph, on which different coverage criteria are applied to generate test paths that cover interactions between the agents. A prototype tool has been developed to generate test paths from protocol graph according to the specified coverage criterion. PMID:25874263

  5. Population genetic testing for cancer susceptibility: founder mutations to genomes.

    PubMed

    Foulkes, William D; Knoppers, Bartha Maria; Turnbull, Clare

    2016-01-01

    The current standard model for identifying carriers of high-risk mutations in cancer-susceptibility genes (CSGs) generally involves a process that is not amenable to population-based testing: access to genetic tests is typically regulated by health-care providers on the basis of a labour-intensive assessment of an individual's personal and family history of cancer, with face-to-face genetic counselling performed before mutation testing. Several studies have shown that application of these selection criteria results in a substantial proportion of mutation carriers being missed. Population-based genetic testing has been proposed as an alternative approach to determining cancer susceptibility, and aims for a more-comprehensive detection of mutation carriers. Herein, we review the existing data on population-based genetic testing, and consider some of the barriers, pitfalls, and challenges related to the possible expansion of this approach. We consider mechanisms by which population-based genetic testing for cancer susceptibility could be delivered, and suggest how such genetic testing might be integrated into existing and emerging health-care structures. The existing models of genetic testing (including issues relating to informed consent) will very likely require considerable alteration if the potential benefits of population-based genetic testing are to be fully realized.

  6. VERIFYING THE VOC CONTROL PERFORMANCE OF BIOREACTORS

    EPA Science Inventory

    The paper describes the verification testing approach used to collect high-quality, peer-reviewed data on the performance of bioreaction-based technologies for the control of volatile organic compounds (VOCs). The verification protocol that describes the approach for these tests ...

  7. Towards Universal Voluntary HIV Testing and Counselling: A Systematic Review and Meta-Analysis of Community-Based Approaches

    PubMed Central

    Suthar, Amitabh B.; Ford, Nathan; Bachanas, Pamela J.; Wong, Vincent J.; Rajan, Jay S.; Saltzman, Alex K.; Ajose, Olawale; Fakoya, Ade O.; Granich, Reuben M.; Negussie, Eyerusalem K.; Baggaley, Rachel C.

    2013-01-01

    Background Effective national and global HIV responses require a significant expansion of HIV testing and counselling (HTC) to expand access to prevention and care. Facility-based HTC, while essential, is unlikely to meet national and global targets on its own. This article systematically reviews the evidence for community-based HTC. Methods and Findings PubMed was searched on 4 March 2013, clinical trial registries were searched on 3 September 2012, and Embase and the World Health Organization Global Index Medicus were searched on 10 April 2012 for studies including community-based HTC (i.e., HTC outside of health facilities). Randomised controlled trials, and observational studies were eligible if they included a community-based testing approach and reported one or more of the following outcomes: uptake, proportion receiving their first HIV test, CD4 value at diagnosis, linkage to care, HIV positivity rate, HTC coverage, HIV incidence, or cost per person tested (outcomes are defined fully in the text). The following community-based HTC approaches were reviewed: (1) door-to-door testing (systematically offering HTC to homes in a catchment area), (2) mobile testing for the general population (offering HTC via a mobile HTC service), (3) index testing (offering HTC to household members of people with HIV and persons who may have been exposed to HIV), (4) mobile testing for men who have sex with men, (5) mobile testing for people who inject drugs, (6) mobile testing for female sex workers, (7) mobile testing for adolescents, (8) self-testing, (9) workplace HTC, (10) church-based HTC, and (11) school-based HTC. The Newcastle-Ottawa Quality Assessment Scale and the Cochrane Collaboration's “risk of bias” tool were used to assess the risk of bias in studies with a comparator arm included in pooled estimates.  117 studies, including 864,651 participants completing HTC, met the inclusion criteria. The percentage of people offered community-based HTC who accepted HTC was as follows: index testing, 88% of 12,052 participants; self-testing, 87% of 1,839 participants; mobile testing, 87% of 79,475 participants; door-to-door testing, 80% of 555,267 participants; workplace testing, 67% of 62,406 participants; and school-based testing, 62% of 2,593 participants. Mobile HTC uptake among key populations (men who have sex with men, people who inject drugs, female sex workers, and adolescents) ranged from 9% to 100% (among 41,110 participants across studies), with heterogeneity related to how testing was offered. Community-based approaches increased HTC uptake (relative risk [RR] 10.65, 95% confidence interval [CI] 6.27–18.08), the proportion of first-time testers (RR 1.23, 95% CI 1.06–1.42), and the proportion of participants with CD4 counts above 350 cells/µl (RR 1.42, 95% CI 1.16–1.74), and obtained a lower positivity rate (RR 0.59, 95% CI 0.37–0.96), relative to facility-based approaches. 80% (95% CI 75%–85%) of 5,832 community-based HTC participants obtained a CD4 measurement following HIV diagnosis, and 73% (95% CI 61%–85%) of 527 community-based HTC participants initiated antiretroviral therapy following a CD4 measurement indicating eligibility. The data on linking participants without HIV to prevention services were limited. In low- and middle-income countries, the cost per person tested ranged from US$2–US$126. At the population level, community-based HTC increased HTC coverage (RR 7.07, 95% CI 3.52–14.22) and reduced HIV incidence (RR 0.86, 95% CI 0.73–1.02), although the incidence reduction lacked statistical significance. No studies reported any harm arising as a result of having been tested. Conclusions Community-based HTC achieved high rates of HTC uptake, reached people with high CD4 counts, and linked people to care. It also obtained a lower HIV positivity rate relative to facility-based approaches. Further research is needed to further improve acceptability of community-based HTC for key populations. HIV programmes should offer community-based HTC linked to prevention and care, in addition to facility-based HTC, to support increased access to HIV prevention, care, and treatment. Review Registration International Prospective Register of Systematic Reviews CRD42012002554 Please see later in the article for the Editors' Summary PMID:23966838

  8. Development of an antigen-based rapid diagnostic test for the identification of blowfly (Calliphoridae) species of forensic significance.

    PubMed

    McDonagh, Laura; Thornton, Chris; Wallman, James F; Stevens, Jamie R

    2009-06-01

    In this study we examine the limitations of currently used sequence-based approaches to blowfly (Calliphoridae) identification and evaluate the utility of an immunological approach to discriminate between blowfly species of forensic importance. By investigating antigenic similarity and dissimilarity between the first instar larval stages of four forensically important blowfly species, we have been able to identify immunoreactive proteins of potential use in the development of species-specific immuno-diagnostic tests. Here we outline our protein-based approach to species determination, and describe how it may be adapted to develop rapid diagnostic assays for the 'on-site' identification of blowfly species.

  9. The Effect of Learning Cycle Constructivist-Based Approach on Students' Academic Achievement and Attitude towards Chemistry in Secondary Schools in North-Eastern Part of Nigeria

    ERIC Educational Resources Information Center

    Jack, Gladys Uzezi

    2017-01-01

    This study investigated the effect of learning cycle constructivist-based approach on secondary schools students' academic achievement and their attitude towards chemistry. The design used was a pre-test, post-test non randomized control group quasi experimental research design. The design consisted of two instructional groups (learning cycle…

  10. Multilevel Models for Estimating the Effect of Implementing Argumentation-Based Elementary Science Instruction

    ERIC Educational Resources Information Center

    Shelley, Mack; Gonwa-Reeves, Christopher; Baenziger, Joan; Seefeld, Ashley; Hand, Brian; Therrien, William; Villanueva, Mary Grace; Taylor, Jonte

    2012-01-01

    The purpose of this paper is to examine the impact of implementation of the Science Writing Heuristic (SWH) approach at 5th grade level in the public school system in Iowa as measured by Cornell Critical Thinking student test scores. This is part of a project that overall tests the efficacy of the SWH inquiry-based approach to build students'…

  11. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    DTIC Science & Technology

    2007-07-01

    RTO-EN-AVT-142 17 - 1 A Risk-Based Approach for Aerothermal/ TPS Analysis and Testing Michael J. Wright∗ and Jay H. Grinstead† NASA Ames...of the thermal protection system ( TPS ) is to protect the payload (crew, cargo, or science) from this entry heating environment. The performance of...the TPS is determined by the efficiency and reliability of this system, typically measured

  12. The Effects of the Marriage Enrichment Program Based on the Cognitive-Behavioral Approach on the Marital Adjustment of Couples

    ERIC Educational Resources Information Center

    Kalkan, Melek; Ersanli, Ercumend

    2008-01-01

    The aim of this study is to investigate the effects of the marriage enrichment program based on the cognitive-behavioral approach on levels of marital adjustment of individuals. The experimental and control group of this research was totally composed of 30 individuals. A pre-test post-test research model with control group was used in this…

  13. A reduced order, test verified component mode synthesis approach for system modeling applications

    NASA Astrophysics Data System (ADS)

    Butland, Adam; Avitabile, Peter

    2010-05-01

    Component mode synthesis (CMS) is a very common approach used for the generation of large system models. In general, these modeling techniques can be separated into two categories: those utilizing a combination of constraint modes and fixed interface normal modes and those based on a combination of free interface normal modes and residual flexibility terms. The major limitation of the methods utilizing constraint modes and fixed interface normal modes is the inability to easily obtain the required information from testing; the result of this limitation is that constraint mode-based techniques are primarily used with numerical models. An alternate approach is proposed which utilizes frequency and shape information acquired from modal testing to update reduced order finite element models using exact analytical model improvement techniques. The connection degrees of freedom are then rigidly constrained in the test verified, reduced order model to provide the boundary conditions necessary for constraint modes and fixed interface normal modes. The CMS approach is then used with this test verified, reduced order model to generate the system model for further analysis. A laboratory structure is used to show the application of the technique with both numerical and simulated experimental components to describe the system and validate the proposed approach. Actual test data is then used in the approach proposed. Due to typical measurement data contaminants that are always included in any test, the measured data is further processed to remove contaminants and is then used in the proposed approach. The final case using improved data with the reduced order, test verified components is shown to produce very acceptable results from the Craig-Bampton component mode synthesis approach. Use of the technique with its strengths and weaknesses are discussed.

  14. A study to compare traditional and constructivism-based instruction of a high school biology unit on biosystematics

    NASA Astrophysics Data System (ADS)

    Saigo, Barbara Woodworth

    The researcher collaborated with four high school biology teachers who had been involved for 2-1/2 years in a constructivism-based professional development experience that emphasized teaching for conceptual change and using classroom-based inquiry as a basis for making instructional decisions. The researcher and teachers designed a five-day instructional unit on biosystematics using two contrasting approaches, comprising the treatment variable. The "traditional" unit emphasized lecture, written materials, and some laboratory activities. The "constructivist" unit emphasized a specific, inquiry-based, conceptual change strategy and collaborative learning. The study used a quasi-experimental, factorial design to explore impact of instructional approach (the treatment variable) on student performance (the dependent variable) on repeated measures (three) of a biology concept test. Additional independent variables considered were gender, cumulative GPA, and the section in which students were enrolled. Scores on the biology concept test were compiled for the 3 constructivist sections (N = 44) and the 3 traditional sections (N = 42). Analysis of Covariance (ANCOVA) was applied. The main findings in regard to the primary research question were that instructional approach did not have a significant relationship to immediate post test scores or gain, but that one month after instruction students in the constructivist group demonstrated less loss of gain than those in the traditional group; i.e., their longer-term retention was greater. Also, GPA*instructional approach effects were detected for post-post-test gain. GPA and gender were significantly associated with pre-test, post-test, and post-post scores; however, in terms of change (gain) from pre-test to post-test and pre-test to post-post-test, GPA and gender were not significant effects. Section was a significant effect for all three tests, in terms of both score and gain. Gender*section effects were detected for post-test gain and post-post-test scores.

  15. A generalized least-squares framework for rare-variant analysis in family data.

    PubMed

    Li, Dalin; Rotter, Jerome I; Guo, Xiuqing

    2014-01-01

    Rare variants may, in part, explain some of the hereditability missing in current genome-wide association studies. Many gene-based rare-variant analysis approaches proposed in recent years are aimed at population-based samples, although analysis strategies for family-based samples are clearly warranted since the family-based design has the potential to enhance our ability to enrich for rare causal variants. We have recently developed the generalized least squares, sequence kernel association test, or GLS-SKAT, approach for the rare-variant analyses in family samples, in which the kinship matrix that was computed from the high dimension genetic data was used to decorrelate the family structure. We then applied the SKAT-O approach for gene-/region-based inference in the decorrelated data. In this study, we applied this GLS-SKAT method to the systolic blood pressure data in the simulated family sample distributed by the Genetic Analysis Workshop 18. We compared the GLS-SKAT approach to the rare-variant analysis approach implemented in family-based association test-v1 and demonstrated that the GLS-SKAT approach provides superior power and good control of type I error rate.

  16. Cluster Randomized Test-Negative Design (CR-TND) Trials: A Novel and Efficient Method to Assess the Efficacy of Community Level Dengue Interventions.

    PubMed

    Anders, Katherine L; Cutcher, Zoe; Kleinschmidt, Immo; Donnelly, Christl A; Ferguson, Neil M; Indriani, Citra; O'Neill, Scott L; Jewell, Nicholas P; Simmons, Cameron P

    2018-05-07

    Cluster randomized trials are the gold standard for assessing efficacy of community-level interventions, such as vector control strategies against dengue. We describe a novel cluster randomized trial methodology with a test-negative design, which offers advantages over traditional approaches. It utilizes outcome-based sampling of patients presenting with a syndrome consistent with the disease of interest, who are subsequently classified as test-positive cases or test-negative controls on the basis of diagnostic testing. We use simulations of a cluster trial to demonstrate validity of efficacy estimates under the test-negative approach. This demonstrates that, provided study arms are balanced for both test-negative and test-positive illness at baseline and that other test-negative design assumptions are met, the efficacy estimates closely match true efficacy. We also briefly discuss analytical considerations for an odds ratio-based effect estimate arising from clustered data, and outline potential approaches to analysis. We conclude that application of the test-negative design to certain cluster randomized trials could increase their efficiency and ease of implementation.

  17. Efficient Testing Combining Design of Experiment and Learn-to-Fly Strategies

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Brandon, Jay M.

    2017-01-01

    Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.

  18. Radiation and Electromagnetic Induction Data Fusion for Detection of Buried Radioactive Metal Waste - 12282

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Zhiling; Wei, Wei; Turlapaty, Anish

    2012-07-01

    At the United States Army's test sites, fired penetrators made of Depleted Uranium (DU) have been buried under ground and become hazardous waste. Previously, we developed techniques for detecting buried radioactive targets. We also developed approaches for locating buried paramagnetic metal objects by utilizing the electromagnetic induction (EMI) sensor data. In this paper, we apply data fusion techniques to combine results from both the radiation detection and the EMI detection, so that we can further distinguish among DU penetrators, DU oxide, and non- DU metal debris. We develop a two-step fusion approach for the task, and test it with surveymore » data collected on simulation targets. In this work, we explored radiation and EMI data fusion for detecting DU, oxides, and non-DU metals. We developed a two-step fusion approach based on majority voting and a set of decision rules. With this approach, we fuse results from radiation detection based on the RX algorithm and EMI detection based on a 3-step analysis. Our fusion approach has been tested successfully with data collected on simulation targets. In the future, we will need to further verify the effectiveness of this fusion approach with field data. (authors)« less

  19. Testing the TPF Interferometry Approach before Launch

    NASA Technical Reports Server (NTRS)

    Serabyn, Eugene; Mennesson, Bertrand

    2006-01-01

    One way to directly detect nearby extra-solar planets is via their thermal infrared emission, and with this goal in mind, both NASA and ESA are investigating cryogenic infrared interferometers. Common to both agencies' approaches to faint off-axis source detection near bright stars is the use of a rotating nulling interferometer, such as the Terrestrial Planet Finder interferometer (TPF-I), or Darwin. In this approach, the central star is nulled, while the emission from off-axis sources is transmitted and modulated by the rotation of the off-axis fringes. Because of the high contrasts involved, and the novelty of the measurement technique, it is essential to gain experience with this technique before launch. Here we describe a simple ground-based experiment that can test the essential aspects of the TPF signal measurement and image reconstruction approaches by generating a rotating interferometric baseline within the pupil of a large singleaperture telescope. This approach can mimic potential space-based interferometric configurations, and allow the extraction of signals from off-axis sources using the same algorithms proposed for the space-based missions. This approach should thus allow for testing of the applicability of proposed signal extraction algorithms for the detection of single and multiple near-neighbor companions...

  20. Development of Scientific Approach Based on Discovery Learning Module

    NASA Astrophysics Data System (ADS)

    Ellizar, E.; Hardeli, H.; Beltris, S.; Suharni, R.

    2018-04-01

    Scientific Approach is a learning process, designed to make the students actively construct their own knowledge through stages of scientific method. The scientific approach in learning process can be done by using learning modules. One of the learning model is discovery based learning. Discovery learning is a learning model for the valuable things in learning through various activities, such as observation, experience, and reasoning. In fact, the students’ activity to construct their own knowledge were not optimal. It’s because the available learning modules were not in line with the scientific approach. The purpose of this study was to develop a scientific approach discovery based learning module on Acid Based, also on electrolyte and non-electrolyte solution. The developing process of this chemistry modules use the Plomp Model with three main stages. The stages are preliminary research, prototyping stage, and the assessment stage. The subject of this research was the 10th and 11th Grade of Senior High School students (SMAN 2 Padang). Validation were tested by the experts of Chemistry lecturers and teachers. Practicality of these modules had been tested through questionnaire. The effectiveness had been tested through experimental procedure by comparing student achievement between experiment and control groups. Based on the findings, it can be concluded that the developed scientific approach discovery based learning module significantly improve the students’ learning in Acid-based and Electrolyte solution. The result of the data analysis indicated that the chemistry module was valid in content, construct, and presentation. Chemistry module also has a good practicality level and also accordance with the available time. This chemistry module was also effective, because it can help the students to understand the content of the learning material. That’s proved by the result of learning student. Based on the result can conclude that chemistry module based on discovery learning and scientific approach in electrolyte and non-electrolyte solution and Acid Based for the 10th and 11th grade of senior high school students were valid, practice, and effective.

  1. Urban tree cover change in Detroit and Atlanta, USA, 1951-2010

    Treesearch

    Krista Merry; Jacek Siry; Pete Bettinger; J.M. Bowker

    2014-01-01

    We assessed tree cover using random points and polygons distributed within the administrative boundaries of Detroit, MI and Atlanta, GA. Two approaches were tested, a point-based approach using 1000 randomly located sample points, and polygon-based approach using 250 circular areas, 200 m in radius (12.56 ha). In the case of Atlanta, both approaches arrived at similar...

  2. Fixed Base Modal Survey of the MPCV Orion European Service Module Structural Test Article

    NASA Technical Reports Server (NTRS)

    Winkel, James P.; Akers, J. C.; Suarez, Vicente J.; Staab, Lucas D.; Napolitano, Kevin L.

    2017-01-01

    Recently, the MPCV Orion European Service Module Structural Test Article (E-STA) underwent sine vibration testing using the multi-axis shaker system at NASA GRC Plum Brook Station Mechanical Vibration Facility (MVF). An innovative approach using measured constraint shapes at the interface of E-STA to the MVF allowed high-quality fixed base modal parameters of the E-STA to be extracted, which have been used to update the E-STA finite element model (FEM), without the need for a traditional fixed base modal survey. This innovative approach provided considerable program cost and test schedule savings. This paper documents this modal survey, which includes the modal pretest analysis sensor selection, the fixed base methodology using measured constraint shapes as virtual references and measured frequency response functions, and post-survey comparison between measured and analysis fixed base modal parameters.

  3. To t-Test or Not to t-Test? A p-Values-Based Point of View in the Receiver Operating Characteristic Curve Framework.

    PubMed

    Vexler, Albert; Yu, Jihnhee

    2018-04-13

    A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.

  4. BETA: Behavioral testability analyzer and its application to high-level test generation and synthesis for testability. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chen, Chung-Hsing

    1992-01-01

    In this thesis, a behavioral-level testability analysis approach is presented. This approach is based on analyzing the circuit behavioral description (similar to a C program) to estimate its testability by identifying controllable and observable circuit nodes. This information can be used by a test generator to gain better access to internal circuit nodes and to reduce its search space. The results of the testability analyzer can also be used to select test points or partial scan flip-flops in the early design phase. Based on selection criteria, a novel Synthesis for Testability approach call Test Statement Insertion (TSI) is proposed, which modifies the circuit behavioral description directly. Test Statement Insertion can also be used to modify circuit structural description to improve its testability. As a result, Synthesis for Testability methodology can be combined with an existing behavioral synthesis tool to produce more testable circuits.

  5. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    PubMed

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  6. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems

    PubMed Central

    2011-01-01

    Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688

  7. A method for evaluating models that use galaxy rotation curves to derive the density profiles

    NASA Astrophysics Data System (ADS)

    de Almeida, Álefe O. F.; Piattella, Oliver F.; Rodrigues, Davi C.

    2016-11-01

    There are some approaches, either based on General Relativity (GR) or modified gravity, that use galaxy rotation curves to derive the matter density of the corresponding galaxy, and this procedure would either indicate a partial or a complete elimination of dark matter in galaxies. Here we review these approaches, clarify the difficulties on this inverted procedure, present a method for evaluating them, and use it to test two specific approaches that are based on GR: the Cooperstock-Tieu (CT) and the Balasin-Grumiller (BG) approaches. Using this new method, we find that neither of the tested approaches can satisfactorily fit the observational data without dark matter. The CT approach results can be significantly improved if some dark matter is considered, while for the BG approach no usual dark matter halo can improve its results.

  8. Synthesis of the Insecticide Prothrin and Its Analogues from Biomass-Derived 5-(Cloromethyl)furfural

    DTIC Science & Technology

    2013-12-19

    the same synthetic approach. Preliminary testing of these furan-based pyrethroids against the yellow fever mosquito Aedes aegypti indicates promising...approach. Preliminary testing of these furan-based pyrethroids against the yellow fever mosquito Aedes aegypti indicates promising insecticidal... Aedes aegypti (established 1952) was reared in the insectary of the Mosquito and Fly Research Unit at the Center for Medical, Agricultural, and Veterinary

  9. "L2 Assessment and Testing" Teacher Education: An Exploration of Alternative Assessment Approaches Using New Technologies

    ERIC Educational Resources Information Center

    Papadima-Sophocleous, Salomi

    2017-01-01

    Most Second Language (L2) Teacher Training Assessment and Testing courses focus on testing. Through the development of a Master of Arts (MA) in a computer assisted language learning module (based on a constructivist and "practise what you preach" approach, entailing that the teachers experience firsthand the assessment types they were…

  10. State of the art in non-animal approaches for skin sensitization testing: from individual test methods towards testing strategies.

    PubMed

    Ezendam, Janine; Braakhuis, Hedwig M; Vandebriel, Rob J

    2016-12-01

    The hazard assessment of skin sensitizers relies mainly on animal testing, but much progress is made in the development, validation and regulatory acceptance and implementation of non-animal predictive approaches. In this review, we provide an update on the available computational tools and animal-free test methods for the prediction of skin sensitization hazard. These individual test methods address mostly one mechanistic step of the process of skin sensitization induction. The adverse outcome pathway (AOP) for skin sensitization describes the key events (KEs) that lead to skin sensitization. In our review, we have clustered the available test methods according to the KE they inform: the molecular initiating event (MIE/KE1)-protein binding, KE2-keratinocyte activation, KE3-dendritic cell activation and KE4-T cell activation and proliferation. In recent years, most progress has been made in the development and validation of in vitro assays that address KE2 and KE3. No standardized in vitro assays for T cell activation are available; thus, KE4 cannot be measured in vitro. Three non-animal test methods, addressing either the MIE, KE2 or KE3, are accepted as OECD test guidelines, and this has accelerated the development of integrated or defined approaches for testing and assessment (e.g. testing strategies). The majority of these approaches are mechanism-based, since they combine results from multiple test methods and/or computational tools that address different KEs of the AOP to estimate skin sensitization potential and sometimes potency. Other approaches are based on statistical tools. Until now, eleven different testing strategies have been published, the majority using the same individual information sources. Our review shows that some of the defined approaches to testing and assessment are able to accurately predict skin sensitization hazard, sometimes even more accurate than the currently used animal test. A few defined approaches are developed to provide an estimate of the potency sub-category of a skin sensitizer as well, but these approaches need further independent evaluation with a new dataset of chemicals. To conclude, this update shows that the field of non-animal approaches for skin sensitization has evolved greatly in recent years and that it is possible to predict skin sensitization hazard without animal testing.

  11. An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing.

    PubMed

    Chalil Madathil, Kapil; Greenstein, Joel S

    2017-11-01

    Collaborative virtual reality-based systems have integrated high fidelity voice-based communication, immersive audio and screen-sharing tools into virtual environments. Such three-dimensional collaborative virtual environments can mirror the collaboration among usability test participants and facilitators when they are physically collocated, potentially enabling moderated usability tests to be conducted effectively when the facilitator and participant are located in different places. We developed a virtual collaborative three-dimensional remote moderated usability testing laboratory and employed it in a controlled study to evaluate the effectiveness of moderated usability testing in a collaborative virtual reality-based environment with two other moderated usability testing methods: the traditional lab approach and Cisco WebEx, a web-based conferencing and screen sharing approach. Using a mixed methods experimental design, 36 test participants and 12 test facilitators were asked to complete representative tasks on a simulated online shopping website. The dependent variables included the time taken to complete the tasks; the usability defects identified and their severity; and the subjective ratings on the workload index, presence and satisfaction questionnaires. Remote moderated usability testing methodology using a collaborative virtual reality system performed similarly in terms of the total number of defects identified, the number of high severity defects identified and the time taken to complete the tasks with the other two methodologies. The overall workload experienced by the test participants and facilitators was the least with the traditional lab condition. No significant differences were identified for the workload experienced with the virtual reality and the WebEx conditions. However, test participants experienced greater involvement and a more immersive experience in the virtual environment than in the WebEx condition. The ratings for the virtual environment condition were not significantly different from those for the traditional lab condition. The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to conventional approaches for synchronous usability testing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A Cancer Gene Selection Algorithm Based on the K-S Test and CFS.

    PubMed

    Su, Qiang; Wang, Yina; Jiang, Xiaobing; Chen, Fuxue; Lu, Wen-Cong

    2017-01-01

    To address the challenging problem of selecting distinguished genes from cancer gene expression datasets, this paper presents a gene subset selection algorithm based on the Kolmogorov-Smirnov (K-S) test and correlation-based feature selection (CFS) principles. The algorithm selects distinguished genes first using the K-S test, and then, it uses CFS to select genes from those selected by the K-S test. We adopted support vector machines (SVM) as the classification tool and used the criteria of accuracy to evaluate the performance of the classifiers on the selected gene subsets. This approach compared the proposed gene subset selection algorithm with the K-S test, CFS, minimum-redundancy maximum-relevancy (mRMR), and ReliefF algorithms. The average experimental results of the aforementioned gene selection algorithms for 5 gene expression datasets demonstrate that, based on accuracy, the performance of the new K-S and CFS-based algorithm is better than those of the K-S test, CFS, mRMR, and ReliefF algorithms. The experimental results show that the K-S test-CFS gene selection algorithm is a very effective and promising approach compared to the K-S test, CFS, mRMR, and ReliefF algorithms.

  13. Research Problems Associated with Limiting the Applied Force in Vibration Tests and Conducting Base-Drive Modal Vibration Tests

    NASA Technical Reports Server (NTRS)

    Scharton, Terry D.

    1995-01-01

    The intent of this paper is to make a case for developing and conducting vibration tests which are both realistic and practical (a question of tailoring versus standards). Tests are essential for finding things overlooked in the analyses. The best test is often the most realistic test which can be conducted within the cost and budget constraints. Some standards are essential, but the author believes more in the individual's ingenuity to solve a specific problem than in the application of standards which reduce problems (and technology) to their lowest common denominator. Force limited vibration tests and base-drive modal tests are two examples of realistic, but practical testing approaches. Since both of these approaches are relatively new, a number of interesting research problems exist, and these are emphasized herein.

  14. FloPSy - Search-Based Floating Point Constraint Solving for Symbolic Execution

    NASA Astrophysics Data System (ADS)

    Lakhotia, Kiran; Tillmann, Nikolai; Harman, Mark; de Halleux, Jonathan

    Recently there has been an upsurge of interest in both, Search-Based Software Testing (SBST), and Dynamic Symbolic Execution (DSE). Each of these two approaches has complementary strengths and weaknesses, making it a natural choice to explore the degree to which the strengths of one can be exploited to offset the weakness of the other. This paper introduces an augmented version of DSE that uses a SBST-based approach to handling floating point computations, which are known to be problematic for vanilla DSE. The approach has been implemented as a plug in for the Microsoft Pex DSE testing tool. The paper presents results from both, standard evaluation benchmarks, and two open source programs.

  15. Accounting for estimated IQ in neuropsychological test performance with regression-based techniques.

    PubMed

    Testa, S Marc; Winicki, Jessica M; Pearlson, Godfrey D; Gordon, Barry; Schretlen, David J

    2009-11-01

    Regression-based normative techniques account for variability in test performance associated with multiple predictor variables and generate expected scores based on algebraic equations. Using this approach, we show that estimated IQ, based on oral word reading, accounts for 1-9% of the variability beyond that explained by individual differences in age, sex, race, and years of education for most cognitive measures. These results confirm that adding estimated "premorbid" IQ to demographic predictors in multiple regression models can incrementally improve the accuracy with which regression-based norms (RBNs) benchmark expected neuropsychological test performance in healthy adults. It remains to be seen whether the incremental variance in test performance explained by estimated "premorbid" IQ translates to improved diagnostic accuracy in patient samples. We describe these methods, and illustrate the step-by-step application of RBNs with two cases. We also discuss the rationale, assumptions, and caveats of this approach. More broadly, we note that adjusting test scores for age and other characteristics might actually decrease the accuracy with which test performance predicts absolute criteria, such as the ability to drive or live independently.

  16. Creep-fatigue life prediction for engine hot section materials (isotropic)

    NASA Technical Reports Server (NTRS)

    Moreno, V.

    1982-01-01

    The objectives of this program are the investigation of fundamental approaches to high temperature crack initiation life prediction, identification of specific modeling strategies and the development of specific models for component relevant loading conditions. A survey of the hot section material/coating systems used throughout the gas turbine industry is included. Two material/coating systems will be identified for the program. The material/coating system designated as the base system shall be used throughout Tasks 1-12. The alternate material/coating system will be used only in Task 12 for further evaluation of the models developed on the base material. In Task II, candidate life prediction approaches will be screened based on a set of criteria that includes experience of the approaches within the literature, correlation with isothermal data generated on the base material, and judgements relative to the applicability of the approach for the complex cycles to be considered in the option program. The two most promising approaches will be identified. Task 3 further evaluates the best approach using additional base material fatigue testing including verification tests. Task 4 consists of technical, schedular, financial and all other reporting requirements in accordance with the Reports of Work clause.

  17. Can traits predict individual growth performance? A test in a hyperdiverse tropical forest.

    PubMed

    Poorter, Lourens; Castilho, Carolina V; Schietti, Juliana; Oliveira, Rafael S; Costa, Flávia R C

    2018-07-01

    The functional trait approach has, as a central tenet, that plant traits are functional and shape individual performance, but this has rarely been tested in the field. Here, we tested the individual-based trait approach in a hyperdiverse Amazonian tropical rainforest and evaluated intraspecific variation in trait values, plant strategies at the individual level, and whether traits are functional and predict individual performance. We evaluated > 1300 tree saplings belonging to > 383 species, measured 25 traits related to growth and defense, and evaluated the effects of environmental conditions, plant size, and traits on stem growth. A total of 44% of the trait variation was observed within species, indicating a strong potential for acclimation. Individuals showed two strategy spectra, related to tissue toughness and organ size vs leaf display. In this nutrient- and light-limited forest, traits measured at the individual level were surprisingly poor predictors of individual growth performance because of convergence of traits and growth rates. Functional trait approaches based on individuals or species are conceptually fundamentally different: the species-based approach focuses on the potential and the individual-based approach on the realized traits and growth rates. Counterintuitively, the individual approach leads to a poor prediction of individual performance, although it provides a more realistic view on community dynamics. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  18. Proposal of a skin tests based approach for the prevention of recurrent hypersensitivity reactions to iodinated contrast media.

    PubMed

    Della-Torre, E; Berti, A; Yacoub, M R; Guglielmi, B; Tombetti, E; Sabbadini, M G; Voltolini, S; Colombo, G

    2015-05-01

    The purpose of the present work is to evaluate the efficacy of an approach that combines clinical history, skin tests results, and premedication, in preventing recurrent hypersensitivity reactions to iodinated contrast media (ICM). Skin Prick tests, Intradermal tests, and Patch tests were performed in 36 patients with a previous reaction to ICM. All patients underwent a second contrast enhanced radiological procedure with an alternative ICM selected on the basis of the proposed approach. After alternative ICM re-injection, only one patient presented a mild NIR. The proposed algorithm, validated in clinical settings where repeated radiological exams are needed, offers a safe and practical approach for protecting patients from recurrent hypersensitivity reactions to ICM.

  19. A General Approach to Measuring Test-Taking Effort on Computer-Based Tests

    ERIC Educational Resources Information Center

    Wise, Steven L.; Gao, Lingyun

    2017-01-01

    There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…

  20. A diagnostic approach to hemochromatosis

    PubMed Central

    Tavill, Anthony S; Adams, Paul C

    2006-01-01

    In the present clinical review, a diagnostic approach to hemochromatosis is discussed from the perspective of two clinicians with extensive experience in this area. The introduction of genetic testing and large-scale population screening studies have broadened our understanding of the clinical expression of disease and the utility of biochemical iron tests for the detection of disease and for the assessment of disease severity. Liver biopsy has become more of a prognostic test than a diagnostic test. The authors offer a stepwise, diagnostic algorithm based on current evidence-based data, that they regard as most cost-effective. An early diagnosis can lead to phlebotomy therapy to prevent the development of cirrhosis. PMID:16955151

  1. Size-based molecular diagnostics using plasma DNA for noninvasive prenatal testing.

    PubMed

    Yu, Stephanie C Y; Chan, K C Allen; Zheng, Yama W L; Jiang, Peiyong; Liao, Gary J W; Sun, Hao; Akolekar, Ranjit; Leung, Tak Y; Go, Attie T J I; van Vugt, John M G; Minekawa, Ryoko; Oudejans, Cees B M; Nicolaides, Kypros H; Chiu, Rossa W K; Lo, Y M Dennis

    2014-06-10

    Noninvasive prenatal testing using fetal DNA in maternal plasma is an actively researched area. The current generation of tests using massively parallel sequencing is based on counting plasma DNA sequences originating from different genomic regions. In this study, we explored a different approach that is based on the use of DNA fragment size as a diagnostic parameter. This approach is dependent on the fact that circulating fetal DNA molecules are generally shorter than the corresponding maternal DNA molecules. First, we performed plasma DNA size analysis using paired-end massively parallel sequencing and microchip-based capillary electrophoresis. We demonstrated that the fetal DNA fraction in maternal plasma could be deduced from the overall size distribution of maternal plasma DNA. The fetal DNA fraction is a critical parameter affecting the accuracy of noninvasive prenatal testing using maternal plasma DNA. Second, we showed that fetal chromosomal aneuploidy could be detected by observing an aberrant proportion of short fragments from an aneuploid chromosome in the paired-end sequencing data. Using this approach, we detected fetal trisomy 21 and trisomy 18 with 100% sensitivity (T21: 36/36; T18: 27/27) and 100% specificity (non-T21: 88/88; non-T18: 97/97). For trisomy 13, the sensitivity and specificity were 95.2% (20/21) and 99% (102/103), respectively. For monosomy X, the sensitivity and specificity were both 100% (10/10 and 8/8). Thus, this study establishes the principle of size-based molecular diagnostics using plasma DNA. This approach has potential applications beyond noninvasive prenatal testing to areas such as oncology and transplantation monitoring.

  2. Size-based molecular diagnostics using plasma DNA for noninvasive prenatal testing

    PubMed Central

    Yu, Stephanie C. Y.; Chan, K. C. Allen; Zheng, Yama W. L.; Jiang, Peiyong; Liao, Gary J. W.; Sun, Hao; Akolekar, Ranjit; Leung, Tak Y.; Go, Attie T. J. I.; van Vugt, John M. G.; Minekawa, Ryoko; Oudejans, Cees B. M.; Nicolaides, Kypros H.; Chiu, Rossa W. K.; Lo, Y. M. Dennis

    2014-01-01

    Noninvasive prenatal testing using fetal DNA in maternal plasma is an actively researched area. The current generation of tests using massively parallel sequencing is based on counting plasma DNA sequences originating from different genomic regions. In this study, we explored a different approach that is based on the use of DNA fragment size as a diagnostic parameter. This approach is dependent on the fact that circulating fetal DNA molecules are generally shorter than the corresponding maternal DNA molecules. First, we performed plasma DNA size analysis using paired-end massively parallel sequencing and microchip-based capillary electrophoresis. We demonstrated that the fetal DNA fraction in maternal plasma could be deduced from the overall size distribution of maternal plasma DNA. The fetal DNA fraction is a critical parameter affecting the accuracy of noninvasive prenatal testing using maternal plasma DNA. Second, we showed that fetal chromosomal aneuploidy could be detected by observing an aberrant proportion of short fragments from an aneuploid chromosome in the paired-end sequencing data. Using this approach, we detected fetal trisomy 21 and trisomy 18 with 100% sensitivity (T21: 36/36; T18: 27/27) and 100% specificity (non-T21: 88/88; non-T18: 97/97). For trisomy 13, the sensitivity and specificity were 95.2% (20/21) and 99% (102/103), respectively. For monosomy X, the sensitivity and specificity were both 100% (10/10 and 8/8). Thus, this study establishes the principle of size-based molecular diagnostics using plasma DNA. This approach has potential applications beyond noninvasive prenatal testing to areas such as oncology and transplantation monitoring. PMID:24843150

  3. Accounting for Uncertainty in Decision Analytic Models Using Rank Preserving Structural Failure Time Modeling: Application to Parametric Survival Models.

    PubMed

    Bennett, Iain; Paracha, Noman; Abrams, Keith; Ray, Joshua

    2018-01-01

    Rank Preserving Structural Failure Time models are one of the most commonly used statistical methods to adjust for treatment switching in oncology clinical trials. The method is often applied in a decision analytic model without appropriately accounting for additional uncertainty when determining the allocation of health care resources. The aim of the study is to describe novel approaches to adequately account for uncertainty when using a Rank Preserving Structural Failure Time model in a decision analytic model. Using two examples, we tested and compared the performance of the novel Test-based method with the resampling bootstrap method and with the conventional approach of no adjustment. In the first example, we simulated life expectancy using a simple decision analytic model based on a hypothetical oncology trial with treatment switching. In the second example, we applied the adjustment method on published data when no individual patient data were available. Mean estimates of overall and incremental life expectancy were similar across methods. However, the bootstrapped and test-based estimates consistently produced greater estimates of uncertainty compared with the estimate without any adjustment applied. Similar results were observed when using the test based approach on a published data showing that failing to adjust for uncertainty led to smaller confidence intervals. Both the bootstrapping and test-based approaches provide a solution to appropriately incorporate uncertainty, with the benefit that the latter can implemented by researchers in the absence of individual patient data. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. Predicting future protection of respirator users: Statistical approaches and practical implications.

    PubMed

    Hu, Chengcheng; Harber, Philip; Su, Jing

    2016-01-01

    The purpose of this article is to describe a statistical approach for predicting a respirator user's fit factor in the future based upon results from initial tests. A statistical prediction model was developed based upon joint distribution of multiple fit factor measurements over time obtained from linear mixed effect models. The model accounts for within-subject correlation as well as short-term (within one day) and longer-term variability. As an example of applying this approach, model parameters were estimated from a research study in which volunteers were trained by three different modalities to use one of two types of respirators. They underwent two quantitative fit tests at the initial session and two on the same day approximately six months later. The fitted models demonstrated correlation and gave the estimated distribution of future fit test results conditional on past results for an individual worker. This approach can be applied to establishing a criterion value for passing an initial fit test to provide reasonable likelihood that a worker will be adequately protected in the future; and to optimizing the repeat fit factor test intervals individually for each user for cost-effective testing.

  5. Dynamic ground-effect measurements on the F-15 STOL and Maneuver Technology Demonstrator (S/MTD) configuration

    NASA Technical Reports Server (NTRS)

    Kemmerly, Guy T.

    1990-01-01

    A moving-model ground-effect testing method was used to study the influence of rate-of-descent on the aerodynamic characteristics for the F-15 STOL and Maneuver Technology Demonstrator (S/MTD) configuration for both the approach and roll-out phases of landing. The approach phase was modeled for three rates of descent, and the results were compared to the predictions from the F-15 S/MTD simulation data base (prediction based on data obtained in a wind tunnel with zero rate of descent). This comparison showed significant differences due both to the rate of descent in the moving-model test and to the presence of the ground boundary layer in the wind tunnel test. Relative to the simulation data base predictions, the moving-model test showed substantially less lift increase in ground effect, less nose-down pitching moment, and less increase in drag. These differences became more prominent at the larger thrust vector angles. Over the small range of rates of descent tested using the moving-model technique, the effect of rate of descent on longitudinal aerodynamics was relatively constant. The results of this investigation indicate no safety-of-flight problems with the lower jets vectored up to 80 deg on approach. The results also indicate that this configuration could employ a nozzle concept using lower reverser vector angles up to 110 deg on approach if a no-flare approach procedure were adopted and if inlet reingestion does not pose a problem.

  6. Failing Tests: Commentary on "Adapting Educational Measurement to the Demands of Test-Based Accountability"

    ERIC Educational Resources Information Center

    Thissen, David

    2015-01-01

    In "Adapting Educational Measurement to the Demands of Test-Based Accountability" Koretz takes the time-honored engineering approach to educational measurement, identifying specific problems with current practice and proposing minimal modifications of the system to alleviate those problems. In response to that article, David Thissen…

  7. Knowledge-Based Runway Assignment for Arrival Aircraft in the Terminal Area

    DOT National Transportation Integrated Search

    1997-01-01

    A knowledge-based system for scheduling arrival traffic in the terminal area, : referred to as the Final Approach Spacing Tool (FAST), has been implemented and : operationally tested at the Dallas/Fort Worth Terminal Radar Approach Control : (TRACON)...

  8. A long PCR–based approach for DNA enrichment prior to next-generation sequencing for systematic studies1

    PubMed Central

    Uribe-Convers, Simon; Duke, Justin R.; Moore, Michael J.; Tank, David C.

    2014-01-01

    • Premise of the study: We present an alternative approach for molecular systematic studies that combines long PCR and next-generation sequencing. Our approach can be used to generate templates from any DNA source for next-generation sequencing. Here we test our approach by amplifying complete chloroplast genomes, and we present a set of 58 potentially universal primers for angiosperms to do so. Additionally, this approach is likely to be particularly useful for nuclear and mitochondrial regions. • Methods and Results: Chloroplast genomes of 30 species across angiosperms were amplified to test our approach. Amplification success varied depending on whether PCR conditions were optimized for a given taxon. To further test our approach, some amplicons were sequenced on an Illumina HiSeq 2000. • Conclusions: Although here we tested this approach by sequencing plastomes, long PCR amplicons could be generated using DNA from any genome, expanding the possibilities of this approach for molecular systematic studies. PMID:25202592

  9. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  10. Object-based classification of earthquake damage from high-resolution optical imagery using machine learning

    NASA Astrophysics Data System (ADS)

    Bialas, James; Oommen, Thomas; Rebbapragada, Umaa; Levin, Eugene

    2016-07-01

    Object-based approaches in the segmentation and classification of remotely sensed images yield more promising results compared to pixel-based approaches. However, the development of an object-based approach presents challenges in terms of algorithm selection and parameter tuning. Subjective methods are often used, but yield less than optimal results. Objective methods are warranted, especially for rapid deployment in time-sensitive applications, such as earthquake damage assessment. Herein, we used a systematic approach in evaluating object-based image segmentation and machine learning algorithms for the classification of earthquake damage in remotely sensed imagery. We tested a variety of algorithms and parameters on post-event aerial imagery for the 2011 earthquake in Christchurch, New Zealand. Results were compared against manually selected test cases representing different classes. In doing so, we can evaluate the effectiveness of the segmentation and classification of different classes and compare different levels of multistep image segmentations. Our classifier is compared against recent pixel-based and object-based classification studies for postevent imagery of earthquake damage. Our results show an improvement against both pixel-based and object-based methods for classifying earthquake damage in high resolution, post-event imagery.

  11. Real-Time Smart Grids Control for Preventing Cascading Failures and Blackout using Neural Networks: Experimental Approach for N-1-1 Contingency

    NASA Astrophysics Data System (ADS)

    Zarrabian, Sina; Belkacemi, Rabie; Babalola, Adeniyi A.

    2016-12-01

    In this paper, a novel intelligent control is proposed based on Artificial Neural Networks (ANN) to mitigate cascading failure (CF) and prevent blackout in smart grid systems after N-1-1 contingency condition in real-time. The fundamental contribution of this research is to deploy the machine learning concept for preventing blackout at early stages of its occurrence and to make smart grids more resilient, reliable, and robust. The proposed method provides the best action selection strategy for adaptive adjustment of generators' output power through frequency control. This method is able to relieve congestion of transmission lines and prevent consecutive transmission line outage after N-1-1 contingency condition. The proposed ANN-based control approach is tested on an experimental 100 kW test system developed by the authors to test intelligent systems. Additionally, the proposed approach is validated on the large-scale IEEE 118-bus power system by simulation studies. Experimental results show that the ANN approach is very promising and provides accurate and robust control by preventing blackout. The technique is compared to a heuristic multi-agent system (MAS) approach based on communication interchanges. The ANN approach showed more accurate and robust response than the MAS algorithm.

  12. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    ERIC Educational Resources Information Center

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  13. Instantaneous and controllable integer ambiguity resolution: review and an alternative approach

    NASA Astrophysics Data System (ADS)

    Zhang, Jingyu; Wu, Meiping; Li, Tao; Zhang, Kaidong

    2015-11-01

    In the high-precision application of Global Navigation Satellite System (GNSS), integer ambiguity resolution is the key step to realize precise positioning and attitude determination. As the necessary part of quality control, integer aperture (IA) ambiguity resolution provides the theoretical and practical foundation for ambiguity validation. It is mainly realized by acceptance testing. Due to the constraint of correlation between ambiguities, it is impossible to realize the controlling of failure rate according to analytical formula. Hence, the fixed failure rate approach is implemented by Monte Carlo sampling. However, due to the characteristics of Monte Carlo sampling and look-up table, we have to face the problem of a large amount of time consumption if sufficient GNSS scenarios are included in the creation of look-up table. This restricts the fixed failure rate approach to be a post process approach if a look-up table is not available. Furthermore, if not enough GNSS scenarios are considered, the table may only be valid for a specific scenario or application. Besides this, the method of creating look-up table or look-up function still needs to be designed for each specific acceptance test. To overcome these problems in determination of critical values, this contribution will propose an instantaneous and CONtrollable (iCON) IA ambiguity resolution approach for the first time. The iCON approach has the following advantages: (a) critical value of acceptance test is independently determined based on the required failure rate and GNSS model without resorting to external information such as look-up table; (b) it can be realized instantaneously for most of IA estimators which have analytical probability formulas. The stronger GNSS model, the less time consumption; (c) it provides a new viewpoint to improve the research about IA estimation. To verify these conclusions, multi-frequency and multi-GNSS simulation experiments are implemented. Those results show that IA estimators based on iCON approach can realize controllable ambiguity resolution. Besides this, compared with ratio test IA based on look-up table, difference test IA and IA least square based on the iCON approach most of times have higher success rates and better controllability to failure rates.

  14. Pathology consultation on urine compliance testing and drug abuse screening.

    PubMed

    Ward, Michael B; Hackenmueller, Sarah A; Strathmann, Frederick G

    2014-11-01

    Compliance testing in pain management requires a distinct approach compared with classic clinical toxicology testing. Differences in the patient populations and clinical expectations require modifications to established reporting cutoffs, assay performance expectations, and critical review of how best to apply the available testing methods. Although other approaches to testing are emerging, immunoassay screening followed by mass spectrometry confirmation remains the most common testing workflow for pain management compliance and drug abuse testing. A case-based approach was used to illustrate the complexities inherent to and uniqueness of pain management compliance testing for both clinicians and laboratories. A basic understanding of the inherent strengths and weaknesses of immunoassays and mass spectrometry provides the clinician a better understanding of how best to approach pain management compliance testing. Pain management compliance testing is a textbook example of an emerging field requiring open communication between physician and performing laboratory to fully optimize patient care. Copyright© by the American Society for Clinical Pathology.

  15. Empirical likelihood-based tests for stochastic ordering

    PubMed Central

    BARMI, HAMMOU EL; MCKEAGUE, IAN W.

    2013-01-01

    This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142

  16. On the Relationship Between Classical Test Theory and Item Response Theory: From One to the Other and Back.

    PubMed

    Raykov, Tenko; Marcoulides, George A

    2016-04-01

    The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete nature of the observed items. Two distinct observational equivalence approaches are outlined that render the item response models from corresponding classical test theory-based models, and can each be used to obtain the former from the latter models. Similarly, classical test theory models can be furnished using the reverse application of either of those approaches from corresponding item response models.

  17. Minimal intervention dentistry for early childhood caries and child dental anxiety: a randomized controlled trial.

    PubMed

    Arrow, P; Klobas, E

    2017-06-01

    To compare changes in child dental anxiety after treatment for early childhood caries (ECC) using two treatment approaches. Children with ECC were randomized to test (atraumatic restorative treatment (ART)-based approach) or control (standard care approach) groups. Children aged 3 years or older completed a dental anxiety scale at baseline and follow up. Changes in child dental anxiety from baseline to follow up were tested using the chi-squared statistic, Wilcoxon rank sum test, McNemar's test and multinomial logistic regression. Two hundred and fifty-four children were randomized (N = 127 test, N = 127 control). At baseline, 193 children completed the dental anxiety scale, 211 at follow up and 170 completed the scale on both occasions. Children who were anxious at baseline (11%) were no longer anxious at follow up, and 11% non-anxious children became anxious. Multinomial logistic regression found each increment in the number of visits increased the odds of worsening dental anxiety (odds ratio (OR), 2.2; P < 0.05), whereas each increment in the number of treatments lowered the odds of worsening anxiety (OR, 0.50; P = 0.05). The ART-based approach to managing ECC resulted in similar levels of dental anxiety to the standard treatment approach and provides a valuable alternative approach to the management of ECC in a primary dental care setting. © 2016 Australian Dental Association.

  18. An Innovative Method of Teaching-Learning Strategy to Enhance the Learner's Educational Process: Paradigm Shift from Conventional Approach to Modern Approach by Neurocognitive Based Concept Mapping

    ERIC Educational Resources Information Center

    Ramachandran, Sridhar; Pandia Vadivu, P.

    2014-01-01

    This study examines the effectiveness of Neurocognitive Based Concept Mapping (NBCM) on students' learning in a science course. A total of 32 grade IX of high school Central Board of Secondary Education (CBSE) students were involved in this study by pre-test and post-test measurements. They were divided into two groups: NBCM group as an…

  19. A powerful and efficient set test for genetic markers that handles confounders

    PubMed Central

    Listgarten, Jennifer; Lippert, Christoph; Kang, Eun Yong; Xiang, Jing; Kadie, Carl M.; Heckerman, David

    2013-01-01

    Motivation: Approaches for testing sets of variants, such as a set of rare or common variants within a gene or pathway, for association with complex traits are important. In particular, set tests allow for aggregation of weak signal within a set, can capture interplay among variants and reduce the burden of multiple hypothesis testing. Until now, these approaches did not address confounding by family relatedness and population structure, a problem that is becoming more important as larger datasets are used to increase power. Results: We introduce a new approach for set tests that handles confounders. Our model is based on the linear mixed model and uses two random effects—one to capture the set association signal and one to capture confounders. We also introduce a computational speedup for two random-effects models that makes this approach feasible even for extremely large cohorts. Using this model with both the likelihood ratio test and score test, we find that the former yields more power while controlling type I error. Application of our approach to richly structured Genetic Analysis Workshop 14 data demonstrates that our method successfully corrects for population structure and family relatedness, whereas application of our method to a 15 000 individual Crohn’s disease case–control cohort demonstrates that it additionally recovers genes not recoverable by univariate analysis. Availability: A Python-based library implementing our approach is available at http://mscompbio.codeplex.com. Contact: jennl@microsoft.com or lippert@microsoft.com or heckerma@microsoft.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23599503

  20. An effective self-assessment based on concept map extraction from test-sheet for personalized learning

    NASA Astrophysics Data System (ADS)

    Liew, Keng-Hou; Lin, Yu-Shih; Chang, Yi-Chun; Chu, Chih-Ping

    2013-12-01

    Examination is a traditional way to assess learners' learning status, progress and performance after a learning activity. Except the test grade, a test sheet hides some implicit information such as test concepts, their relationships, importance, and prerequisite. The implicit information can be extracted and constructed a concept map for considering (1) the test concepts covered in the same question means these test concepts have strong relationships, and (2) questions in the same test sheet means the test concepts are relative. Concept map has been successfully employed in many researches to help instructors and learners organize relationships among concepts. However, concept map construction depends on experts who need to take effort and time for the organization of the domain knowledge. In addition, the previous researches regarding to automatic concept map construction are limited to consider all learners of a class, which have not considered personalized learning. To cope with this problem, this paper proposes a new approach to automatically extract and construct concept map based on implicit information in a test sheet. Furthermore, the proposed approach also can help learner for self-assessment and self-diagnosis. Finally, an example is given to depict the effectiveness of proposed approach.

  1. A review of consensus test methods for established medical imaging modalities and their implications for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Pfefer, Joshua; Agrawal, Anant

    2012-03-01

    In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.

  2. Case-based debates: an innovative teaching tool in nephrology education.

    PubMed

    Jhaveri, Kenar D; Chawla, Arun; Shah, Hitesh H

    2012-01-01

    Medical educators have called for new teaching methods and materials that supplement the traditional lecture format, and education in a range of health professions, including medicine, nursing, and pharmacy, is using a game-based approach to teach learners. Here, we describe a novel teaching tool in a case-based debate using the game format. Two teams of first- and second-year nephrology fellows participated in a PowerPoint game-based debate about which tests to order to diagnose transplant-related case. Our pilot study assessed the participant acceptance of case-based debate sessions and rewards system, and participant perceptions of using this approach to teach fellows and residents the importance of each test ordered and its cost-effectiveness in medicine. Each test ordered requires an explanation and has a point value attached to it (based on relevance and cost of positive and negative test results). The team that comes up with the diagnosis with most points wins the game. A faculty member leads a short concluding discussion. Subjective evaluations found these case-based debates to be highly entertaining and thought-provoking and to enhance self-directed learning.

  3. An Improved Approach for Analyzing the Oxygen Compatibility of Solvents and other Oxygen-Flammable Materials for Use in Oxygen Systems

    NASA Technical Reports Server (NTRS)

    Harper, Susan A.; Juarez, Alfredo; Peralta, Stephen F.; Stoltzfus, Joel; Arpin, Christina Pina; Beeson, Harold D.

    2016-01-01

    Solvents used to clean oxygen system components must be assessed for oxygen compatibility, as incompatible residue or fluid inadvertently left behind within an oxygen system can pose a flammability risk. The most recent approach focused on solvent ignition susceptibility to assess the flammability risk associated with these materials. Previous evaluations included Ambient Pressure Liquid Oxygen (LOX) Mechanical Impact Testing (ASTM G86) and Autogenous Ignition Temperature (AIT) Testing (ASTM G72). The goal in this approach was to identify a solvent material that was not flammable in oxygen. As environmental policies restrict the available options of acceptable solvents, it has proven difficult to identify one that is not flammable in oxygen. A more rigorous oxygen compatibility approach is needed in an effort to select a new solvent for NASA applications. NASA White Sands Test Facility proposed an approach that acknowledges oxygen flammability, yet selects solvent materials based on their relative oxygen compatibility ranking, similar to that described in ASTM G63-99. Solvents are selected based on their ranking with respect to minimal ignition susceptibility, damage and propagation potential, as well as their relative ranking when compared with other solvent materials that are successfully used in oxygen systems. Test methods used in this approach included ASTM G86 (Ambient Pressure LOX Mechanical Impact Testing and Pressurized Gaseous Oxygen (GOX) Mechanical Impact Testing), ASTM G72 (AIT Testing), and ASTM D240 (Heat of Combustion (HOC) Testing). Only four solvents were tested through the full battery of tests for evaluation of oxygen compatibility: AK-225G as a baseline comparison, Solstice PF, L-14780, and Vertrel MCA. Baseline solvent AK-225G exhibited the lowest HOC and highest AIT of solvents tested. Nonetheless, Solstice PF, L-14780, and Vertrel MCA HOCs all fell well within the range of properties that are associated with proven oxygen system materials. Tested AITs for these solvents fell only slightly lower than the AIT for the proven AK-225G solvent. Based on these comparisons in which solvents exhibited properties within those ranges seen with proven oxygen system materials, it is believed that Solstice PF, L-14780, and Vertrel MCA would perform well with respect to oxygen compatibility.

  4. Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.

    PubMed

    Chalmers, R Philip

    2018-06-01

    This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.

  5. A Method to Examine Content Domain Structures

    ERIC Educational Resources Information Center

    D'Agostino, Jerome; Karpinski, Aryn; Welsh, Megan

    2011-01-01

    After a test is developed, most content validation analyses shift from ascertaining domain definition to studying domain representation and relevance because the domain is assumed to be set once a test exists. We present an approach that allows for the examination of alternative domain structures based on extant test items. In our example based on…

  6. The dynamical systems approach to numerical integration

    NASA Astrophysics Data System (ADS)

    Wisdom, Jack

    2018-03-01

    The dynamical systems approach to numerical integration is reviewed and extended. The new method is compared to some alternative methods based on the Lie series approach. The test problem is the motion of the outer planets. The algorithms developed using the dynamical systems approach perform well.

  7. A New Method for a Virtue-Based Responsible Conduct of Research Curriculum: Pilot Test Results.

    PubMed

    Berling, Eric; McLeskey, Chet; O'Rourke, Michael; Pennock, Robert T

    2018-02-03

    Drawing on Pennock's theory of scientific virtues, we are developing an alternative curriculum for training scientists in the responsible conduct of research (RCR) that emphasizes internal values rather than externally imposed rules. This approach focuses on the virtuous characteristics of scientists that lead to responsible and exemplary behavior. We have been pilot-testing one element of such a virtue-based approach to RCR training by conducting dialogue sessions, modeled upon the approach developed by Toolbox Dialogue Initiative, that focus on a specific virtue, e.g., curiosity and objectivity. During these structured discussions, small groups of scientists explore the roles they think the focus virtue plays and should play in the practice of science. Preliminary results have shown that participants strongly prefer this virtue-based model over traditional methods of RCR training. While we cannot yet definitively say that participation in these RCR sessions contributes to responsible conduct, these pilot results are encouraging and warrant continued development of this virtue-based approach to RCR training.

  8. Orion Flight Test Architecture Benefits of MBSE Approach

    NASA Technical Reports Server (NTRS)

    Reed, Don; Simpson, Kim

    2012-01-01

    Exploration Flight Test 1 (EFT-1) is an unmanned first orbital flight test of the Multi Purpose Crew Vehicle (MPCV) Mission s purpose is to: Test Orion s ascent, on-orbit and entry capabilities Monitor critical activities Provide ground control in support of contingency scenarios Requires development of a large scale end-to-end information system network architecture To effectively communicate the scope of the end-to-end system a model-based system engineering approach was chosen.

  9. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    PubMed

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  10. Monitoring scale scores over time via quality control charts, model-based approaches, and time series techniques.

    PubMed

    Lee, Yi-Hsuan; von Davier, Alina A

    2013-07-01

    Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.

  11. Initial Usability and Feasibility Evaluation of a Personal Health Record-Based Self-Management System for Older Adults.

    PubMed

    Sheehan, Barbara; Lucero, Robert J

    2015-01-01

    Electronic personal health record-based (ePHR-based) self-management systems can improve patient engagement and have an impact on health outcomes. In order to realize the benefits of these systems, there is a need to develop and evaluate heath information technology from the same theoretical underpinnings. Using an innovative usability approach based in human-centered distributed information design (HCDID), we tested an ePHR-based falls-prevention self-management system-Self-Assessment via a Personal Health Record (i.e., SAPHeR)-designed using HCDID principles in a laboratory. And we later evaluated SAPHeR's use by community-dwelling older adults at home. The innovative approach used in this study supported the analysis of four components: tasks, users, representations, and functions. Tasks were easily learned and features such as text-associated images facilitated task completion. Task performance times were slow, however user satisfaction was high. Nearly seven out of every ten features desired by design participants were evaluated in our usability testing of the SAPHeR system. The in vivo evaluation suggests that older adults could improve their confidence in performing indoor and outdoor activities after using the SAPHeR system. We have applied an innovative consumer-usability evaluation. Our approach addresses the limitations of other usability testing methods that do not utilize consistent theoretically based methods for designing and testing technology. We have successfully demonstrated the utility of testing consumer technology use across multiple components (i.e., task, user, representational, functional) to evaluate the usefulness, usability, and satisfaction of an ePHR-based self-management system.

  12. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2003-01-01

    In this paper we present, a comparison of trajectory optimization approaches for the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP). Quasi- Newton and Nelder-Meade Simplex. Several cost function parameterizations are considered for the direct approach. We choose one direct approach that appears to be the most flexible. Both the direct and indirect methods are applied to a variety of test cases which are chosen to demonstrate the performance of each method in different flight regimes. The first test case is a simple circular-to-circular coplanar rendezvous. The second test case is an elliptic-to-elliptic line of apsides rotation. The final test case is an orbit phasing maneuver sequence in a highly elliptic orbit. For each test case we present a comparison of the performance of all methods we consider in this paper.

  13. Automation Hooks Architecture Trade Study for Flexible Test Orchestration

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin A.; Maclean, John R.; Graffagnino, Frank J.; McCartney, Patrick A.

    2010-01-01

    We describe the conclusions of a technology and communities survey supported by concurrent and follow-on proof-of-concept prototyping to evaluate feasibility of defining a durable, versatile, reliable, visible software interface to support strategic modularization of test software development. The objective is that test sets and support software with diverse origins, ages, and abilities can be reliably integrated into test configurations that assemble and tear down and reassemble with scalable complexity in order to conduct both parametric tests and monitored trial runs. The resulting approach is based on integration of three recognized technologies that are currently gaining acceptance within the test industry and when combined provide a simple, open and scalable test orchestration architecture that addresses the objectives of the Automation Hooks task. The technologies are automated discovery using multicast DNS Zero Configuration Networking (zeroconf), commanding and data retrieval using resource-oriented Restful Web Services, and XML data transfer formats based on Automatic Test Markup Language (ATML). This open-source standards-based approach provides direct integration with existing commercial off-the-shelf (COTS) analysis software tools.

  14. Development of Advanced Thermal and Environmental Barrier Coatings Using a High-Heat-Flux Testing Approach

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Miller, Robert A.

    2003-01-01

    The development of low conductivity, robust thermal and environmental barrier coatings requires advanced testing techniques that can accurately and effectively evaluate coating thermal conductivity and cyclic resistance at very high surface temperatures (up to 1700 C) under large thermal gradients. In this study, a laser high-heat-flux test approach is established for evaluating advanced low conductivity, high temperature capability thermal and environmental barrier coatings under the NASA Ultra Efficient Engine Technology (UEET) program. The test approach emphasizes the real-time monitoring and assessment of the coating thermal conductivity, which initially rises under the steady-state high temperature thermal gradient test due to coating sintering, and later drops under the cyclic thermal gradient test due to coating cracking/delamination. The coating system is then evaluated based on damage accumulation and failure after the combined steady-state and cyclic thermal gradient tests. The lattice and radiation thermal conductivity of advanced ceramic coatings can also be evaluated using laser heat-flux techniques. The external radiation resistance of the coating is assessed based on the measured specimen temperature response under a laser- heated intense radiation-flux source. The coating internal radiation contribution is investigated based on the measured apparent coating conductivity increases with the coating surface test temperature under large thermal gradient test conditions. Since an increased radiation contribution is observed at these very high surface test temperatures, by varying the laser heat-flux and coating average test temperature, the complex relation between the lattice and radiation conductivity as a function of surface and interface test temperature may be derived.

  15. Applying innovative approach “Nature of Science (NoS) within inquiry” for developing scientific literacy in the student worksheet

    NASA Astrophysics Data System (ADS)

    Widowati, A.; Anjarsari, P.; Zuhdan, K. P.; Dita, A.

    2018-03-01

    The challenges of the 21st century require innovative solutions. Education must able to make an understanding of science learning that leads to the formation of scientific literacy learners. This research was conducted to produce the prototype as science worksheet based on Nature of Science (NoS) within inquiry approach and to know the effectiveness its product for developing scientific literacy. This research was the development and research design, by pointing to Four D models and Borg & Gall Model. There were 4 main phases (define, design, develop, disseminate) and additional phases (preliminary field testing, main product revision, main field testing, and operational product revision). Research subjects were students of the junior high school in Yogyakarta. The instruments used included questionnaire sheet product validation and scientific literacy test. For the validation data were analyzed descriptively. The test result was analyzed by an N-gain score. The results showed that the appropriateness of worksheet applying NoS within inquiry-based learning approach is eligible based on the assessment from excellent by experts and teachers, students’ scientific literacy can improve high category of the N-gain score at 0.71 by using student worksheet with Nature of Science (NoS) within inquiry approach.

  16. Estimating Missing Unit Process Data in Life Cycle Assessment Using a Similarity-Based Approach.

    PubMed

    Hou, Ping; Cai, Jiarui; Qu, Shen; Xu, Ming

    2018-05-01

    In life cycle assessment (LCA), collecting unit process data from the empirical sources (i.e., meter readings, operation logs/journals) is often costly and time-consuming. We propose a new computational approach to estimate missing unit process data solely relying on limited known data based on a similarity-based link prediction method. The intuition is that similar processes in a unit process network tend to have similar material/energy inputs and waste/emission outputs. We use the ecoinvent 3.1 unit process data sets to test our method in four steps: (1) dividing the data sets into a training set and a test set; (2) randomly removing certain numbers of data in the test set indicated as missing; (3) using similarity-weighted means of various numbers of most similar processes in the training set to estimate the missing data in the test set; and (4) comparing estimated data with the original values to determine the performance of the estimation. The results show that missing data can be accurately estimated when less than 5% data are missing in one process. The estimation performance decreases as the percentage of missing data increases. This study provides a new approach to compile unit process data and demonstrates a promising potential of using computational approaches for LCA data compilation.

  17. Examining Differential Item Functioning: IRT-Based Detection in the Framework of Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    2017-01-01

    This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.

  18. Student Perceptions of a Form-Based Approach to Reflective Journaling

    ERIC Educational Resources Information Center

    Mabrouk, Patricia Ann

    2015-01-01

    The author describes the principal findings of a survey study looking at student perceptions of a new form-based approach to reflective journaling. A form-based journal assignment was developed for use in introductory lecture courses and tested over a two-year period in an Honors General Chemistry course for engineers with a total of 157…

  19. A Computer-Based Approach for Deriving and Measuring Individual and Team Knowledge Structure from Essay Questions

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Wallace, Patricia

    2007-01-01

    This proof-of-concept investigation describes a computer-based approach for deriving the knowledge structure of individuals and of groups from their written essays, and considers the convergent criterion-related validity of the computer-based scores relative to human rater essay scores and multiple-choice test scores. After completing a…

  20. Growth hormone in sports: detecting the doped or duped.

    PubMed

    Ho, Ken K Y; Nelson, Anne E

    2011-01-01

    Doping with growth hormone (GH) is banned; however, there is anecdotal evidence that it is widely abused. GH is reportedly often used in combination with anabolic steroids at high doses for several months. Development of a robust test for detecting GH has been challenging since recombinant human 22-kDa GH used in doping is indistinguishable analytically from endogenous GH and there are wide physiological fluctuations in circulating GH concentrations. One approach to GH testing is based on measurement of different circulating GH isoforms using immunoassays that differentiate between 22-kDa and other GH isoforms. Administration of 22-kDa GH results in a change in its abundance relative to other endogenous pituitary GH isoforms. The differential isoform method is, however, limited by its short time window of detection. A second approach that extends the time window of detection is based on detection of increased levels of circulating GH-responsive proteins, such as the insulin-like growth factor (IGF) axis and collagen peptides. As age and gender are the major determinants of variability for IGF-I and the collagen markers, a test based on these markers must take these factors into account. Extensive data now validate the GH-responsive marker approach, and implementation is largely dependent on establishing an assured supply of standardized assays. Robust tests are available to detect GH and enforce the ban on its abuse in sports. Novel approaches that include gene expression and proteomic profiling must continue to be pursued to expand the repertoire of testing approaches available and to maintain deterrence of GH doping. Copyright © 2011 S. Karger AG, Basel.

  1. Linking ToxCast Signatures with Functional Consequences: Proof-of-Concept Study using Known Inhibitors of Vascular Development

    EPA Science Inventory

    The USEPA’s ToxCast program is developing a novel approach to chemical toxicity testing using high-throughput screening (HTS) assays to rapidly test thousands of chemicals against hundreds of in vitro molecular targets. This approach is based on the premise that in vitro HTS bioa...

  2. Divide and Conquer-Based 1D CNN Human Activity Recognition Using Test Data Sharpening †

    PubMed Central

    Yoon, Sang Min

    2018-01-01

    Human Activity Recognition (HAR) aims to identify the actions performed by humans using signals collected from various sensors embedded in mobile devices. In recent years, deep learning techniques have further improved HAR performance on several benchmark datasets. In this paper, we propose one-dimensional Convolutional Neural Network (1D CNN) for HAR that employs a divide and conquer-based classifier learning coupled with test data sharpening. Our approach leverages a two-stage learning of multiple 1D CNN models; we first build a binary classifier for recognizing abstract activities, and then build two multi-class 1D CNN models for recognizing individual activities. We then introduce test data sharpening during prediction phase to further improve the activity recognition accuracy. While there have been numerous researches exploring the benefits of activity signal denoising for HAR, few researches have examined the effect of test data sharpening for HAR. We evaluate the effectiveness of our approach on two popular HAR benchmark datasets, and show that our approach outperforms both the two-stage 1D CNN-only method and other state of the art approaches. PMID:29614767

  3. Divide and Conquer-Based 1D CNN Human Activity Recognition Using Test Data Sharpening.

    PubMed

    Cho, Heeryon; Yoon, Sang Min

    2018-04-01

    Human Activity Recognition (HAR) aims to identify the actions performed by humans using signals collected from various sensors embedded in mobile devices. In recent years, deep learning techniques have further improved HAR performance on several benchmark datasets. In this paper, we propose one-dimensional Convolutional Neural Network (1D CNN) for HAR that employs a divide and conquer-based classifier learning coupled with test data sharpening. Our approach leverages a two-stage learning of multiple 1D CNN models; we first build a binary classifier for recognizing abstract activities, and then build two multi-class 1D CNN models for recognizing individual activities. We then introduce test data sharpening during prediction phase to further improve the activity recognition accuracy. While there have been numerous researches exploring the benefits of activity signal denoising for HAR, few researches have examined the effect of test data sharpening for HAR. We evaluate the effectiveness of our approach on two popular HAR benchmark datasets, and show that our approach outperforms both the two-stage 1D CNN-only method and other state of the art approaches.

  4. A multiscale-based approach for composite materials with embedded PZT filaments for energy harvesting

    NASA Astrophysics Data System (ADS)

    El-Etriby, Ahmed E.; Abdel-Meguid, Mohamed E.; Hatem, Tarek M.; Bahei-El-Din, Yehia A.

    2014-03-01

    Ambient vibrations are major source of wasted energy, exploiting properly such vibration can be converted to valuable energy and harvested to power up devices, i.e. electronic devices. Accordingly, energy harvesting using smart structures with active piezoelectric ceramics has gained wide interest over the past few years as a method for converting such wasted energy. This paper provides numerical and experimental analysis of piezoelectric fiber based composites for energy harvesting applications proposing a multi-scale modeling approach coupled with experimental verification. The multi-scale approach suggested to predict the behavior of piezoelectric fiber-based composites use micromechanical model based on Transformation Field Analysis (TFA) to calculate the overall material properties of electrically active composite structure. Capitalizing on the calculated properties, single-phase analysis of a homogeneous structure is conducted using finite element method. The experimental work approach involves running dynamic tests on piezoelectric fiber-based composites to simulate mechanical vibrations experienced by a subway train floor tiles. Experimental results agree well with the numerical results both for static and dynamic tests.

  5. An effective utilization management strategy by dual approach of influencing physician ordering and gate keeping.

    PubMed

    Elnenaei, Manal O; Campbell, Samuel G; Thoni, Andrea J; Lou, Amy; Crocker, Bryan D; Nassar, Bassam A

    2016-02-01

    There is increasing recognition of the importance of appropriate laboratory test utilization. We investigate the effect of a multifaceted educational approach that includes physician feedback on individual test ordering, in conjunction with targeted restriction, on the utilization of selected laboratory tests. Scientific evidence was compiled on the usefulness and limitations of tests suspected of being over utilized in our laboratories. A variety of approaches were used to deliver education on each of the targeted tests, with greater focus on primary care physicians (PCPs). Feedback on requesting behavior of these tests was also communicated to the latter group which included an educational component. Laboratory based restriction of testing was also exercised, including the unbundling of our electrolyte panel. PCP requesting patterns for the selected tests were found to be markedly skewed. The interventions implemented over the study period resulted in a substantial 51% reduction in overall ordering of five of the targeted tests equating to an annual marginal cost saving of $60,124. Unbundling of the electrolyte panel resulted in marginal cost savings that equated annually to $42,500 on chloride and $48,000 on total CO2. A multifaceted educational approach combined with feedback on utilization and laboratory driven gate-keeping significantly reduced the number of laboratory tests suspected of being redundant or unjustifiably requested. Laboratory professionals are well positioned to manage demand on laboratory tests by utilizing evidence base in developing specific test ordering directives and gate-keeping rules. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  6. Testing-Based Compiler Validation for Synchronous Languages

    NASA Technical Reports Server (NTRS)

    Garoche, Pierre-Loic; Howar, Falk; Kahsai, Temesghen; Thirioux, Xavier

    2014-01-01

    In this paper we present a novel lightweight approach to validate compilers for synchronous languages. Instead of verifying a compiler for all input programs or providing a fixed suite of regression tests, we extend the compiler to generate a test-suite with high behavioral coverage and geared towards discovery of faults for every compiled artifact. We have implemented and evaluated our approach using a compiler from Lustre to C.

  7. A Review of DIMPACK Version 1.0: Conditional Covariance-Based Test Dimensionality Analysis Package

    ERIC Educational Resources Information Center

    Deng, Nina; Han, Kyung T.; Hambleton, Ronald K.

    2013-01-01

    DIMPACK Version 1.0 for assessing test dimensionality based on a nonparametric conditional covariance approach is reviewed. This software was originally distributed by Assessment Systems Corporation and now can be freely accessed online. The software consists of Windows-based interfaces of three components: DIMTEST, DETECT, and CCPROX/HAC, which…

  8. Experiments with Test Case Generation and Runtime Analysis

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Drusinsky, Doron; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Rosu, Grigore; Visser, Willem; Koga, Dennis (Technical Monitor)

    2003-01-01

    Software testing is typically an ad hoc process where human testers manually write many test inputs and expected test results, perhaps automating their execution in a regression suite. This process is cumbersome and costly. This paper reports preliminary results on an approach to further automate this process. The approach consists of combining automated test case generation based on systematically exploring the program's input domain, with runtime analysis, where execution traces are monitored and verified against temporal logic specifications, or analyzed using advanced algorithms for detecting concurrency errors such as data races and deadlocks. The approach suggests to generate specifications dynamically per input instance rather than statically once-and-for-all. The paper describes experiments with variants of this approach in the context of two examples, a planetary rover controller and a space craft fault protection system.

  9. Support vector machines-based modelling of seismic liquefaction potential

    NASA Astrophysics Data System (ADS)

    Pal, Mahesh

    2006-08-01

    This paper investigates the potential of support vector machines (SVM)-based classification approach to assess the liquefaction potential from actual standard penetration test (SPT) and cone penetration test (CPT) field data. SVMs are based on statistical learning theory and found to work well in comparison to neural networks in several other applications. Both CPT and SPT field data sets is used with SVMs for predicting the occurrence and non-occurrence of liquefaction based on different input parameter combination. With SPT and CPT test data sets, highest accuracy of 96 and 97%, respectively, was achieved with SVMs. This suggests that SVMs can effectively be used to model the complex relationship between different soil parameter and the liquefaction potential. Several other combinations of input variable were used to assess the influence of different input parameters on liquefaction potential. Proposed approach suggest that neither normalized cone resistance value with CPT data nor the calculation of standardized SPT value is required with SPT data. Further, SVMs required few user-defined parameters and provide better performance in comparison to neural network approach.

  10. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  11. Binding Task-Based Language Teaching and Task-Based Language Testing: A Survey into EFL Teachers and Learners' Views of Task-Based Approach

    ERIC Educational Resources Information Center

    Panahi, Ali

    2012-01-01

    In most settings, task-based language teaching and testing have been dissociated from each other. That is why this study came to rethink of the learners' views towards awareness and implementation of task-based language teaching through IELTS listening tasks. To these objectives, after sketching instrumentation, the learners were divided into…

  12. Computer-Based Testing: Practices and Considerations. Synthesis Report 78

    ERIC Educational Resources Information Center

    Thurlow, Martha; Lazarus, Sheryl S.; Albus, Debra; Hodgson, Jennifer

    2010-01-01

    Computer-based testing (CBT) has emerged as one of the recent "innovative" approaches to assessments most pursued by states. CBT is lauded as the answer to having cheaper and speedier test delivery for state and district-wide assessments. It is also seen by some as an avenue toward greater accessibility for students with disabilities. In…

  13. Examination of Test and Item Statistics from Visual and Verbal Mathematics Questions

    ERIC Educational Resources Information Center

    Alpayar, Cagla; Gulleroglu, H. Deniz

    2017-01-01

    The aim of this research is to determine whether students' test performance and approaches to test questions change based on the type of mathematics questions (visual or verbal) administered to them. This research is based on a mixed-design model. The quantitative data are gathered from 297 seventh grade students, attending seven different middle…

  14. A numerical evaluation of the dynamical systems approach to wall layer turbulence

    NASA Technical Reports Server (NTRS)

    Berkooz, Gal

    1990-01-01

    This work attempts to test predictions based on the Dynamical Systems approach to Wall Layer Turbulence. We analyze the Dynamical Systems model for the nonlinear interaction mechanisms between the coherent structures and deduce qualitative behavior as expected. We then test for this behavior in data sets from D.N.S. The agreement is good, given the suboptimal conditions for the test. We discuss implications of this test and work to be done to deepen the understanding of control of turbulent boundary layers.

  15. ENHANCING TEST SENSITIVITY IN TOXICITY TESTING BY USING A STATISTICAL PERFORMANCE STANDARD

    EPA Science Inventory

    Previous reports have shown that within-test sensitivity can vary markedly among laboratories. Experts have advocated an empirical approach to controlling test variability based on the MSD, control means, and other test acceptability criteria. (The MSD represents the smallest dif...

  16. The Impact of Team-Based Learning on Nervous System Examination Knowledge of Nursing Students.

    PubMed

    Hemmati Maslakpak, Masomeh; Parizad, Naser; Zareie, Farzad

    2015-12-01

    Team-based learning is one of the active learning approaches in which independent learning is combined with small group discussion in the class. This study aimed to determine the impact of team-based learning in nervous system examination knowledge of nursing students. This quasi-experimental study was conducted on 3(rd) grade nursing students, including 5th semester (intervention group) and 6(th) semester (control group). The traditional lecture method and the team-based learning method were used for educating the examination of the nervous system for intervention and control groups, respectively. The data were collected by a test covering 40-questions (multiple choice, matching, gap-filling and descriptive questions) before and after intervention in both groups. Individual Readiness Assurance Test (RAT) and Group Readiness Assurance Test (GRAT) used to collect data in the intervention group. In the end, the collected data were analyzed by SPSS ver. 13 using descriptive and inferential statistical tests. In team-based learning group, mean and standard deviation was 13.39 (4.52) before the intervention, which had been increased to 31.07 (3.20) after the intervention and this increase was statistically significant. Also, there was a statistically significant difference between the scores of RAT and GRAT in team-based learning group. Using team-based learning approach resulted in much better improvement and stability in the nervous system examination knowledge of nursing students compared to traditional lecture method; therefore, this method could be efficiently used as an effective educational approach in nursing education.

  17. A Concept of Thermographic Method for Non-Destructive Testing of Polymeric Composite Structures Using Self-Heating Effect

    PubMed Central

    2017-01-01

    Traditional techniques of active thermography require an external source of energy used for excitation, usually in the form of high power lamps or ultrasonic devices. In this paper, the author presents an alternative approach based on the self-heating effect observable in polymer-based structures during cyclic loading. The presented approach is based on, firstly, determination of bending resonance frequencies of a tested structure, and then, on excitation of a structure with a multi-harmonic signal constructed from the harmonics with frequencies of determined resonances. Following this, heating-up of a tested structure occurs in the location of stress concentration and mechanical energy dissipation due to the viscoelastic response of a structure. By applying multi-harmonic signal, one ensures coverage of the structure by such heated regions. The concept is verified experimentally on artificially damaged composite specimens. The results demonstrate the presented approach and indicate its potential, especially when traditional methods of excitation with an external structure for thermographic inspection cannot be applied. PMID:29283430

  18. A Concept of Thermographic Method for Non-Destructive Testing of Polymeric Composite Structures Using Self-Heating Effect.

    PubMed

    Katunin, Andrzej

    2017-12-28

    Traditional techniques of active thermography require an external source of energy used for excitation, usually in the form of high power lamps or ultrasonic devices. In this paper, the author presents an alternative approach based on the self-heating effect observable in polymer-based structures during cyclic loading. The presented approach is based on, firstly, determination of bending resonance frequencies of a tested structure, and then, on excitation of a structure with a multi-harmonic signal constructed from the harmonics with frequencies of determined resonances. Following this, heating-up of a tested structure occurs in the location of stress concentration and mechanical energy dissipation due to the viscoelastic response of a structure. By applying multi-harmonic signal, one ensures coverage of the structure by such heated regions. The concept is verified experimentally on artificially damaged composite specimens. The results demonstrate the presented approach and indicate its potential, especially when traditional methods of excitation with an external structure for thermographic inspection cannot be applied.

  19. Finite-sample and asymptotic sign-based tests for parameters of non-linear quantile regression with Markov noise

    NASA Astrophysics Data System (ADS)

    Sirenko, M. A.; Tarasenko, P. F.; Pushkarev, M. I.

    2017-01-01

    One of the most noticeable features of sign-based statistical procedures is an opportunity to build an exact test for simple hypothesis testing of parameters in a regression model. In this article, we expanded a sing-based approach to the nonlinear case with dependent noise. The examined model is a multi-quantile regression, which makes it possible to test hypothesis not only of regression parameters, but of noise parameters as well.

  20. The effect of web quest and team-based learning on students' self-regulation.

    PubMed

    Badiyepeymaie Jahromi, Zohreh; Mosalanejad, Leili; Rezaee, Rita

    2016-04-01

    In this study, the authors aimed to examine the effects of cooperative learning methods using Web Quest and team-based learning on students' self-direction, self-regulation, and academic achievement. This is a comparative study of students taking a course in mental health and psychiatric disorders. In two consecutive years, a group of students were trained using the WebQuest approach as a teaching strategy (n = 38), while the other group was taught using team-based learning (n=39). Data gathering was based on Guglielmino's self-directed learning readiness scale (SDLRS) and Buford's self-regulation questionnaire. The data were analyzed by descriptive test using M (IQR), Wilcoxon signed-rank test, and the Mann-Whitney U-test in SPSS software, version 13. p<0.05 was considered as the significance level. The results of the Mann-Whitney U test showed that the participants' self- directed (self-management) and self-regulated learning differed between the two groups (p=0.04 and p=0.01, respectively). Wilcoxon test revealed that self-directed learning indices (self-control and self-management) were differed between the two strategies before and after the intervention. However, the scores related to learning (students' final scores) were higher in the WebQuest approach than in team-based learning. By employing modern educational approaches, students are not only more successful in their studies but also acquire the necessary professional skills for future performance. Further research to compare the effects of new methods of teaching is required.

  1. Web-based oral health promotion program for older adults: Development and preliminary evaluation.

    PubMed

    Mariño, Rodrigo J; Marwaha, Parul; Barrow, Su-Yan

    2016-07-01

    This study reports on the impact evaluation of a Web-based oral health promotion programme aimed at improving the oral health knowledge, attitudes, practices and self-efficacy of independent-living older adults from Melbourne, Australia. With ethics approval from the University of Melbourne, a convenience sample of volunteers 55 years or older was invited to participate in a study to test a web-based oral health promotion program. Consenting volunteers were asked to undergo a structured interview as part of the pre-intervention data collection. The intervention was based on the ORHIS (Oral Health Information Seminars/Sheets) Model and involved computer interaction with six oral health presentations, with no direct oral health professional input. A one group pre-test-post-test quasi-experimental design was chosen to evaluate the intervention. A series of paired t-tests were used to compare pre-test with post-test results. Forty-seven active, independent-living older adults participated in this evaluation. After the intervention participants responded with higher levels of achievement than before participating in this Web-based oral health program. Participants showed significant improvements in oral health attitudes (4.10 vs. 4.94; p<0.01), knowledge (18.37 vs. 23.83; p<0.0001), and self-efficacy (84.37 vs.89.23; p<0.01), as well as, self-reported oral hygiene practices (i.e., frequency of use of dental floss) (p<0.05). The e-ORHIS approach was successful in improving oral health knowledge, attitudes and self-efficacy. As such, it represents a helpful approach for the design of (oral) health interventions in older adults. Further evaluation with a larger sample is required to test the long-term impact including the economic evaluation of the e-ORHIS approach. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Towards Accurate Node-Based Detection of P2P Botnets

    PubMed Central

    2014-01-01

    Botnets are a serious security threat to the current Internet infrastructure. In this paper, we propose a novel direction for P2P botnet detection called node-based detection. This approach focuses on the network characteristics of individual nodes. Based on our model, we examine node's flows and extract the useful features over a given time period. We have tested our approach on real-life data sets and achieved detection rates of 99-100% and low false positives rates of 0–2%. Comparison with other similar approaches on the same data sets shows that our approach outperforms the existing approaches. PMID:25089287

  3. Toward a new methodological paradigm for testing theories of health behavior and health behavior change.

    PubMed

    Noar, Seth M; Mehrotra, Purnima

    2011-03-01

    Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. A Novel Approach to Prediction of Mild Obstructive Sleep Disordered Breathing in a Population-Based Sample: The Sleep Heart Health Study

    PubMed Central

    Caffo, Brian; Diener-West, Marie; Punjabi, Naresh M.; Samet, Jonathan

    2010-01-01

    This manuscript considers a data-mining approach for the prediction of mild obstructive sleep disordered breathing, defined as an elevated respiratory disturbance index (RDI), in 5,530 participants in a community-based study, the Sleep Heart Health Study. The prediction algorithm was built using modern ensemble learning algorithms, boosting in specific, which allowed for assessing potential high-dimensional interactions between predictor variables or classifiers. To evaluate the performance of the algorithm, the data were split into training and validation sets for varying thresholds for predicting the probability of a high RDI (≥ 7 events per hour in the given results). Based on a moderate classification threshold from the boosting algorithm, the estimated post-test odds of a high RDI were 2.20 times higher than the pre-test odds given a positive test, while the corresponding post-test odds were decreased by 52% given a negative test (sensitivity and specificity of 0.66 and 0.70, respectively). In rank order, the following variables had the largest impact on prediction performance: neck circumference, body mass index, age, snoring frequency, waist circumference, and snoring loudness. Citation: Caffo B; Diener-West M; Punjabi NM; Samet J. A novel approach to prediction of mild obstructive sleep disordered breathing in a population-based sample: the Sleep Heart Health Study. SLEEP 2010;33(12):1641-1648. PMID:21120126

  5. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  6. A hybrid approach to urine drug testing using high-resolution mass spectrometry and select immunoassays.

    PubMed

    McMillin, Gwendolyn A; Marin, Stephanie J; Johnson-Davis, Kamisha L; Lawlor, Bryan G; Strathmann, Frederick G

    2015-02-01

    The major objective of this research was to propose a simplified approach for the evaluation of medication adherence in chronic pain management patients, using liquid chromatography time-of-flight (TOF) mass spectrometry, performed in parallel with select homogeneous enzyme immunoassays (HEIAs). We called it a "hybrid" approach to urine drug testing. The hybrid approach was defined based on anticipated positivity rates, availability of commercial reagents for HEIAs, and assay performance, particularly analytical sensitivity and specificity for drug(s) of interest. Subsequent to implementation of the hybrid approach, time to result was compared with that observed with other urine drug testing approaches. Opioids, benzodiazepines, zolpidem, amphetamine-like stimulants, and methylphenidate metabolite were detected by TOF mass spectrometry to maximize specificity and sensitivity of these 37 drug analytes. Barbiturates, cannabinoid metabolite, carisoprodol, cocaine metabolite, ethyl glucuronide, methadone, phencyclidine, propoxyphene, and tramadol were detected by HEIAs that performed adequately and/or for which positivity rates were very low. Time to result was significantly reduced compared with the traditional approach. The hybrid approach to urine drug testing provides a simplified and analytically specific testing process that minimizes the need for secondary confirmation. Copyright© by the American Society for Clinical Pathology.

  7. Simple Approaches to Minimally-Instrumented, Microfluidic-Based Point-of-Care Nucleic Acid Amplification Tests

    PubMed Central

    Mauk, Michael G.; Song, Jinzhao; Liu, Changchun; Bau, Haim H.

    2018-01-01

    Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs) in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC) tests for resource-limited settings. Microfluidic cartridges (‘chips’) that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets) is demonstrated. Low-cost detection and added functionality (data analysis, control, communication) can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed. PMID:29495424

  8. A call for differentiated approaches to delivering HIV services to key populations.

    PubMed

    Macdonald, Virginia; Verster, Annette; Baggaley, Rachel

    2017-07-21

    Key populations (KPs) are disproportionally affected by HIV and have low rates of access to HIV testing and treatment services compared to the broader population. WHO promotes the use of differentiated approaches for reaching and recruiting KP into the HIV services continuum. These approaches may help increase access to KPs who are often criminalized or stigmatized. By catering to the specific needs of each KP individual, differentiated approaches may increase service acceptability, quality and coverage, reduce costs and support KP members in leading the HIV response among their communities. WHO recommends the implementation of community-based and lay provider administered HIV testing services. Together, these approaches reduce barriers and costs associated with other testing strategies, allow greater ownership in HIV programmes for KP members and reach more people than do facility-based services. Despite this evidence availability and support for them is limited. Peer-driven interventions have been shown to be effective in engaging, recruiting and supporting clients. Some programmes employ HIV-positive or non-PLHIV "peer navigators" and other staff to provide case management, enrolment and/or re-enrolment in care and treatment services. However, a better understanding of the impact, cost effectiveness and potential burden on peer volunteers is required. Task shifting and non-facility-based service locations for antiretroviral therapy (ART) initiation and maintenance and antiretroviral (ARV) distribution are recommended in both the consolidated HIV treatment and KP guidelines of WHO. These approaches are accepted in generalized epidemics and for the general population where successful models exist; however, few organizations provide or initiate ART at KP community-based services. The application of a differentiated service approach for KP could increase the number of people who know their status and receive effective and sustained prevention and treatment for HIV. However, while community-based and lay provider testing are effective and affordable, they are not implemented to scale. Furthermore regulatory barriers to legitimizing lay and peer providers as part of healthcare delivery systems need to be overcome in many settings. WHO recommendations on task shifting and decentralization of ART treatment and care are often not applied to KP settings.

  9. Are your covariates under control? How normalization can re-introduce covariate effects.

    PubMed

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  10. A clinical approach to diagnosis of autoimmune encephalitis

    PubMed Central

    Graus, Francesc; Titulaer, Maarten J; Balu, Ramani; Benseler, Susanne; Bien, Christian G; Cellucci, Tania; Cortese, Irene; Dale, Russell C; Gelfand, Jeffrey M; Geschwind, Michael; Glaser, Carol A; Honnorat, Jerome; Höftberger, Romana; Iizuka, Takahiro; Irani, Sarosh R; Lancaster, Eric; Leypoldt, Frank; Prüss, Harald; Rae-Grant, Alexander; Reindl, Markus; Rosenfeld, Myrna R; Rostásy, Kevin; Saiz, Albert; Venkatesan, Arun; Vincent, Angela; Wandinger, Klaus-Peter; Waters, Patrick; Dalmau, Josep

    2016-01-01

    Encephalitis is a severe inflammatory disorder of the brain with many possible causes and a complex differential diagnosis. Advances in autoimmune encephalitis research in the past 10 years have led to the identification of new syndromes and biomarkers that have transformed the diagnostic approach to these disorders. However, existing criteria for autoimmune encephalitis are too reliant on antibody testing and response to immunotherapy, which might delay the diagnosis. We reviewed the literature and gathered the experience of a team of experts with the aims of developing a practical, syndrome-based diagnostic approach to autoimmune encephalitis and providing guidelines to navigate through the differential diagnosis. Because autoantibody test results and response to therapy are not available at disease onset, we based the initial diagnostic approach on neurological assessment and conventional tests that are accessible to most clinicians. Through logical differential diagnosis, levels of evidence for autoimmune encephalitis (possible, probable, or definite) are achieved, which can lead to prompt immunotherapy. PMID:26906964

  11. Shifting from presumptive to test-based management of malaria - technical basis and implications for malaria control in Ghana.

    PubMed

    Baiden, F; Malm, K; Bart-Plange, C; Hodgson, A; Chandramohan, D; Webster, J; Owusu-Agyei, S

    2014-06-01

    The presumptive approach was the World Health Organisation (WHO) recommended to the management of malaria for many years and this was incorporated into syndromic guidelines such as the Integrated Management of Childhood Illnesses (IMCI). In early 2010 however, WHO issued revised treatment guidelines that call for a shift from the presumptive to the test-based approach. Practically, this implies that in all suspected cases, the diagnosis of uncomplicated malaria should be confirmed using rapid test before treatment is initiated. This revision effectively brings to an end an era of clinical practice that span several years. Its implementation has important implications for the health systems in malaria-endemic countries. On the basis of research in Ghana and other countries, and evidence from program work, the Ghana National Malaria Control Program has issued revised national treatment guidelines that call for implementation of test-based management of malaria in all cases, and across all age groups. This article reviews the evidence and the technical basis for the shift to test-based management and examines the implications for malaria control in Ghana.

  12. A U-statistics based approach to sample size planning of two-arm trials with discrete outcome criterion aiming to establish either superiority or noninferiority.

    PubMed

    Wellek, Stefan

    2017-02-28

    In current practice, the most frequently applied approach to the handling of ties in the Mann-Whitney-Wilcoxon (MWW) test is based on the conditional distribution of the sum of mid-ranks, given the observed pattern of ties. Starting from this conditional version of the testing procedure, a sample size formula was derived and investigated by Zhao et al. (Stat Med 2008). In contrast, the approach we pursue here is a nonconditional one exploiting explicit representations for the variances of and the covariance between the two U-statistics estimators involved in the Mann-Whitney form of the test statistic. The accuracy of both ways of approximating the sample sizes required for attaining a prespecified level of power in the MWW test for superiority with arbitrarily tied data is comparatively evaluated by means of simulation. The key qualitative conclusions to be drawn from these numerical comparisons are as follows: With the sample sizes calculated by means of the respective formula, both versions of the test maintain the level and the prespecified power with about the same degree of accuracy. Despite the equivalence in terms of accuracy, the sample size estimates obtained by means of the new formula are in many cases markedly lower than that calculated for the conditional test. Perhaps, a still more important advantage of the nonconditional approach based on U-statistics is that it can be also adopted for noninferiority trials. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Prospective Environmental Risk Assessment for Sediment-Bound Organic Chemicals: A Proposal for Tiered Effect Assessment.

    PubMed

    Diepens, Noël J; Koelmans, Albert A; Baveco, Hans; van den Brink, Paul J; van den Heuvel-Greve, Martine J; Brock, Theo C M

    A broadly accepted framework for prospective environmental risk assessment (ERA) of sediment-bound organic chemicals is currently lacking. Such a framework requires clear protection goals, evidence-based concepts that link exposure to effects and a transparent tiered-effect assessment. In this paper, we provide a tiered prospective sediment ERA procedure for organic chemicals in sediment, with a focus on the applicable European regulations and the underlying data requirements. Using the ecosystem services concept, we derived specific protection goals for ecosystem service providing units: microorganisms, benthic algae, sediment-rooted macrophytes, benthic invertebrates and benthic vertebrates. Triggers for sediment toxicity testing are discussed.We recommend a tiered approach (Tier 0 through Tier 3). Tier-0 is a cost-effective screening based on chronic water-exposure toxicity data for pelagic species and equilibrium partitioning. Tier-1 is based on spiked sediment laboratory toxicity tests with standard benthic test species and standardised test methods. If comparable chronic toxicity data for both standard and additional benthic test species are available, the Species Sensitivity Distribution (SSD) approach is a more viable Tier-2 option than the geometric mean approach. This paper includes criteria for accepting results of sediment-spiked single species toxicity tests in prospective ERA, and for the application of the SSD approach. We propose micro/mesocosm experiments with spiked sediment, to study colonisation success by benthic organisms, as a Tier-3 option. Ecological effect models can be used to supplement the experimental tiers. A strategy for unifying information from various tiers by experimental work and exposure-and effect modelling is provided.

  14. Multiobjective optimization approach: thermal food processing.

    PubMed

    Abakarov, A; Sushkov, Y; Almonacid, S; Simpson, R

    2009-01-01

    The objective of this study was to utilize a multiobjective optimization technique for the thermal sterilization of packaged foods. The multiobjective optimization approach used in this study is based on the optimization of well-known aggregating functions by an adaptive random search algorithm. The applicability of the proposed approach was illustrated by solving widely used multiobjective test problems taken from the literature. The numerical results obtained for the multiobjective test problems and for the thermal processing problem show that the proposed approach can be effectively used for solving multiobjective optimization problems arising in the food engineering field.

  15. The Relationship between CTT and IRT Approaches in Analyzing Item Characteristics

    ERIC Educational Resources Information Center

    Abedalaziz, Nabeel; Leng, Chin Hai

    2013-01-01

    Most of the tests and inventories used by counseling psychologists have been developed using CTT; IRT derives from what is called latent trait theory. A number of important differences exist between CTT- versus IRT-based approaches to both test development and evaluation, as well as the process of scoring the response profiles of individual…

  16. Analyzing the Efficacy of the Testing Effect Using Kahoot™ on Student Performance

    ERIC Educational Resources Information Center

    Iwamoto, Darren H.; Hargis, Jace; Taitano, Erik Jon; Vuong, Ky

    2017-01-01

    Lower than expected high-stakes examination scores were being observed in a first-year general psychology class. This research sought an alternate approach that would assist students in preparing for high-stakes examinations. The purpose of this study was to measure the effectiveness of an alternate teaching approach based on the testing effect to…

  17. West Virginia Physical Education Teacher Perceptions of State Mandated Fitnessgram® Testing and Application of Results

    ERIC Educational Resources Information Center

    Miller, William M.

    2013-01-01

    Background/Purpose: In response to concerns with increasing rates of childhood obesity, many states have enacted policies that affect physical education. A commonly used approach is state mandated fitness test administration in school-based settings. While this approach is widely debated throughout the literature, one area that lacks research is…

  18. Empirical Approaches to Measuring the Intelligibility of Different Varieties of English in Predicting Listener Comprehension

    ERIC Educational Resources Information Center

    Kang, Okim; Thomson, Ron I.; Moran, Meghan

    2018-01-01

    This study compared five research-based intelligibility measures as they were applied to six varieties of English. The objective was to determine which approach to measuring intelligibility would be most reliable for predicting listener comprehension, as measured through a listening comprehension test similar to the Test of English as a Foreign…

  19. Critical evaluation of the EU-technical guidance on shelf-life studies for L. monocytogenes on RTE-foods: a case study for smoked salmon.

    PubMed

    Vermeulen, A; Devlieghere, F; De Loy-Hendrickx, A; Uyttendaele, M

    2011-01-31

    In November 2008, a technical guidance document on the challenge test protocol was published by the EU CRL (Community of Reference Laboratory) for L. monocytogenes. This document describes the practical aspects on the execution of a challenge test in order to comply to the EU Commission regulation N° 2073/2005 on microbiological criteria for foodstuff. In this guideline two approaches are specified. On the one hand challenge tests, based on actual data measurements at the beginning and end of the shelf-life of products stored under reasonably foreseen T-profile, are described. On the other hand, growth potential is calculated by predictive models using a validated maximum specific growth rate. The present study evaluates the two above mentioned approaches on cold smoked salmon, a typical risk product for L. monocytogenes. The focus is on: (i) the relative importance of intrabatch versus interbatch variability, (ii) the concept of a simple challenge test based on actual data at start and end of shelf life versus a modelling approach and (iii) the interpretation of challenge tests. Next to this, available tertiary models were used to estimate the growth potential of these products based on their initial physicochemical characteristics. From the results it could be concluded that in some batches considerable intrabatch variability was obtained. In general, however, the interbatch variability was significantly higher than intrabatch variability. Concerning the two above mentioned methods for challenge tests, it can be stated that the first approach (simple challenge test) can be set up rather rapidly and is cost-effective for SMEs (small and medium enterprises) but provides only a single isolated outcome. This implies that challenge tests should be redone if changes occur in composition or production process. The second (modelling) approach, using extended challenge tests to establish growth parameters needs larger set ups and more complicated data analysis, which makes them more expensive. Using available tertiary models has the major advantage that the most important intrinsic and extrinsic factors can be included for the prediction of the growth parameter. It was clear that product specific models, taking into account the interaction effects with background flora, performed the best. Regarding the challenge tests, it can be concluded that the best approach to choose will depend on the particular context as in the end both approaches will lead to the same conclusion. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. New learning based super-resolution: use of DWT and IGMRF prior.

    PubMed

    Gajjar, Prakash P; Joshi, Manjunath V

    2010-05-01

    In this paper, we propose a new learning-based approach for super-resolving an image captured at low spatial resolution. Given the low spatial resolution test image and a database consisting of low and high spatial resolution images, we obtain super-resolution for the test image. We first obtain an initial high-resolution (HR) estimate by learning the high-frequency details from the available database. A new discrete wavelet transform (DWT) based approach is proposed for learning that uses a set of low-resolution (LR) images and their corresponding HR versions. Since the super-resolution is an ill-posed problem, we obtain the final solution using a regularization framework. The LR image is modeled as the aliased and noisy version of the corresponding HR image, and the aliasing matrix entries are estimated using the test image and the initial HR estimate. The prior model for the super-resolved image is chosen as an Inhomogeneous Gaussian Markov random field (IGMRF) and the model parameters are estimated using the same initial HR estimate. A maximum a posteriori (MAP) estimation is used to arrive at the cost function which is minimized using a simple gradient descent approach. We demonstrate the effectiveness of the proposed approach by conducting the experiments on gray scale as well as on color images. The method is compared with the standard interpolation technique and also with existing learning-based approaches. The proposed approach can be used in applications such as wildlife sensor networks, remote surveillance where the memory, the transmission bandwidth, and the camera cost are the main constraints.

  1. Flight-Test Evaluation of Flutter-Prediction Methods

    NASA Technical Reports Server (NTRS)

    Lind, RIck; Brenner, Marty

    2003-01-01

    The flight-test community routinely spends considerable time and money to determine a range of flight conditions, called a flight envelope, within which an aircraft is safe to fly. The cost of determining a flight envelope could be greatly reduced if there were a method of safely and accurately predicting the speed associated with the onset of an instability called flutter. Several methods have been developed with the goal of predicting flutter speeds to improve the efficiency of flight testing. These methods include (1) data-based methods, in which one relies entirely on information obtained from the flight tests and (2) model-based approaches, in which one relies on a combination of flight data and theoretical models. The data-driven methods include one based on extrapolation of damping trends, one that involves an envelope function, one that involves the Zimmerman-Weissenburger flutter margin, and one that involves a discrete-time auto-regressive model. An example of a model-based approach is that of the flutterometer. These methods have all been shown to be theoretically valid and have been demonstrated on simple test cases; however, until now, they have not been thoroughly evaluated in flight tests. An experimental apparatus called the Aerostructures Test Wing (ATW) was developed to test these prediction methods.

  2. Scene-aware joint global and local homographic video coding

    NASA Astrophysics Data System (ADS)

    Peng, Xiulian; Xu, Jizheng; Sullivan, Gary J.

    2016-09-01

    Perspective motion is commonly represented in video content that is captured and compressed for various applications including cloud gaming, vehicle and aerial monitoring, etc. Existing approaches based on an eight-parameter homography motion model cannot deal with this efficiently, either due to low prediction accuracy or excessive bit rate overhead. In this paper, we consider the camera motion model and scene structure in such video content and propose a joint global and local homography motion coding approach for video with perspective motion. The camera motion is estimated by a computer vision approach, and camera intrinsic and extrinsic parameters are globally coded at the frame level. The scene is modeled as piece-wise planes, and three plane parameters are coded at the block level. Fast gradient-based approaches are employed to search for the plane parameters for each block region. In this way, improved prediction accuracy and low bit costs are achieved. Experimental results based on the HEVC test model show that up to 9.1% bit rate savings can be achieved (with equal PSNR quality) on test video content with perspective motion. Test sequences for the example applications showed a bit rate savings ranging from 3.7 to 9.1%.

  3. Predicting protein interactions by Brownian dynamics simulations.

    PubMed

    Meng, Xuan-Yu; Xu, Yu; Zhang, Hong-Xing; Mezei, Mihaly; Cui, Meng

    2012-01-01

    We present a newly adapted Brownian-Dynamics (BD)-based protein docking method for predicting native protein complexes. The approach includes global BD conformational sampling, compact complex selection, and local energy minimization. In order to reduce the computational costs for energy evaluations, a shell-based grid force field was developed to represent the receptor protein and solvation effects. The performance of this BD protein docking approach has been evaluated on a test set of 24 crystal protein complexes. Reproduction of experimental structures in the test set indicates the adequate conformational sampling and accurate scoring of this BD protein docking approach. Furthermore, we have developed an approach to account for the flexibility of proteins, which has been successfully applied to reproduce the experimental complex structure from the structure of two unbounded proteins. These results indicate that this adapted BD protein docking approach can be useful for the prediction of protein-protein interactions.

  4. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    PubMed

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation.

  5. Differentiating between rights-based and relational ethical approaches.

    PubMed

    Trobec, Irena; Herbst, Majda; Zvanut, Bostjan

    2009-05-01

    When forced treatment in mental health care is under consideration, two approaches guide clinicians in their actions: the dominant rights-based approach and the relational ethical approach. We hypothesized that nurses with bachelor's degrees differentiate better between the two approaches than nurses without a degree. To test this hypothesis a survey was performed in major Slovenian health institutions. We found that nurses emphasize the importance of ethics and personal values, but 55.4% of all the nurse participants confused the two approaches. The results confirmed our hypothesis and indicate the importance of nurses' formal education, especially when caring for patients with mental illness.

  6. Nanomaterial Toxicity Testing in the 21st Century: Use of a Predictive Toxicological Approach and High Throughput Screening

    PubMed Central

    NEL, ANDRE; XIA, TIAN; MENG, HUAN; WANG, XIANG; LIN, SIJIE; JI, ZHAOXIA; ZHANG, HAIYUAN

    2014-01-01

    Conspectus The production of engineered nanomaterials (ENMs) is a scientific breakthrough in material design and the development of new consumer products. While the successful implementation of nanotechnology is important for the growth of the global economy, we also need to consider the possible environmental health and safety (EHS) impact as a result of the novel physicochemical properties that could generate hazardous biological outcomes. In order to assess ENM hazard, reliable and reproducible screening approaches are needed to test the basic materials as well as nano-enabled products. A platform is required to investigate the potentially endless number of bio-physicochemical interactions at the nano/bio interface, in response to which we have developed a predictive toxicological approach. We define a predictive toxicological approach as the use of mechanisms-based high throughput screening in vitro to make predictions about the physicochemical properties of ENMs that may lead to the generation of pathology or disease outcomes in vivo. The in vivo results are used to validate and improve the in vitro high throughput screening (HTS) and to establish structure-activity relationships (SARs) that allow hazard ranking and modeling by an appropriate combination of in vitro and in vivo testing. This notion is in agreement with the landmark 2007 report from the US National Academy of Sciences, “Toxicity Testing in the 21st Century: A Vision and a Strategy” (http://www.nap.edu/catalog.php?record_id=11970), which advocates increased efficiency of toxicity testing by transitioning from qualitative, descriptive animal testing to quantitative, mechanistic and pathway-based toxicity testing in human cells or cell lines using high throughput approaches. Accordingly, we have implemented HTS approaches to screen compositional and combinatorial ENM libraries to develop hazard ranking and structure-activity relationships that can be used for predicting in vivo injury outcomes. This predictive approach allows the bulk of the screening analysis and high volume data generation to be carried out in vitro, following which limited, but critical, validation studies are carried out in animals or whole organisms. Risk reduction in the exposed human or environmental populations can then focus on limiting or avoiding exposures that trigger these toxicological responses as well as implementing safer design of potentially hazardous ENMs. In this communication, we review the tools required for establishing predictive toxicology paradigms to assess inhalation and environmental toxicological scenarios through the use of compositional and combinatorial ENM libraries, mechanism-based HTS assays, hazard ranking and development of nano-SARs. We will discuss the major injury paradigms that have emerged based on specific ENM properties, as well as describing the safer design of ZnO nanoparticles based on characterization of dissolution chemistry as a major predictor of toxicity. PMID:22676423

  7. Identity Recognition Algorithm Using Improved Gabor Feature Selection of Gait Energy Image

    NASA Astrophysics Data System (ADS)

    Chao, LIANG; Ling-yao, JIA; Dong-cheng, SHI

    2017-01-01

    This paper describes an effective gait recognition approach based on Gabor features of gait energy image. In this paper, the kernel Fisher analysis combined with kernel matrix is proposed to select dominant features. The nearest neighbor classifier based on whitened cosine distance is used to discriminate different gait patterns. The approach proposed is tested on the CASIA and USF gait databases. The results show that our approach outperforms other state of gait recognition approaches in terms of recognition accuracy and robustness.

  8. Huntington's disease predictive testing: the case for an assessment approach to requests from adolescents.

    PubMed Central

    Binedell, J; Soldan, J R; Scourfield, J; Harper, P S

    1996-01-01

    Adolescents who are actively requesting Huntington's predictive testing of their own accord pose a dilemma to those providing testing. In the absence of empirical evidence as regards the impact of genetic testing on minors, current policy and guidelines, based on the ethical principles of non-maleficence and respect for individual autonomy and confidentiality, generally exclude the testing of minors. It is argued that adherence to an age based exclusion criterion in Huntington's disease predictive testing protocols is out of step with trends in UK case law concerning minors' consent to medical treatment. Furthermore, contributions from developmental psychology and research into adolescents' decision making competence suggest that adolescents can make informed choices about their health and personal lives. Criteria for developing an assessment approach to such requests are put forward and the implications of a case by case evaluation of competence to consent in terms of clinicians' tolerance for uncertainty are discussed. PMID:8950670

  9. Test Scheduling for Core-Based SOCs Using Genetic Algorithm Based Heuristic Approach

    NASA Astrophysics Data System (ADS)

    Giri, Chandan; Sarkar, Soumojit; Chattopadhyay, Santanu

    This paper presents a Genetic algorithm (GA) based solution to co-optimize test scheduling and wrapper design for core based SOCs. Core testing solutions are generated as a set of wrapper configurations, represented as rectangles with width equal to the number of TAM (Test Access Mechanism) channels and height equal to the corresponding testing time. A locally optimal best-fit heuristic based bin packing algorithm has been used to determine placement of rectangles minimizing the overall test times, whereas, GA has been utilized to generate the sequence of rectangles to be considered for placement. Experimental result on ITC'02 benchmark SOCs shows that the proposed method provides better solutions compared to the recent works reported in the literature.

  10. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  11. A Multi-Level Approach for Promoting HIV Testing Within African American Church Settings

    PubMed Central

    2015-01-01

    Abstract The African American church is a community-based organization that is integral to the lives, beliefs, and behaviors of the African American community. Engaging this vital institution as a primary setting for HIV testing and referral would significantly impact the epidemic. The disproportionately high HIV incidence rate among African Americans dictates the national priority for promotion of early and routine HIV testing, and suggests engaging community-based organizations in this endeavor. However, few multilevel HIV testing frameworks have been developed, tested, and evaluated within the African American church. This article proposes one such framework for promoting HIV testing and referral within African American churches. A qualitative study was employed to examine the perceptions, beliefs, knowledge, and behaviors related to understanding involvement in church-based HIV testing. A total of four focus groups with church leaders and four in-depth interviews with pastors, were conducted between November 2012 and June 2013 to identify the constructs most important to supporting Philadelphia churches' involvement in HIV testing, referral, and linkage to care. The data generated from this study were analyzed using a grounded theory approach and used to develop and refine a multilevel framework for identifying factors impacting church-based HIV testing and referral and to ultimately support capacity building among African American churches to promote HIV testing and linkage to care. PMID:25682887

  12. Experimental issues related to frequency response function measurements for frequency-based substructuring

    NASA Astrophysics Data System (ADS)

    Nicgorski, Dana; Avitabile, Peter

    2010-07-01

    Frequency-based substructuring is a very popular approach for the generation of system models from component measured data. Analytically the approach has been shown to produce accurate results. However, implementation with actual test data can cause difficulties and cause problems with the system response prediction. In order to produce good results, extreme care is needed in the measurement of the drive point and transfer impedances of the structure as well as observe all the conditions for a linear time invariant system. Several studies have been conducted to show the sensitivity of the technique to small variations that often occur during typical testing of structures. These variations have been observed in actual tested configurations and have been substantiated with analytical models to replicate the problems typically encountered. The use of analytically simulated issues helps to clearly see the effects of typical measurement difficulties often observed in test data. This paper presents some of these common problems observed and provides guidance and recommendations for data to be used for this modeling approach.

  13. A Model-Based Method for Content Validation of Automatically Generated Test Items

    ERIC Educational Resources Information Center

    Zhang, Xinxin; Gierl, Mark

    2016-01-01

    The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…

  14. Optimizing point-of-care testing in clinical systems management.

    PubMed

    Kost, G J

    1998-01-01

    The goal of improving medical and economic outcomes calls for leadership based on fundamental principles. The manager of clinical systems works collaboratively within the acute care center to optimize point-of-care testing through systematic approaches such as integrative strategies, algorithms, and performance maps. These approaches are effective and efficacious for critically ill patients. Optimizing point-of-care testing throughout the entire health-care system is inherently more difficult. There is potential to achieve high-quality testing, integrated disease management, and equitable health-care delivery. Despite rapid change and economic uncertainty, a macro-strategic, information-integrated, feedback-systems, outcomes-oriented approach is timely, challenging, effective, and uplifting to the creative human spirit.

  15. MS lesion segmentation using a multi-channel patch-based approach with spatial consistency

    NASA Astrophysics Data System (ADS)

    Mechrez, Roey; Goldberger, Jacob; Greenspan, Hayit

    2015-03-01

    This paper presents an automatic method for segmentation of Multiple Sclerosis (MS) in Magnetic Resonance Images (MRI) of the brain. The approach is based on similarities between multi-channel patches (T1, T2 and FLAIR). An MS lesion patch database is built using training images for which the label maps are known. For each patch in the testing image, k similar patches are retrieved from the database. The matching labels for these k patches are then combined to produce an initial segmentation map for the test case. Finally a novel iterative patch-based label refinement process based on the initial segmentation map is performed to ensure spatial consistency of the detected lesions. A leave-one-out evaluation is done for each testing image in the MS lesion segmentation challenge of MICCAI 2008. Results are shown to compete with the state-of-the-art methods on the MICCAI 2008 challenge.

  16. An investigation into the effectiveness of problem-based learning in a physical chemistry laboratory course

    NASA Astrophysics Data System (ADS)

    Gürses, Ahmet; Açıkyıldız, Metin; Doğar, Çetin; Sözbilir, Mustafa

    2007-04-01

    The aim of this study was to investigate the effectiveness of a problem-based learning (PBL) approach in a physical chemistry laboratory course. The parameters investigated were students’ attitudes towards a chemistry laboratory course, scientific process skills of students and their academic achievement. The design of the study was one group pre-test post-test. Four experiments, covering the topics adsorption, viscosity, surface tension and conductivity were performed using a PBL approach in the fall semester of the 2003/04 academic year at Kazim Karabekir Education Faculty of Atatürk University. Each experiment was done over a three week period. A total of 40 students, 18 male and 22 female, participated in the study. Students took the Physical Chemistry Laboratory Concept Test (PCLCT), Attitudes towards Chemistry Laboratory (ATCL) questionnaire and Science Process Skills Test (SPST) as pre and post-tests. In addition, the effectiveness of the PBL approach was also determined through four different scales; Scales Specific to Students’ Views of PBL. A statistically significant difference between the students’ academic achievement and scientific process skills at p

  17. Glossary of reference terms for alternative test methods and their validation.

    PubMed

    Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas

    2014-01-01

    This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.

  18. Nonlinear viscoelastic characterization of polymer materials using a dynamic-mechanical methodology

    NASA Technical Reports Server (NTRS)

    Strganac, Thomas W.; Payne, Debbie Flowers; Biskup, Bruce A.; Letton, Alan

    1995-01-01

    Polymer materials retrieved from LDEF exhibit nonlinear constitutive behavior; thus the authors present a method to characterize nonlinear viscoelastic behavior using measurements from dynamic (oscillatory) mechanical tests. Frequency-derived measurements are transformed into time-domain properties providing the capability to predict long term material performance without a lengthy experimentation program. Results are presented for thin-film high-performance polymer materials used in the fabrication of high-altitude scientific balloons. Predictions based upon a linear test and analysis approach are shown to deteriorate for moderate to high stress levels expected for extended applications. Tests verify that nonlinear viscoelastic response is induced by large stresses. Hence, an approach is developed in which the stress-dependent behavior is examined in a manner analogous to modeling temperature-dependent behavior with time-temperature correspondence and superposition principles. The development leads to time-stress correspondence and superposition of measurements obtained through dynamic mechanical tests. Predictions of material behavior using measurements based upon linear and nonlinear approaches are compared with experimental results obtained from traditional creep tests. Excellent agreement is shown for the nonlinear model.

  19. A new impedance based approach to test the activity of recombinant protein--Semaphorins as a test case.

    PubMed

    Birger, Anastasya; Besser, Elazar; Reubinoff, Benjamin; Behar, Oded

    2015-10-01

    The biological activity of a recombinant protein is routinely measured using a bioassay such as an enzyme assay. However, many proteins have no enzymatic activity and in many cases it is difficult to devise a simple and reliable approach to test their activity. Semaphorins, Ephrins, Slits, Netrins or amylin-assisted proteins have numerous activities affecting many systems and cell types in the human body. Most of them are also able to induce rapid cytoskeleton changes at least in some cell types. We assumed therefore, that such proteins might be tested based on their ability to modulate the cytoskeleton. Here we tested a number of semaphorins in an impedance based label-free platform that allows for dynamic monitoring of subtle morphological and adhesive changes. This system has proved to be a very fast, sensitive and effective way to monitor and determine the activity of such proteins. Furthermore we showed that it is possible to customize a cell-protein system by transfecting the cells with specific receptors and test the cell response following the addition of the recombinant ligand protein. Since other protein families such as Ephrins and Netrins can also influence the cytoskeleton of some cells, this approach may be applicable to a large number of proteins. Copyright © 2015 Elsevier GmbH. All rights reserved.

  20. A Model for Applying Lexical Approach in Teaching Russian Grammar.

    ERIC Educational Resources Information Center

    Gettys, Serafima

    The lexical approach to teaching Russian grammar is explained, an instructional sequence is outlined, and a classroom study testing the effectiveness of the approach is reported. The lexical approach draws on research on cognitive psychology, second language acquisition theory, and research on learner language. Its bases in research and its…

  1. Assessing the Humanities in the Primary School Using a Portfolio-Based Approach

    ERIC Educational Resources Information Center

    Eaude, Tony

    2017-01-01

    This article suggests that a portfolio-based approach to assessing the humanities in the primary school is appropriate and outlines what this might involve. It argues for a broad interpretation of "the humanities" and for adopting principles associated with formative assessment, where assessment is not equated with testing and a wide…

  2. Effects of Problem Based Economics on High School Economics Instruction

    ERIC Educational Resources Information Center

    Finkelstein, Neal; Hanson, Thomas

    2011-01-01

    The primary purpose of this study is to assess student-level impacts of a problem-based instructional approach to high school economics. The curriculum approach examined here was designed to increase class participation and content knowledge for high school students who are learning economics. This study tests the effectiveness of Problem Based…

  3. Responsiveness of performance-based outcome measures for mobility, balance, muscle strength and manual dexterity in adults with myotonic dystrophy type 1.

    PubMed

    Kierkegaard, Marie; Petitclerc, Émilie; Hébert, Luc J; Mathieu, Jean; Gagnon, Cynthia

    2018-02-28

    To assess changes and responsiveness in outcome measures of mobility, balance, muscle strength and manual dexterity in adults with myotonic dystrophy type 1. A 9-year longitudinal study conducted with 113 patients. The responsiveness of the Timed Up and Go test, Berg Balance Scale, quantitative muscle testing, grip and pinch-grip strength, and Purdue Pegboard Test was assessed using criterion and construct approaches. Patient-reported perceived changes (worse/stable) in balance, walking, lower-limb weakness, stair-climbing and hand weakness were used as criteria. Predefined hypotheses about expected area under the receiver operating characteristic curves (criterion approach) and correlations between relative changes (construct approach) were explored. The direction and magnitude of median changes in outcome measures corresponded with patient-reported changes. Median changes in the Timed Up and Go test, grip strength, pinch-grip strength and Purdue Pegboard Test did not, in general, exceed known measurement errors. Most criterion (72%) and construct (70%) approach hypotheses were supported. Promising responsiveness was found for outcome measures of mobility, balance and muscle strength. Grip strength and manual dexterity measures showed poorer responsiveness. The performance-based outcome measures captured changes over the 9-year period and responsiveness was promising. Knowledge of measurement errors is needed to interpret the meaning of these longitudinal changes.

  4. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  5. A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants

    PubMed Central

    Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.

    2016-01-01

    Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286

  6. Mapping the Human Toxome by Systems Toxicology

    PubMed Central

    Bouhifd, Mounir; Hogberg, Helena T.; Kleensang, Andre; Maertens, Alexandra; Zhao, Liang; Hartung, Thomas

    2014-01-01

    Toxicity testing typically involves studying adverse health outcomes in animals subjected to high doses of toxicants with subsequent extrapolation to expected human responses at lower doses. The low-throughput of current toxicity testing approaches (which are largely the same for industrial chemicals, pesticides and drugs) has led to a backlog of more than 80,000 chemicals to which human beings are potentially exposed whose potential toxicity remains largely unknown. Employing new testing strategies that employ the use of predictive, high-throughput cell-based assays (of human origin) to evaluate perturbations in key pathways, referred as pathways of toxicity, and to conduct targeted testing against those pathways, we can begin to greatly accelerate our ability to test the vast “storehouses” of chemical compounds using a rational, risk-based approach to chemical prioritization, and provide test results that are more predictive of human toxicity than current methods. The NIH Transformative Research Grant project Mapping the Human Toxome by Systems Toxicology aims at developing the tools for pathway mapping, annotation and validation as well as the respective knowledge base to share this information. PMID:24443875

  7. Measurement of Residual Flexibility for Substructures Having Prominent Flexible Interfaces

    NASA Technical Reports Server (NTRS)

    Tinker, Michael L.; Bookout, Paul S.

    1994-01-01

    Verification of a dynamic model of a constrained structure requires a modal survey test of the physical structure and subsequent modification of the model to obtain the best agreement possible with test data. Constrained-boundary or fixed-base testing has historically been the most common approach for verifying constrained mathematical models, since the boundary conditions of the test article are designed to match the actual constraints in service. However, there are difficulties involved with fixed-base testing, in some cases making the approach impractical. It is not possible to conduct a truly fixed-base test due to coupling between the test article and the fixture. In addition, it is often difficult to accurately simulate the actual boundary constraints, and the cost of designing and constructing the fixture may be prohibitive. For use when fixed-base testing proves impractical or undesirable, alternate free-boundary test methods have been investigated, including the residual flexibility technique. The residual flexibility approach has been treated analytically in considerable detail and has had limited frequency response measurements for the method. This concern is well-justified for a number of reasons. First, residual flexibilities are very small numbers, typically on the order of 1.0E-6 in/lb for translational diagonal terms, and orders of magnitude smaller for off-diagonal values. This poses difficulty in obtaining accurate and noise-free measurements, especially for points removed from the excitation source. A second difficulty encountered in residual measurements lies in obtaining a clean residual function in the process of subtracting synthesized modal data from a measured response function. Inaccuracies occur since modes are not subtracted exactly, but only to the accuracy of the curve fits for each mode; these errors are compounded with increasing distance from the excitation point. In this paper, the residual flexibility method is applied to a simple structure in both test and analysis. Measured and predicted residual functions are compared, and regions of poor data in the measured curves are described. It is found that for accurate residual measurements, frequency response functions having prominent stiffness lines in the acceleration/force format are needed. The lack of such stiffness lines increases measurement errors. Interface drive point frequency respose functions for shuttle orbiter payloads exhibit dominant stiffness lines, making the residual test approach a good candidate for payload modal tests when constrained tests are inappropriate. Difficulties in extracting a residual flexibility value from noisy test data are discussed. It is shown that use of a weighted second order least-squares curve fit of the measured residual function allows identification of residual flexibility that compares very well with predictions for the simple structure. This approach also provides an estimate of second order residual mass effects.

  8. Instruction-Based Approach-Avoidance Effects: Changing Stimulus Evaluation via the Mere Instruction to Approach or Avoid Stimuli.

    PubMed

    Van Dessel, Pieter; De Houwer, Jan; Gast, Anne; Tucker Smith, Colin

    2015-01-01

    Prior research suggests that repeatedly approaching or avoiding a certain stimulus changes the liking of this stimulus. We investigated whether these effects of approach and avoidance training occur also when participants do not perform these actions but are merely instructed about the stimulus-action contingencies. Stimulus evaluations were registered using both implicit (Implicit Association Test and evaluative priming) and explicit measures (valence ratings). Instruction-based approach-avoidance effects were observed for relatively neutral fictitious social groups (i.e., Niffites and Luupites), but not for clearly valenced well-known social groups (i.e., Blacks and Whites). We conclude that instructions to approach or avoid stimuli can provide sufficient bases for establishing both implicit and explicit evaluations of novel stimuli and discuss several possible reasons for why similar instruction-based approach-avoidance effects were not found for valenced well-known stimuli.

  9. Methodological Issues in Achieving School Accountability

    ERIC Educational Resources Information Center

    Linn, Robert L.

    2008-01-01

    Test-based educational accountability is widely used in many countries, but is pervasive in the US. Key features of test-based accountability required by the US No Child Left Behind Act are discussed. Particular attention is given to methodological issues such as the distinction between status and growth approaches, the setting of performance…

  10. Accountability Policies and Teachers' Acceptance and Usage of School Performance Feedback--A Comparative Study

    ERIC Educational Resources Information Center

    Maier, Uwe

    2010-01-01

    This paper implemented a comparative approach to investigate the relationships between test-based school accountability policies in 2 German states and teachers' acceptance and usage of feedback information. Thuringia implemented mandatory tests for secondary schools based on competency modeling and performance data controlled for socioeconomic…

  11. Maxi CAI with a Micro.

    ERIC Educational Resources Information Center

    Gerhold, George; And Others

    This paper describes an effective microprocessor-based CAI system which has been repeatedly tested by a large number of students and edited accordingly. Tasks not suitable for microprocessor based systems (authoring, testing, and debugging) were handled on larger multi-terminal systems. This approach requires that the CAI language used on the…

  12. A Design-Based Approach to Fostering Understanding of Global Climate Change

    ERIC Educational Resources Information Center

    Svihla, Vanessa; Linn, Marcia C.

    2012-01-01

    To prepare students to make informed decisions and gain coherent understanding about global climate change, we tested and refined a middle school inquiry unit that featured interactive visualizations. Based on evidence from student pre-test responses, we increased emphasis on energy transfer and transformation. The first iteration improved…

  13. The Effect of DBAE Approach on Teaching Painting of Undergraduate Art Students

    ERIC Educational Resources Information Center

    Hedayat, Mina; Kahn, Sabzali Musa; Honarvar, Habibeh; Bakar, Syed Alwi Syed Abu; Samsuddin, Mohd Effindi

    2013-01-01

    The aim of this study is to implement a new method of teaching painting which uses the Discipline-Based Art Education (DBAE) approach for the undergraduate art students at Tehran University. In the current study, the quasi-experimental method was used to test the hypothesis three times (pre, mid and post-tests). Thirty students from two classes…

  14. A hardware-in-the-loop simulation program for ground-based radar

    NASA Astrophysics Data System (ADS)

    Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna

    2011-06-01

    A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.

  15. Vibrational multiconfiguration self-consistent field theory: implementation and test calculations.

    PubMed

    Heislbetz, Sandra; Rauhut, Guntram

    2010-03-28

    A state-specific vibrational multiconfiguration self-consistent field (VMCSCF) approach based on a multimode expansion of the potential energy surface is presented for the accurate calculation of anharmonic vibrational spectra. As a special case of this general approach vibrational complete active space self-consistent field calculations will be discussed. The latter method shows better convergence than the general VMCSCF approach and must be considered the preferred choice within the multiconfigurational framework. Benchmark calculations are provided for a small set of test molecules.

  16. The effect of web quest and team-based learning on students’ self-regulation

    PubMed Central

    BADIYEPEYMAIE JAHROMI, ZOHREH; MOSALANEJAD, LEILI; REZAEE, RITA

    2016-01-01

    Introduction In this study, the authors aimed to examine the effects of cooperative learning methods using Web Quest and team-based learning on students’ self-direction, self-regulation, and academic achievement. Method This is a comparative study of students taking a course in mental health and psychiatric disorders. In two consecutive years, a group of students were trained using the WebQuest approach as a teaching strategy (n = 38), while the other group was taught using team-based learning (n=39). Data gathering was based on Guglielmino’s self-directed learning readiness scale (SDLRS) and Buford’s self-regulation questionnaire. The data were analyzed by descriptive test using M (IQR), Wilcoxon signed-rank test, and the Mann–Whitney U-test in SPSS software, version 13. p<0.05 was considered as the significance level. Results The results of the Mann–Whitney U test showed that the participants’ self- directed (self-management) and self-regulated learning differed between the two groups (p=0.04 and p=0.01, respectively). Wilcoxon test revealed that self-directed learning indices (self-control and self-management) were differed between the two strategies before and after the intervention. However, the scores related to learning (students’ final scores) were higher in the WebQuest approach than in team-based learning. Conclusion By employing modern educational approaches, students are not only more successful in their studies but also acquire the necessary professional skills for future performance. Further research to compare the effects of new methods of teaching is required. PMID:27104202

  17. Analysis of Slug Tests in Formations of High Hydraulic Conductivity

    USGS Publications Warehouse

    Butler, J.J.; Garnett, E.J.; Healey, J.M.

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  18. Analysis of slug tests in formations of high hydraulic conductivity.

    PubMed

    Butler, James J; Garnett, Elizabeth J; Healey, John M

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  19. Current-State Constrained Filter Bank for Wald Testing of Spacecraft Conjunctions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2012-01-01

    We propose a filter bank consisting of an ordinary current-state extended Kalman filter, and two similar but constrained filters: one is constrained by a null hypothesis that the miss distance between two conjuncting spacecraft is inside their combined hard body radius at the predicted time of closest approach, and one is constrained by an alternative complementary hypothesis. The unconstrained filter is the basis of an initial screening for close approaches of interest. Once the initial screening detects a possibly risky conjunction, the unconstrained filter also governs measurement editing for all three filters, and predicts the time of closest approach. The constrained filters operate only when conjunctions of interest occur. The computed likelihoods of the innovations of the two constrained filters form a ratio for a Wald sequential probability ratio test. The Wald test guides risk mitigation maneuver decisions based on explicit false alarm and missed detection criteria. Since only current-state Kalman filtering is required to compute the innovations for the likelihood ratio, the present approach does not require the mapping of probability density forward to the time of closest approach. Instead, the hard-body constraint manifold is mapped to the filter update time by applying a sigma-point transformation to a projection function. Although many projectors are available, we choose one based on Lambert-style differential correction of the current-state velocity. We have tested our method using a scenario based on the Magnetospheric Multi-Scale mission, scheduled for launch in late 2014. This mission involves formation flight in highly elliptical orbits of four spinning spacecraft equipped with antennas extending 120 meters tip-to-tip. Eccentricities range from 0.82 to 0.91, and close approaches generally occur in the vicinity of perigee, where rapid changes in geometry may occur. Testing the method using two 12,000-case Monte Carlo simulations, we found the method achieved a missed detection rate of 0.1%, and a false alarm rate of 2%.

  20. The vaccines consistency approach project: an EPAA initiative.

    PubMed

    De Mattia, F; Hendriksen, C; Buchheit, K H; Chapsal, J M; Halder, M; Lambrigts, D; Redhead, K; Rommel, E; Scharton-Kersten, T; Sesardic, T; Viviani, L; Ragan, I

    2015-01-01

    The consistency approach for release testing of established vaccines promotes the use of in vitro, analytical, non-animal based systems allowing the monitoring of quality parameters during the whole production process. By using highly sensitive non-animal methods, the consistency approach has the potential to improve the quality of testing and to foster the 3Rs (replacement, refinement and reduction of animal use) for quality control of established vaccines. This concept offers an alternative to the current quality control strategy which often requires large numbers of laboratory animals. In order to facilitate the introduction of the consistency approach for established human and veterinary vaccine quality control, the European Partnership for Alternatives to Animal Testing (EPAA) initiated a project, the "Vaccines Consistency Approach Project", aiming at developing and validating the consistency approach with stakeholders from academia, regulators, OMCLs, EDQM, European Commission and industry. This report summarises progress since the project's inception.

  1. Structure–activity relationships study of mTOR kinase inhibition using QSAR and structure-based drug design approaches

    PubMed Central

    Lakhlili, Wiame; Yasri, Abdelaziz; Ibrahimi, Azeddine

    2016-01-01

    The discovery of clinically relevant inhibitors of mammalian target of rapamycin (mTOR) for anticancer therapy has proved to be a challenging task. The quantitative structure–activity relationship (QSAR) approach is a very useful and widespread technique for ligand-based drug design, which can be used to identify novel and potent mTOR inhibitors. In this study, we performed two-dimensional QSAR tests, and molecular docking validation tests of a series of mTOR ATP-competitive inhibitors to elucidate their structural properties associated with their activity. The QSAR tests were performed using partial least square method with a correlation coefficient of r2=0.799 and a cross-validation of q2=0.714. The chemical library screening was done by associating ligand-based to structure-based approach using the three-dimensional structure of mTOR developed by homology modeling. We were able to select 22 compounds from two databases as inhibitors of the mTOR kinase active site. We believe that the method and applications highlighted in this study will help future efforts toward the design of selective ATP-competitive inhibitors. PMID:27980424

  2. Rapid review of cognitive screening instruments in MCI: proposal for a process-based approach modification of overlapping tasks in select widely used instruments.

    PubMed

    Díaz-Orueta, Unai; Blanco-Campal, Alberto; Burke, Teresa

    2018-05-01

    ABSTRACTBackground:A detailed neuropsychological assessment plays an important role in the diagnostic process of Mild Cognitive Impairment (MCI). However, available brief cognitive screening tests for this clinical population are administered and interpreted based mainly, or exclusively, on total achievement scores. This score-based approach can lead to erroneous clinical interpretations unless we also pay attention to the test taking behavior or to the type of errors committed during test performance. The goal of the current study is to perform a rapid review of the literature regarding cognitive screening tools for dementia in primary and secondary care; this will include revisiting previously published systematic reviews on screening tools for dementia, extensive database search, and analysis of individual references cited in selected studies. A subset of representative screening tools for dementia was identified that covers as many cognitive functions as possible. How these screening tools overlap with each other (in terms of the cognitive domains being measured and the method used to assess them) was examined and a series of process-based approach (PBA) modifications for these overlapping features was proposed, so that the changes recommended in relation to one particular cognitive task could be extrapolated to other screening tools. It is expected that future versions of cognitive screening tests, modified using a PBA, will highlight the benefits of attending to qualitative features of test performance when trying to identify subtle features suggestive of MCI and/or dementia.

  3. Optical testing of progressive ophthalmic glasses based on galvo mirrors

    NASA Astrophysics Data System (ADS)

    Stuerwald, S.; Schmitt, R.

    2014-03-01

    In production of ophthalmic freeform optics like progressive eyeglasses, the specimens are tested according to a standardized method which is based on the measurement of the vertex power on usually less than 10 points. For a better quality management and thus to ensure more reliable and valid tests, a more comprehensive measurement approach is required. For Shack Hartmann Sensors (SHS) the dynamic range is defined by the number of micro-lenses and the resolution of the imaging sensor. Here, we present an approach for measuring wavefronts with increased dynamic range and lateral resolution by the use of a scanning procedure. Therefore, the proposed innovative setup is based on galvo mirrors that are capable of measuring the vertex power with a lateral resolution below one millimeter since this is sufficient for a functional test of progressive eyeglasses. Expressed in a more abstract way, the concept is based on a selection and thereby encoding of single sub-apertures of the wave front under test. This allows measuring the wave fronts slope consecutively in a scanning procedure. The use of high precision galvo systems allows a lateral resolution below one millimeter as well as a significant fast scanning ability. The measurement concept and performance of this method will be demonstrated for different spherical and freeformed specimens like progressive eye glasses. Furthermore, approaches for calibration of the measurement system will be characterized and the optical design of the detector will be discussed.

  4. FY16 Status Report on Development of Integrated EPP and SMT Design Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetter, R. I.; Sham, T. -L.; Wang, Y.

    2016-08-01

    The goal of the Elastic-Perfectly Plastic (EPP) combined integrated creep-fatigue damage evaluation approach is to incorporate a Simplified Model Test (SMT) data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The EPP methodology is based on the idea that creep damage and strain accumulation can be bounded by a properly chosen “pseudo” yield strength used in an elastic-perfectly plastic analysis, thus avoiding the need for stress classification. The originalmore » SMT approach is based on the use of elastic analysis. The experimental data, cycles to failure, is correlated using the elastically calculated strain range in the test specimen and the corresponding component strain is also calculated elastically. The advantage of this approach is that it is no longer necessary to use the damage interaction, or D-diagram, because the damage due to the combined effects of creep and fatigue are accounted in the test data by means of a specimen that is designed to replicate or bound the stress and strain redistribution that occurs in actual components when loaded in the creep regime. The reference approach to combining the two methodologies and the corresponding uncertainties and validation plans are presented. Results from recent key feature tests are discussed to illustrate the applicability of the EPP methodology and the behavior of materials at elevated temperature when undergoing stress and strain redistribution due to plasticity and creep.« less

  5. Individual interviews and focus groups in patients with rheumatoid arthritis: a comparison of two qualitative methods.

    PubMed

    Coenen, Michaela; Stamm, Tanja A; Stucki, Gerold; Cieza, Alarcos

    2012-03-01

    To compare two different approaches to performing focus groups and individual interviews, an open approach, and an approach based on the International Classification of Functioning, Disability and Health (ICF). Patients with rheumatoid arthritis attended focus groups (n = 49) and individual interviews (n = 21). Time, number of concepts, ICF categories identified, and sample size for reaching saturation of data were compared. Descriptive statistics, Chi-square tests, and independent t tests were performed. With an overall time of 183 h, focus groups were more time consuming than individual interviews (t = 9.782; P < 0.001). In the open approach, 188 categories in the focus groups and 102 categories in the interviews were identified compared to the 231 and 110 respective categories identified in the ICF-based approach. Saturation of data was reached after performing five focus groups and nine individual interviews in the open approach and five focus groups and 12 individual interviews in the ICF-based approach. The method chosen should depend on the objective of the study, issues related to the health condition, and the study's participants. We recommend performing focus groups if the objective of the study is to comprehensively explore the patient perspective.

  6. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  7. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  8. The enhancement of students' mathematical problem solving ability through teaching with metacognitive scaffolding approach

    NASA Astrophysics Data System (ADS)

    Prabawanto, Sufyani

    2017-05-01

    This research aims to investigate the enhancement of students' mathematical problem solving through teaching with metacognitive scaffolding approach. This research used a quasi-experimental design with pretest-posttest control. The subjects were pre-service elementary school teachers in a state university in Bandung. In this study, there were two groups: experimental and control groups. The experimental group consists of 60 studentswho acquire teaching mathematicsunder metacognitive scaffolding approach, while the control group consists of 58 studentswho acquire teaching mathematicsunder direct approach. Students were classified into three categories based on the mathematical prior ability, namely high, middle, and low. Data collection instruments consist of mathematical problem solving test instruments. By usingmean difference test, two conclusions of the research:(1) there is a significant difference in the enhancement of mathematical problem solving between the students who attended the course under metacognitive scaffolding approach and students who attended the course under direct approach, and(2) thereis no significant interaction effect of teaching approaches and ability level based on the mathematical prior ability toward enhancement of students' mathematical problem solving.

  9. Use of tablet-based kiosks in the emergency department to guide patient HIV self-testing with a point-of-care oral fluid test.

    PubMed

    Gaydos, Charlotte A; Solis, Melissa; Hsieh, Yu-Hsiang; Jett-Goheen, Mary; Nour, Samah; Rothman, Richard E

    2013-09-01

    Despite successes in efforts to integrate HIV testing into routine care in emergency departments, challenges remain. Kiosk-facilitated, directed HIV self-testing offers one novel approach to address logistical challenges. Emergency department patients, 18-64 years, were recruited to evaluate use of tablet-based-kiosks to guide patients to conduct their own point-of-care HIV tests followed by standard-of-care HIV tests by healthcare workers. Both tests were OraQuick Advance tests. Of 955 patients approached, 473 (49.5%) consented; 467 completed the test, and 100% had concordant results with healthcare workers. Median age was 41 years, 59.6% were female, 74.8% were African-American, and 19.6% were White. In all, 99.8% of patients believed the self-test was "definitely" or "probably" correct; 91.7% of patients "trusted their results very much"; 99.8% reported "overall" self-testing was "easy or somewhat easy" to perform. Further, 96.9% indicated they would "probably" or "definitely" test themselves at home were the HIV test available for purchase; 25.9% preferred self-testing versus 34.4% who preferred healthcare professional testing (p>0.05). Tablet-based kiosk testing proved to be highly feasible, acceptable, and an accurate method of conducting rapid HIV self-testing in this study; however, rates of engagement were moderate. More research will be required to ascertain barriers to increased engagement for self-testing.

  10. CR-Calculus and adaptive array theory applied to MIMO random vibration control tests

    NASA Astrophysics Data System (ADS)

    Musella, U.; Manzato, S.; Peeters, B.; Guillaume, P.

    2016-09-01

    Performing Multiple-Input Multiple-Output (MIMO) tests to reproduce the vibration environment in a user-defined number of control points of a unit under test is necessary in applications where a realistic environment replication has to be achieved. MIMO tests require vibration control strategies to calculate the required drive signal vector that gives an acceptable replication of the target. This target is a (complex) vector with magnitude and phase information at the control points for MIMO Sine Control tests while in MIMO Random Control tests, in the most general case, the target is a complete spectral density matrix. The idea behind this work is to tailor a MIMO random vibration control approach that can be generalized to other MIMO tests, e.g. MIMO Sine and MIMO Time Waveform Replication. In this work the approach is to use gradient-based procedures over the complex space, applying the so called CR-Calculus and the adaptive array theory. With this approach it is possible to better control the process performances allowing the step-by-step Jacobian Matrix update. The theoretical bases behind the work are followed by an application of the developed method to a two-exciter two-axis system and by performance comparisons with standard methods.

  11. Artificial neuron-glia networks learning approach based on cooperative coevolution.

    PubMed

    Mesejo, Pablo; Ibáñez, Oscar; Fernández-Blanco, Enrique; Cedrón, Francisco; Pazos, Alejandro; Porto-Pazos, Ana B

    2015-06-01

    Artificial Neuron-Glia Networks (ANGNs) are a novel bio-inspired machine learning approach. They extend classical Artificial Neural Networks (ANNs) by incorporating recent findings and suppositions about the way information is processed by neural and astrocytic networks in the most evolved living organisms. Although ANGNs are not a consolidated method, their performance against the traditional approach, i.e. without artificial astrocytes, was already demonstrated on classification problems. However, the corresponding learning algorithms developed so far strongly depends on a set of glial parameters which are manually tuned for each specific problem. As a consequence, previous experimental tests have to be done in order to determine an adequate set of values, making such manual parameter configuration time-consuming, error-prone, biased and problem dependent. Thus, in this paper, we propose a novel learning approach for ANGNs that fully automates the learning process, and gives the possibility of testing any kind of reasonable parameter configuration for each specific problem. This new learning algorithm, based on coevolutionary genetic algorithms, is able to properly learn all the ANGNs parameters. Its performance is tested on five classification problems achieving significantly better results than ANGN and competitive results with ANN approaches.

  12. The Impact of Problem-Based Learning Approach to Senior High School Students' Mathematics Critical Thinking Ability

    ERIC Educational Resources Information Center

    Widyatiningtyas, Reviandari; Kusumah, Yaya S.; Sumarmo, Utari; Sabandar, Jozua

    2015-01-01

    The study reported the findings of an only post-test control group research design and aims to analyze the influence of problem-based learning approach, school level, and students' prior mathematical ability to student's mathematics critical thinking ability. The research subjects were 140 grade ten senior high school students coming from…

  13. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  14. Physiology and the Biomedical Engineering Curriculum: Utilizing Emerging Instructional Technologies to Promote Development of Adaptive Expertise in Undergraduate Students

    ERIC Educational Resources Information Center

    Nelson, Regina K.

    2013-01-01

    A mixed-methods research study was designed to test whether undergraduate engineering students were better prepared to learn advanced topics in biomedical engineering if they learned physiology via a quantitative, concept-based approach rather than a qualitative, system-based approach. Experiments were conducted with undergraduate engineering…

  15. Thematic versus Subject-Based Curriculum Delivery and Achievement Goals: Findings from a Single-School Study

    ERIC Educational Resources Information Center

    Putwain, Dave; Whiteley, Helen; Caddick, Lee

    2011-01-01

    Background: It has been claimed that thematic or integrated approaches to curriculum delivery offer a range of advantages over subject-based modes of delivery including improved pupil motivation. Purpose: This study put claims regarding pupil motivation to the test, using the achievement goals framework. This contemporary approach to understanding…

  16. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    PubMed Central

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  17. Advanced Stirling Convertor Dynamic Test Approach and Results

    NASA Technical Reports Server (NTRS)

    Meer, David W.; Hill, Dennis; Ursic, Joseph J.

    2010-01-01

    The U.S. Department of Energy (DOE), Lockheed Martin Corporation (LM), and NASA Glenn Research Center (GRC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. As part of the extended operation testing of this power system, the Advanced Stirling Convertors (ASC) at NASA GRC undergo a vibration test sequence intended to simulate the vibration history that an ASC would experience when used in an ASRG for a space mission. This sequence includes testing at workmanship and flight acceptance levels interspersed with periods of extended operation to simulate prefueling and post fueling. The final step in the test sequence utilizes additional testing at flight acceptance levels to simulate launch. To better replicate the acceleration profile seen by an ASC incorporated into an ASRG, the input spectra used in testing the convertors was modified based on dynamic testing of the ASRG Engineering Unit (ASRG EU) at LM. This paper outlines the overall test approach, summarizes the test results from the ASRG EU, describes the incorporation of those results into the test approach, and presents the results of applying the test approach to the ASC-1 #3 and #4 convertors. The test results include data from several accelerometers mounted on the convertors as well as the piston position and output power variables.

  18. Recommendations for Evaluating Multiple Filters in Ballast Water Management Systems for US Type Approval

    DTIC Science & Technology

    2016-01-01

    is extremely unlikely to be practicable . A second approach is to conduct a full suite of TA testing on a BWMS with a “base filter” configuration...that of full TA testing. Here, three land-based tests would be conducted, and O&M and component testing would also occur. If time or practicality ... Practical salinity units SAE Society of Automotive Engineers SDI Silt density index SOP Standard operating procedure STEP Shipboard Technology

  19. GEE-based SNP set association test for continuous and discrete traits in family-based association studies.

    PubMed

    Wang, Xuefeng; Lee, Seunggeun; Zhu, Xiaofeng; Redline, Susan; Lin, Xihong

    2013-12-01

    Family-based genetic association studies of related individuals provide opportunities to detect genetic variants that complement studies of unrelated individuals. Most statistical methods for family association studies for common variants are single marker based, which test one SNP a time. In this paper, we consider testing the effect of an SNP set, e.g., SNPs in a gene, in family studies, for both continuous and discrete traits. Specifically, we propose a generalized estimating equations (GEEs) based kernel association test, a variance component based testing method, to test for the association between a phenotype and multiple variants in an SNP set jointly using family samples. The proposed approach allows for both continuous and discrete traits, where the correlation among family members is taken into account through the use of an empirical covariance estimator. We derive the theoretical distribution of the proposed statistic under the null and develop analytical methods to calculate the P-values. We also propose an efficient resampling method for correcting for small sample size bias in family studies. The proposed method allows for easily incorporating covariates and SNP-SNP interactions. Simulation studies show that the proposed method properly controls for type I error rates under both random and ascertained sampling schemes in family studies. We demonstrate through simulation studies that our approach has superior performance for association mapping compared to the single marker based minimum P-value GEE test for an SNP-set effect over a range of scenarios. We illustrate the application of the proposed method using data from the Cleveland Family GWAS Study. © 2013 WILEY PERIODICALS, INC.

  20. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  1. Comparisons of Means Using Exploratory and Confirmatory Approaches

    ERIC Educational Resources Information Center

    Kuiper, Rebecca M.; Hoijtink, Herbert

    2010-01-01

    This article discusses comparisons of means using exploratory and confirmatory approaches. Three methods are discussed: hypothesis testing, model selection based on information criteria, and Bayesian model selection. Throughout the article, an example is used to illustrate and evaluate the two approaches and the three methods. We demonstrate that…

  2. Methodological Approaches to Online Scoring of Essays.

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.; O'Neil, Harold F., Jr.

    This report examines the feasibility of scoring essays using computer-based techniques. Essays have been incorporated into many of the standardized testing programs. Issues of validity and reliability must be addressed to deploy automated approaches to scoring fully. Two approaches that have been used to classify documents, surface- and word-based…

  3. Effect of different runway size on pilot performance during simulated night landing approaches.

    DOT National Transportation Integrated Search

    1981-02-01

    In Experiment I, three pilots flew simulated approaches and landings in a fixed-base simulator with a computer-generated-image visual display. Practice approaches were flown with an 8,000-ft-long runway that was either 75, 150, or 300 ft wide; test a...

  4. Spacecraft attitude control using a smart control system

    NASA Technical Reports Server (NTRS)

    Buckley, Brian; Wheatcraft, Louis

    1992-01-01

    Traditionally, spacecraft attitude control has been implemented using control loops written in native code for a space hardened processor. The Naval Research Lab has taken this approach during the development of the Attitude Control Electronics (ACE) package. After the system was developed and delivered, NRL decided to explore alternate technologies to accomplish this same task more efficiently. The approach taken by NRL was to implement the ACE control loops using systems technologies. The purpose of this effort was to: (1) research capabilities required of an expert system in processing a classic closed-loop control algorithm; (2) research the development environment required to design and test an embedded expert systems environment; (3) research the complexity of design and development of expert systems versus a conventional approach; and (4) test the resulting systems against the flight acceptance test software for both response and accuracy. Two expert systems were selected to implement the control loops. Criteria used for the selection of the expert systems included that they had to run in both embedded systems and ground based environments. Using two different expert systems allowed a comparison of the real-time capabilities, inferencing capabilities, and the ground-based development environment. The two expert systems chosen for the evaluation were Spacecraft Command Language (SCL), and NEXTPERT Object. SCL is a smart control system produced for the NRL by Interface and Control Systems (ICS). SCL was developed to be used for real-time command, control, and monitoring of a new generation of spacecraft. NEXPERT Object is a commercially available product developed by Neuron Data. Results of the effort were evaluated using the ACE test bed. The ACE test bed had been developed and used to test the original flight hardware and software using simulators and flight-like interfaces. The test bed was used for testing the expert systems in a 'near-flight' environment. The technical approach, the system architecture, the development environments, knowledge base development, and results of this effort are detailed.

  5. 37: COMPARISON OF TWO METHODS: TBL-BASED AND LECTURE-BASED LEARNING IN NURSING CARE OF PATIENTS WITH DIABETES IN NURSING STUDENTS

    PubMed Central

    Khodaveisi, Masoud; Qaderian, Khosro; Oshvandi, Khodayar; Soltanian, Ali Reza; Vardanjani, Mehdi molavi

    2017-01-01

    Background and aims learning plays an important role in developing nursing skills and right care-taking. The Present study aims to evaluate two learning methods based on team –based learning and lecture-based learning in learning care-taking of patients with diabetes in nursing students. Method In this quasi-experimental study, 64 students in term 4 in nursing college of Bukan and Miandoab were included in the study based on knowledge and performance questionnaire including 15 questions based on knowledge and 5 questions based on performance on care-taking in patients with diabetes were used as data collection tool whose reliability was confirmed by cronbach alpha (r=0.83) by the researcher. To compare the mean score of knowledge and performance in each group in pre-test step and post-test step, pair –t test and to compare mean of scores in two groups of control and intervention, the independent t- test was used. Results There was not significant statistical difference between two groups in pre terms of knowledge and performance score (p=0.784). There was significant difference between the mean of knowledge scores and diabetes performance in the post-test in the team-based learning group and lecture-based learning group (p=0.001). There was significant difference between the mean score of knowledge of diabetes care in pre-test and post-test in base learning groups (p=0.001). Conclusion In both methods team-based and lecture-based learning approaches resulted in improvement in learning in students, but the rate of learning in the team-based learning approach is greater compared to that of lecture-based learning and it is recommended that this method be used as a higher education method in the education of students.

  6. Space-Based Identification of Archaeological Illegal Excavations and a New Automatic Method for Looting Feature Extraction in Desert Areas

    NASA Astrophysics Data System (ADS)

    Lasaponara, Rosa; Masini, Nicola

    2018-06-01

    The identification and quantification of disturbance of archaeological sites has been generally approached by visual inspection of optical aerial or satellite pictures. In this paper, we briefly summarize the state of the art of the traditionally satellite-based approaches for looting identification and propose a new automatic method for archaeological looting feature extraction approach (ALFEA). It is based on three steps: the enhancement using spatial autocorrelation, unsupervised classification, and segmentation. ALFEA has been applied to Google Earth images of two test areas, selected in desert environs in Syria (Dura Europos), and in Peru (Cahuachi-Nasca). The reliability of ALFEA was assessed through field surveys in Peru and visual inspection for the Syrian case study. Results from the evaluation procedure showed satisfactory performance from both of the two analysed test cases with a rate of success higher than 90%.

  7. Simulator evaluation of manually flown curved instrument approaches. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Sager, D.

    1973-01-01

    Pilot performance in flying horizontally curved instrument approaches was analyzed by having nine test subjects fly curved approaches in a fixed-base simulator. Approaches were flown without an autopilot and without a flight director. Evaluations were based on deviation measurements made at a number of points along the curved approach path and on subject questionnaires. Results indicate that pilots can fly curved approaches, though less accurately than straight-in approaches; that a moderate wind does not effect curve flying performance; and that there is no performance difference between 60 deg. and 90 deg. turns. A tradeoff of curve path parameters and a paper analysis of wind compensation were also made.

  8. Building a Better Workforce: A Case Study in Management Simulations and Experiential Learning in the Construction Industry

    ERIC Educational Resources Information Center

    Douglas-Lenders, Rachel Claire; Holland, Peter Jeffrey; Allen, Belinda

    2017-01-01

    Purpose: The purpose of this paper is to examine the impact of experiential simulation-based learning of employee self-efficacy. Design/Methodology/Approach: The research approach is an exploratory case study of a group of trainees from the same organisation. Using a quasi-experiment, one group, pre-test-post-test design (Tharenou et al., 2007), a…

  9. Improving Measurement in Health Education and Health Behavior Research Using Item Response Modeling: Comparison with the Classical Test Theory Approach

    ERIC Educational Resources Information Center

    Wilson, Mark; Allen, Diane D.; Li, Jun Corser

    2006-01-01

    This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…

  10. Building a Better Workforce: A Case Study in Management Simulations and Experiential Learning in the Construction Industry

    ERIC Educational Resources Information Center

    Douglas-Lenders, Rachel Claire; Holland, Peter Jeffrey; Allen, Belinda

    2017-01-01

    Purpose: The purpose of this paper is to examine the impact of experiential simulation-based learning of employee self-efficacy. Design/Methodology/Approach: The research approach is an exploratory case study of a group of trainees from the same organisation. Using a quasi-experiment, one group, pre-test-post-test design (Tharenou et al., 2007), a…

  11. Helicopter precision approach capability using the Global Positioning System

    NASA Technical Reports Server (NTRS)

    Kaufmann, David N.

    1992-01-01

    The period between 1 July and 31 December, 1992, was spent developing a research plan as well as a navigation system document and flight test plan to investigate helicopter precision approach capability using the Global Positioning System (GPS). In addition, all hardware and software required for the research was acquired, developed, installed, and verified on both the test aircraft and the ground-based reference station.

  12. [Expectations and patient satisfaction in hospitals: construction and application of an expectation-based experience typology and its use in the management of quality and expectations].

    PubMed

    Gehrlach, Christoph; Güntert, Bernhard

    2015-01-01

    Patient satisfaction (PS) surveys are frequently used evaluation methods to show performance from the customer's view. This approach has some fundamental deficits, especially with respect to theory, methodology and usage. Because of the significant theoretical value of the expectation confirmation/disconfirmation concept in the development of PS, an expectation-based experience typology has been developed and tested to check whether this approach could be a theoretical and practical alternative to the survey of PS. Due to the mainly cognitive-rational process of comparison between expectations and expectation fulfilment, it is easier to make changes in this stage of the process than in the subsequent stage of the development of PS that is mainly based on emotional-affective processes. The paper contains a literature review of the common concept of PS and its causal and influencing factors. Based on the theoretical part of this study, an expectation-based experience typology was developed. In the next step, the typology was subjected to exploratory testing, based on two patient surveys. In some parts of the tested typology explorative differences could be found between hospitals. Despite this rather more complex and unusual approach to expectation-based experience typology, this concept offers the chance to change conditions not only retrospectively (based on data), but also in a prospective way in terms of a "management of expectations". Copyright © 2014. Published by Elsevier GmbH.

  13. Bayesian Estimation of Combined Accuracy for Tests with Verification Bias

    PubMed Central

    Broemeling, Lyle D.

    2011-01-01

    This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487

  14. Item Selection for the Development of Parallel Forms from an IRT-Based Seed Test Using a Sampling and Classification Approach

    ERIC Educational Resources Information Center

    Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan

    2012-01-01

    Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…

  15. Electrospin-coating of nitrocellulose membrane enhances sensitivity in nucleic acid-based lateral flow assay.

    PubMed

    Yew, Chee-Hong Takahiro; Azari, Pedram; Choi, Jane Ru; Li, Fei; Pingguan-Murphy, Belinda

    2018-06-07

    Point-of-care biosensors are important tools developed to aid medical diagnosis and testing, food safety and environmental monitoring. Paper-based biosensors, especially nucleic acid-based lateral flow assays (LFA), are affordable, simple to produce and easy to use in remote settings. However, the sensitivity of such assays to infectious diseases has always been a restrictive challenge. Here, we have successfully electrospun polycaprolactone (PCL) on nitrocellulose (NC) membrane to form a hydrophobic coating to reduce the flow rate and increase the interaction rate between the targets and gold nanoparticles-detecting probes conjugates, resulting in the binding of more complexes to the capture probes. With this approach, the sensitivity of the PCL electrospin-coated test strip has been increased by approximately ten-fold as compared to the unmodified test strip. As a proof of concept, this approach holds great potential for sensitive detection of targets at point-of-care testing. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Central implementation strategies outperform local ones in improving HIV testing in Veterans Healthcare Administration facilities.

    PubMed

    Goetz, Matthew Bidwell; Hoang, Tuyen; Knapp, Herschel; Burgess, Jane; Fletcher, Michael D; Gifford, Allen L; Asch, Steven M

    2013-10-01

    Pilot data suggest that a multifaceted approach may increase HIV testing rates, but the scalability of this approach and the level of support needed for successful implementation remain unknown. To evaluate the effectiveness of a scaled-up multi-component intervention in increasing the rate of risk-based and routine HIV diagnostic testing in primary care clinics and the impact of differing levels of program support. Three arm, quasi-experimental implementation research study. Veterans Health Administration (VHA) facilities. Persons receiving primary care between June 2009 and September 2011 INTERVENTION: A multimodal program, including a real-time electronic clinical reminder to facilitate HIV testing, provider feedback reports and provider education, was implemented in Central and Local Arm Sites; sites in the Central Arm also received ongoing programmatic support. Control Arm sites had no intervention Frequency of performing HIV testing during the 6 months before and after implementation of a risk-based clinical reminder (phase I) or routine clinical reminder (phase II). The adjusted rate of risk-based testing increased by 0.4 %, 5.6 % and 10.1 % in the Control, Local and Central Arms, respectively (all comparisons, p < 0.01). During phase II, the adjusted rate of routine testing increased by 1.1 %, 6.3 % and 9.2 % in the Control, Local and Central Arms, respectively (all comparisons, p < 0.01). At study end, 70-80 % of patients had been offered an HIV test. Use of clinical reminders, provider feedback, education and social marketing significantly increased the frequency at which HIV testing is offered and performed in VHA facilities. These findings support a multimodal approach toward achieving the goal of having every American know their HIV status as a matter of routine clinical practice.

  17. Modifying Achievement Test Items: A Theory-Guided and Data-Based Approach for Better Measurement of What Students with Disabilities Know

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Elliott, Stephen N.; Beddow, Peter A.

    2009-01-01

    Federal regulations allow up to 2% of the student population of a state to achieve proficiency for adequate yearly progress by taking an alternate assessment based on modified academic achievement standards (AA-MAS). Such tests are likely to be easier, but as long as a test is considered a valid measure of grade level content, it is allowable as…

  18. Development and flight test of a helicopter compact, portable, precision landing system concept

    NASA Technical Reports Server (NTRS)

    Clary, G. R.; Bull, J. S.; Davis, T. J.; Chisholm, J. P.

    1984-01-01

    An airborne, radar-based, precision approach concept is being developed and flight tested as a part of NASA's Rotorcraft All-Weather Operations Research Program. A transponder-based beacon landing system (BLS) applying state-of-the-art X-band radar technology and digital processing techniques, was built and is being flight tested to demonstrate the concept feasibility. The BLS airborne hardware consists of an add-on microprocessor, installed in conjunction with the aircraft weather/mapping radar, which analyzes the radar beacon receiver returns and determines range, localizer deviation, and glide-slope deviation. The ground station is an inexpensive, portable unit which can be quickly deployed at a landing site. Results from the flight test program show that the BLS concept has a significant potential for providing rotorcraft with low-cost, precision instrument approach capability in remote areas.

  19. Approaches and considerations for the assessment of immunotoxicity for environmental chemicals: a workshop summary.

    PubMed

    Boverhof, Darrell R; Ladics, Greg; Luebke, Bob; Botham, Jane; Corsini, Emanuela; Evans, Ellen; Germolec, Dori; Holsapple, Michael; Loveless, Scott E; Lu, Haitian; van der Laan, Jan Willem; White, Kimber L; Yang, Yung

    2014-02-01

    As experience is gained with toxicology testing and as new assays and technologies are developed, it is critical for stakeholders to discuss opportunities to advance our overall testing strategies. To facilitate these discussions, a workshop on practices for assessing immunotoxicity for environmental chemicals was held with the goal of sharing perspectives on immunotoxicity testing strategies and experiences, developmental immunotoxicity (DIT), and integrated and alternative approaches to immunotoxicity testing. Experiences across the chemical and pharmaceutical industries suggested that standard toxicity studies, combined with triggered-based testing approaches, represent an effective and efficient approach to evaluate immunotoxic potential. Additionally, discussions on study design, critical windows, and new guideline approaches and experiences identified important factors to consider before initiating DIT evaluations including assay choice and timing and the impact of existing adult data. Participants agreed that integrating endpoints into standard repeat-dose studies should be considered for fulfilling any immunotoxicity testing requirements, while also maximizing information and reducing animal use. Participants also acknowledged that in vitro evaluation of immunosuppression is complex and may require the use of multiple assays that are still being developed. These workshop discussions should contribute to developing an effective but more resource and animal efficient approach for evaluating chemical immunotoxicity. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Free-Suspension Residual Flexibility Testing of Space Station Pathfinder: Comparison to Fixed-Base Results

    NASA Technical Reports Server (NTRS)

    Tinker, Michael L.

    1998-01-01

    Application of the free-suspension residual flexibility modal test method to the International Space Station Pathfinder structure is described. The Pathfinder, a large structure of the general size and weight of Space Station module elements, was also tested in a large fixed-base fixture to simulate Shuttle Orbiter payload constraints. After correlation of the Pathfinder finite element model to residual flexibility test data, the model was coupled to a fixture model, and constrained modes and frequencies were compared to fixed-base test. modes. The residual flexibility model compared very favorably to results of the fixed-base test. This is the first known direct comparison of free-suspension residual flexibility and fixed-base test results for a large structure. The model correlation approach used by the author for residual flexibility data is presented. Frequency response functions (FRF) for the regions of the structure that interface with the environment (a test fixture or another structure) are shown to be the primary tools for model correlation that distinguish or characterize the residual flexibility approach. A number of critical issues related to use of the structure interface FRF for correlating the model are then identified and discussed, including (1) the requirement of prominent stiffness lines, (2) overcoming problems with measurement noise which makes the antiresonances or minima in the functions difficult to identify, and (3) the use of interface stiffness and lumped mass perturbations to bring the analytical responses into agreement with test data. It is shown that good comparison of analytical-to-experimental FRF is the key to obtaining good agreement of the residual flexibility values.

  1. New error calibration tests for gravity models using subset solutions and independent data - Applied to GEM-T3

    NASA Technical Reports Server (NTRS)

    Lerch, F. J.; Nerem, R. S.; Chinn, D. S.; Chan, J. C.; Patel, G. B.; Klosko, S. M.

    1993-01-01

    A new method has been developed to provide a direct test of the error calibrations of gravity models based on actual satellite observations. The basic approach projects the error estimates of the gravity model parameters onto satellite observations, and the results of these projections are then compared with data residual computed from the orbital fits. To allow specific testing of the gravity error calibrations, subset solutions are computed based on the data set and data weighting of the gravity model. The approach is demonstrated using GEM-T3 to show that the gravity error estimates are well calibrated and that reliable predictions of orbit accuracies can be achieved for independent orbits.

  2. Usability Methods for Ensuring Health Information Technology Safety: Evidence-Based Approaches. Contribution of the IMIA Working Group Health Informatics for Patient Safety.

    PubMed

    Borycki, E; Kushniruk, A; Nohr, C; Takeda, H; Kuwata, S; Carvalho, C; Bainbridge, M; Kannry, J

    2013-01-01

    Issues related to lack of system usability and potential safety hazards continue to be reported in the health information technology (HIT) literature. Usability engineering methods are increasingly used to ensure improved system usability and they are also beginning to be applied more widely for ensuring the safety of HIT applications. These methods are being used in the design and implementation of many HIT systems. In this paper we describe evidence-based approaches to applying usability engineering methods. A multi-phased approach to ensuring system usability and safety in healthcare is described. Usability inspection methods are first described including the development of evidence-based safety heuristics for HIT. Laboratory-based usability testing is then conducted under artificial conditions to test if a system has any base level usability problems that need to be corrected. Usability problems that are detected are corrected and then a new phase is initiated where the system is tested under more realistic conditions using clinical simulations. This phase may involve testing the system with simulated patients. Finally, an additional phase may be conducted, involving a naturalistic study of system use under real-world clinical conditions. The methods described have been employed in the analysis of the usability and safety of a wide range of HIT applications, including electronic health record systems, decision support systems and consumer health applications. It has been found that at least usability inspection and usability testing should be applied prior to the widespread release of HIT. However, wherever possible, additional layers of testing involving clinical simulations and a naturalistic evaluation will likely detect usability and safety issues that may not otherwise be detected prior to widespread system release. The framework presented in the paper can be applied in order to develop more usable and safer HIT, based on multiple layers of evidence.

  3. Decision-Level Fusion of Spatially Scattered Multi-Modal Data for Nondestructive Inspection of Surface Defects

    PubMed Central

    Heideklang, René; Shokouhi, Parisa

    2016-01-01

    This article focuses on the fusion of flaw indications from multi-sensor nondestructive materials testing. Because each testing method makes use of a different physical principle, a multi-method approach has the potential of effectively differentiating actual defect indications from the many false alarms, thus enhancing detection reliability. In this study, we propose a new technique for aggregating scattered two- or three-dimensional sensory data. Using a density-based approach, the proposed method explicitly addresses localization uncertainties such as registration errors. This feature marks one of the major of advantages of this approach over pixel-based image fusion techniques. We provide guidelines on how to set all the key parameters and demonstrate the technique’s robustness. Finally, we apply our fusion approach to experimental data and demonstrate its capability to locate small defects by substantially reducing false alarms under conditions where no single-sensor method is adequate. PMID:26784200

  4. Helicopter Approach Capability Using the Differential Global Positioning System

    NASA Technical Reports Server (NTRS)

    Kaufmann, David N.

    1994-01-01

    The results of flight tests to determine the feasibility of using the Global Positioning System (GPS) in the Differential mode (DGPS) to provide high accuracy, precision navigation and guidance for helicopter approaches to landing are presented. The airborne DGPS receiver and associated equipment is installed in a NASA UH-60 Black Hawk helicopter. The ground-based DGPS reference receiver is located at a surveyed test site and is equipped with a real-time VHF data link to transmit correction information to the airborne DGPS receiver. The corrected airborne DGPS information, together with the preset approach geometry, is used to calculate guidance commands which are sent to the aircraft's approach guidance instruments. The use of DGPS derived guidance for helicopter approaches to landing is evaluated by comparing the DGPS data with the laser tracker truth data. The errors indicate that the helicopter position based on DGPS guidance satisfies the International Civil Aviation Organization (ICAO) Category 1 (CAT 1) lateral and vertical navigational accuracy requirements.

  5. The Simplified Aircraft-Based Paired Approach With the ALAS Alerting Algorithm

    NASA Technical Reports Server (NTRS)

    Perry, Raleigh B.; Madden, Michael M.; Torres-Pomales, Wilfredo; Butler, Ricky W.

    2013-01-01

    This paper presents the results of an investigation of a proposed concept for closely spaced parallel runways called the Simplified Aircraft-based Paired Approach (SAPA). This procedure depends upon a new alerting algorithm called the Adjacent Landing Alerting System (ALAS). This study used both low fidelity and high fidelity simulations to validate the SAPA procedure and test the performance of the new alerting algorithm. The low fidelity simulation enabled a determination of minimum approach distance for the worst case over millions of scenarios. The high fidelity simulation enabled an accurate determination of timings and minimum approach distance in the presence of realistic trajectories, communication latencies, and total system error for 108 test cases. The SAPA procedure and the ALAS alerting algorithm were applied to the 750-ft parallel spacing (e.g., SFO 28L/28R) approach problem. With the SAPA procedure as defined in this paper, this study concludes that a 750-ft application does not appear to be feasible, but preliminary results for 1000-ft parallel runways look promising.

  6. Performance enhancement of low-cost, high-accuracy, state estimation for vehicle collision prevention system using ANFIS

    NASA Astrophysics Data System (ADS)

    Saadeddin, Kamal; Abdel-Hafez, Mamoun F.; Jaradat, Mohammad A.; Jarrah, Mohammad Amin

    2013-12-01

    In this paper, a low-cost navigation system that fuses the measurements of the inertial navigation system (INS) and the global positioning system (GPS) receiver is developed. First, the system's dynamics are obtained based on a vehicle's kinematic model. Second, the INS and GPS measurements are fused using an extended Kalman filter (EKF) approach. Subsequently, an artificial intelligence based approach for the fusion of INS/GPS measurements is developed based on an Input-Delayed Adaptive Neuro-Fuzzy Inference System (IDANFIS). Experimental tests are conducted to demonstrate the performance of the two sensor fusion approaches. It is found that the use of the proposed IDANFIS approach achieves a reduction in the integration development time and an improvement in the estimation accuracy of the vehicle's position and velocity compared to the EKF based approach.

  7. Image analysis by integration of disparate information

    NASA Technical Reports Server (NTRS)

    Lemoigne, Jacqueline

    1993-01-01

    Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.

  8. Potential Mediators in Parenting and Family Intervention: Quality of Mediation Analyses

    PubMed Central

    Patel, Chandni C.; Fairchild, Amanda J.; Prinz, Ronald J.

    2017-01-01

    Parenting and family interventions have repeatedly shown effectiveness in preventing and treating a range of youth outcomes. Accordingly, investigators in this area have conducted a number of studies using statistical mediation to examine some of the potential mechanisms of action by which these interventions work. This review examined from a methodological perspective in what ways and how well the family-based intervention studies tested statistical mediation. A systematic search identified 73 published outcome studies that tested mediation for family-based interventions across a wide range of child and adolescent outcomes (i.e., externalizing, internalizing, and substance-abuse problems; high-risk sexual activity; and academic achievement), for putative mediators pertaining to positive and negative parenting, family functioning, youth beliefs and coping skills, and peer relationships. Taken as a whole, the studies used designs that adequately addressed temporal precedence. The majority of studies used the product of coefficients approach to mediation, which is preferred, and less limiting than the causal steps approach. Statistical significance testing did not always make use of the most recently developed approaches, which would better accommodate small sample sizes and more complex functions. Specific recommendations are offered for future mediation studies in this area with respect to full longitudinal design, mediation approach, significance testing method, documentation and reporting of statistics, testing of multiple mediators, and control for Type I error. PMID:28028654

  9. An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.

    PubMed

    Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph

    2010-06-01

    We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.

  10. 40 CFR 80.47 - Performance-based Analytical Test Method Approach.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... chemistry and statistics, or at least a bachelor's degree in chemical engineering, from an accredited... be compensated for any known chemical interferences using good laboratory practices. (3) The test... section, individual test results shall be compensated for any known chemical interferences using good...

  11. A Different Approach to the Generation of Patient Management Problems from a Knowledge-Based System

    PubMed Central

    Barriga, Rosa Maria

    1988-01-01

    Several strategies are proposed to approach the generation of Patient Management Problems from a Knowledge Base and avoid inconsistencies in the results. These strategies are based on a different Knowledge Base structure and in the use of case introductions that describe the patient attributes which are not disease-dependent. This methodology has proven effective in a recent pilot test and it is on its way to implementation as part of an educational program at CWRU, School of Medicine.

  12. Localizing Ground Penetrating RADAR: A Step Towards Robust Autonomous Ground Vehicle Localization

    DTIC Science & Technology

    2016-07-14

    localization designed to complement existing approaches with a low sensitivity to failure modes of LIDAR, camera, and GPS/INS sensors due to its low...the detailed design and results from highway testing, which uses a simple heuristic for fusing LGPR estimates with a GPS/INS system. Cross-track... designed to enable a priori map-based local- ization. LGPR offers complementary capabilities to tradi- tional optics-based approaches to map-based

  13. Developing Physics Textbook Based on Cognitive Conflict for Deeper Conceptual Understanding and Better Characters

    NASA Astrophysics Data System (ADS)

    Linuwih, S.; Lurinda, N. W.; Fianti

    2017-04-01

    These study aims are to develop a textbook based on cognitive conflict approachment, to know theproperness of textbook, the legibility of textbook, and also the effect of using the textbook for increasing the conceptual understanding and improving the character of the students. This study was conducted by research and development method employing non-equivalent control group design to test the product. The subjects wereten-grade students of SMA N 1 Gubug in thesecond semester of 2015/2016. The properness test used properness-questionnaire, while the legibility test used themost closet. The data of conceptual understanding was taken from thepretest-postest result and the data of characters was taken from direct observation. By analysing the data, we concluded that the textbook based on cognitive conflict approachment was very proper to use with high legibility. By applied this textbook, students would be helped to get a deeper conceptual understanding and better characters.

  14. Locality-preserving sparse representation-based classification in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Gao, Lianru; Yu, Haoyang; Zhang, Bing; Li, Qingting

    2016-10-01

    This paper proposes to combine locality-preserving projections (LPP) and sparse representation (SR) for hyperspectral image classification. The LPP is first used to reduce the dimensionality of all the training and testing data by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold, where the high-dimensional data lies. Then, SR codes the projected testing pixels as sparse linear combinations of all the training samples to classify the testing pixels by evaluating which class leads to the minimum approximation error. The integration of LPP and SR represents an innovative contribution to the literature. The proposed approach, called locality-preserving SR-based classification, addresses the imbalance between high dimensionality of hyperspectral data and the limited number of training samples. Experimental results on three real hyperspectral data sets demonstrate that the proposed approach outperforms the original counterpart, i.e., SR-based classification.

  15. Impact of Active Teaching Methods Implemented on Therapeutic Chemistry Module: Performance and Impressions of First-Year Pharmacy Students

    ERIC Educational Resources Information Center

    Derfoufi, Sanae; Benmoussa, Adnane; El Harti, Jaouad; Ramli, Youssef; Taoufik, Jamal; Chaouir, Souad

    2015-01-01

    This study investigates the positive impact of the Case Method implemented during a 4- hours tutorial in "therapeutic chemistry module." We view the Case Method as one particular approach within the broader spectrum of problem based or inquiry based learning approaches. Sixty students were included in data analysis. A pre-test and…

  16. An information communication technology based approach for the acquisition of critical thinking skills.

    PubMed

    Pucer, Patrik; Trobec, Irena; Žvanut, Boštjan

    2014-06-01

    Both academics and practitioners agree that critical thinking skills are necessary to provide safe and comprehensive nursing care. In order to promote the development of critical thinking, nurse educators need to keep the teaching/learning process captivating and interesting using active learning environments. These can be implemented by using modern information and communication technologies that are simple, fun, and time and cost effective. The goal of our study was to design and test an approach, which allows individual and fast acquisition of critical thinking skills with the use of information and communication technology. A combination of qualitative and quantitative research design was implemented. The study consisted of a quasi-experiment (phases 1-3): (1) pre-test discussion board, (2) use of e-contents based on the presented approach, and (3) post-test discussion board. The participants' opinion about the presented approach was identified in phase 4. The study was performed in May 2012 during the course "Ethics and Philosophy in Nursing" at the Faculty of Health Sciences, University of Primorska, Slovenia. Forty first-year undergraduate nursing students. Qualitative analysis of the discussion boards (phases 1, 3) and an anonymous survey with open- and closed-ended questions (phase 4). Qualitative analysis of the discussion boards showed a significant (p<0.001) improvement in the percentage of posts (12.2%) for which the opinions and conclusions of the participants were justified with valid arguments. The survey results indicated that participants perceived the e-contents based on the presented approach as useful, and that they improved their critical thinking skills. Repeated confirmation of the validity of the presented approach through methodological triangulation represents a strong indication that the presented approach is a valuable tool to develop nursing students' critical thinking skills. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings.

    PubMed

    Singh, Narinderpal; Wang, Changlu; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-07-26

    We tested a threshold-based bed bug ( Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1-12 bed bug count, II- Chemical control only in apartments with 1-12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach.

  18. Evaluation of an online, case-based interactive approach to teaching pathophysiology.

    PubMed

    Van Dijken, Pieter Canham; Thévoz, Sara; Jucker-Kupper, Patrick; Feihl, François; Bonvin, Raphaël; Waeber, Bernard

    2008-06-01

    The aim of this study was to evaluate a new pedagogical approach in teaching fluid, electrolyte and acid-base pathophysiology in undergraduate students. This approach comprises traditional lectures, the study of clinical cases on the web and a final interactive discussion of these cases in the classroom. When on the web, the students are asked to select laboratory tests that seem most appropriate to understand the pathophysiological condition underlying the clinical case. The percentage of students having chosen a given test is made available to the teacher who uses it in an interactive session to stimulate discussion with the whole class of students. The same teacher used the same case studies during 2 consecutive years during the third year of the curriculum. The majority of students answered the questions on the web as requested and evaluated positively their experience with this form of teaching and learning. Complementing traditional lectures with online case-based studies and interactive group discussions represents, therefore, a simple means to promote the learning and the understanding of complex pathophysiological mechanisms. This simple problem-based approach to teaching and learning may be implemented to cover all fields of medicine.

  19. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings

    PubMed Central

    Singh, Narinderpal; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-01-01

    We tested a threshold-based bed bug (Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1–12 bed bug count, II- Chemical control only in apartments with 1–12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach. PMID:28933720

  20. Potential testing of reprocessing procedures by real-time polymerase chain reaction: A multicenter study of colonoscopy devices.

    PubMed

    Valeriani, Federica; Agodi, Antonella; Casini, Beatrice; Cristina, Maria Luisa; D'Errico, Marcello Mario; Gianfranceschi, Gianluca; Liguori, Giorgio; Liguori, Renato; Mucci, Nicolina; Mura, Ida; Pasquarella, Cesira; Piana, Andrea; Sotgiu, Giovanni; Privitera, Gaetano; Protano, Carmela; Quattrocchi, Annalisa; Ripabelli, Giancarlo; Rossini, Angelo; Spagnolo, Anna Maria; Tamburro, Manuela; Tardivo, Stefano; Veronesi, Licia; Vitali, Matteo; Romano Spica, Vincenzo

    2018-02-01

    Reprocessing of endoscopes is key to preventing cross-infection after colonoscopy. Culture-based methods are recommended for monitoring, but alternative and rapid approaches are needed to improve surveillance and reduce turnover times. A molecular strategy based on detection of residual traces from gut microbiota was developed and tested using a multicenter survey. A simplified sampling and DNA extraction protocol using nylon-tipped flocked swabs was optimized. A multiplex real-time polymerase chain reaction (PCR) test was developed that targeted 6 bacteria genes that were amplified in 3 mixes. The method was validated by interlaboratory tests involving 5 reference laboratories. Colonoscopy devices (n = 111) were sampled in 10 Italian hospitals. Culture-based microbiology and metagenomic tests were performed to verify PCR data. The sampling method was easily applied in all 10 endoscopy units and the optimized DNA extraction and amplification protocol was successfully performed by all of the involved laboratories. This PCR-based method allowed identification of both contaminated (n = 59) and fully reprocessed endoscopes (n = 52) with high sensibility (98%) and specificity (98%), within 3-4 hours, in contrast to the 24-72 hours needed for a classic microbiology test. Results were confirmed by next-generation sequencing and classic microbiology. A novel approach for monitoring reprocessing of colonoscopy devices was developed and successfully applied in a multicenter survey. The general principle of tracing biological fluids through microflora DNA amplification was successfully applied and may represent a promising approach for hospital hygiene. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  1. Classification of motor imagery tasks for BCI with multiresolution analysis and multiobjective feature selection.

    PubMed

    Ortega, Julio; Asensio-Cubero, Javier; Gan, John Q; Ortiz, Andrés

    2016-07-15

    Brain-computer interfacing (BCI) applications based on the classification of electroencephalographic (EEG) signals require solving high-dimensional pattern classification problems with such a relatively small number of training patterns that curse of dimensionality problems usually arise. Multiresolution analysis (MRA) has useful properties for signal analysis in both temporal and spectral analysis, and has been broadly used in the BCI field. However, MRA usually increases the dimensionality of the input data. Therefore, some approaches to feature selection or feature dimensionality reduction should be considered for improving the performance of the MRA based BCI. This paper investigates feature selection in the MRA-based frameworks for BCI. Several wrapper approaches to evolutionary multiobjective feature selection are proposed with different structures of classifiers. They are evaluated by comparing with baseline methods using sparse representation of features or without feature selection. The statistical analysis, by applying the Kolmogorov-Smirnoff and Kruskal-Wallis tests to the means of the Kappa values evaluated by using the test patterns in each approach, has demonstrated some advantages of the proposed approaches. In comparison with the baseline MRA approach used in previous studies, the proposed evolutionary multiobjective feature selection approaches provide similar or even better classification performances, with significant reduction in the number of features that need to be computed.

  2. Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.

    ERIC Educational Resources Information Center

    Meghabghab, Dania Bilal

    Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…

  3. The Implementation of Contextual Approach in Solving Problems Understanding Syntax: "Sentence" Indonesian at Universities in Surakarta, Indonesia

    ERIC Educational Resources Information Center

    Wahyuni, Tutik; Suwandi, Sarwiji; Slamet, St. Y.; Andayani

    2015-01-01

    This study aims to: (1) assess the charge textbooks Syntax: "Sentence" bahasa Indonesia is based on a needs analysis; (2) analyzing the breakdown of understanding Syntax: "Sentence" Indonesian with contextual approach; (3) test the effectiveness of understanding Syntax: "Sentence" Indonesian with kontekstua approach.…

  4. The Metamemory Approach to Confidence: A Test Using Semantic Memory

    ERIC Educational Resources Information Center

    Brewer, William F.; Sampaio, Cristina

    2012-01-01

    The metamemory approach to memory confidence was extended and elaborated to deal with semantic memory tasks. The metamemory approach assumes that memory confidence is based on the products and processes of a completed memory task, as well as metamemory beliefs that individuals have about how their memory products and processes relate to memory…

  5. An efficient sampling approach for variance-based sensitivity analysis based on the law of total variance in the successive intervals without overlapping

    NASA Astrophysics Data System (ADS)

    Yun, Wanying; Lu, Zhenzhou; Jiang, Xian

    2018-06-01

    To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.

  6. Orion Flight Test 1 Architecture: Observed Benefits of a Model Based Engineering Approach

    NASA Technical Reports Server (NTRS)

    Simpson, Kimberly A.; Sindiy, Oleg V.; McVittie, Thomas I.

    2012-01-01

    This paper details how a NASA-led team is using a model-based systems engineering approach to capture, analyze and communicate the end-to-end information system architecture supporting the first unmanned orbital flight of the Orion Multi-Purpose Crew Exploration Vehicle. Along with a brief overview of the approach and its products, the paper focuses on the observed program-level benefits, challenges, and lessons learned; all of which may be applied to improve system engineering tasks for characteristically similarly challenges

  7. A new modeling and inference approach for the Systolic Blood Pressure Intervention Trial outcomes.

    PubMed

    Yang, Song; Ambrosius, Walter T; Fine, Lawrence J; Bress, Adam P; Cushman, William C; Raj, Dominic S; Rehman, Shakaib; Tamariz, Leonardo

    2018-06-01

    Background/aims In clinical trials with time-to-event outcomes, usually the significance tests and confidence intervals are based on a proportional hazards model. Thus, the temporal pattern of the treatment effect is not directly considered. This could be problematic if the proportional hazards assumption is violated, as such violation could impact both interim and final estimates of the treatment effect. Methods We describe the application of inference procedures developed recently in the literature for time-to-event outcomes when the treatment effect may or may not be time-dependent. The inference procedures are based on a new model which contains the proportional hazards model as a sub-model. The temporal pattern of the treatment effect can then be expressed and displayed. The average hazard ratio is used as the summary measure of the treatment effect. The test of the null hypothesis uses adaptive weights that often lead to improvement in power over the log-rank test. Results Without needing to assume proportional hazards, the new approach yields results consistent with previously published findings in the Systolic Blood Pressure Intervention Trial. It provides a visual display of the time course of the treatment effect. At four of the five scheduled interim looks, the new approach yields smaller p values than the log-rank test. The average hazard ratio and its confidence interval indicates a treatment effect nearly a year earlier than a restricted mean survival time-based approach. Conclusion When the hazards are proportional between the comparison groups, the new methods yield results very close to the traditional approaches. When the proportional hazards assumption is violated, the new methods continue to be applicable and can potentially be more sensitive to departure from the null hypothesis.

  8. A robust test for growth hormone doping--present status and future prospects.

    PubMed

    Nelson, Anne E; Ho, Ken K

    2008-05-01

    Although doping with growth hormone (GH) is banned, there is anecdotal evidence that it is widely abused. GH is reportedly used often in combination with anabolic steroids at high doses for several months. Development of a robust test for GH has been challenging because recombinant human 22 kDa (22K) GH used in doping is indistinguishable analytically from endogenous GH and there are wide physiological fluctuations in circulating GH concentrations. One approach to GH testing is based on measurement of different circulating GH isoforms using immunoassays that differentiate between 22K and other GH isoforms. Administration of 22K GH results in a change in its abundance relative to other endogenous pituitary GH isoforms. The differential isoform method has been implemented; however, its utility is limited because of the short window of opportunity of detection. The second approach, which will extend the window of opportunity of detection, is based on the detection of increased levels of circulating GH-responsive proteins, such as insulin-like growth factor (IGF) axis and collagen peptides. Age and gender are the major determinants of variability for IGF-I and the collagen markers; therefore, a test based on these markers must take age into account for men and women. Extensive data is now available that validates the GH-responsive marker approach and implementation is now largely dependent on establishing an assured supply of standardized assays. Future directions will include more widespread implementation of both approaches by the World Anti-Doping Agency, possible use of other platforms for measurement and an athlete's passport to establish individual reference levels for biological parameters such as GH-responsive markers. Novel approaches include gene expression and proteomic profiling. 2008, Asian Journal of Andrology, SIMM and SJTU. All rights reserved.

  9. A kernel regression approach to gene-gene interaction detection for case-control studies.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  10. Understanding evidence-based diagnosis.

    PubMed

    Kohn, Michael A

    2014-01-01

    The real meaning of the word "diagnosis" is naming the disease that is causing a patient's illness. The cognitive process of assigning this name is a mysterious combination of pattern recognition and the hypothetico-deductive approach that is only remotely related to the mathematical process of using test results to update the probability of a disease. What I refer to as "evidence-based diagnosis" is really evidence-based use of medical tests to guide treatment decisions. Understanding how to use test results to update the probability of disease can help us interpret test results more rationally. Also, evidence-based diagnosis reminds us to consider the costs and risks of testing and the dangers of over-diagnosis and over-treatment, in addition to the costs and risks of missing serious disease.

  11. Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model

    NASA Astrophysics Data System (ADS)

    Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward

    2018-04-01

    A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.

  12. Community-based intermittent mass testing and treatment for malaria in an area of high transmission intensity, western Kenya: study design and methodology for a cluster randomized controlled trial.

    PubMed

    Samuels, Aaron M; Awino, Nobert; Odongo, Wycliffe; Abong'o, Benard; Gimnig, John; Otieno, Kephas; Shi, Ya Ping; Were, Vincent; Allen, Denise Roth; Were, Florence; Sang, Tony; Obor, David; Williamson, John; Hamel, Mary J; Patrick Kachur, S; Slutsker, Laurence; Lindblade, Kim A; Kariuki, Simon; Desai, Meghna

    2017-06-07

    Most human Plasmodium infections in western Kenya are asymptomatic and are believed to contribute importantly to malaria transmission. Elimination of asymptomatic infections requires active treatment approaches, such as mass testing and treatment (MTaT) or mass drug administration (MDA), as infected persons do not seek care for their infection. Evaluations of community-based approaches that are designed to reduce malaria transmission require careful attention to study design to ensure that important effects can be measured accurately. This manuscript describes the study design and methodology of a cluster-randomized controlled trial to evaluate a MTaT approach for malaria transmission reduction in an area of high malaria transmission. Ten health facilities in western Kenya were purposively selected for inclusion. The communities within 3 km of each health facility were divided into three clusters of approximately equal population size. Two clusters around each health facility were randomly assigned to the control arm, and one to the intervention arm. Three times per year for 2 years, after the long and short rains, and again before the long rains, teams of community health volunteers visited every household within the intervention arm, tested all consenting individuals with malaria rapid diagnostic tests, and treated all positive individuals with an effective anti-malarial. The effect of mass testing and treatment on malaria transmission was measured through population-based longitudinal cohorts, outpatient visits for clinical malaria, periodic population-based cross-sectional surveys, and entomological indices.

  13. Static and Dynamic Model Update of an Inflatable/Rigidizable Torus Structure

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, mercedes C.

    2006-01-01

    The present work addresses the development of an experimental and computational procedure for validating finite element models. A torus structure, part of an inflatable/rigidizable Hexapod, is used to demonstrate the approach. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with optimization is used to modify key model parameters. Static test results are used to update stiffness parameters and dynamic test results are used to update the mass distribution. Updated parameters are computed using gradient and non-gradient based optimization algorithms. Results show significant improvements in model predictions after parameters are updated. Lessons learned in the areas of test procedures, modeling approaches, and uncertainties quantification are presented.

  14. Testing Causal Impacts of a School-Based SEL Intervention Using Instrumental Variable Techniques

    ERIC Educational Resources Information Center

    Torrente, Catalina; Nathanson, Lori; Rivers, Susan; Brackett, Marc

    2015-01-01

    Children's social-emotional skills, such as conflict resolution and emotion regulation, have been linked to a number of highly regarded academic and social outcomes. The current study presents preliminary results from a causal test of the theory of change of RULER, a universal school-based approach to social and emotional learning (SEL).…

  15. Functional Assessment-Based Interventions for Students with or At-Risk for High-Incidence Disabilities: Field Testing Single-Case Synthesis Methods

    ERIC Educational Resources Information Center

    Common, Eric Alan; Lane, Kathleen Lynne; Pustejovsky, James E.; Johnson, Austin H.; Johl, Liane Elizabeth

    2017-01-01

    This systematic review investigated one systematic approach to designing, implementing, and evaluating functional assessment-based interventions (FABI) for use in supporting school-age students with or at-risk for high-incidence disabilities. We field tested several recently developed methods for single-case design syntheses. First, we appraised…

  16. Meeting the Needs of All Students: A Universal Design Approach to Computer-Based Testing

    ERIC Educational Resources Information Center

    Russell, Michael; Hoffmann, Thomas; Higgins, Jennifer

    2009-01-01

    Michael Russell, Thomas Hoffmann, and Jennifer Higgins describe how the principles of universal design were applied to the development of an innovative computer-based test delivery system, NimbleTools, to meet the accessibility and accommodation needs of students with a wide range of disabilities and special needs. Noting the movement to…

  17. Developing Environmentally Responsible Behaviours Through the Implementation of Argumentation- and Problem-Based Learning Models

    NASA Astrophysics Data System (ADS)

    Fettahlıoğlu, Pınar; Aydoğdu, Mustafa

    2018-04-01

    The purpose of this research is to investigate the effect of using argumentation and problem-based learning approaches on the development of environmentally responsible behaviours among pre-service science teachers. Experimental activities were implemented for 14 weeks for 52 class hours in an environmental education class within a science teaching department. A mixed method was used as a research design; particularly, a special type of Concurrent Nested Strategy was applied. The quantitative portion was based on the one-group pre-test and post-test models, and the qualitative portion was based on the holistic multiple-case study method. The quantitative portion of the research was conducted with 34 third-year pre-service science teachers studying at a state university. The qualitative portion of the study was conducted with six pre-service science teachers selected among the 34 pre-service science teachers based on the pre-test results obtained from an environmentally responsible behaviour scale. t tests for dependent groups were used to analyse quantitative data. Both descriptive and content analyses of the qualitative data were performed. The results of the study showed that the use of the argumentation and problem-based learning approaches significantly contributed to the development of environmentally responsible behaviours among pre-service science teachers.

  18. Bibliography of In-House and Contract Reports. Supplement 15.

    DTIC Science & Technology

    1988-04-01

    that approaches the aesthetic quality obtainable from experienced manual placement. )S 10 WNW oA ETL- 0428 AD-BI06 994L KNOWLEDGE-BASED VISION...Service Tests, and Production Model 1307 -TR 1953 Tests, Autofocusing Rectifier Development, Test, Preparation, Delivery, and ETL- 1307 1982 Installation of

  19. Comparative effectiveness of congregation- versus clinic-based approach to prevention of mother-to-child HIV transmission: study protocol for a cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background A total of 22 priority countries have been identified by the WHO that account for 90% of pregnant women living with HIV. Nigeria is one of only 4 countries among the 22 with an HIV testing rate for pregnant women of less than 20%. Currently, most pregnant women must access a healthcare facility (HF) to be screened and receive available prevention of mother-to-child HIV transmission (PMTCT) interventions. Finding new approaches to increase HIV testing among pregnant women is necessary to realize the WHO/ President's Emergency Plan for AIDS Relief (PEPFAR) goal of eliminating new pediatric infections by 2015. Methods This cluster randomized trial tests the comparative effectiveness of a congregation-based Healthy Beginning Initiative (HBI) versus a clinic-based approach on the rates of HIV testing and PMTCT completion among a cohort of church attending pregnant women. Recruitment occurs at the level of the churches and participants (in that order), while randomization occurs only at the church level. The trial is unblinded, and the churches are informed of their randomization group. Eligible participants, pregnant women attending study churches, are recruited during prayer sessions. HBI is delivered by trained community health nurses and church-based health advisors and provides free, integrated on-site laboratory tests (HIV plus hemoglobin, malaria, hepatitis B, sickle cell gene, syphilis) during a church-organized ‘baby shower.’ The baby shower includes refreshments, gifts exchange, and an educational game show testing participants’ knowledge of healthy pregnancy habits in addition to HIV acquisition modes, and effective PMTCT interventions. Baby receptions provide a contact point for follow-up after delivery. This approach was designed to reduce barriers to screening including knowledge, access, cost and stigma. The primary aim is to evaluate the effect of HBI on the HIV testing rate among pregnant women. The secondary aims are to evaluate the effect of HBI on the rate of HIV testing among male partners of pregnant women and the rate of PMTCT completion among HIV-infected pregnant women. Discussion Results of this study will provide further understanding of the most effective strategies for increasing HIV testing among pregnant women in hard-to-reach communities. Trial Registration Clinicaltrials.gov, NCT01795261 PMID:23758933

  20. Economic evaluation of HCV testing approaches in low and middle income countries.

    PubMed

    Morgan, Jake R; Servidone, Maria; Easterbrook, Philippa; Linas, Benjamin P

    2017-11-01

    Hepatitis C virus (HCV) infection represents a major public health burden with diverse epidemics worldwide, but at present, only a minority of infected persons have been tested and are aware of their diagnosis. The advent of highly effective direct acting antiviral (DAA) therapy, which is becoming available at increasingly lower costs in low and middle income countries (LMICs), represents a major opportunity to expand access to testing and treatment. However, there is uncertainty as to the optimal testing approaches and who to prioritize for testing. We undertook a narrative review of the cost-effectiveness literature on different testing approaches for chronic hepatitis C infection to inform decision-making and formulation of recommendations in the 2017 World Health Organization (WHO) viral hepatitis testing guidelines. We undertook a focused search and narrative review of the literature for cost effectiveness studies of testing approaches in three main groups:- 1) focused testing of specific high-risk groups (defined as those who are part of a population with higher seroprevalence or who have a history of exposure or high-risk behaviours); 2) "birth cohort" testing among easily identified age groups (i.e. specific birth cohorts) known to have a high prevalence of HCV infection; and 3) routine testing in the general population. Articles included were those published in PubMed, written in English and published after 2000. We identified 26 eligible studies. Twenty-four of them were from Europe (n = 14) or the United States (n = 10). There was only one study from a LMIC (Egypt) and this evaluated general population testing. Thirteen studies evaluated focused testing among specific groups at high risk for HCV infection, including nine in persons who inject drugs (PWID); five among people in prison, and one among HIV-infected men who have sex with men (MSM). Eight studies evaluated birth cohort testing, and five evaluated testing in the general population. Most studies were based on a one-time testing intervention, but in one study testing was undertaken every 5 years and in another among HIV-infected MSM there was more frequent testing. Comparators were generally either: 1) no testing, 2) the status quo, or 3) multiple different strategies. Overall, we found broad agreement that focused testing of high risk groups such as persons who inject drugs and men who have sex with men was cost-effective, as was birth cohort testing. Key drivers of cost-effectiveness were the prevalence of HCV infection in these groups, efficacy and cost of treatment, stage of disease and linkage to care. The evidence for routine population testing was mixed, and the cost-effectiveness depends largely on the prevalence of HCV. The evidence base for different HCV testing approaches in LMICs is limited, minimizing the contribution of cost-effectiveness data alone to decision-making and recommendations on testing approaches in the 2017 WHO viral hepatitis testing guidelines. Overall, the guidelines recommended focused testing in high risk-groups, particularly PWID, prisoners, and men who have sex with men; with consideration of two other approaches:- birth cohort testing in those countries with epidemiological evidence of a significant birth cohort effect; and routine access to testing across the general population in those countries with a high HCV seroprevalence above 2% - 5% in the general population. Further implementation research on different testing approaches is needed in order to help guide national policy planning.

  1. Biowaiver Monographs for Immediate Release Solid Oral Dosage Forms: Levetiracetam.

    PubMed

    Petruševska, Marija; Berglez, Sandra; Krisch, Igor; Legen, Igor; Megušar, Klara; Peternel, Luka; Abrahamsson, Bertil; Cristofoletti, Rodrigo; Groot, D W; Kopp, Sabine; Langguth, Peter; Mehta, Mehul; Polli, James E; Shah, Vinod P; Dressman, Jennifer

    2015-09-01

    Literature and experimental data relevant for the decision to allow a waiver of in vivo bioequivalence (BE) testing for the approval of immediate release (IR) solid oral dosage forms containing levetiracetam are reviewed. Data on solubility and permeability suggest that levetiracetam belongs to class I of the biopharmaceutical classification system (BCS). Levetiracetam's therapeutic use, its wide therapeutic index, and its favorable pharmacokinetic properties make levetiracetam a valid candidate for the BCS-based biowaiver approach. Further, no BE studies with levetiracetam IR formulations in which the test formulation failed to show BE with the comparator have been reported in the open literature. On the basis of the overall evidence, it appears unlikely that a BCS-based biowaiver approach for levetiracetam IR solid oral dosage forms formulated with established excipients would expose patients to undue risks. Thus, the BCS-based biowaiver approach procedure is recommended for IR solid oral dosage form containing levetiracetam, provided the excipients in the formulation are also present in products that have been approved in countries belonging to or associated with the International Committee on Harmonization and are used in their usual quantities, and provided the dissolution profiles of the test and reference product comply with the current requirements for BCS-based biowaivers. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  2. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    NASA Astrophysics Data System (ADS)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  3. Examining Testlet Effects in the TestDaF Listening Section: A Testlet Response Theory Modeling Approach

    ERIC Educational Resources Information Center

    Eckes, Thomas

    2014-01-01

    Testlets are subsets of test items that are based on the same stimulus and are administered together. Tests that contain testlets are in widespread use in language testing, but they also share a fundamental problem: Items within a testlet are locally dependent with possibly adverse consequences for test score interpretation and use. Building on…

  4. Using Testlet Response Theory to Examine Local Dependence in C-Tests

    ERIC Educational Resources Information Center

    Eckes, Thomas; Baghaei, Purya

    2015-01-01

    C-tests are gap-filling tests widely used to assess general language proficiency for purposes of placement, screening, or provision of feedback to language learners. C-tests consist of several short texts in which parts of words are missing. We addressed the issue of local dependence in C-tests using an explicit modeling approach based on testlet…

  5. Problem-based learning using patient-simulated videos showing daily life for a comprehensive clinical approach

    PubMed Central

    Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi

    2017-01-01

    Objectives We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Methods Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students’ recall of cases in three categories: video, paper, and non-experienced. Results Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (p<0.001), and satisfaction with sessions (p=0.001). No significant differences existed in the discussion contents diversity regarding the International Classification of Primary Care Second Edition codes and chapter types or in the rate of psychological codes. In a follow-up survey comparing video and paper groups to non-experienced groups, the rates were higher for video (χ2=24.319, p<0.001) and paper (χ2=11.134, p=0.001). Although the video rate tended to be higher than the paper rate, no significant difference was found between the two. Conclusions Patient-simulated videos showing daily life facilitate imagining true patients and support a comprehensive approach that fosters better memory. The clinical patient-simulated video method is more practical and clinical problem-based tutorials can be implemented if we create patient-simulated videos for each symptom as teaching materials.  PMID:28245193

  6. Problem-based learning using patient-simulated videos showing daily life for a comprehensive clinical approach.

    PubMed

    Ikegami, Akiko; Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi

    2017-02-27

    We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students' recall of cases in three categories: video, paper, and non-experienced. Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (p<0.001), and satisfaction with sessions (p=0.001). No significant differences existed in the discussion contents diversity regarding the International Classification of Primary Care Second Edition codes and chapter types or in the rate of psychological codes. In a follow-up survey comparing video and paper groups to non-experienced groups, the rates were higher for video (χ 2 =24.319, p<0.001) and paper (χ 2 =11.134, p=0.001). Although the video rate tended to be higher than the paper rate, no significant difference was found between the two. Patient-simulated videos showing daily life facilitate imagining true patients and support a comprehensive approach that fosters better memory. The clinical patient-simulated video method is more practical and clinical problem-based tutorials can be implemented if we create patient-simulated videos for each symptom as teaching materials.

  7. Changes in acid-base and ion balance during exercise in normoxia and normobaric hypoxia.

    PubMed

    Lühker, Olaf; Berger, Marc Moritz; Pohlmann, Alexander; Hotz, Lorenz; Gruhlke, Tilmann; Hochreiter, Marcel

    2017-11-01

    Both exercise and hypoxia cause complex changes in acid-base homeostasis. The aim of the present study was to investigate whether during intense physical exercise in normoxia and hypoxia, the modified physicochemical approach offers a better understanding of the changes in acid-base homeostasis than the traditional Henderson-Hasselbalch approach. In this prospective, randomized, crossover trial, 19 healthy males completed an exercise test until voluntary fatigue on a bicycle ergometer on two different study days, once during normoxia and once during normobaric hypoxia (12% oxygen, equivalent to an altitude of 4500 m). Arterial blood gases were sampled during and after the exercise test and analysed according to the modified physicochemical and Henderson-Hasselbalch approach, respectively. Peak power output decreased from 287 ± 9 Watts in normoxia to 213 ± 6 Watts in hypoxia (-26%, P < 0.001). Exercise decreased arterial pH to 7.21 ± 0.01 and 7.27 ± 0.02 (P < 0.001) during normoxia and hypoxia, respectively, and increased plasma lactate to 16.8 ± 0.8 and 17.5 ± 0.9 mmol/l (P < 0.001). While the Henderson-Hasselbalch approach identified lactate as main factor responsible for the non-respiratory acidosis, the modified physicochemical approach additionally identified strong ions (i.e. plasma electrolytes, organic acid ions) and non-volatile weak acids (i.e. albumin, phosphate ion species) as important contributors. The Henderson-Hasselbalch approach might serve as basis for screening acid-base disturbances, but the modified physicochemical approach offers more detailed insights into the complex changes in acid-base status during exercise in normoxia and hypoxia, respectively.

  8. 3-D CFD Simulation and Validation of Oxygen-Rich Hydrocarbon Combustion in a Gas-Centered Swirl Coaxial Injector using a Flamelet-Based Approach

    NASA Technical Reports Server (NTRS)

    Richardson, Brian; Kenny, Jeremy

    2015-01-01

    Injector design is a critical part of the development of a rocket Thrust Chamber Assembly (TCA). Proper detailed injector design can maximize propulsion efficiency while minimizing the potential for failures in the combustion chamber. Traditional design and analysis methods for hydrocarbon-fuel injector elements are based heavily on empirical data and models developed from heritage hardware tests. Using this limited set of data produces challenges when trying to design a new propulsion system where the operating conditions may greatly differ from heritage applications. Time-accurate, Three-Dimensional (3-D) Computational Fluid Dynamics (CFD) modeling of combusting flows inside of injectors has long been a goal of the fluid analysis group at Marshall Space Flight Center (MSFC) and the larger CFD modeling community. CFD simulation can provide insight into the design and function of an injector that cannot be obtained easily through testing or empirical comparisons to existing hardware. However, the traditional finite-rate chemistry modeling approach utilized to simulate combusting flows for complex fuels, such as Rocket Propellant-2 (RP-2), is prohibitively expensive and time consuming even with a large amount of computational resources. MSFC has been working, in partnership with Streamline Numerics, Inc., to develop a computationally efficient, flamelet-based approach for modeling complex combusting flow applications. In this work, a flamelet modeling approach is used to simulate time-accurate, 3-D, combusting flow inside a single Gas Centered Swirl Coaxial (GCSC) injector using the flow solver, Loci-STREAM. CFD simulations were performed for several different injector geometries. Results of the CFD analysis helped guide the design of the injector from an initial concept to a tested prototype. The results of the CFD analysis are compared to data gathered from several hot-fire, single element injector tests performed in the Air Force Research Lab EC-1 test facility located at Edwards Air Force Base.

  9. Metabolomics Approach for Toxicity Screening of Volatile Substances

    EPA Science Inventory

    In 2007 the National Research Council envisioned the need for inexpensive, high throughput, cell based toxicity testing methods relevant to human health. High Throughput Screening (HTS) in vitro screening approaches have addressed these problems by using robotics. However, the ch...

  10. The consistency approach for quality control of vaccines - a strategy to improve quality control and implement 3Rs.

    PubMed

    De Mattia, Fabrizio; Chapsal, Jean-Michel; Descamps, Johan; Halder, Marlies; Jarrett, Nicholas; Kross, Imke; Mortiaux, Frederic; Ponsar, Cecile; Redhead, Keith; McKelvie, Jo; Hendriksen, Coenraad

    2011-01-01

    Current batch release testing of established vaccines emphasizes quality control of the final product and is often characterized by extensive use of animals. This report summarises the discussions of a joint ECVAM/EPAA workshop on the applicability of the consistency approach for routine release of human and veterinary vaccines and its potential to reduce animal use. The consistency approach is based upon thorough characterization of the vaccine during development and the principle that the quality of subsequent batches is the consequence of the strict application of a quality system and of a consistent production of batches. The concept of consistency of production is state-of-the-art for new-generation vaccines, where batch release is mainly based on non-animal methods. There is now the opportunity to introduce the approach into established vaccine production, where it has the potential to replace in vivo tests with non-animal tests designed to demonstrate batch quality while maintaining the highest quality standards. The report indicates how this approach may be further developed for application to established human and veterinary vaccines and emphasizes the continuing need for co-ordination and harmonization. It also gives recommendations for work to be undertaken in order to encourage acceptance and implementation of the consistency approach. Copyright © 2011. Published by Elsevier Ltd.. All rights reserved.

  11. A novel toxicogenomics-based approach to categorize (non-)genotoxic carcinogens.

    PubMed

    Schaap, Mirjam M; Wackers, Paul F K; Zwart, Edwin P; Huijskens, Ilse; Jonker, Martijs J; Hendriks, Giel; Breit, Timo M; van Steeg, Harry; van de Water, Bob; Luijten, Mirjam

    2015-12-01

    Alternative methods to detect non-genotoxic carcinogens are urgently needed, as this class of carcinogens goes undetected in the current testing strategy for carcinogenicity under REACH. A complicating factor is that non-genotoxic carcinogens act through several distinctive modes of action, which makes prediction of their carcinogenic property difficult. We have recently demonstrated that gene expression profiling in primary mouse hepatocytes is a useful approach to categorize non-genotoxic carcinogens according to their modes of action. In the current study, we improved the methods used for analysis and added mouse embryonic stem cells as a second in vitro test system, because of their features complementary to hepatocytes. Our approach involved an unsupervised analysis based on the 30 most significantly up- and down-regulated genes per chemical. Mouse embryonic stem cells and primary mouse hepatocytes were exposed to a selected set of chemicals and subsequently subjected to gene expression profiling. We focused on non-genotoxic carcinogens, but also included genotoxic carcinogens and non-carcinogens to test the robustness of this approach. Application of the optimized comparison approach resulted in improved categorization of non-genotoxic carcinogens. Mouse embryonic stem cells were a useful addition, especially for genotoxic substances, but also for detection of non-genotoxic carcinogens that went undetected by primary hepatocytes. The approach presented here is an important step forward to categorize chemicals, especially those that are carcinogenic.

  12. Two Different Approaches to Automated Mark Up of Emotions in Text

    NASA Astrophysics Data System (ADS)

    Francisco, Virginia; Hervás, Raqucl; Gervás, Pablo

    This paper presents two different approaches to automated marking up of texts with emotional labels. For the first approach a corpus of example texts previously annotated by human evaluators is mined for an initial assignment of emotional features to words. This results in a List of Emotional Words (LEW) which becomes a useful resource for later automated mark up. The mark up algorithm in this first approach mirrors closely the steps taken during feature extraction, employing for the actual assignment of emotional features a combination of the LEW resource and WordNet for knowledge-based expansion of words not occurring in LEW. The algorithm for automated mark up is tested against new text samples to test its coverage. The second approach mark up texts during their generation. We have a knowledge base which contains the necessary information for marking up the text. This information is related to actions and characters. The algorithm in this case employ the information of the knowledge database and decides the correct emotion for every sentence. The algorithm for automated mark up is tested against four different texts. The results of the two approaches are compared and discussed with respect to three main issues: relative adequacy of each one of the representations used, correctness and coverage of the proposed algorithms, and additional techniques and solutions that may be employed to improve the results.

  13. New technologies and approaches in toxicity testing and risk assessment (ESOT)

    EPA Science Inventory

    The release of the National Research Council’s Report “Toxicity Testing in the 21st Century: A Vision and a Strategy” in 2007 initiated a broad-based movement in the toxicology community to re-think how toxicity testing and risk assessment are performed. Multiple efforts in the ...

  14. Kodak AMSD Cryogenic Test Plans

    NASA Technical Reports Server (NTRS)

    Matthews, Gary; Hammon, John; Barrett, David; Russell, Kevin (Technical Monitor)

    2002-01-01

    NGST will be an IR based optical system that will operate at cryogenic temperatures. As part of the AMSD program, Kodak must demonstrate the ability of our system to perform at these very cold temperatures. Kodak will discuss the test approach that will be used for cryogenic testing at MSFC's XRCF.

  15. Using Alternative Approaches to Prioritize Testing for the Universe of Chemicals with Potential for Human Exposure (WC9)

    EPA Science Inventory

    One use of alternative methods is to target animal use at only those chemicals and tests that are absolutely necessary. We discuss prioritization of testing based on high-throughput screening assays (HTS), QSAR modeling, high-throughput toxicokinetics (HTTK), and exposure modelin...

  16. A critical issue in model-based inference for studying trait-based community assembly and a solution.

    PubMed

    Ter Braak, Cajo J F; Peres-Neto, Pedro; Dray, Stéphane

    2017-01-01

    Statistical testing of trait-environment association from data is a challenge as there is no common unit of observation: the trait is observed on species, the environment on sites and the mediating abundance on species-site combinations. A number of correlation-based methods, such as the community weighted trait means method (CWM), the fourth-corner correlation method and the multivariate method RLQ, have been proposed to estimate such trait-environment associations. In these methods, valid statistical testing proceeds by performing two separate resampling tests, one site-based and the other species-based and by assessing significance by the largest of the two p -values (the p max test). Recently, regression-based methods using generalized linear models (GLM) have been proposed as a promising alternative with statistical inference via site-based resampling. We investigated the performance of this new approach along with approaches that mimicked the p max test using GLM instead of fourth-corner. By simulation using models with additional random variation in the species response to the environment, the site-based resampling tests using GLM are shown to have severely inflated type I error, of up to 90%, when the nominal level is set as 5%. In addition, predictive modelling of such data using site-based cross-validation very often identified trait-environment interactions that had no predictive value. The problem that we identify is not an "omitted variable bias" problem as it occurs even when the additional random variation is independent of the observed trait and environment data. Instead, it is a problem of ignoring a random effect. In the same simulations, the GLM-based p max test controlled the type I error in all models proposed so far in this context, but still gave slightly inflated error in more complex models that included both missing (but important) traits and missing (but important) environmental variables. For screening the importance of single trait-environment combinations, the fourth-corner test is shown to give almost the same results as the GLM-based tests in far less computing time.

  17. Blended learning approach improves teaching in a problem-based learning environment in orthopedics - a pilot study.

    PubMed

    Back, David A; Haberstroh, Nicole; Antolic, Andrea; Sostmann, Kai; Schmidmaier, Gerhard; Hoff, Eike

    2014-01-27

    While e-learning is enjoying increasing popularity as adjunct in modern teaching, studies on this topic should shift from mere evaluation of students' satisfaction towards assessing its benefits on enhancement of knowledge and skills. This pilot study aimed to detect the teaching effects of a blended learning program on students of orthopedics and traumatology in the context of a problem-based learning environment. The project NESTOR (network for students in traumatology and orthopedics) was offered to students in a problem-based learning course. Participants completed written tests before and directly after the course, followed by a final written test and an objective structured clinical examination (OSCE) as well as an evaluation questionnaire at the end of the semester. Results were compared within the group of NESTOR users and non-users and between these two groups. Participants (n = 53) rated their experiences very positively. An enhancement in knowledge was found directly after the course and at the final written test for both groups (p < 0.001). NESTOR users scored higher than non-users in the post-tests, while the OSCE revealed no differences between the groups. This pilot study showed a positive effect of the blended learning approach on knowledge enhancement and satisfaction of participating students. However, it will be an aim for the future to further explore the chances of this approach and internet-based technologies for possibilities to improve also practical examination skills.

  18. Blended learning approach improves teaching in a problem-based learning environment in orthopedics - a pilot study

    PubMed Central

    2014-01-01

    Background While e-learning is enjoying increasing popularity as adjunct in modern teaching, studies on this topic should shift from mere evaluation of students’ satisfaction towards assessing its benefits on enhancement of knowledge and skills. This pilot study aimed to detect the teaching effects of a blended learning program on students of orthopedics and traumatology in the context of a problem-based learning environment. Methods The project NESTOR (network for students in traumatology and orthopedics) was offered to students in a problem-based learning course. Participants completed written tests before and directly after the course, followed by a final written test and an objective structured clinical examination (OSCE) as well as an evaluation questionnaire at the end of the semester. Results were compared within the group of NESTOR users and non-users and between these two groups. Results Participants (n = 53) rated their experiences very positively. An enhancement in knowledge was found directly after the course and at the final written test for both groups (p < 0.001). NESTOR users scored higher than non-users in the post-tests, while the OSCE revealed no differences between the groups. Conclusions This pilot study showed a positive effect of the blended learning approach on knowledge enhancement and satisfaction of participating students. However, it will be an aim for the future to further explore the chances of this approach and internet-based technologies for possibilities to improve also practical examination skills. PMID:24690365

  19. Effects of Brain-Based Learning Approach on Students' Motivation and Attitudes Levels in Science Class

    ERIC Educational Resources Information Center

    Akyurek, Erkan; Afacan, Ozlem

    2013-01-01

    The purpose of the study was to examine the effect of brain-based learning approach on attitudes and motivation levels in 8th grade students' science classes. The main reason for examining attitudes and motivation levels, the effect of the short-term motivation, attitude shows the long-term effect. The pre/post-test control group research model…

  20. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  1. An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test

    ERIC Educational Resources Information Center

    Kahraman, Nilüfer

    2014-01-01

    Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local…

  2. Analysis and Testing of a LIDAR-Based Approach to Terrain Relative Navigation for Precise Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Ivanov, Tonislav I.

    2011-01-01

    To increase safety and land near pre-deployed resources, future NASA missions to the moon will require precision landing. A LIDAR-based terrain relative navigation (TRN) approach can achieve precision landing under any lighting conditions. This paper presents results from processing flash lidar and laser altimeter field test data that show LIDAR TRN can obtain position estimates less than 90m while automatically detecting and eliminating incorrect measurements using internal metrics on terrain relief and data correlation. Sensitivity studies show that the algorithm has no degradation in matching performance with initial position uncertainties up to 1.6 km

  3. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    PubMed

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  4. Hypothesis testing for differentially correlated features.

    PubMed

    Sheng, Elisa; Witten, Daniela; Zhou, Xiao-Hua

    2016-10-01

    In a multivariate setting, we consider the task of identifying features whose correlations with the other features differ across conditions. Such correlation shifts may occur independently of mean shifts, or differences in the means of the individual features across conditions. Previous approaches for detecting correlation shifts consider features simultaneously, by computing a correlation-based test statistic for each feature. However, since correlations involve two features, such approaches do not lend themselves to identifying which feature is the culprit. In this article, we instead consider a serial testing approach, by comparing columns of the sample correlation matrix across two conditions, and removing one feature at a time. Our method provides a novel perspective and favorable empirical results compared with competing approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Validity evidence for the situational judgment test paradigm in emotional intelligence measurement.

    PubMed

    Libbrecht, Nele; Lievens, Filip

    2012-01-01

    To date, various measurement approaches have been proposed to assess emotional intelligence (EI). Recently, two new EI tests have been developed based on the situational judgment test (SJT) paradigm: the Situational Test of Emotional Understanding (STEU) and the Situational Test of Emotion Management (STEM). Initial attempts have been made to examine the construct-related validity of these new tests; we extend these findings by placing the tests in a broad nomological network. To this end, 850 undergraduate students completed a personality inventory, a cognitive ability test, a self-report EI test, a performance-based EI measure, the STEU, and the STEM. The SJT-based EI tests were not strongly correlated with personality and fluid cognitive ability. Regarding their relation with existing EI measures, the tests did not capture the same construct as self-report EI measures, but corresponded rather to performance-based EI measures. Overall, these results lend support for the SJT paradigm for measuring EI as an ability.

  6. Hybrid Residual Flexibility/Mass-Additive Method for Structural Dynamic Testing

    NASA Technical Reports Server (NTRS)

    Tinker, M. L.

    2003-01-01

    A large fixture was designed and constructed for modal vibration testing of International Space Station elements. This fixed-base test fixture, which weighs thousands of pounds and is anchored to a massive concrete floor, initially utilized spherical bearings and pendulum mechanisms to simulate Shuttle orbiter boundary constraints for launch of the hardware. Many difficulties were encountered during a checkout test of the common module prototype structure, mainly due to undesirable friction and excessive clearances in the test-article-to-fixture interface bearings. Measured mode shapes and frequencies were not representative of orbiter-constrained modes due to the friction and clearance effects in the bearings. As a result, a major redesign effort for the interface mechanisms was undertaken. The total cost of the fixture design, construction and checkout, and redesign was over $2 million. Because of the problems experienced with fixed-base testing, alternative free-suspension methods were studied, including the residual flexibility and mass-additive approaches. Free-suspension structural dynamics test methods utilize soft elastic bungee cords and overhead frame suspension systems that are less complex and much less expensive than fixed-base systems. The cost of free-suspension fixturing is on the order of tens of thousands of dollars as opposed to millions, for large fixed-base fixturing. In addition, free-suspension test configurations are portable, allowing modal tests to be done at sites without modal test facilities. For example, a mass-additive modal test of the ASTRO-1 Shuttle payload was done at the Kennedy Space Center launch site. In this Technical Memorandum, the mass-additive and residual flexibility test methods are described in detail. A discussion of a hybrid approach that combines the best characteristics of each method follows and is the focus of the study.

  7. Strainrange partitioning life predictions of the long time metal properties council creep-fatigue tests

    NASA Technical Reports Server (NTRS)

    Saltsman, J. F.; Halford, G. R.

    1979-01-01

    The method of strainrange partitioning is used to predict the cyclic lives of the Metal Properties Council's long time creep-fatigue interspersion tests of several steel alloys. Comparisons are made with predictions based upon the time- and cycle-fraction approach. The method of strainrange partitioning is shown to give consistently more accurate predictions of cyclic life than is given by the time- and cycle-fraction approach.

  8. An Empirical Comparison of DDF Detection Methods for Understanding the Causes of DIF in Multiple-Choice Items

    ERIC Educational Resources Information Center

    Suh, Youngsuk; Talley, Anna E.

    2015-01-01

    This study compared and illustrated four differential distractor functioning (DDF) detection methods for analyzing multiple-choice items. The log-linear approach, two item response theory-model-based approaches with likelihood ratio tests, and the odds ratio approach were compared to examine the congruence among the four DDF detection methods.…

  9. Aerostructural interaction in a collaborative MDO environment

    NASA Astrophysics Data System (ADS)

    Ciampa, Pier Davide; Nagel, Björn

    2014-10-01

    The work presents an approach for aircraft design and optimization, developed to account for fluid-structure interactions in MDO applications. The approach makes use of a collaborative distributed design environment, and focuses on the influence of multiple physics based aerostructural models, on the overall aircraft synthesis and optimization. The approach is tested for the design of large transportation aircraft.

  10. Using Machine-Learning and Visualisation to Facilitate Learner Interpretation of Source Material

    ERIC Educational Resources Information Center

    Wolff, Annika; Mulholland, Paul; Zdrahal, Zdenek

    2014-01-01

    This paper describes an approach for supporting inquiry learning from source materials, realised and tested through a tool-kit. The approach is optimised for tasks that require a student to make interpretations across sets of resources, where opinions and justifications may be hard to articulate. We adopt a dialogue-based approach to learning…

  11. A computer vision-based approach for structural displacement measurement

    NASA Astrophysics Data System (ADS)

    Ji, Yunfeng

    2010-04-01

    Along with the incessant advancement in optics, electronics and computer technologies during the last three decades, commercial digital video cameras have experienced a remarkable evolution, and can now be employed to measure complex motions of objects with sufficient accuracy, which render great assistance to structural displacement measurement in civil engineering. This paper proposes a computer vision-based approach for dynamic measurement of structures. One digital camera is used to capture image sequences of planar targets mounted on vibrating structures. The mathematical relationship between image plane and real space is established based on computer vision theory. Then, the structural dynamic displacement at the target locations can be quantified using point reconstruction rules. Compared with other tradition displacement measurement methods using sensors, such as accelerometers, linear-variable-differential-transducers (LVDTs) and global position system (GPS), the proposed approach gives the main advantages of great flexibility, a non-contact working mode and ease of increasing measurement points. To validate, four tests of sinusoidal motion of a point, free vibration of a cantilever beam, wind tunnel test of a cross-section bridge model, and field test of bridge displacement measurement, are performed. Results show that the proposed approach can attain excellent accuracy compared with the analytical ones or the measurements using conventional transducers, and proves to deliver an innovative and low cost solution to structural displacement measurement.

  12. Hierarchical screening for multiple mental disorders.

    PubMed

    Batterham, Philip J; Calear, Alison L; Sunderland, Matthew; Carragher, Natacha; Christensen, Helen; Mackinnon, Andrew J

    2013-10-01

    There is a need for brief, accurate screening when assessing multiple mental disorders. Two-stage hierarchical screening, consisting of brief pre-screening followed by a battery of disorder-specific scales for those who meet diagnostic criteria, may increase the efficiency of screening without sacrificing precision. This study tested whether more efficient screening could be gained using two-stage hierarchical screening than by administering multiple separate tests. Two Australian adult samples (N=1990) with high rates of psychopathology were recruited using Facebook advertising to examine four methods of hierarchical screening for four mental disorders: major depressive disorder, generalised anxiety disorder, panic disorder and social phobia. Using K6 scores to determine whether full screening was required did not increase screening efficiency. However, pre-screening based on two decision tree approaches or item gating led to considerable reductions in the mean number of items presented per disorder screened, with estimated item reductions of up to 54%. The sensitivity of these hierarchical methods approached 100% relative to the full screening battery. Further testing of the hierarchical screening approach based on clinical criteria and in other samples is warranted. The results demonstrate that a two-phase hierarchical approach to screening multiple mental disorders leads to considerable increases efficiency gains without reducing accuracy. Screening programs should take advantage of prescreeners based on gating items or decision trees to reduce the burden on respondents. © 2013 Elsevier B.V. All rights reserved.

  13. The comparative effectiveness of a team-based versus group-based physical activity intervention for cancer survivors.

    PubMed

    Carter, Cindy L; Onicescu, Georgiana; Cartmell, Kathleen B; Sterba, Katherine R; Tomsic, James; Alberg, Anthony J

    2012-08-01

    Physical activity benefits cancer survivors, but the comparative effectiveness of a team-based delivery approach remains unexplored. The hypothesis tested was that a team-based physical activity intervention delivery approach has added physical and psychological benefits compared to a group-based approach. A team-based sport accessible to survivors is dragon boating, which requires no previous experience and allows for diverse skill levels. In a non-randomized trial, cancer survivors chose between two similarly structured 8-week programs, a dragon boat paddling team (n = 68) or group-based walking program (n = 52). Three separate intervention rounds were carried out in 2007-2008. Pre-post testing measured physical and psychosocial outcomes. Compared to walkers, paddlers had significantly greater (all p < 0.01) team cohesion, program adherence/attendance, and increased upper-body strength. For quality-of-life outcomes, both interventions were associated with pre-post improvements, but with no clear-cut pattern of between-intervention differences. These hypothesis-generating findings suggest that a short-term, team-based physical activity program (dragon boat paddling) was associated with increased cohesion and adherence/attendance. Improvements in physical fitness and psychosocial benefits were comparable to a traditional, group-based walking program. Compared to a group-based intervention delivery format, the team-based intervention delivery format holds promise for promoting physical activity program adherence/attendance in cancer survivors.

  14. The Effect of ICT Assisted Project Based Learning Approach on Prospective ICT Integration Skills of Teacher Candidates

    ERIC Educational Resources Information Center

    Pilten, Pusat; Pilten, Gulhiz; Sahinkaya, Nihan

    2017-01-01

    The purpose of the present research is studying the effects of information and communication technologies (ICT) assisted project based learning practices on ICT integration skills of pre-service classroom teachers. The research adopted a mixed method. The quantitative dimension of the research was designed with pre-test-post-test control groups.…

  15. A Randomized Controlled Trial of Acceptance-Based Behavior Therapy and Cognitive Therapy for Test Anxiety: A Pilot Study

    ERIC Educational Resources Information Center

    Brown, Lily A.; Forman, Evan M.; Herbert, James D.; Hoffman, Kimberly L.; Yuen, Erica K.; Goetter, Elizabeth M.

    2011-01-01

    Many university students suffer from test anxiety that is severe enough to impair performance. Given mixed efficacy results of previous cognitive-behavior therapy (CBT) trials and a theoretically driven rationale, an acceptance-based behavior therapy (ABBT) approach was compared to traditional CBT (i.e., Beckian cognitive therapy; CT) for the…

  16. Proposed modifications of Environmental Protection Agency Method 1601 for detection of coliphages in drinking water, with same-day fluorescence-based detection and evaluation by the performance-based measurement system and alternative test protocol validation approaches.

    PubMed

    Salter, Robert S; Durbin, Gregory W; Conklin, Ernestine; Rosen, Jeff; Clancy, Jennifer

    2010-12-01

    Coliphages are microbial indicators specified in the Ground Water Rule that can be used to monitor for potential fecal contamination of drinking water. The Total Coliform Rule specifies coliform and Escherichia coli indicators for municipal water quality testing; thus, coliphage indicator use is less common and advances in detection methodology are less frequent. Coliphages are viral structures and, compared to bacterial indicators, are more resistant to disinfection and diffuse further distances from pollution sources. Therefore, coliphage presence may serve as a better predictor of groundwater quality. This study describes Fast Phage, a 16- to 24-h presence/absence modification of U.S. Environmental Protection Agency (EPA) Method 1601 for detection of coliphages in 100 ml water. The objective of the study is to demonstrate that the somatic and male-specific coliphage modifications provide results equivalent to those of Method 1601. Five laboratories compared the modifications, featuring same-day fluorescence-based prediction, to Method 1601 by using the performance-based measurement system (PBMS) criterion. This requires a minimum 50% positive response in 10 replicates of 100-ml water samples at coliphage contamination levels of 1.3 to 1.5 PFU/100 ml. The laboratories showed that Fast Phage meets PBMS criteria with 83.5 to 92.1% correlation of the same-day rapid fluorescence-based prediction with the next-day result. Somatic coliphage PBMS data are compared to manufacturer development data that followed the EPA alternative test protocol (ATP) validation approach. Statistical analysis of the data sets indicates that PBMS utilizes fewer samples than does the ATP approach but with similar conclusions. Results support testing the coliphage modifications by using an EPA-approved national PBMS approach with collaboratively shared samples.

  17. Combining computer adaptive testing technology with cognitively diagnostic assessment.

    PubMed

    McGlohen, Meghan; Chang, Hua-Hua

    2008-08-01

    A major advantage of computerized adaptive testing (CAT) is that it allows the test to home in on an examinee's ability level in an interactive manner. The aim of the new area of cognitive diagnosis is to provide information about specific content areas in which an examinee needs help. The goal of this study was to combine the benefit of specific feedback from cognitively diagnostic assessment with the advantages of CAT. In this study, three approaches to combining these were investigated: (1) item selection based on the traditional ability level estimate (theta), (2) item selection based on the attribute mastery feedback provided by cognitively diagnostic assessment (alpha), and (3) item selection based on both the traditional ability level estimate (theta) and the attribute mastery feedback provided by cognitively diagnostic assessment (alpha). The results from these three approaches were compared for theta estimation accuracy, attribute mastery estimation accuracy, and item exposure control. The theta- and alpha-based condition outperformed the alpha-based condition regarding theta estimation, attribute mastery pattern estimation, and item exposure control. Both the theta-based condition and the theta- and alpha-based condition performed similarly with regard to theta estimation, attribute mastery estimation, and item exposure control, but the theta- and alpha-based condition has an additional advantage in that it uses the shadow test method, which allows the administrator to incorporate additional constraints in the item selection process, such as content balancing, item type constraints, and so forth, and also to select items on the basis of both the current theta and alpha estimates, which can be built on top of existing 3PL testing programs.

  18. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    PubMed

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  19. An Update on the Progress of the Extended One-Generation Reproductive Protocol

    EPA Science Inventory

    Consistent with a more science-based approach assessing potential adverse effects of pesticides, the ILSI-HESI Agricultural Chemical Safety Assessment (ACSA) Technical Committee, proposed a new tiered toxicity testing strategy. This approach utilizes fewer animals and provides an...

  20. Distinguishing between forensic science and forensic pseudoscience: testing of validity and reliability, and approaches to forensic voice comparison.

    PubMed

    Morrison, Geoffrey Stewart

    2014-05-01

    In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.

  1. Get Tested Why Not? A novel approach to internet-based chlamydia and gonorrhea testing in Canada.

    PubMed

    Mann, Tara A; Uddin, Zhaida; Hendriks, Andrew M; Bouchard, Christiane J; Etches, Vera G

    2013-03-07

    The objective of the Get Tested Why Not campaign is to increase access to chlamydia and gonorrhea testing and sexual health information, with specific focus on youth. Individuals between the ages of 15-29 are most affected by chlamydia and gonorrhea infections in Ottawa and were identified as the target population. Youth from the target population were engaged in the development and launch of the campaign. Development of the campaign began in 2009 and led to a launch on March 21, 2011. Social media promotion as well as traditional advertising approaches developed awareness of the campaign within the target population. The campaign consists of a bilingual, youth-friendly website and texting service. After assessing appropriateness of testing, clients can download a requisition form for urine-based chlamydia and gonorrhea testing and submit a sample at one of 26 laboratories across Ottawa. During year 1 of the campaign, there were 13,385 website hits and 104 specimens submitted for chlamydia and gonorrhea testing. The majority (57.6%, n=60) of requisitions were submitted by members of the target population (age 15-29). Of the requisitions submitted, 95 (91.3%) were negative, 4 (3.9%) were positive and 5 (4.8%) were cancelled due to lab errors. The campaign is reaching the target population and has demonstrated a positive impact on knowledge and intended behaviours of users. The use of technology has expanded testing options, thereby potentially broadening Ottawa Public Health's reach to clients who may be less likely to test through traditional testing approaches.

  2. Illustrating, Quantifying, and Correcting for Bias in Post-hoc Analysis of Gene-Based Rare Variant Tests of Association

    PubMed Central

    Grinde, Kelsey E.; Arbet, Jaron; Green, Alden; O'Connell, Michael; Valcarcel, Alessandra; Westra, Jason; Tintle, Nathan

    2017-01-01

    To date, gene-based rare variant testing approaches have focused on aggregating information across sets of variants to maximize statistical power in identifying genes showing significant association with diseases. Beyond identifying genes that are associated with diseases, the identification of causal variant(s) in those genes and estimation of their effect is crucial for planning replication studies and characterizing the genetic architecture of the locus. However, we illustrate that straightforward single-marker association statistics can suffer from substantial bias introduced by conditioning on gene-based test significance, due to the phenomenon often referred to as “winner's curse.” We illustrate the ramifications of this bias on variant effect size estimation and variant prioritization/ranking approaches, outline parameters of genetic architecture that affect this bias, and propose a bootstrap resampling method to correct for this bias. We find that our correction method significantly reduces the bias due to winner's curse (average two-fold decrease in bias, p < 2.2 × 10−6) and, consequently, substantially improves mean squared error and variant prioritization/ranking. The method is particularly helpful in adjustment for winner's curse effects when the initial gene-based test has low power and for relatively more common, non-causal variants. Adjustment for winner's curse is recommended for all post-hoc estimation and ranking of variants after a gene-based test. Further work is necessary to continue seeking ways to reduce bias and improve inference in post-hoc analysis of gene-based tests under a wide variety of genetic architectures. PMID:28959274

  3. Prediction of Regulation Reserve Requirements in California ISO Control Area based on BAAL Standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Makarov, Yuri V.; Samaan, Nader A.

    This paper presents new methodologies developed at Pacific Northwest National Laboratory (PNNL) to estimate regulation capacity requirements in the California ISO control area. Two approaches have been developed: (1) an approach based on statistical analysis of actual historical area control error (ACE) and regulation data, and (2) an approach based on balancing authority ACE limit control performance standard. The approaches predict regulation reserve requirements on a day-ahead basis including upward and downward requirements, for each operating hour of a day. California ISO data has been used to test the performance of the proposed algorithms. Results show that software tool allowsmore » saving up to 30% on the regulation procurements cost .« less

  4. Economic evaluation of pharmacogenomics: a value-based approach to pragmatic decision making in the face of complexity.

    PubMed

    Snyder, Susan R; Mitropoulou, Christina; Patrinos, George P; Williams, Marc S

    2014-01-01

    Evidence of the value of pharmacogenomic testing is needed to inform policymakers and clinicians for decision making related to adoption and coverage, and to facilitate prioritization for research and development. Pharmacogenomics has an important role in creating a more efficient healthcare system, and this article addresses how economic evaluation can strategically target evidence gaps for public health priorities with examples from pharmacogenomic medicine. This article begins with a review of the need for and use of economic evaluations in value-based decision making for pharmacogenomic testing. Three important gaps are described with examples demonstrating how they can be addressed: (1) projected impact of hypothetical new technology, (2) pre-implementation assessment of a specific technology, and (3) post-implementation assessment from relevant analytical stakeholder perspectives. Additional needs, challenges and approaches specific to pharmacogenomic economic evaluation in the developing world are also identified. These pragmatic approaches can provide much needed evidence to support real-world value-based decision making for pharmacogenomic-based screening and treatment strategies. © 2014 S. Karger AG, Basel.

  5. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    PubMed

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Sentence Recognition Prediction for Hearing-impaired Listeners in Stationary and Fluctuation Noise With FADE

    PubMed Central

    Schädler, Marc René; Warzybok, Anna; Meyer, Bernd T.; Brand, Thomas

    2016-01-01

    To characterize the individual patient’s hearing impairment as obtained with the matrix sentence recognition test, a simulation Framework for Auditory Discrimination Experiments (FADE) is extended here using the Attenuation and Distortion (A+D) approach by Plomp as a blueprint for setting the individual processing parameters. FADE has been shown to predict the outcome of both speech recognition tests and psychoacoustic experiments based on simulations using an automatic speech recognition system requiring only few assumptions. It builds on the closed-set matrix sentence recognition test which is advantageous for testing individual speech recognition in a way comparable across languages. Individual predictions of speech recognition thresholds in stationary and in fluctuating noise were derived using the audiogram and an estimate of the internal level uncertainty for modeling the individual Plomp curves fitted to the data with the Attenuation (A-) and Distortion (D-) parameters of the Plomp approach. The “typical” audiogram shapes from Bisgaard et al with or without a “typical” level uncertainty and the individual data were used for individual predictions. As a result, the individualization of the level uncertainty was found to be more important than the exact shape of the individual audiogram to accurately model the outcome of the German Matrix test in stationary or fluctuating noise for listeners with hearing impairment. The prediction accuracy of the individualized approach also outperforms the (modified) Speech Intelligibility Index approach which is based on the individual threshold data only. PMID:27604782

  7. Test of cold asphalt storability based on alternative approaches

    NASA Astrophysics Data System (ADS)

    Abaffyová, Zora; Komačka, Jozef

    2017-09-01

    Cold asphalt products for potholes repairs should be workable (soft enough) for long time to ensure their applicability. Storability is assessed indirectly using various tests of workability. Therefore, simple test methods (self-compaction and disintegration test) was developed and verified to investigate changes of storability of this group of cold asphalts. Selfcompaction of the tested mixture in the upturned Abram’s cone for the cement concrete slump test and in the mould for the California Bearing Ratio test was assessed in first stage. After that the video record of disintegration test was taken. During this test, the mould was lifted up and the mixture fell off the mould (Abram’s cone) or disintegrate (CBR mould). The drop of surface after 10 min self-compaction and netto time related to falling out or disintegration of the mixture were used to evaluate the mixture from storability point of view. It was found out the self-compaction test has not a potential to reveal and prove changes of mixture properties. Based on the disintegration test results it can be stated this test at 5 °C using the upturned Abram’s cone could be a suitable approach to determine qualitative changes of a cold mixture from storability point of view.

  8. Design of the ANTARES LCM-DAQ board test bench using a FPGA-based system-on-chip approach

    NASA Astrophysics Data System (ADS)

    Anvar, S.; Kestener, P.; Le Provost, H.

    2006-11-01

    The System-on-Chip (SoC) approach consists in using state-of-the-art FPGA devices with embedded RISC processor cores, high-speed differential LVDS links and ready-to-use multi-gigabit transceivers allowing development of compact systems with substantial number of IO channels. Required performances are obtained through a subtle separation of tasks between closely cooperating programmable hardware logic and user-friendly software environment. We report about our experience in using the SoC approach for designing the production test bench of the off-shore readout system for the ANTARES neutrino experiment.

  9. Clinical examination and physical assessment of hip joint-related pain in athletes.

    PubMed

    Reiman, Michael P; Thorborg, Kristian

    2014-11-01

    Evidence-based clinical examination and assessment of the athlete with hip joint related pain is complex. It requires a systematic approach to properly differentially diagnose competing potential causes of athletic pain generation. An approach with an initial broad focus (and hence use of highly sensitive tests/measures) that then is followed by utilizing more specific tests/measures to pare down this imprecise differential diagnosis list is suggested. Physical assessment measures are then suggested to discern impairments, activity and participation restrictions for athletes with hip-join related pain, hence guiding the proper treatment approach. 5.

  10. Relational machine learning for electronic health record-driven phenotyping.

    PubMed

    Peissig, Peggy L; Santos Costa, Vitor; Caldwell, Michael D; Rottscheit, Carla; Berg, Richard L; Mendonca, Eneida A; Page, David

    2014-12-01

    Electronic health records (EHR) offer medical and pharmacogenomics research unprecedented opportunities to identify and classify patients at risk. EHRs are collections of highly inter-dependent records that include biological, anatomical, physiological, and behavioral observations. They comprise a patient's clinical phenome, where each patient has thousands of date-stamped records distributed across many relational tables. Development of EHR computer-based phenotyping algorithms require time and medical insight from clinical experts, who most often can only review a small patient subset representative of the total EHR records, to identify phenotype features. In this research we evaluate whether relational machine learning (ML) using inductive logic programming (ILP) can contribute to addressing these issues as a viable approach for EHR-based phenotyping. Two relational learning ILP approaches and three well-known WEKA (Waikato Environment for Knowledge Analysis) implementations of non-relational approaches (PART, J48, and JRIP) were used to develop models for nine phenotypes. International Classification of Diseases, Ninth Revision (ICD-9) coded EHR data were used to select training cohorts for the development of each phenotypic model. Accuracy, precision, recall, F-Measure, and Area Under the Receiver Operating Characteristic (AUROC) curve statistics were measured for each phenotypic model based on independent manually verified test cohorts. A two-sided binomial distribution test (sign test) compared the five ML approaches across phenotypes for statistical significance. We developed an approach to automatically label training examples using ICD-9 diagnosis codes for the ML approaches being evaluated. Nine phenotypic models for each ML approach were evaluated, resulting in better overall model performance in AUROC using ILP when compared to PART (p=0.039), J48 (p=0.003) and JRIP (p=0.003). ILP has the potential to improve phenotyping by independently delivering clinically expert interpretable rules for phenotype definitions, or intuitive phenotypes to assist experts. Relational learning using ILP offers a viable approach to EHR-driven phenotyping. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Artificial intelligence techniques for ground test monitoring of rocket engines

    NASA Technical Reports Server (NTRS)

    Ali, Moonis; Gupta, U. K.

    1990-01-01

    An expert system is being developed which can detect anomalies in Space Shuttle Main Engine (SSME) sensor data significantly earlier than the redline algorithm currently in use. The training of such an expert system focuses on two approaches which are based on low frequency and high frequency analyses of sensor data. Both approaches are being tested on data from SSME tests and their results compared with the findings of NASA and Rocketdyne experts. Prototype implementations have detected the presence of anomalies earlier than the redline algorithms that are in use currently. It therefore appears that these approaches have the potential of detecting anomalies early eneough to shut down the engine or take other corrective action before severe damage to the engine occurs.

  12. A combination test for detection of gene-environment interaction in cohort studies.

    PubMed

    Coombes, Brandon; Basu, Saonli; McGue, Matt

    2017-07-01

    Identifying gene-environment (G-E) interactions can contribute to a better understanding of disease etiology, which may help researchers develop disease prevention strategies and interventions. One big criticism of studying G-E interaction is the lack of power due to sample size. Studies often restrict the interaction search to the top few hundred hits from a genome-wide association study or focus on potential candidate genes. In this paper, we test interactions between a candidate gene and an environmental factor to improve power by analyzing multiple variants within a gene. We extend recently developed score statistic based genetic association testing approaches to the G-E interaction testing problem. We also propose tests for interaction using gene-based summary measures that pool variants together. Although it has recently been shown that these summary measures can be biased and may lead to inflated type I error, we show that under several realistic scenarios, we can still provide valid tests of interaction. These tests use significantly less degrees of freedom and thus can have much higher power to detect interaction. Additionally, we demonstrate that the iSeq-aSum-min test, which combines a gene-based summary measure test, iSeq-aSum-G, and an interaction-based summary measure test, iSeq-aSum-I, provides a powerful alternative to test G-E interaction. We demonstrate the performance of these approaches using simulation studies and illustrate their performance to study interaction between the SNPs in several candidate genes and family climate environment on alcohol consumption using the Minnesota Center for Twin and Family Research dataset. © 2017 WILEY PERIODICALS, INC.

  13. An Approach to a Comprehensive Test Framework for Analysis and Evaluation of Text Line Segmentation Algorithms

    PubMed Central

    Brodic, Darko; Milivojevic, Dragan R.; Milivojevic, Zoran N.

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures. PMID:22164106

  14. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    PubMed

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  15. An approach to a comprehensive test framework for analysis and evaluation of text line segmentation algorithms.

    PubMed

    Brodic, Darko; Milivojevic, Dragan R; Milivojevic, Zoran N

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures.

  16. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sale, K

    Several approaches to ASP performance can be contemplated. Perhaps the ideal would be a full cost/benefit analysis (which is probably utterly infeasible). Another approach would be a test-based figure-of-merit (FOM), this approach has the virtue of being quantitative and the challenge that each customer and application would be characterized by a different FOM. The alternative proposed here is an approach that uses information about the limits of detection of real instruments to support informed judgments.

  18. Angular velocity estimation from measurement vectors of star tracker.

    PubMed

    Liu, Hai-bo; Yang, Jun-cai; Yi, Wen-jun; Wang, Jiong-qi; Yang, Jian-kun; Li, Xiu-jian; Tan, Ji-chun

    2012-06-01

    In most spacecraft, there is a need to know the craft's angular rate. Approaches with least squares and an adaptive Kalman filter are proposed for estimating the angular rate directly from the star tracker measurements. In these approaches, only knowledge of the vector measurements and sampling interval is required. The designed adaptive Kalman filter can filter out noise without information of the dynamic model and inertia dyadic. To verify the proposed estimation approaches, simulations based on the orbit data of the challenging minisatellite payload (CHAMP) satellite and experimental tests with night-sky observation are performed. Both the simulations and experimental testing results have demonstrated that the proposed approach performs well in terms of accuracy, robustness, and performance.

  19. Power calculation for comparing diagnostic accuracies in a multi-reader, multi-test design.

    PubMed

    Kim, Eunhee; Zhang, Zheng; Wang, Youdan; Zeng, Donglin

    2014-12-01

    Receiver operating characteristic (ROC) analysis is widely used to evaluate the performance of diagnostic tests with continuous or ordinal responses. A popular study design for assessing the accuracy of diagnostic tests involves multiple readers interpreting multiple diagnostic test results, called the multi-reader, multi-test design. Although several different approaches to analyzing data from this design exist, few methods have discussed the sample size and power issues. In this article, we develop a power formula to compare the correlated areas under the ROC curves (AUC) in a multi-reader, multi-test design. We present a nonparametric approach to estimate and compare the correlated AUCs by extending DeLong et al.'s (1988, Biometrics 44, 837-845) approach. A power formula is derived based on the asymptotic distribution of the nonparametric AUCs. Simulation studies are conducted to demonstrate the performance of the proposed power formula and an example is provided to illustrate the proposed procedure. © 2014, The International Biometric Society.

  20. Predicting the future: opportunities and challenges for the chemical industry to apply 21st-century toxicity testing.

    PubMed

    Settivari, Raja S; Ball, Nicholas; Murphy, Lynea; Rasoulpour, Reza; Boverhof, Darrell R; Carney, Edward W

    2015-03-01

    Interest in applying 21st-century toxicity testing tools for safety assessment of industrial chemicals is growing. Whereas conventional toxicology uses mainly animal-based, descriptive methods, a paradigm shift is emerging in which computational approaches, systems biology, high-throughput in vitro toxicity assays, and high-throughput exposure assessments are beginning to be applied to mechanism-based risk assessments in a time- and resource-efficient fashion. Here we describe recent advances in predictive safety assessment, with a focus on their strategic application to meet the changing demands of the chemical industry and its stakeholders. The opportunities to apply these new approaches is extensive and include screening of new chemicals, informing the design of safer and more sustainable chemical alternatives, filling information gaps on data-poor chemicals already in commerce, strengthening read-across methodology for categories of chemicals sharing similar modes of action, and optimizing the design of reduced-risk product formulations. Finally, we discuss how these predictive approaches dovetail with in vivo integrated testing strategies within repeated-dose regulatory toxicity studies, which are in line with 3Rs principles to refine, reduce, and replace animal testing. Strategic application of these tools is the foundation for informed and efficient safety assessment testing strategies that can be applied at all stages of the product-development process.

  1. An evidence-based approach to the creation of normative data: base rates of impaired scores within a brief neuropsychological battery argue for age corrections, but against corrections for medical conditions.

    PubMed

    O'Connell, Megan E; Tuokko, Holly; Voll, Stacey; Simard, Martine; Griffith, Lauren E; Taler, Vanessa; Wolfson, Christina; Kirkland, Susan; Raina, Parminder

    We detail a new approach to the creation of normative data for neuropsychological tests. The traditional approach to normative data creation is to make demographic adjustments based on observations of correlations between single neuropsychological tests and selected demographic variables. We argue, however, that this does not describe the implications for clinical practice, such as increased likelihood of misclassification of cognitive impairment, nor does it elucidate the impact on decision-making with a neuropsychological battery. We propose base rate analyses; specifically, differential base rates of impaired scores between theoretical and actual base rates as the basis for decisions to create demographic adjustments within normative data. Differential base rates empirically describe the potential clinical implications of failing to create an appropriate normative group. We demonstrate this approach with data from a short telephone-administered neuropsychological battery given to a large, neurologically healthy sample aged 45-85 years old. We explored whether adjustments for age and medical conditions were warranted based on differential base rates of spuriously impaired scores. Theoretical base rates underestimated the frequency of impaired scores in older adults and overestimated the frequency of impaired scores in younger adults, providing an evidence base for the creation of age-corrected normative data. In contrast, the number of medical conditions (numerous cardiovascular, hormonal, and metabolic conditions) was not related to differential base rates of impaired scores. Despite a small correlation between number of medical conditions and each neuropsychological variable, normative adjustments for number of medical conditions does not appear warranted. Implications for creation of normative data are discussed.

  2. Cost-effectiveness of Population Screening for BRCA Mutations in Ashkenazi Jewish Women Compared With Family History–Based Testing

    PubMed Central

    Manchanda, Ranjit; Legood, Rosa; Burnell, Matthew; McGuire, Alistair; Raikou, Maria; Loggenberg, Kelly; Wardle, Jane; Sanderson, Saskia; Gessler, Sue; Side, Lucy; Balogun, Nyala; Desai, Rakshit; Kumar, Ajith; Dorkins, Huw; Wallis, Yvonne; Chapman, Cyril; Taylor, Rohan; Jacobs, Chris; Tomlinson, Ian; Beller, Uziel; Menon, Usha

    2015-01-01

    Background: Population-based testing for BRCA1/2 mutations detects the high proportion of carriers not identified by cancer family history (FH)–based testing. We compared the cost-effectiveness of population-based BRCA testing with the standard FH-based approach in Ashkenazi Jewish (AJ) women. Methods: A decision-analytic model was developed to compare lifetime costs and effects amongst AJ women in the UK of BRCA founder-mutation testing amongst: 1) all women in the population age 30 years or older and 2) just those with a strong FH (≥10% mutation risk). The model assumes that BRCA carriers are offered risk-reducing salpingo-oophorectomy and annual MRI/mammography screening or risk-reducing mastectomy. Model probabilities utilize the Genetic Cancer Prediction through Population Screening trial/published literature to estimate total costs, effects in terms of quality-adjusted life-years (QALYs), cancer incidence, incremental cost-effectiveness ratio (ICER), and population impact. Costs are reported at 2010 prices. Costs/outcomes were discounted at 3.5%. We used deterministic/probabilistic sensitivity analysis (PSA) to evaluate model uncertainty. Results: Compared with FH-based testing, population-screening saved 0.090 more life-years and 0.101 more QALYs resulting in 33 days’ gain in life expectancy. Population screening was found to be cost saving with a baseline-discounted ICER of -£2079/QALY. Population-based screening lowered ovarian and breast cancer incidence by 0.34% and 0.62%. Assuming 71% testing uptake, this leads to 276 fewer ovarian and 508 fewer breast cancer cases. Overall, reduction in treatment costs led to a discounted cost savings of £3.7 million. Deterministic sensitivity analysis and 94% of simulations on PSA (threshold £20000) indicated that population screening is cost-effective, compared with current NHS policy. Conclusion: Population-based screening for BRCA mutations is highly cost-effective compared with an FH-based approach in AJ women age 30 years and older. PMID:25435542

  3. A motor learning approach to training wheelchair propulsion biomechanics for new manual wheelchair users: A pilot study

    PubMed Central

    Morgan, Kerri A.; Tucker, Susan M.; Klaesner, Joseph W.; Engsberg, Jack R.

    2017-01-01

    Context/Objective Developing an evidence-based approach to teaching wheelchair skills and proper propulsion for everyday wheelchair users with a spinal cord injury (SCI) is important to their rehabilitation. The purpose of this project was to pilot test manual wheelchair training based on motor learning and repetition-based approaches for new manual wheelchair users with an SCI. Design A repeated measures within-subject design was used with participants acting as their own controls. Methods Six persons with an SCI requiring the use of a manual wheelchair participated in wheelchair training. The training included nine 90-minute sessions. The primary focus was on wheelchair propulsion biomechanics with a secondary focus on wheelchair skills. Outcome Measures During Pretest 1, Pretest 2, and Posttest, wheelchair propulsion biomechanics were measured using the Wheelchair Propulsion Test and a Video Motion Capture system. During Pretest 2 and Posttest, propulsion forces using the WheelMill System and wheelchair skills using the Wheelchair Skills Test were measured. Results Significant changes in area of the push loop, hand-to-axle relationship, and slope of push forces were found. Changes in propulsion patterns were identified post-training. No significant differences were found in peak and average push forces and wheelchair skills pre- and post-training. Conclusions This project identified trends in change related to a repetition-based motor learning approach for propelling a manual wheelchair. The changes found were related to the propulsion patterns used by participants. Despite some challenges associated with implementing interventions for new manual wheelchair users, such as recruitment, the results of this study show that repetition-based training can improve biomechanics and propulsion patterns for new manual wheelchair users. PMID:26674751

  4. A motor learning approach to training wheelchair propulsion biomechanics for new manual wheelchair users: A pilot study.

    PubMed

    Morgan, Kerri A; Tucker, Susan M; Klaesner, Joseph W; Engsberg, Jack R

    2017-05-01

    Developing an evidence-based approach to teaching wheelchair skills and proper propulsion for everyday wheelchair users with a spinal cord injury (SCI) is important to their rehabilitation. The purpose of this project was to pilot test manual wheelchair training based on motor learning and repetition-based approaches for new manual wheelchair users with an SCI. A repeated measures within-subject design was used with participants acting as their own controls. Six persons with an SCI requiring the use of a manual wheelchair participated in wheelchair training. The training included nine 90-minute sessions. The primary focus was on wheelchair propulsion biomechanics with a secondary focus on wheelchair skills. During Pretest 1, Pretest 2, and Posttest, wheelchair propulsion biomechanics were measured using the Wheelchair Propulsion Test and a Video Motion Capture system. During Pretest 2 and Posttest, propulsion forces using the WheelMill System and wheelchair skills using the Wheelchair Skills Test were measured. Significant changes in area of the push loop, hand-to-axle relationship, and slope of push forces were found. Changes in propulsion patterns were identified post-training. No significant differences were found in peak and average push forces and wheelchair skills pre- and post-training. This project identified trends in change related to a repetition-based motor learning approach for propelling a manual wheelchair. The changes found were related to the propulsion patterns used by participants. Despite some challenges associated with implementing interventions for new manual wheelchair users, such as recruitment, the results of this study show that repetition-based training can improve biomechanics and propulsion patterns for new manual wheelchair users.

  5. Concern-driven integrated approaches to nanomaterial testing and assessment – report of the NanoSafety Cluster Working Group 10

    PubMed Central

    Oomen, Agnes G.; Bos, Peter M. J.; Fernandes, Teresa F.; Hund-Rinke, Kerstin; Boraschi, Diana; Byrne, Hugh J.; Aschberger, Karin; Gottardo, Stefania; von der Kammer, Frank; Kühnel, Dana; Hristozov, Danail; Marcomini, Antonio; Migliore, Lucia; Scott-Fordsmand, Janeck; Wick, Peter

    2014-01-01

    Bringing together topic-related European Union (EU)-funded projects, the so-called “NanoSafety Cluster” aims at identifying key areas for further research on risk assessment procedures for nanomaterials (NM). The outcome of NanoSafety Cluster Working Group 10, this commentary presents a vision for concern-driven integrated approaches for the (eco-)toxicological testing and assessment (IATA) of NM. Such approaches should start out by determining concerns, i.e., specific information needs for a given NM based on realistic exposure scenarios. Recognised concerns can be addressed in a set of tiers using standardised protocols for NM preparation and testing. Tier 1 includes determining physico-chemical properties, non-testing (e.g., structure–activity relationships) and evaluating existing data. In tier 2, a limited set of in vitro and in vivo tests are performed that can either indicate that the risk of the specific concern is sufficiently known or indicate the need for further testing, including details for such testing. Ecotoxicological testing begins with representative test organisms followed by complex test systems. After each tier, it is evaluated whether the information gained permits assessing the safety of the NM so that further testing can be waived. By effectively exploiting all available information, IATA allow accelerating the risk assessment process and reducing testing costs and animal use (in line with the 3Rs principle implemented in EU Directive 2010/63/EU). Combining material properties, exposure, biokinetics and hazard data, information gained with IATA can be used to recognise groups of NM based upon similar modes of action. Grouping of substances in return should form integral part of the IATA themselves. PMID:23641967

  6. Gene Therapy for the Retinal Degeneration of Usher Syndrome Caused by Mutations in MYO7A.

    PubMed

    Lopes, Vanda S; Williams, David S

    2015-01-20

    Usher syndrome is a deaf-blindness disorder. One of the subtypes, Usher 1B, is caused by loss of function of the gene encoding the unconventional myosin, MYO7A. A variety of different viral-based delivery approaches have been tested for retinal gene therapy to prevent the blindness of Usher 1B, and a clinical trial based on one of these approaches has begun. This review evaluates the different approaches. Copyright © 2015 Cold Spring Harbor Laboratory Press; all rights reserved.

  7. Final Report: The Influence of Novel Behavioral Strategies in Promoting the Diffusion of Solar Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillingham, Kenneth; Bollinger, Bryan

    This is the final report for a systematic, evidence-based project using an unprecedented series of large-scale field experiments to examine the effectiveness and cost-effectiveness of novel approaches to reduce the soft costs of solar residential photovoltaics. The approaches were based around grassroots marketing campaigns called ‘Solarize’ campaigns, that were designed to lower costs and increase adoption of solar technology. This study quantified the effectiveness and cost-effectiveness of the Solarize programs and tested new approaches to further improve the model.

  8. An empirical study on information spillover effects between the Chinese copper futures market and spot market

    NASA Astrophysics Data System (ADS)

    Liu, Xiangli; Cheng, Siwei; Wang, Shouyang; Hong, Yongmiao; Li, Yi

    2008-02-01

    This study employs a parametric approach based on TGARCH and GARCH models to estimate the VaR of the copper futures market and spot market in China. Considering the short selling mechanism in the futures market, the paper introduces two new notions: upside VaR and extreme upside risk spillover. And downside VaR and upside VaR are examined by using the above approach. Also, we use Kupiec’s [P.H. Kupiec, Techniques for verifying the accuracy of risk measurement models, Journal of Derivatives 3 (1995) 73-84] backtest to test the power of our approaches. In addition, we investigate information spillover effects between the futures market and the spot market by employing a linear Granger causality test, and Granger causality tests in mean, volatility and risk respectively. Moreover, we also investigate the relationship between the futures market and the spot market by using a test based on a kernel function. Empirical results indicate that there exist significant two-way spillovers between the futures market and the spot market, and the spillovers from the futures market to the spot market are much more striking.

  9. Numerical study on the sequential Bayesian approach for radioactive materials detection

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng

    2013-01-01

    A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.

  10. A Computationally-Efficient Inverse Approach to Probabilistic Strain-Based Damage Diagnosis

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.; Leser, William P.; Leser, Patrick E.; Newman, John A

    2016-01-01

    This work presents a computationally-efficient inverse approach to probabilistic damage diagnosis. Given strain data at a limited number of measurement locations, Bayesian inference and Markov Chain Monte Carlo (MCMC) sampling are used to estimate probability distributions of the unknown location, size, and orientation of damage. Substantial computational speedup is obtained by replacing a three-dimensional finite element (FE) model with an efficient surrogate model. The approach is experimentally validated on cracked test specimens where full field strains are determined using digital image correlation (DIC). Access to full field DIC data allows for testing of different hypothetical sensor arrangements, facilitating the study of strain-based diagnosis effectiveness as the distance between damage and measurement locations increases. The ability of the framework to effectively perform both probabilistic damage localization and characterization in cracked plates is demonstrated and the impact of measurement location on uncertainty in the predictions is shown. Furthermore, the analysis time to produce these predictions is orders of magnitude less than a baseline Bayesian approach with the FE method by utilizing surrogate modeling and effective numerical sampling approaches.

  11. A multi-frequency receiver function inversion approach for crustal velocity structure

    NASA Astrophysics Data System (ADS)

    Li, Xuelei; Li, Zhiwei; Hao, Tianyao; Wang, Sheng; Xing, Jian

    2017-05-01

    In order to constrain the crustal velocity structures better, we developed a new nonlinear inversion approach based on multi-frequency receiver function waveforms. With the global optimizing algorithm of Differential Evolution (DE), low-frequency receiver function waveforms can primarily constrain large-scale velocity structures, while high-frequency receiver function waveforms show the advantages in recovering small-scale velocity structures. Based on the synthetic tests with multi-frequency receiver function waveforms, the proposed approach can constrain both long- and short-wavelength characteristics of the crustal velocity structures simultaneously. Inversions with real data are also conducted for the seismic stations of KMNB in southeast China and HYB in Indian continent, where crustal structures have been well studied by former researchers. Comparisons of inverted velocity models from previous and our studies suggest good consistency, but better waveform fitness with fewer model parameters are achieved by our proposed approach. Comprehensive tests with synthetic and real data suggest that the proposed inversion approach with multi-frequency receiver function is effective and robust in inverting the crustal velocity structures.

  12. Issues, concerns, and initial implementation results for space based telerobotic control

    NASA Technical Reports Server (NTRS)

    Lawrence, D. A.; Chapel, J. D.; Depkovich, T. M.

    1987-01-01

    Telerobotic control for space based assembly and servicing tasks presents many problems in system design. Traditional force reflection teleoperation schemes are not well suited to this application, and the approaches to compliance control via computer algorithms have yet to see significant testing and comparison. These observations are discussed in detail, as well as the concerns they raise for imminent design and testing of space robotic systems. As an example of the detailed technical work yet to be done before such systems can be specified, a particular approach to providing manipulator compliance is examined experimentally and through modeling and analysis. This yields some initial insight into the limitations and design trade-offs for this class of manipulator control schemes. Implications of this investigation for space based telerobots are discussed in detail.

  13. Flamelet Model Application for Non-Premixed Turbulent Combustion

    NASA Technical Reports Server (NTRS)

    Secundov, A.; Bezgin, L.; Buriko, Yu.; Guskov, O.; Kopchenov, V.; Laskin, I.; Lomkov, K.; Tshepin, S.; Volkov, D.; Zaitsev, S.

    1996-01-01

    The current Final Report contains results of the study which was performed in Scientific Research Center 'ECOLEN' (Moscow, Russia). The study concerns the development and verification of non-expensive approach for modeling of supersonic turbulent diffusion flames based on flamelet consideration of the chemistry/turbulence interaction (FL approach). Research work included: development of the approach and CFD tests of the flamelet model for supersonic jet flames; development of the simplified procedure for solution of the flamelet equations based on partial equilibrium chemistry assumption; study of the flame ignition/extinction predictions provided by flamelet model. The performed investigation demonstrated that FL approach allowed to describe satisfactory main features of supersonic H 2/air jet flames. Model demonstrated also high capabilities for reduction of the computational expenses in CFD modeling of the supersonic flames taking into account detailed oxidation chemistry. However, some disadvantages and restrictions of the existing version of approach were found in this study. They were: (1) inaccuracy in predictions of the passive scalar statistics by our turbulence model for one of the considered test cases; and (2) applicability of the available version of the flamelet model to flames without large ignition delay distance only. Based on the results of the performed investigation, we formulated and submitted to the National Aeronautics and Space Administration our Project Proposal for the next step research directed toward further improvement of the FL approach.

  14. A new statistical approach to climate change detection and attribution

    NASA Astrophysics Data System (ADS)

    Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe

    2017-01-01

    We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).

  15. Test on the Effectiveness of the Sum over Paths Approach in Favoring the Construction of an Integrated Knowledge of Quantum Physics in High School

    ERIC Educational Resources Information Center

    Malgieri, Massimiliano; Onorato, Pasquale; De Ambrosis, Anna

    2017-01-01

    In this paper we present the results of a research-based teaching-learning sequence on introductory quantum physics based on Feynman's sum over paths approach in the Italian high school. Our study focuses on students' understanding of two founding ideas of quantum physics, wave particle duality and the uncertainty principle. In view of recent…

  16. Computer Assisted Language Testing: On the Efficacy of Web-Based Approach in the Instruction of Elementary Learners of English

    ERIC Educational Resources Information Center

    Soleimani, Maryam; Gahhari, Shima

    2012-01-01

    In this study the effectiveness and efficacy of e-learning was evaluated through a web based approach. Two classes of elementary learners of English were selected for this study, one class received a six month instruction through the typical twice a week classes and the other one was instructed through internet, in other words, the first class did…

  17. Architecture-Based Unit Testing of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; McComas, David; Bartholomew, Maureen; Slegel, Steve; Medina, Barbara

    2010-01-01

    This paper presents an analysis of the unit testing approach developed and used by the Core Flight Software (CFS) product line team at the NASA GSFC. The goal of the analysis is to understand, review, and reconunend strategies for improving the existing unit testing infrastructure as well as to capture lessons learned and best practices that can be used by other product line teams for their unit testing. The CFS unit testing framework is designed and implemented as a set of variation points, and thus testing support is built into the product line architecture. The analysis found that the CFS unit testing approach has many practical and good solutions that are worth considering when deciding how to design the testing architecture for a product line, which are documented in this paper along with some suggested innprovennents.

  18. High Throughput Exposure Prioritization of Chemicals Using a Screening-Level Probabilistic SHEDS-Lite Exposure Model

    EPA Science Inventory

    These novel modeling approaches for screening, evaluating and classifying chemicals based on the potential for biologically-relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. The new modeling approach is derived from the Stocha...

  19. Qualitative Research Designs: Selection and Implementation

    ERIC Educational Resources Information Center

    Creswell, John W.; Hanson, William E.; Plano Clark, Vicki L.; Morales, Alejandro

    2007-01-01

    Counseling psychologists face many approaches from which to choose when they conduct a qualitative research study. This article focuses on the processes of selecting, contrasting, and implementing five different qualitative approaches. Based on an extended example related to test interpretation by counselors, clients, and communities, this article…

  20. Scoring Yes-No Vocabulary Tests: Reaction Time vs. Nonword Approaches

    ERIC Educational Resources Information Center

    Pellicer-Sanchez, Ana; Schmitt, Norbert

    2012-01-01

    Despite a number of research studies investigating the Yes-No vocabulary test format, one main question remains unanswered: What is the best scoring procedure to adjust for testee overestimation of vocabulary knowledge? Different scoring methodologies have been proposed based on the inclusion and selection of nonwords in the test. However, there…

  1. A Real-time Evaluation of Human-based Approaches to Safety Testing: What We Can Do Now (TDS)

    EPA Science Inventory

    Despite ever-increasing efforts in early safety assessment in all industries, there are still many chemicals that prove toxic in humans. While greater use of human in vitro test methods may serve to reduce this problem, the formal validation process applied to such tests represen...

  2. Testing hypotheses for differences between linear regression lines

    Treesearch

    Stanley J. Zarnoch

    2009-01-01

    Five hypotheses are identified for testing differences between simple linear regression lines. The distinctions between these hypotheses are based on a priori assumptions and illustrated with full and reduced models. The contrast approach is presented as an easy and complete method for testing for overall differences between the regressions and for making pairwise...

  3. The Role of the Family in Genetic Testing: Theoretical Perspectives, Current Knowledge, and Future Directions

    ERIC Educational Resources Information Center

    Peterson, Susan K.

    2005-01-01

    This article addresses conceptual challenges and theoretical approaches for examining the role of the family in responding and adapting to genetic testing for inherited conditions. Using a family systems perspective, family-based constructs that are relevant to genetic testing may be organized into three domains: family communication, organization…

  4. The Effectiveness of Mandatory-Random Student Drug Testing

    ERIC Educational Resources Information Center

    James-Burdumy, Susanne; Goesling, Brian; Deke, John; Einspruch, Eric

    2011-01-01

    One approach some U.S. schools now use to combat high rates of adolescent substance use is school-based mandatory-random student drug testing (MRSDT). Under MRSDT, students and their parents sign consent forms agreeing to the students' participation in random drug testing as a condition of participating in athletics and other school-sponsored…

  5. Heritability in Cognitive Performance: Evidence Using Computer-Based Testing

    ERIC Educational Resources Information Center

    Hervey, Aaron S.; Greenfield, Kathryn; Gualtieri, C. Thomas

    2012-01-01

    There is overwhelming evidence of genetic influence on cognition. The effect is seen in general cognitive ability, as well as in specific cognitive domains. A conventional assessment approach using face-to-face paper and pencil testing is difficult for large-scale studies. Computerized neurocognitive testing is a suitable alternative. A total of…

  6. Significance Testing in Confirmatory Factor Analytic Models.

    ERIC Educational Resources Information Center

    Khattab, Ali-Maher; Hocevar, Dennis

    Traditionally, confirmatory factor analytic models are tested against a null model of total independence. Using randomly generated factors in a matrix of 46 aptitude tests, this approach is shown to be unlikely to reject even random factors. An alternative null model, based on a single general factor, is suggested. In addition, an index of model…

  7. Application of molecular target homology-based approaches to predict species sensitivities to two pesticides, permethrin and propiconozole

    EPA Science Inventory

    In the U.S., registration of pesticide active ingredients requires a battery of intensive and costly in vivo toxicity tests which utilize large numbers of test animals. These tests use a limited array of model species from various aquatic and terrestrial taxa to represent all pla...

  8. Using Backward Design in Education Research: A Research Methods Essay †

    PubMed Central

    Jensen, Jamie L.; Bailey, Elizabeth G.; Kummer, Tyler A.; Weber, K. Scott

    2017-01-01

    Education research within the STEM disciplines applies a scholarly approach to teaching and learning, with the intent of better understanding how people learn and of improving pedagogy at the undergraduate level. Most of the professionals practicing in this field have ‘crossed over’ from other disciplinary fields and thus have faced challenges in becoming experts in a new discipline. In this article, we offer a novel framework for approaching education research design called Backward Design in Education Research. It is patterned on backward curricular design and provides a three-step, systematic approach to designing education projects: 1) Define a research question that leads to a testable causal hypothesis based on a theoretical rationale; 2) Choose or design the assessment instruments to test the research hypothesis; and 3) Develop an experimental protocol that will be effective in testing the research hypothesis. This approach provides a systematic method to develop and carry out evidence-based research design. PMID:29854045

  9. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    EPA Science Inventory

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  10. Rasch-family models are more valuable than score-based approaches for analysing longitudinal patient-reported outcomes with missing data.

    PubMed

    de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique

    2016-10-01

    The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.

  11. To What Extent Does Children's Spelling Improve as a Result of Learning Words with the Look, Say, Cover, Write, Check, Fix Strategy Compared with Phonological Spelling Strategies?

    ERIC Educational Resources Information Center

    Dymock, Susan; Nicholson, Tom

    2017-01-01

    The ubiquitous weekly spelling test assumes that words are best learned by memorisation and testing but is this the best way? This study compared two well-known approaches to spelling instruction, the rule based and visual memory approaches. A group of 55 seven-year-olds in two Year 3 classrooms was taught spelling in small groups for three…

  12. A Novel Outreach to High School Students by Teaching Them the Engineering Skills in a Project-Based Approach

    ERIC Educational Resources Information Center

    Asiabanpour, Bahram

    2010-01-01

    In this paper a novel outreach approach to high school students to familiarize them with engineering functions and methods is explained. In this approach students participated in a seven days research camp and learned many engineering skills and tools such as CAD solid modeling, finite element analysis, rapid prototyping, mechanical tests, team…

  13. A simplified approach to quasi-linear viscoelastic modeling

    PubMed Central

    Nekouzadeh, Ali; Pryse, Kenneth M.; Elson, Elliot L.; Genin, Guy M.

    2007-01-01

    The fitting of quasi-linear viscoelastic (QLV) constitutive models to material data often involves somewhat cumbersome numerical convolution. A new approach to treating quasi-linearity in one dimension is described and applied to characterize the behavior of reconstituted collagen. This approach is based on a new principle for including nonlinearity and requires considerably less computation than other comparable models for both model calibration and response prediction, especially for smoothly applied stretching. Additionally, the approach allows relaxation to adapt with the strain history. The modeling approach is demonstrated through tests on pure reconstituted collagen. Sequences of “ramp-and-hold” stretching tests were applied to rectangular collagen specimens. The relaxation force data from the “hold” was used to calibrate a new “adaptive QLV model” and several models from literature, and the force data from the “ramp” was used to check the accuracy of model predictions. Additionally, the ability of the models to predict the force response on a reloading of the specimen was assessed. The “adaptive QLV model” based on this new approach predicts collagen behavior comparably to or better than existing models, with much less computation. PMID:17499254

  14. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach.

    PubMed

    Park, Hyunseok; Magee, Christopher L

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents.

  15. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach

    PubMed Central

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents. PMID:28135304

  16. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  17. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  18. A systems approach to solder joint fatigue in spacecraft electronic packaging

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1991-01-01

    Differential expansion induced fatigue resulting from temperature cycling is a leading cause of solder joint failures in spacecraft. Achieving high reliability flight hardware requires that each element of the fatigue issue be addressed carefully. This includes defining the complete thermal-cycle environment to be experienced by the hardware, developing electronic packaging concepts that are consistent with the defined environments, and validating the completed designs with a thorough qualification and acceptance test program. This paper describes a useful systems approach to solder fatigue based principally on the fundamental log-strain versus log-cycles-to-failure behavior of fatigue. This fundamental behavior has been useful to integrate diverse ground test and flight operational thermal-cycle environments into a unified electronics design approach. Each element of the approach reflects both the mechanism physics that control solder fatigue, as well as the practical realities of the hardware build, test, delivery, and application cycle.

  19. PheProb: probabilistic phenotyping using diagnosis codes to improve power for genetic association studies.

    PubMed

    Sinnott, Jennifer A; Cai, Fiona; Yu, Sheng; Hejblum, Boris P; Hong, Chuan; Kohane, Isaac S; Liao, Katherine P

    2018-05-17

    Standard approaches for large scale phenotypic screens using electronic health record (EHR) data apply thresholds, such as ≥2 diagnosis codes, to define subjects as having a phenotype. However, the variation in the accuracy of diagnosis codes can impair the power of such screens. Our objective was to develop and evaluate an approach which converts diagnosis codes into a probability of a phenotype (PheProb). We hypothesized that this alternate approach for defining phenotypes would improve power for genetic association studies. The PheProb approach employs unsupervised clustering to separate patients into 2 groups based on diagnosis codes. Subjects are assigned a probability of having the phenotype based on the number of diagnosis codes. This approach was developed using simulated EHR data and tested in a real world EHR cohort. In the latter, we tested the association between low density lipoprotein cholesterol (LDL-C) genetic risk alleles known for association with hyperlipidemia and hyperlipidemia codes (ICD-9 272.x). PheProb and thresholding approaches were compared. Among n = 1462 subjects in the real world EHR cohort, the threshold-based p-values for association between the genetic risk score (GRS) and hyperlipidemia were 0.126 (≥1 code), 0.123 (≥2 codes), and 0.142 (≥3 codes). The PheProb approach produced the expected significant association between the GRS and hyperlipidemia: p = .001. PheProb improves statistical power for association studies relative to standard thresholding approaches by leveraging information about the phenotype in the billing code counts. The PheProb approach has direct applications where efficient approaches are required, such as in Phenome-Wide Association Studies.

  20. Comparing Web, Group and Telehealth Formats of a Military Parenting Program

    DTIC Science & Technology

    2017-06-01

    directed approaches. Comparative effectiveness will be tested by specifying a non - equivalence hypothesis for group -based and web-facilitated relative...Comparative effectiveness will be tested by specifying a non - equivalence hypothesis fro group based and individualized facilitated relative to self-directed...documents for review and approval. 1a. Finalize human subjects protocol and consent documents for pilot group (N=5 families), and randomized controlled

  1. Efficient Blockwise Permutation Tests Preserving Exchangeability

    PubMed Central

    Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.

    2014-01-01

    In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113

  2. Precision medicine for cancer with next-generation functional diagnostics.

    PubMed

    Friedman, Adam A; Letai, Anthony; Fisher, David E; Flaherty, Keith T

    2015-12-01

    Precision medicine is about matching the right drugs to the right patients. Although this approach is technology agnostic, in cancer there is a tendency to make precision medicine synonymous with genomics. However, genome-based cancer therapeutic matching is limited by incomplete biological understanding of the relationship between phenotype and cancer genotype. This limitation can be addressed by functional testing of live patient tumour cells exposed to potential therapies. Recently, several 'next-generation' functional diagnostic technologies have been reported, including novel methods for tumour manipulation, molecularly precise assays of tumour responses and device-based in situ approaches; these address the limitations of the older generation of chemosensitivity tests. The promise of these new technologies suggests a future diagnostic strategy that integrates functional testing with next-generation sequencing and immunoprofiling to precisely match combination therapies to individual cancer patients.

  3. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  4. Artificial Bee Colony Optimization for Short-Term Hydrothermal Scheduling

    NASA Astrophysics Data System (ADS)

    Basu, M.

    2014-12-01

    Artificial bee colony optimization is applied to determine the optimal hourly schedule of power generation in a hydrothermal system. Artificial bee colony optimization is a swarm-based algorithm inspired by the food foraging behavior of honey bees. The algorithm is tested on a multi-reservoir cascaded hydroelectric system having prohibited operating zones and thermal units with valve point loading. The ramp-rate limits of thermal generators are taken into consideration. The transmission losses are also accounted for through the use of loss coefficients. The algorithm is tested on two hydrothermal multi-reservoir cascaded hydroelectric test systems. The results of the proposed approach are compared with those of differential evolution, evolutionary programming and particle swarm optimization. From numerical results, it is found that the proposed artificial bee colony optimization based approach is able to provide better solution.

  5. Suboptimal LQR-based spacecraft full motion control: Theory and experimentation

    NASA Astrophysics Data System (ADS)

    Guarnaccia, Leone; Bevilacqua, Riccardo; Pastorelli, Stefano P.

    2016-05-01

    This work introduces a real time suboptimal control algorithm for six-degree-of-freedom spacecraft maneuvering based on a State-Dependent-Algebraic-Riccati-Equation (SDARE) approach and real-time linearization of the equations of motion. The control strategy is sub-optimal since the gains of the linear quadratic regulator (LQR) are re-computed at each sample time. The cost function of the proposed controller has been compared with the one obtained via a general purpose optimal control software, showing, on average, an increase in control effort of approximately 15%, compensated by real-time implementability. Lastly, the paper presents experimental tests on a hardware-in-the-loop six-degree-of-freedom spacecraft simulator, designed for testing new guidance, navigation, and control algorithms for nano-satellites in a one-g laboratory environment. The tests show the real-time feasibility of the proposed approach.

  6. The effect of project-based learning on students' statistical literacy levels for data representation

    NASA Astrophysics Data System (ADS)

    Koparan, Timur; Güven, Bülent

    2015-07-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.

  7. From community-based pilot testing to region-wide systems change: lessons from a local quality improvement collaborative.

    PubMed

    Keyser, Donna J; Pincus, Harold Alan

    2010-01-01

    A community-based collaborative conducted a 2-year pilot study to inform efforts for improving maternal and child health care practice and policy in Allegheny County, Pennsylvania. (1) To test whether three small-scale versions of an evidence-based, systems improvement approach would be workable in local community settings and (2) to identify specific policy/infrastructure reforms for sustaining improvements. A mixed methods approach was used, including quantitative performance measurement supplemented with qualitative data about factors related to outcomes of interest, as well as key stakeholder interviews and a literature review/Internet search. Quantitative performance results varied; qualitative data revealed critical factors for the success and failure of the practices tested. Policy/infrastructure recommendations were developed to address specific practice barriers. This information was important for designing a region-wide quality improvement initiative focused on maternal depression. The processes and outcomes provide valuable insights for other communities interested in conducting similar quality improvement initiatives.

  8. Higher-Order Asymptotics and Its Application to Testing the Equality of the Examinee Ability Over Two Sets of Items.

    PubMed

    Sinharay, Sandip; Jensen, Jens Ledet

    2018-06-27

    In educational and psychological measurement, researchers and/or practitioners are often interested in examining whether the ability of an examinee is the same over two sets of items. Such problems can arise in measurement of change, detection of cheating on unproctored tests, erasure analysis, detection of item preknowledge, etc. Traditional frequentist approaches that are used in such problems include the Wald test, the likelihood ratio test, and the score test (e.g., Fischer, Appl Psychol Meas 27:3-26, 2003; Finkelman, Weiss, & Kim-Kang, Appl Psychol Meas 34:238-254, 2010; Glas & Dagohoy, Psychometrika 72:159-180, 2007; Guo & Drasgow, Int J Sel Assess 18:351-364, 2010; Klauer & Rettig, Br J Math Stat Psychol 43:193-206, 1990; Sinharay, J Educ Behav Stat 42:46-68, 2017). This paper shows that approaches based on higher-order asymptotics (e.g., Barndorff-Nielsen & Cox, Inference and asymptotics. Springer, London, 1994; Ghosh, Higher order asymptotics. Institute of Mathematical Statistics, Hayward, 1994) can also be used to test for the equality of the examinee ability over two sets of items. The modified signed likelihood ratio test (e.g., Barndorff-Nielsen, Biometrika 73:307-322, 1986) and the Lugannani-Rice approximation (Lugannani & Rice, Adv Appl Prob 12:475-490, 1980), both of which are based on higher-order asymptotics, are shown to provide some improvement over the traditional frequentist approaches in three simulations. Two real data examples are also provided.

  9. On testing for spatial correspondence between maps of human brain structure and function.

    PubMed

    Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin

    2018-06-01

    A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Suspension parameter estimation in the frequency domain using a matrix inversion approach

    NASA Astrophysics Data System (ADS)

    Thite, A. N.; Banvidi, S.; Ibicek, T.; Bennett, L.

    2011-12-01

    The dynamic lumped parameter models used to optimise the ride and handling of a vehicle require base values of the suspension parameters. These parameters are generally experimentally identified. The accuracy of identified parameters can depend on the measurement noise and the validity of the model used. The existing publications on suspension parameter identification are generally based on the time domain and use a limited degree of freedom. Further, the data used are either from a simulated 'experiment' or from a laboratory test on an idealised quarter or a half-car model. In this paper, a method is developed in the frequency domain which effectively accounts for the measurement noise. Additional dynamic constraining equations are incorporated and the proposed formulation results in a matrix inversion approach. The nonlinearities in damping are estimated, however, using a time-domain approach. Full-scale 4-post rig test data of a vehicle are used. The variations in the results are discussed using the modal resonant behaviour. Further, a method is implemented to show how the results can be improved when the matrix inverted is ill-conditioned. The case study shows a good agreement between the estimates based on the proposed frequency-domain approach and measurable physical parameters.

  11. A tutorial for developing a topical cream formulation based on the Quality by Design approach.

    PubMed

    Simões, Ana; Veiga, Francisco; Vitorino, Carla; Figueiras, Ana

    2018-06-20

    The pharmaceutical industry has entered in a new era, as there is a growing interest in increasing the quality standards of dosage forms, through the implementation of more structured development and manufacturing approaches. For many decades, the manufacturing of drug products was controlled by a regulatory framework to guarantee the quality of the final product through a fixed process and exhaustive testing. Limitations related to the Quality by Test (QbT) system have been widely acknowledged. The emergence of Quality by Design (QbD) as a systematic and risk-based approach introduced a new quality concept based on a good understanding of how raw materials and process parameters influence the final quality profile. Although the QbD system has been recognized as a revolutionary approach to product development and manufacturing, its full implementation in the pharmaceutical field is still limited. This is particularly evident in the case of semisolid complex formulation development. The present review aims at establishing a practical QbD framework to describe all stages comprised in the pharmaceutical development of a conventional cream in a comprehensible manner. Copyright © 2018. Published by Elsevier Inc.

  12. Quick multitemporal approach to get cloudless improved multispectral imagery for large geographical areas

    NASA Astrophysics Data System (ADS)

    Colaninno, Nicola; Marambio Castillo, Alejandro; Roca Cladera, Josep

    2017-10-01

    The demand for remotely sensed data is growing increasingly, due to the possibility of managing information about huge geographic areas, in digital format, at different time periods, and suitable for analysis in GIS platforms. However, primary satellite information is not such immediate as desirable. Beside geometric and atmospheric limitations, clouds, cloud shadows, and haze generally contaminate optical images. In terms of land cover, such a contamination is intended as missing information and should be replaced. Generally, image reconstruction is classified according to three main approaches, i.e. in-painting-based, multispectral-based, and multitemporal-based methods. This work relies on a multitemporal-based approach to retrieve uncontaminated pixels for an image scene. We explore an automatic method for quickly getting daytime cloudless and shadow-free image at moderate spatial resolution for large geographical areas. The process expects two main steps: a multitemporal effect adjustment to avoid significant seasonal variations, and a data reconstruction phase, based on automatic selection of uncontaminated pixels from an image stack. The result is a composite image based on middle values of the stack, over a year. The assumption is that, for specific purposes, land cover changes at a coarse scale are not significant over relatively short time periods. Because it is largely recognized that satellite imagery along tropical areas are generally strongly affected by clouds, the methodology is tested for the case study of the Dominican Republic at the year 2015; while Landsat 8 imagery are employed to test the approach.

  13. SMART empirical approaches for predicting field performance of PV modules from results of reliability tests

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata

    2016-09-01

    Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.

  14. Treatment of Selective Mutism: A Best-Evidence Synthesis.

    ERIC Educational Resources Information Center

    Stone, Beth Pionek; Kratochwill, Thomas R.; Sladezcek, Ingrid; Serlin, Ronald C.

    2002-01-01

    Presents systematic analysis of the major treatment approaches used for selective mutism. Based on nonparametric statistical tests of effect sizes, major findings include the following: treatment of selective mutism is more effective than no treatment; behaviorally oriented treatment approaches are more effective than no treatment; and no…

  15. Analytical approach for collective diffusion: One-dimensional lattice with the nearest neighbor and the next nearest neighbor lateral interactions

    NASA Astrophysics Data System (ADS)

    Tarasenko, Alexander

    2018-01-01

    Diffusion of particles adsorbed on a homogeneous one-dimensional lattice is investigated using a theoretical approach and MC simulations. The analytical dependencies calculated in the framework of approach are tested using the numerical data. The perfect coincidence of the data obtained by these different methods demonstrates that the correctness of the approach based on the theory of the non-equilibrium statistical operator.

  16. Comprehensive summary--Predict-IV: A systems toxicology approach to improve pharmaceutical drug safety testing.

    PubMed

    Mueller, Stefan O; Dekant, Wolfgang; Jennings, Paul; Testai, Emanuela; Bois, Frederic

    2015-12-25

    This special issue of Toxicology in Vitro is dedicated to disseminating the results of the EU-funded collaborative project "Profiling the toxicity of new drugs: a non animal-based approach integrating toxicodynamics and biokinetics" (Predict-IV; Grant 202222). The project's overall aim was to develop strategies to improve the assessment of drug safety in the early stage of development and late discovery phase, by an intelligent combination of non animal-based test systems, cell biology, mechanistic toxicology and in silico modeling, in a rapid and cost effective manner. This overview introduces the scope and overall achievements of Predict-IV. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Development of a remote digital augmentation system and application to a remotely piloted research vehicle

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.; Deets, D. A.

    1975-01-01

    A cost-effective approach to flight testing advanced control concepts with remotely piloted vehicles is described. The approach utilizes a ground based digital computer coupled to the remotely piloted vehicle's motion sensors and control surface actuators through telemetry links to provide high bandwidth feedback control. The system was applied to the control of an unmanned 3/8-scale model of the F-15 airplane. The model was remotely augmented; that is, the F-15 mechanical and control augmentation flight control systems were simulated by the ground-based computer, rather than being in the vehicle itself. The results of flight tests of the model at high angles of attack are discussed.

  18. Development of a hazard-based method for evaluating the fire safety of passenger trains

    DOT National Transportation Integrated Search

    1999-01-01

    The fire safety of U.S. passenger rail trains currently is addressed through small-scale flammability and smoke emission tests and performance criteria promulgated by the Federal Railroad Administration (FRA). The FRA approach relies heavily on test ...

  19. 10 CFR 100.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... approval requirements for proposed sites for stationary power and testing reactors subject to part 50 or part 52 of this chapter. (b) There exists a substantial base of knowledge regarding power reactor... approach incorporates the appropriate standards and criteria for approval of stationary power and testing...

  20. A Functional Approach to the Assessment of Language Skills

    ERIC Educational Resources Information Center

    Jakobovits, Leon A.

    1969-01-01

    Argues for language tests based on a view of linguistic competence broad enough to recognize the importance of social-psychological factors in the use of language. Paper prepared for a conference on language testing at Idyllwild, California, November 7-8, 1968. (FWB)

  1. A vision for modernizing environmental risk assessment

    EPA Science Inventory

    In 2007, the US National Research Council (NRC) published a Vision and Strategy for [human health] Toxicity Testing in the 21st century. Central to the vision was increased reliance on high throughput in vitro testing and predictive approaches based on mechanistic understanding o...

  2. Launching a virtual decision lab: development and field-testing of a web-based patient decision support research platform.

    PubMed

    Hoffman, Aubri S; Llewellyn-Thomas, Hilary A; Tosteson, Anna N A; O'Connor, Annette M; Volk, Robert J; Tomek, Ivan M; Andrews, Steven B; Bartels, Stephen J

    2014-12-12

    Over 100 trials show that patient decision aids effectively improve patients' information comprehension and values-based decision making. However, gaps remain in our understanding of several fundamental and applied questions, particularly related to the design of interactive, personalized decision aids. This paper describes an interdisciplinary development process for, and early field testing of, a web-based patient decision support research platform, or virtual decision lab, to address these questions. An interdisciplinary stakeholder panel designed the web-based research platform with three components: a) an introduction to shared decision making, b) a web-based patient decision aid, and c) interactive data collection items. Iterative focus groups provided feedback on paper drafts and online prototypes. A field test assessed a) feasibility for using the research platform, in terms of recruitment, usage, and acceptability; and b) feasibility of using the web-based decision aid component, compared to performance of a videobooklet decision aid in clinical care. This interdisciplinary, theory-based, patient-centered design approach produced a prototype for field-testing in six months. Participants (n = 126) reported that: the decision aid component was easy to use (98%), information was clear (90%), the length was appropriate (100%), it was appropriately detailed (90%), and it held their interest (97%). They spent a mean of 36 minutes using the decision aid and 100% preferred using their home/library computer. Participants scored a mean of 75% correct on the Decision Quality, Knowledge Subscale, and 74 out of 100 on the Preparation for Decision Making Scale. Completing the web-based decision aid reduced mean Decisional Conflict scores from 31.1 to 19.5 (p < 0.01). Combining decision science and health informatics approaches facilitated rapid development of a web-based patient decision support research platform that was feasible for use in research studies in terms of recruitment, acceptability, and usage. Within this platform, the web-based decision aid component performed comparably with the videobooklet decision aid used in clinical practice. Future studies may use this interactive research platform to study patients' decision making processes in real-time, explore interdisciplinary approaches to designing web-based decision aids, and test strategies for tailoring decision support to meet patients' needs and preferences.

  3. Ecological risk assessment of agricultural soils for the definition of soil screening values: A comparison between substance-based and matrix-based approaches.

    PubMed

    Pivato, Alberto; Lavagnolo, Maria Cristina; Manachini, Barbara; Vanin, Stefano; Raga, Roberto; Beggio, Giovanni

    2017-04-01

    The Italian legislation on contaminated soils does not include the Ecological Risk Assessment (ERA) and this deficiency has important consequences for the sustainable management of agricultural soils. The present research compares the results of two ERA procedures applied to agriculture (i) one based on the "substance-based" approach and (ii) a second based on the "matrix-based" approach. In the former the soil screening values (SVs) for individual substances were derived according to institutional foreign guidelines. In the latter, the SVs characterizing the whole-matrix were derived originally by the authors by means of experimental activity. The results indicate that the "matrix-based" approach can be efficiently implemented in the Italian legislation for the ERA of agricultural soils. This method, if compared to the institutionalized "substance based" approach is (i) comparable in economic terms and in testing time, (ii) is site specific and assesses the real effect of the investigated soil on a battery of bioassays, (iii) accounts for phenomena that may radically modify the exposure of the organisms to the totality of contaminants and (iv) can be considered sufficiently conservative.

  4. GPU-Based Point Cloud Superpositioning for Structural Comparisons of Protein Binding Sites.

    PubMed

    Leinweber, Matthias; Fober, Thomas; Freisleben, Bernd

    2018-01-01

    In this paper, we present a novel approach to solve the labeled point cloud superpositioning problem for performing structural comparisons of protein binding sites. The solution is based on a parallel evolution strategy that operates on large populations and runs on GPU hardware. The proposed evolution strategy reduces the likelihood of getting stuck in a local optimum of the multimodal real-valued optimization problem represented by labeled point cloud superpositioning. The performance of the GPU-based parallel evolution strategy is compared to a previously proposed CPU-based sequential approach for labeled point cloud superpositioning, indicating that the GPU-based parallel evolution strategy leads to qualitatively better results and significantly shorter runtimes, with speed improvements of up to a factor of 1,500 for large populations. Binary classification tests based on the ATP, NADH, and FAD protein subsets of CavBase, a database containing putative binding sites, show average classification rate improvements from about 92 percent (CPU) to 96 percent (GPU). Further experiments indicate that the proposed GPU-based labeled point cloud superpositioning approach can be superior to traditional protein comparison approaches based on sequence alignments.

  5. Cost-effectiveness analysis of a system-based approach for managing neonatal jaundice and preventing kernicterus in Ontario.

    PubMed

    Xie, Bin; da Silva, Orlando; Zaric, Greg

    2012-01-01

    To evaluate the incremental cost-effectiveness of a system-based approach for the management of neonatal jaundice and the prevention of kernicterus in term and late-preterm (≥35 weeks) infants, compared with the traditional practice based on visual inspection and selected bilirubin testing. Two hypothetical cohorts of 150,000 term and late-preterm neonates were used to compare the costs and outcomes associated with the use of a system-based or traditional practice approach. Data for the evaluation were obtained from the case costing centre at a large teaching hospital in Ontario, supplemented by data from the literature. The per child cost for the system-based approach cohort was $176, compared with $173 in the traditional practice cohort. The higher cost associated with the system-based cohort reflects increased costs for predischarge screening and treatment and increased postdischarge follow-up visits. These costs are partially offset by reduced costs from fewer emergency room visits, hospital readmissions and kernicterus cases. Compared with the traditional approach, the cost to prevent one kernicterus case using the system-based approach was $570,496, the cost per life year gained was $26,279, and the cost per quality-adjusted life year gained was $65,698. The cost to prevent one kernicterus case using the system-based approach is much lower than previously reported in the literature.

  6. Cost-effectiveness analysis of a system-based approach for managing neonatal jaundice and preventing kernicterus in Ontario

    PubMed Central

    Xie, Bin; da Silva, Orlando; Zaric, Greg

    2012-01-01

    OBJECTIVE: To evaluate the incremental cost-effectiveness of a system-based approach for the management of neonatal jaundice and the prevention of kernicterus in term and late-preterm (≥35 weeks) infants, compared with the traditional practice based on visual inspection and selected bilirubin testing. STUDY DESIGN: Two hypothetical cohorts of 150,000 term and late-preterm neonates were used to compare the costs and outcomes associated with the use of a system-based or traditional practice approach. Data for the evaluation were obtained from the case costing centre at a large teaching hospital in Ontario, supplemented by data from the literature. RESULTS: The per child cost for the system-based approach cohort was $176, compared with $173 in the traditional practice cohort. The higher cost associated with the system-based cohort reflects increased costs for predischarge screening and treatment and increased postdischarge follow-up visits. These costs are partially offset by reduced costs from fewer emergency room visits, hospital readmissions and kernicterus cases. Compared with the traditional approach, the cost to prevent one kernicterus case using the system-based approach was $570,496, the cost per life year gained was $26,279, and the cost per quality-adjusted life year gained was $65,698. CONCLUSION: The cost to prevent one kernicterus case using the system-based approach is much lower than previously reported in the literature. PMID:23277747

  7. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  8. Testing the significance of a correlation with nonnormal data: comparison of Pearson, Spearman, transformation, and resampling approaches.

    PubMed

    Bishara, Anthony J; Hittner, James B

    2012-09-01

    It is well known that when data are nonnormally distributed, a test of the significance of Pearson's r may inflate Type I error rates and reduce power. Statistics textbooks and the simulation literature provide several alternatives to Pearson's correlation. However, the relative performance of these alternatives has been unclear. Two simulation studies were conducted to compare 12 methods, including Pearson, Spearman's rank-order, transformation, and resampling approaches. With most sample sizes (n ≥ 20), Type I and Type II error rates were minimized by transforming the data to a normal shape prior to assessing the Pearson correlation. Among transformation approaches, a general purpose rank-based inverse normal transformation (i.e., transformation to rankit scores) was most beneficial. However, when samples were both small (n ≤ 10) and extremely nonnormal, the permutation test often outperformed other alternatives, including various bootstrap tests.

  9. Brief communication: Using averaged soil moisture estimates to improve the performances of a regional-scale landslide early warning system

    NASA Astrophysics Data System (ADS)

    Segoni, Samuele; Rosi, Ascanio; Lagomarsino, Daniela; Fanti, Riccardo; Casagli, Nicola

    2018-03-01

    We communicate the results of a preliminary investigation aimed at improving a state-of-the-art RSLEWS (regional-scale landslide early warning system) based on rainfall thresholds by integrating mean soil moisture values averaged over the territorial units of the system. We tested two approaches. The simplest can be easily applied to improve other RSLEWS: it is based on a soil moisture threshold value under which rainfall thresholds are not used because landslides are not expected to occur. Another approach deeply modifies the original RSLEWS: thresholds based on antecedent rainfall accumulated over long periods are substituted with soil moisture thresholds. A back analysis demonstrated that both approaches consistently reduced false alarms, while the second approach reduced missed alarms as well.

  10. An expert knowledge-based approach to landslide susceptibility mapping using GIS and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Zhu, A.-Xing; Wang, Rongxun; Qiao, Jianping; Qin, Cheng-Zhi; Chen, Yongbo; Liu, Jing; Du, Fei; Lin, Yang; Zhu, Tongxin

    2014-06-01

    This paper presents an expert knowledge-based approach to landslide susceptibility mapping in an effort to overcome the deficiencies of data-driven approaches. The proposed approach consists of three generic steps: (1) extraction of knowledge on the relationship between landslide susceptibility and predisposing factors from domain experts, (2) characterization of predisposing factors using GIS techniques, and (3) prediction of landslide susceptibility under fuzzy logic. The approach was tested in two study areas in China - the Kaixian study area (about 250 km2) and the Three Gorges study area (about 4600 km2). The Kaixian study area was used to develop the approach and to evaluate its validity. The Three Gorges study area was used to test both the portability and the applicability of the developed approach for mapping landslide susceptibility over large study areas. Performance was evaluated by examining if the mean of the computed susceptibility values at landslide sites was statistically different from that of the entire study area. A z-score test was used to examine the statistical significance of the difference. The computed z for the Kaixian area was 3.70 and the corresponding p-value was less than 0.001. This suggests that the computed landslide susceptibility values are good indicators of landslide occurrences. In the Three Gorges study area, the computed z was 10.75 and the corresponding p-value was less than 0.001. In addition, we divided the susceptibility value into four levels: low (0.0-0.25), moderate (0.25-0.5), high (0.5-0.75) and very high (0.75-1.0). No landslides were found for areas of low susceptibility. Landslide density was about three times higher in areas of very high susceptibility than that in the moderate susceptibility areas, and more than twice as high as that in the high susceptibility areas. The results from the Three Gorge study area suggest that the extracted expert knowledge can be extrapolated to another study area and the developed approach can be used in large-scale projects. Results from these case studies suggest that the expert knowledge-based approach is effective in mapping landslide susceptibility and that its performance is maintained when it is moved to a new area from the model development area without changes to the knowledge base.

  11. Comparing writing style feature-based classification methods for estimating user reputations in social media.

    PubMed

    Suh, Jong Hwan

    2016-01-01

    In recent years, the anonymous nature of the Internet has made it difficult to detect manipulated user reputations in social media, as well as to ensure the qualities of users and their posts. To deal with this, this study designs and examines an automatic approach that adopts writing style features to estimate user reputations in social media. Under varying ways of defining Good and Bad classes of user reputations based on the collected data, it evaluates the classification performance of the state-of-art methods: four writing style features, i.e. lexical, syntactic, structural, and content-specific, and eight classification techniques, i.e. four base learners-C4.5, Neural Network (NN), Support Vector Machine (SVM), and Naïve Bayes (NB)-and four Random Subspace (RS) ensemble methods based on the four base learners. When South Korea's Web forum, Daum Agora, was selected as a test bed, the experimental results show that the configuration of the full feature set containing content-specific features and RS-SVM combining RS and SVM gives the best accuracy for classification if the test bed poster reputations are segmented strictly into Good and Bad classes by portfolio approach. Pairwise t tests on accuracy confirm two expectations coming from the literature reviews: first, the feature set adding content-specific features outperform the others; second, ensemble learning methods are more viable than base learners. Moreover, among the four ways on defining the classes of user reputations, i.e. like, dislike, sum, and portfolio, the results show that the portfolio approach gives the highest accuracy.

  12. An alternative approach based on artificial neural networks to study controlled drug release.

    PubMed

    Reis, Marcus A A; Sinisterra, Rubén D; Belchior, Jadson C

    2004-02-01

    An alternative methodology based on artificial neural networks is proposed to be a complementary tool to other conventional methods to study controlled drug release. Two systems are used to test the approach; namely, hydrocortisone in a biodegradable matrix and rhodium (II) butyrate complexes in a bioceramic matrix. Two well-established mathematical models are used to simulate different release profiles as a function of fundamental properties; namely, diffusion coefficient (D), saturation solubility (C(s)), drug loading (A), and the height of the device (h). The models were tested, and the results show that these fundamental properties can be predicted after learning the experimental or model data for controlled drug release systems. The neural network results obtained after the learning stage can be considered to quantitatively predict ideal experimental conditions. Overall, the proposed methodology was shown to be efficient for ideal experiments, with a relative average error of <1% in both tests. This approach can be useful for the experimental analysis to simulate and design efficient controlled drug-release systems. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association

  13. Comparing Paper and Tablet Modes of Retrospective Activity Space Data Collection.

    PubMed

    Yabiku, Scott T; Glick, Jennifer E; Wentz, Elizabeth A; Ghimire, Dirgha; Zhao, Qunshan

    2017-01-01

    Individual actions are both constrained and facilitated by the social context in which individuals are embedded. But research to test specific hypotheses about the role of space on human behaviors and well-being is limited by the difficulty of collecting accurate and personally relevant social context data. We report on a project in Chitwan, Nepal, that directly addresses challenges to collect accurate activity space data. We test if a computer assisted interviewing (CAI) tablet-based approach to collecting activity space data was more accurate than a paper map-based approach; we also examine which subgroups of respondents provided more accurate data with the tablet mode compared to paper. Results show that the tablet approach yielded more accurate data when comparing respondent-indicated locations to the known locations as verified by on-the-ground staff. In addition, the accuracy of the data provided by older and less healthy respondents benefited more from the tablet mode.

  14. A controlled phantom study of a noise equalization algorithm for detecting microcalcifications in digital mammograms.

    PubMed

    Gürün, O O; Fatouros, P P; Kuhn, G M; de Paredes, E S

    2001-04-01

    We report on some extensions and further developments of a well-known microcalcification detection algorithm based on adaptive noise equalization. Tissue equivalent phantom images with and without labeled microcalcifications were subjected to this algorithm, and analyses of results revealed some shortcomings in the approach. Particularly, it was observed that the method of estimating the width of distributions in the feature space was based on assumptions which resulted in the loss of similarity preservation characteristics. A modification involving a change of estimator statistic was made, and the modified approach was tested on the same phantom images. Other modifications for improving detectability such as downsampling and use of alternate local contrast filters were also tested. The results indicate that these modifications yield improvements in detectability, while extending the generality of the approach. Extensions to real mammograms and further directions of research are discussed.

  15. Comparing Paper and Tablet Modes of Retrospective Activity Space Data Collection*

    PubMed Central

    Yabiku, Scott T.; Glick, Jennifer E.; Wentz, Elizabeth A.; Ghimire, Dirgha; Zhao, Qunshan

    2018-01-01

    Individual actions are both constrained and facilitated by the social context in which individuals are embedded. But research to test specific hypotheses about the role of space on human behaviors and well-being is limited by the difficulty of collecting accurate and personally relevant social context data. We report on a project in Chitwan, Nepal, that directly addresses challenges to collect accurate activity space data. We test if a computer assisted interviewing (CAI) tablet-based approach to collecting activity space data was more accurate than a paper map-based approach; we also examine which subgroups of respondents provided more accurate data with the tablet mode compared to paper. Results show that the tablet approach yielded more accurate data when comparing respondent-indicated locations to the known locations as verified by on-the-ground staff. In addition, the accuracy of the data provided by older and less healthy respondents benefited more from the tablet mode. PMID:29623133

  16. Sexually transmitted infection (STI) testing among young oil and gas workers: the need for innovative, place-based approaches to STI control.

    PubMed

    Goldenberg, Shira M; Shoveller, Jean A; Ostry, Aleck C; Koehoorn, Mieke

    2008-01-01

    Northeastern British Columbia is undergoing rapid in-migration of young, primarily male workers in response to the "boom" in the oil/gas industries. Accompanying the boom is a rise in Chlamydia rates among youth, which exceed the provincial average by 22%. STI testing reduces the disease burden, contributing to STI prevention. 1) To document youths' perceptions regarding the socio-cultural and structural forces that affect young oil/gas workers' access to STI testing; 2) to gather service providers' perspectives on sexual health service delivery for workers; and 3) to develop recommendations to improve the accessibility of STI testing. We conducted ethnographic fieldwork (8 weeks) in a remote oil/gas community, including in-depth interviews with 25 young people (ages 15-25) and 14 health and social service providers. Participants identified limited opportunities to access testing, geographic isolation, and 'rigger' culture as three key categories inhibiting STI testing among oil/gas Workers. These results suggest the need for place-based approaches to STI control. Innovative outreach strategies are suggested to address oil/gas workers' needs, including a locally tailored STI awareness campaign, condom distribution, expanded clinic hours, and onsite STI testing.

  17. A Physics-Based Temperature Stabilization Criterion for Thermal Testing

    NASA Technical Reports Server (NTRS)

    Rickman, Steven L.; Ungar, Eugene K.

    2009-01-01

    Spacecraft testing specifications differ greatly in the criteria they specify for stability in thermal balance tests. Some specify a required temperature stabilization rate (the change in temperature per unit time, dT/dt), some specify that the final steady-state temperature be approached to within a specified difference, delta T , and some specify a combination of the two. The particular values for temperature stabilization rate and final temperature difference also vary greatly between specification documents. A one-size-fits-all temperature stabilization rate requirement does not yield consistent results for all test configurations because of differences in thermal mass and heat transfer to the environment. Applying a steady-state temperature difference requirement is problematic because the final test temperature is not accurately known a priori, especially for powered configurations. In the present work, a simplified, lumped-mass analysis has been used to explore the applicability of these criteria. A new, user-friendly, physics-based approach is developed that allows the thermal engineer to determine when an acceptable level of temperature stabilization has been achieved. The stabilization criterion can be predicted pre-test but must be refined during test to allow verification that the defined level of temperature stabilization has been achieved.

  18. Genetic testing for patients with renal disease: procedures, pitfalls, and ethical considerations.

    PubMed

    Korf, B R

    1999-07-01

    The Human Genome Project is rapidly producing insights into the molecular basis of human genetic disorders. The most immediate clinical benefit is the advent of new diagnostic methods. Molecular diagnostic tools are available for several genetic renal disorders and are in development for many more. Two general approaches to molecular diagnosis are linkage-based testing and direct mutation detection. The former is used when the gene has not been cloned but has been mapped in relation to polymorphic loci. Linkage-based testing is also helpful when a large diversity of mutations makes direct detection difficult. Limitations include the need to study multiple family members, the need for informative polymorphisms, and genetic heterogeneity. Direct mutation detection is limited by genetic heterogeneity and the need to distinguish nonpathogenic allelic variants from pathogenic mutations. Molecular testing raises a number of complex ethical issues, including those associated with prenatal or presymptomatic diagnosis. In addition, there are concerns about informed consent, privacy, genetic discrimination, and technology transfer for newly developed tests. Health professionals need to be aware of the technical and ethical implications of these new methods of testing, as well as the complexities in test interpretation, as molecular approaches are increasingly integrated into medical practice.

  19. A Proposal to Plan and Develop a Sample Set of Drill and Testing Materials, Based on Audio and Visual Environmental and Situational Stimuli, Aimed at Training and Testing in the Creation of Original Utterances by Foreign Language Students at the Secondary and College Levels.

    ERIC Educational Resources Information Center

    Obrecht, Dean H.

    This report contrasts the results of a rigidly specified, pattern-oriented approach to learning Spanish with an approach that emphasizes the origination of sentences by the learner in direct response to stimuli. Pretesting and posttesting statistics are presented and conclusions are discussed. The experimental method, which required the student to…

  20. Development of immune-diagnostic reagents to diagnose bovine tuberculosis in cattle.

    PubMed

    Vordermeier, H Martin; Jones, Gareth J; Buddle, Bryce M; Hewinson, R Glyn

    2016-11-15

    Bovine tuberculosis remains a major economic and animal welfare concern worldwide. As part of control strategies, cattle vaccination is being considered. This approach, used alongside conventional control policies, also requires the development of vaccine compatible diagnostic assays to distinguish infected from vaccinated animals (DIVA). In this review we discuss recent advances in DIVA development based on the detection of host cellular immune responses by blood testing or skin testing approaches. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake; Chakraborty, Sudipta; Lauss, Georg

    This paper presents a concise description of state-of-the-art real-time simulation-based testing methods and demonstrates how they can be used independently and/or in combination as an integrated development and validation approach for smart grid DERs and systems. A three-part case study demonstrating the application of this integrated approach at the different stages of development and validation of a system-integrated smart photovoltaic (PV) inverter is also presented. Laboratory testing results and perspectives from two international research laboratories are included in the case study.

  2. Learning from project experiences using a legacy-based approach

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Majchrzak, Ann; Faraj, Samer

    2005-01-01

    As project teams become used more widely, the question of how to capitalize on the knowledge learned in project teams remains an open issue. Using previous research on shared cognition in groups, an approach to promoting post-project learning was developed. This Legacy Review concept was tested on four in tact project teams. The results from those test sessions were used to develop a model of team learning via group cognitive processes. The model and supporting propositions are presented.

  3. The special case of the 2 × 2 table: asymptotic unconditional McNemar test can be used to estimate sample size even for analysis based on GEE.

    PubMed

    Borkhoff, Cornelia M; Johnston, Patrick R; Stephens, Derek; Atenafu, Eshetu

    2015-07-01

    Aligning the method used to estimate sample size with the planned analytic method ensures the sample size needed to achieve the planned power. When using generalized estimating equations (GEE) to analyze a paired binary primary outcome with no covariates, many use an exact McNemar test to calculate sample size. We reviewed the approaches to sample size estimation for paired binary data and compared the sample size estimates on the same numerical examples. We used the hypothesized sample proportions for the 2 × 2 table to calculate the correlation between the marginal proportions to estimate sample size based on GEE. We solved the inside proportions based on the correlation and the marginal proportions to estimate sample size based on exact McNemar, asymptotic unconditional McNemar, and asymptotic conditional McNemar. The asymptotic unconditional McNemar test is a good approximation of GEE method by Pan. The exact McNemar is too conservative and yields unnecessarily large sample size estimates than all other methods. In the special case of a 2 × 2 table, even when a GEE approach to binary logistic regression is the planned analytic method, the asymptotic unconditional McNemar test can be used to estimate sample size. We do not recommend using an exact McNemar test. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Toxicity Testing in the 21st Century Beyond Environmental Chemicals

    PubMed Central

    Rovida, Costanza; Asakura, Shoji; Daneshian, Mardas; Hofman-Huether, Hana; Leist, Marcel; Meunier, Leo; Reif, David; Rossi, Anna; Schmutz, Markus; Valentin, Jean-Pierre; Zurlo, Joanne; Hartung, Thomas

    2018-01-01

    Summary After the publication of the report titled Toxicity Testing in the 21st Century – A Vision and a Strategy, many initiatives started to foster a major paradigm shift for toxicity testing – from apical endpoints in animal-based tests to mechanistic endpoints through delineation of pathways of toxicity (PoT) in human cell based systems. The US EPA has funded an important project to develop new high throughput technologies based on human cell based in vitro technologies. These methods are currently being incorporated into the chemical risk assessment process. In the pharmaceutical industry, the efficacy and toxicity of new drugs are evaluated during preclinical investigations that include drug metabolism, pharmacokinetics, pharmacodynamics and safety toxicology studies. The results of these studies are analyzed and extrapolated to predict efficacy and potential adverse effects in humans. However, due to the high failure rate of drugs during the clinical phases, a new approach for a more predictive assessment of drugs both in terms of efficacy and adverse effects is getting urgent. The food industry faces the challenge of assessing novel foods and food ingredients for the general population, while using animal safety testing for extrapolation purposes is often of limited relevance. The question is whether the latest paradigm shift proposed by the Tox21c report for chemicals may provide a useful tool to improve the risk assessment approach also for drugs and food ingredients. PMID:26168280

  5. "The Hole in the Sky Causes Global Warming": A Case Study of Secondary School Students' Climate Change Alternative Conceptions

    ERIC Educational Resources Information Center

    Chang, Chew-Hung; Pascua, Liberty

    2015-01-01

    This study identified secondary school students' alternative conceptions (ACs) of climate change and their resistance to instruction. Using a case-based approach, a diagnostic test was administered to Secondary 3 male students in a pre-test and post-test. The ACs identified in the pre-test were on the causes of climate change, the natural…

  6. A Comparison of Three IRT Approaches to Examinee Ability Change Modeling in a Single-Group Anchor Test Design

    ERIC Educational Resources Information Center

    Paek, Insu; Park, Hyun-Jeong; Cai, Li; Chi, Eunlim

    2014-01-01

    Typically a longitudinal growth modeling based on item response theory (IRT) requires repeated measures data from a single group with the same test design. If operational or item exposure problems are present, the same test may not be employed to collect data for longitudinal analyses and tests at multiple time points are constructed with unique…

  7. Analyzing Real-World Light Duty Vehicle Efficiency Benefits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonder, Jeffrey; Wood, Eric; Chaney, Larry

    Off-cycle technologies represent an important pathway to achieve real-world fuel savings, through which OEMs can potentially receive credit toward CAFE compliance. DOE national labs such as NREL are well positioned to provide objective input on these technologies using large, national data sets in conjunction with OEM- and technology-specific testing. This project demonstrates an approach that combines vehicle testing (dynamometer and on-road) with powertrain modeling and simulation over large, representative datasets to quantify real-world fuel economy. The approach can be applied to specific off-cycle technologies (engine encapsulation, start/stop, connected vehicle, etc.) in A/B comparisons to support calculation of realistic real-world impacts.more » Future work will focus on testing-based A/B technology comparisons that demonstrate the significance of this approach.« less

  8. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak

    2007-01-01

    The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.

  9. ACCF/ASNC appropriateness criteria for single-photon emission computed tomography myocardial perfusion imaging (SPECT MPI): a report of the American College of Cardiology Foundation Quality Strategic Directions Committee Appropriateness Criteria Working Group and the American Society of Nuclear Cardiology endorsed by the American Heart Association.

    PubMed

    Brindis, Ralph G; Douglas, Pamela S; Hendel, Robert C; Peterson, Eric D; Wolk, Michael J; Allen, Joseph M; Patel, Manesh R; Raskin, Ira E; Hendel, Robert C; Bateman, Timothy M; Cerqueira, Manuel D; Gibbons, Raymond J; Gillam, Linda D; Gillespie, John A; Hendel, Robert C; Iskandrian, Ami E; Jerome, Scott D; Krumholz, Harlan M; Messer, Joseph V; Spertus, John A; Stowers, Stephen A

    2005-10-18

    Under the auspices of the American College of Cardiology Foundation (ACCF) and the American Society of Nuclear Cardiology (ASNC), an appropriateness review was conducted for radionuclide cardiovascular imaging (RNI), specifically gated single-photon emission computed tomography myocardial perfusion imaging (SPECT MPI). The review assessed the risks and benefits of the imaging test for several indications or clinical scenarios and scored them based on a scale of 1 to 9, where the upper range (7 to 9) implies that the test is generally acceptable and is a reasonable approach, and the lower range (1 to 3) implies that the test is generally not acceptable and is not a reasonable approach. The mid range (4 to 6) implies that the test may be generally acceptable and may be a reasonable approach for the indication. The indications for this review were primarily drawn from existing clinical practice guidelines and modified based on discussion by the ACCF Appropriateness Criteria Working Group and the Technical Panel members who rated the indications. The method for this review was based on the RAND/UCLA approach for evaluating appropriateness, which blends scientific evidence and practice experience. A modified Delphi technique was used to obtain first- and second-round ratings of 52 clinical indications. The ratings were done by a Technical Panel with diverse membership, including nuclear cardiologists, referring physicians (including an echocardiographer), health services researchers, and a payer (chief medical officer). These results are expected to have a significant impact on physician decision making and performance, reimbursement policy, and future research directions. Periodic assessment and updating of criteria will be undertaken as needed.

  10. Development and Testing of Harpoon-Based Approaches for Collecting Comet Samples (Video Supplement)

    NASA Technical Reports Server (NTRS)

    Purves, Lloyd (Compiler); Nuth, Joseph (Compiler); Amatucci, Edward (Compiler); Wegel, Donald; Smith, Walter; Leary, James; Kee, Lake; Hill, Stuart; Grebenstein, Markus; Voelk, Stefan; hide

    2017-01-01

    This video supplement contains a set of videos created during the approximately 10-year-long course of developing and testing the Goddard Space Flight Center (GSFC) harpoon-based approach for collecting comet samples. The purpose of the videos is to illustrate various design concepts used in this method of acquiring samples of comet material, the testing used to verify the concepts, and the evolution of designs and testing. To play the videos this PDF needs to be opened in the freeware Adobe Reader. They do not seem to play while within a browser. While this supplement can be used as a stand-alone document, it is intended to augment its parent document of the same title, Development and Testing of Harpoon-Based Approaches for Collecting Comet Samples (NASA/CR-2017-219018; this document is accessible from the website: https://ssed.gsfc.nasa.gov/harpoon/SAS_Paper-V1.pdf). The parent document, which only contains text and figures, describes the overall development and testing effort and contains references to each of the videos in this supplement. Thus, the videos are primarily intended to augment the information provided by the text and figures in the parent document. This approach was followed to allow the file size of the parent document to remain small enough to facilitate downloading and storage. Some of the videos were created by other organizations, Johns Hopkins University Applied Physics Laboratory (JHU APL) and the German Aerospace Center called, the Deutsches Zentrum für Luft- und Raumfahrt (DLR), who are partnering with GSFC on developing this technology. Each video is accompanied by text that provides a summary description of its nature and purpose, as well as the identity of the authors. All videos have been edited to only show key parts of the testing. Although not all videos have sound, the sound has been retained in those that have it. Also, each video has been given one or more title screens to clarify what is going in different phases of the video.

  11. Repeated Challenge Studies: A Comparison of Union-Intersection Testing with Linear Modeling.

    ERIC Educational Resources Information Center

    Levine, Richard A.; Ohman, Pamela A.

    1997-01-01

    Challenge studies can be used to see whether there is a causal relationship between an agent of interest and a response. An approach based on union-intersection testing is presented that allows researchers to examine observations on a single subject and test the hypothesis of interest. An application using psychological data is presented. (SLD)

  12. A Short Test for the Assessment of Basic Knowledge in Psychology

    ERIC Educational Resources Information Center

    Peter, Johannes; Leichner, Nikolas; Mayer, Anne-Kathrin; Krampen, Günter

    2015-01-01

    This paper reports the development of a fixed-choice test for the assessment of basic knowledge in psychology, for use with undergraduate as well as graduate students. Test content is selected based on a core concepts approach and includes a sample of concepts which are indexed most frequently in common introductory psychology textbooks. In a…

  13. Adverse Outcome Pathways and Systems Biology as Conceptual Approaches to Support Development of 21st Century Test Methods and Extrapolation Tools

    EPA Science Inventory

    The proposed paradigm for “Toxicity Testing in the 21st Century” supports the development of mechanistically-based, high-throughput in vitro assays as a potential cost effective and scientifically-sound alternative to some whole animal hazard testing. To accomplish this long-term...

  14. Quantitative 3-d diagnostic ultrasound imaging using a modified transducer array and an automated image tracking technique.

    PubMed

    Hossack, John A; Sumanaweera, Thilaka S; Napel, Sandy; Ha, Jun S

    2002-08-01

    An approach for acquiring dimensionally accurate three-dimensional (3-D) ultrasound data from multiple 2-D image planes is presented. This is based on the use of a modified linear-phased array comprising a central imaging array that acquires multiple, essentially parallel, 2-D slices as the transducer is translated over the tissue of interest. Small, perpendicularly oriented, tracking arrays are integrally mounted on each end of the imaging transducer. As the transducer is translated in an elevational direction with respect to the central imaging array, the images obtained by the tracking arrays remain largely coplanar. The motion between successive tracking images is determined using a minimum sum of absolute difference (MSAD) image matching technique with subpixel matching resolution. An initial phantom scanning-based test of a prototype 8 MHz array indicates that linear dimensional accuracy of 4.6% (2 sigma) is achievable. This result compares favorably with those obtained using an assumed average velocity [31.5% (2 sigma) accuracy] and using an approach based on measuring image-to-image decorrelation [8.4% (2 sigma) accuracy]. The prototype array and imaging system were also tested in a clinical environment, and early results suggest that the approach has the potential to enable a low cost, rapid, screening method for detecting carotid artery stenosis. The average time for performing a screening test for carotid stenosis was reduced from an average of 45 minutes using 2-D duplex Doppler to 12 minutes using the new 3-D scanning approach.

  15. Propellant Readiness Level: A Methodological Approach to Propellant Characterization

    NASA Technical Reports Server (NTRS)

    Bossard, John A.; Rhys, Noah O.

    2010-01-01

    A methodological approach to defining propellant characterization is presented. The method is based on the well-established Technology Readiness Level nomenclature. This approach establishes the Propellant Readiness Level as a metric for ascertaining the readiness of a propellant or a propellant combination by evaluating the following set of propellant characteristics: thermodynamic data, toxicity, applications, combustion data, heat transfer data, material compatibility, analytical prediction modeling, injector/chamber geometry, pressurization, ignition, combustion stability, system storability, qualification testing, and flight capability. The methodology is meant to be applicable to all propellants or propellant combinations; liquid, solid, and gaseous propellants as well as monopropellants and propellant combinations are equally served. The functionality of the proposed approach is tested through the evaluation and comparison of an example set of hydrocarbon fuels.

  16. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  17. GLISTRboost: Combining Multimodal MRI Segmentation, Registration, and Biophysical Tumor Growth Modeling with Gradient Boosting Machines for Glioma Segmentation.

    PubMed

    Bakas, Spyridon; Zeng, Ke; Sotiras, Aristeidis; Rathore, Saima; Akbari, Hamed; Gaonkar, Bilwaj; Rozycki, Martin; Pati, Sarthak; Davatzikos, Christos

    2016-01-01

    We present an approach for segmenting low- and high-grade gliomas in multimodal magnetic resonance imaging volumes. The proposed approach is based on a hybrid generative-discriminative model. Firstly, a generative approach based on an Expectation-Maximization framework that incorporates a glioma growth model is used to segment the brain scans into tumor, as well as healthy tissue labels. Secondly, a gradient boosting multi-class classification scheme is used to refine tumor labels based on information from multiple patients. Lastly, a probabilistic Bayesian strategy is employed to further refine and finalize the tumor segmentation based on patient-specific intensity statistics from the multiple modalities. We evaluated our approach in 186 cases during the training phase of the BRAin Tumor Segmentation (BRATS) 2015 challenge and report promising results. During the testing phase, the algorithm was additionally evaluated in 53 unseen cases, achieving the best performance among the competing methods.

  18. SSME fault monitoring and diagnosis expert system

    NASA Technical Reports Server (NTRS)

    Ali, Moonis; Norman, Arnold M.; Gupta, U. K.

    1989-01-01

    An expert system, called LEADER, has been designed and implemented for automatic learning, detection, identification, verification, and correction of anomalous propulsion system operations in real time. LEADER employs a set of sensors to monitor engine component performance and to detect, identify, and validate abnormalities with respect to varying engine dynamics and behavior. Two diagnostic approaches are adopted in the architecture of LEADER. In the first approach fault diagnosis is performed through learning and identifying engine behavior patterns. LEADER, utilizing this approach, generates few hypotheses about the possible abnormalities. These hypotheses are then validated based on the SSME design and functional knowledge. The second approach directs the processing of engine sensory data and performs reasoning based on the SSME design, functional knowledge, and the deep-level knowledge, i.e., the first principles (physics and mechanics) of SSME subsystems and components. This paper describes LEADER's architecture which integrates a design based reasoning approach with neural network-based fault pattern matching techniques. The fault diagnosis results obtained through the analyses of SSME ground test data are presented and discussed.

  19. A Digital Image-Based Discrete Fracture Network Model and Its Numerical Investigation of Direct Shear Tests

    NASA Astrophysics Data System (ADS)

    Wang, Peitao; Cai, Meifeng; Ren, Fenhua; Li, Changhong; Yang, Tianhong

    2017-07-01

    This paper develops a numerical approach to determine the mechanical behavior of discrete fractures network (DFN) models based on digital image processing technique and particle flow code (PFC2D). A series of direct shear tests of jointed rocks were numerically performed to study the effect of normal stress, friction coefficient and joint bond strength on the mechanical behavior of joint rock and evaluate the influence of micro-parameters on the shear properties of jointed rocks using the proposed approach. The complete shear stress-displacement curve of the DFN model under direct shear tests was presented to evaluate the failure processes of jointed rock. The results show that the peak and residual strength are sensitive to normal stress. A higher normal stress has a greater effect on the initiation and propagation of cracks. Additionally, an increase in the bond strength ratio results in an increase in the number of both shear and normal cracks. The friction coefficient was also found to have a significant influence on the shear strength and shear cracks. Increasing in the friction coefficient resulted in the decreasing in the initiation of normal cracks. The unique contribution of this paper is the proposed modeling technique to simulate the mechanical behavior of jointed rock mass based on particle mechanics approaches.

  20. A hierarchical fuzzy rule-based approach to aphasia diagnosis.

    PubMed

    Akbarzadeh-T, Mohammad-R; Moshtagh-Khorasani, Majid

    2007-10-01

    Aphasia diagnosis is a particularly challenging medical diagnostic task due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. To efficiently address this diagnostic process, a hierarchical fuzzy rule-based structure is proposed here that considers the effect of different features of aphasia by statistical analysis in its construction. This approach can be efficient for diagnosis of aphasia and possibly other medical diagnostic applications due to its fuzzy and hierarchical reasoning construction. Initially, the symptoms of the disease which each consists of different features are analyzed statistically. The measured statistical parameters from the training set are then used to define membership functions and the fuzzy rules. The resulting two-layered fuzzy rule-based system is then compared with a back propagating feed-forward neural network for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. In order to reduce the number of required inputs, the technique is applied and compared on both comprehensive and spontaneous speech tests. Statistical t-test analysis confirms that the proposed approach uses fewer Aphasia features while also presenting a significant improvement in terms of accuracy.

  1. Evaluation of field methods for vertical high resolution aquifer characterization

    NASA Astrophysics Data System (ADS)

    Vienken, T.; Tinter, M.; Rogiers, B.; Leven, C.; Dietrich, P.

    2012-12-01

    The delineation and characterization of subsurface (hydro)-stratigraphic structures is one of the challenging tasks of hydrogeological site investigations. The knowledge about the spatial distribution of soil specific properties and hydraulic conductivity (K) is the prerequisite for understanding flow and fluid transport processes. This is especially true for heterogeneous unconsolidated sedimentary deposits with a complex sedimentary architecture. One commonly used approach to investigate and characterize sediment heterogeneity is soil sampling and lab analyses, e.g. grain size distribution. Tests conducted on 108 samples show that calculation of K based on grain size distribution is not suitable for high resolution aquifer characterization of highly heterogeneous sediments due to sampling effects and large differences of calculated K values between applied formulas (Vienken & Dietrich 2011). Therefore, extensive tests were conducted at two test sites under different geological conditions to evaluate the performance of innovative Direct Push (DP) based approaches for the vertical high resolution determination of K. Different DP based sensor probes for the in-situ subsurface characterization based on electrical, hydraulic, and textural soil properties were used to obtain high resolution vertical profiles. The applied DP based tools proved to be a suitable and efficient alternative to traditional approaches. Despite resolution differences, all of the applied methods captured the main aquifer structure. Correlation of the DP based K estimates and proxies with DP based slug tests show that it is possible to describe the aquifer hydraulic structure on less than a meter scale by combining DP slug test data and continuous DP measurements. Even though correlations are site specific and appropriate DP tools must be chosen, DP is reliable and efficient alternative for characterizing even strongly heterogeneous sites with complex structured sedimentary aquifers (Vienken et al. 2012). References: Vienken, T., Leven, C., and Dietrich, P. 2012. Use of CPT and other direct push methods for (hydro-) stratigraphic aquifer characterization — a field study. Canadian Geotechnical Journal, 49(2): 197-206. Vienken, T., and Dietrich, P. 2011. Field evaluation of methods for determining hydraulic conductivity from grain size data. Journal of Hydrology, 400(1-2): 58-71.

  2. [Does co-operation research provide approaches to explain the changes in the German hospital market?].

    PubMed

    Raible, C; Leidl, R

    2004-11-01

    The German hospital market faces an extensive process of consolidation. In this change hospitals consider cooperation as one possibility to improve competitiveness. To investigate explanations of changes in the German hospital market by theoretical approaches of cooperation research. The aims and mechanism of the theories, their relevance in terms of contents and their potential for empirical tests were used as criteria to assess the approaches, with current and future trends in the German hospital market providing the framework. Based on literature review, six theoretical approaches were investigated: industrial organization, transaction cost theory, game theory, resource dependency, institutional theory, and co-operative investment and finance theory. In addition, the data needed to empirically test the theories were specified. As a general problem, some of the theoretical approaches set a perfect market as a precondition. This precondition is not met by the heavily regulated German hospital market. Given the current regulations and the assessment criteria, industrial organization as well as resource-dependency and institutional theory approaches showed the highest potential to explain various aspects of the changes in the hospital market. So far, none of the approaches investigated provides a comprehensive and empirically tested explanation of the changes in the German hospital market. However, some of the approaches provide a theoretical background for part of the changes. As this dynamic market is economically of high significance, there is a need for further development and empirical testing of relevant theoretical approaches.

  3. Writing in Chemistry: An Effective Learning Tool.

    ERIC Educational Resources Information Center

    Sherwood, Donna W.; Kovac, Jeffrey

    1999-01-01

    Presents some general strategies for using writing in chemistry courses based on experiences in developing a systematic approach to using writing as an effective learning tool in chemistry courses, and testing this approach in high-enrollment general chemistry courses at the University of Tennessee-Knoxville. Contains 18 references. (WRM)

  4. Learning "Number Sense" through Digital Games with Intrinsic Feedback

    ERIC Educational Resources Information Center

    Laurillard, Diana

    2016-01-01

    The paper proposes a new interdisciplinary approach to helping low attaining learners in basic mathematics. It reports on the research-informed design and user testing of an adaptive digital game based on constructionist tasks with intrinsic feedback. The approach uses findings from the neuroscience of dyscalculia, cognitive science research on…

  5. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2002-01-01

    In this paper we present a comparison of optimization approaches to the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP), Quasi-Newton, Simplex, Genetic Algorithms, and Simulated Annealing. Each method is applied to a variety of test cases including, circular to circular coplanar orbits, LEO to GEO, and orbit phasing in highly elliptic orbits. We also compare different constrained optimization routines on complex orbit rendezvous problems with complicated, highly nonlinear constraints.

  6. Experience With Direct-to-Patient Recruitment for Enrollment Into a Clinical Trial in a Rare Disease: A Web-Based Study

    PubMed Central

    2017-01-01

    Background The target sample size for clinical trials often necessitates a multicenter (center of excellence, CoE) approach with associated added complexity, cost, and regulatory requirements. Alternative recruitment strategies need to be tested against this standard model. Objectives The aim of our study was to test whether a Web-based direct recruitment approach (patient-centric, PC) using social marketing strategies provides a viable option to the CoE recruitment method. Methods PC recruitment and Web-based informed consent was compared with CoE recruitment for a randomized controlled trial (RCT) of continuing versus stopping low-dose prednisone for maintenance of remission of patients with granulomatosis with polyangiitis (GPA). Results The PC approach was not as successful as the CoE approach. Enrollment of those confirmed eligible by their physician was 10 of 13 (77%) and 49 of 51 (96%) in the PC and CoE arms, respectively (P=.05). The two approaches were not significantly different in terms of eligibility with 34% of potential participants in the CoE found to be ineligible as compared with 22% in the PC arm (P=.11) nor in provider acceptance, 22% versus 26% (P=.78). There was no difference in the understanding of the trial as reflected in the knowledge surveys of individuals in the PC and CoE arms. Conclusions PC recruitment was substantially less successful than that achieved by the CoE approach. However, the PC approach was good at confirming eligibility and was as acceptable to providers and as understandable to patients as the CoE approach. The PC approach should be evaluated in other clinical settings to get a better sense of its potential. PMID:28246067

  7. Evaluation of image quality

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.

  8. A novel feature extraction approach for microarray data based on multi-algorithm fusion

    PubMed Central

    Jiang, Zhu; Xu, Rong

    2015-01-01

    Feature extraction is one of the most important and effective method to reduce dimension in data mining, with emerging of high dimensional data such as microarray gene expression data. Feature extraction for gene selection, mainly serves two purposes. One is to identify certain disease-related genes. The other is to find a compact set of discriminative genes to build a pattern classifier with reduced complexity and improved generalization capabilities. Depending on the purpose of gene selection, two types of feature extraction algorithms including ranking-based feature extraction and set-based feature extraction are employed in microarray gene expression data analysis. In ranking-based feature extraction, features are evaluated on an individual basis, without considering inter-relationship between features in general, while set-based feature extraction evaluates features based on their role in a feature set by taking into account dependency between features. Just as learning methods, feature extraction has a problem in its generalization ability, which is robustness. However, the issue of robustness is often overlooked in feature extraction. In order to improve the accuracy and robustness of feature extraction for microarray data, a novel approach based on multi-algorithm fusion is proposed. By fusing different types of feature extraction algorithms to select the feature from the samples set, the proposed approach is able to improve feature extraction performance. The new approach is tested against gene expression dataset including Colon cancer data, CNS data, DLBCL data, and Leukemia data. The testing results show that the performance of this algorithm is better than existing solutions. PMID:25780277

  9. A novel feature extraction approach for microarray data based on multi-algorithm fusion.

    PubMed

    Jiang, Zhu; Xu, Rong

    2015-01-01

    Feature extraction is one of the most important and effective method to reduce dimension in data mining, with emerging of high dimensional data such as microarray gene expression data. Feature extraction for gene selection, mainly serves two purposes. One is to identify certain disease-related genes. The other is to find a compact set of discriminative genes to build a pattern classifier with reduced complexity and improved generalization capabilities. Depending on the purpose of gene selection, two types of feature extraction algorithms including ranking-based feature extraction and set-based feature extraction are employed in microarray gene expression data analysis. In ranking-based feature extraction, features are evaluated on an individual basis, without considering inter-relationship between features in general, while set-based feature extraction evaluates features based on their role in a feature set by taking into account dependency between features. Just as learning methods, feature extraction has a problem in its generalization ability, which is robustness. However, the issue of robustness is often overlooked in feature extraction. In order to improve the accuracy and robustness of feature extraction for microarray data, a novel approach based on multi-algorithm fusion is proposed. By fusing different types of feature extraction algorithms to select the feature from the samples set, the proposed approach is able to improve feature extraction performance. The new approach is tested against gene expression dataset including Colon cancer data, CNS data, DLBCL data, and Leukemia data. The testing results show that the performance of this algorithm is better than existing solutions.

  10. Robust Observation Detection for Single Object Tracking: Deterministic and Probabilistic Patch-Based Approaches

    PubMed Central

    Zulkifley, Mohd Asyraf; Rawlinson, David; Moran, Bill

    2012-01-01

    In video analytics, robust observation detection is very important as the content of the videos varies a lot, especially for tracking implementation. Contrary to the image processing field, the problems of blurring, moderate deformation, low illumination surroundings, illumination change and homogenous texture are normally encountered in video analytics. Patch-Based Observation Detection (PBOD) is developed to improve detection robustness to complex scenes by fusing both feature- and template-based recognition methods. While we believe that feature-based detectors are more distinctive, however, for finding the matching between the frames are best achieved by a collection of points as in template-based detectors. Two methods of PBOD—the deterministic and probabilistic approaches—have been tested to find the best mode of detection. Both algorithms start by building comparison vectors at each detected points of interest. The vectors are matched to build candidate patches based on their respective coordination. For the deterministic method, patch matching is done in 2-level test where threshold-based position and size smoothing are applied to the patch with the highest correlation value. For the second approach, patch matching is done probabilistically by modelling the histograms of the patches by Poisson distributions for both RGB and HSV colour models. Then, maximum likelihood is applied for position smoothing while a Bayesian approach is applied for size smoothing. The result showed that probabilistic PBOD outperforms the deterministic approach with average distance error of 10.03% compared with 21.03%. This algorithm is best implemented as a complement to other simpler detection methods due to heavy processing requirement. PMID:23202226

  11. Implementation of Alternative Test Strategies for the Safety Assessment of Engineered Nanomaterials

    PubMed Central

    Nel, Andre

    2014-01-01

    Nanotechnology introduces a new field that requires novel approaches and methods for hazard and risk assessment. For an appropriate scientific platform for safety assessment, nanoscale properties and functions of engineered nanomaterials (ENMs), including how the physicochemical properties of the materials related to mechanisms of injury at the nano-bio interface, must be considered. Moreover, this rapidly advancing new field requires novel test strategies that allow multiple toxicants to be screened in robust, mechanism-based assays in which the bulk of the investigation can be carried out at the cellular and biomolecular level whilst maintaining limited animal use and is based on the contribution of toxicological pathways to the pathophysiology of disease. First, a predictive toxicological approach for the safety assessment of ENMs will be discussed against the background of a ‘21st-century vision’ for using alternative test strategies (ATSs) to perform toxicological assessment of large numbers of untested chemicals, thereby reducing a backlog that could otherwise become a problem for nanotechnology. An ATS is defined here as an alternative/reduction alternative to traditional animal testing. Secondly, the approach of selecting pathways of toxicity to screen for the pulmonary hazard potential of carbon nanotubes and metal oxides will be discussed, as well as how to use these pathways to perform high-content or high-throughput testing and how the data can be used for hazard ranking, risk assessment, regulatory decision-making and ‘safer-by-design’ strategies. Finally, the utility and disadvantages of this predictive toxicological approach to ENM safety assessment, and how it can assist the 21st- century vision, will be addressed PMID:23879741

  12. Sentence Recognition Prediction for Hearing-impaired Listeners in Stationary and Fluctuation Noise With FADE: Empowering the Attenuation and Distortion Concept by Plomp With a Quantitative Processing Model.

    PubMed

    Kollmeier, Birger; Schädler, Marc René; Warzybok, Anna; Meyer, Bernd T; Brand, Thomas

    2016-09-07

    To characterize the individual patient's hearing impairment as obtained with the matrix sentence recognition test, a simulation Framework for Auditory Discrimination Experiments (FADE) is extended here using the Attenuation and Distortion (A+D) approach by Plomp as a blueprint for setting the individual processing parameters. FADE has been shown to predict the outcome of both speech recognition tests and psychoacoustic experiments based on simulations using an automatic speech recognition system requiring only few assumptions. It builds on the closed-set matrix sentence recognition test which is advantageous for testing individual speech recognition in a way comparable across languages. Individual predictions of speech recognition thresholds in stationary and in fluctuating noise were derived using the audiogram and an estimate of the internal level uncertainty for modeling the individual Plomp curves fitted to the data with the Attenuation (A-) and Distortion (D-) parameters of the Plomp approach. The "typical" audiogram shapes from Bisgaard et al with or without a "typical" level uncertainty and the individual data were used for individual predictions. As a result, the individualization of the level uncertainty was found to be more important than the exact shape of the individual audiogram to accurately model the outcome of the German Matrix test in stationary or fluctuating noise for listeners with hearing impairment. The prediction accuracy of the individualized approach also outperforms the (modified) Speech Intelligibility Index approach which is based on the individual threshold data only. © The Author(s) 2016.

  13. Nucleic acids-based tools for ballast water surveillance, monitoring, and research

    NASA Astrophysics Data System (ADS)

    Darling, John A.; Frederick, Raymond M.

    2018-03-01

    Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size classes), gaps in nucleic acids reference databases are still considerable, uncertainties in taxonomic assignment methods persist, and many applications have not yet matured sufficiently to offer standardized methods capable of meeting rigorous quality assurance standards. Nevertheless, the potential value of these tools, their growing utilization in biodiversity monitoring, and the rapid methodological advances over the past decade all suggest that they should be seriously considered for inclusion in the ballast water surveillance toolkit.

  14. A comparative review of methods for comparing means using partially paired data.

    PubMed

    Guo, Beibei; Yuan, Ying

    2017-06-01

    In medical experiments with the objective of testing the equality of two means, data are often partially paired by design or because of missing data. The partially paired data represent a combination of paired and unpaired observations. In this article, we review and compare nine methods for analyzing partially paired data, including the two-sample t-test, paired t-test, corrected z-test, weighted t-test, pooled t-test, optimal pooled t-test, multiple imputation method, mixed model approach, and the test based on a modified maximum likelihood estimate. We compare the performance of these methods through extensive simulation studies that cover a wide range of scenarios with different effect sizes, sample sizes, and correlations between the paired variables, as well as true underlying distributions. The simulation results suggest that when the sample size is moderate, the test based on the modified maximum likelihood estimator is generally superior to the other approaches when the data is normally distributed and the optimal pooled t-test performs the best when the data is not normally distributed, with well-controlled type I error rates and high statistical power; when the sample size is small, the optimal pooled t-test is to be recommended when both variables have missing data and the paired t-test is to be recommended when only one variable has missing data.

  15. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  16. E-Beam Capture Aid Drawing Based Modelling on Cell Biology

    NASA Astrophysics Data System (ADS)

    Hidayat, T.; Rahmat, A.; Redjeki, S.; Rahman, T.

    2017-09-01

    The objectives of this research are to find out how far Drawing-based Modeling assisted with E-Beam Capture could support student’s scientific reasoning skill using Drawing - based Modeling approach assisted with E-Beam Capture. The research design that is used for this research is the Pre-test and Post-test Design. The data collection of scientific reasoning skills is collected by giving multiple choice questions before and after the lesson. The data analysis of scientific reasoning skills is using scientific reasoning assessment rubric. The results show an improvement of student’s scientific reasoning in every indicator; an improvement in generativity which shows 2 students achieving high scores, 3 students in elaboration reasoning, 4 students in justification, 3 students in explanation, 3 students in logic coherency, 2 students in synthesis. The research result in student’s explanation reasoning has the highest number of students with high scores, which shows 20 students with high scores in the pre-test and 23 students in post-test and synthesis reasoning shows the lowest number, which shows 1 student in the pretest and 3 students in posttest. The research result gives the conclusion that Drawing-based Modeling approach assisted with E-Beam Capture could not yet support student’s scientific reasoning skills comprehensively.

  17. Enzymatic testing sensitivity, variability and practical diagnostic algorithm for pyruvate dehydrogenase complex (PDC) deficiency.

    PubMed

    Shin, Ha Kyung; Grahame, George; McCandless, Shawn E; Kerr, Douglas S; Bedoyan, Jirair K

    2017-11-01

    Pyruvate dehydrogenase complex (PDC) deficiency is a major cause of primary lactic acidemia in children. Prompt and correct diagnosis of PDC deficiency and differentiating between specific vs generalized, or secondary deficiencies has important implications for clinical management and therapeutic interventions. Both genetic and enzymatic testing approaches are being used in the diagnosis of PDC deficiency. However, the diagnostic efficacy of such testing approaches for individuals affected with PDC deficiency has not been systematically investigated in this disorder. We sought to evaluate the diagnostic sensitivity and variability of the various PDC enzyme assays in females and males at the Center for Inherited Disorders of Energy Metabolism (CIDEM). CIDEM data were filtered by lactic acidosis and functional PDC deficiency in at least one cell/tissue type (blood lymphocytes, cultured fibroblasts or skeletal muscle) identifying 186 subjects (51% male and 49% female), about half were genetically resolved with 78% of those determined to have a pathogenic PDHA1 mutation. Assaying PDC in cultured fibroblasts in cases where the underlying genetic etiology is PDHA1, was highly sensitive irrespective of gender; 97% (95% confidence interval [CI]: 90%-100%) and 91% (95% CI: 82%-100%) in females and males, respectively. In contrast to the fibroblast-based testing, the lymphocyte- and muscle-based testing were not sensitive (36% [95% CI: 11%-61%, p=0.0003] and 58% [95% CI: 30%-86%, p=0.014], respectively) for identifying known PDC deficient females with pathogenic PDHA1 mutations. In males with a known PDHA1 mutation, the sensitivity of the various cell/tissue assays (75% lymphocyte, 91% fibroblast and 88% muscle) were not statistically different, and the discordance frequency due to the specific cell/tissue used for assaying PDC was 0.15±0.11. Based on this data, a practical diagnostic algorithm is proposed accounting for current molecular approaches, enzyme testing sensitivity, and variability due to gender, cell/tissue type used for testing, and successive repeat testing. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Provider-Initiated HIV Testing for Migrants in Spain: A Qualitative Study with Health Care Workers and Foreign-Born Sexual Minorities

    PubMed Central

    Navaza, Barbara; Abarca, Bruno; Bisoffi, Federico; Pool, Robert; Roura, Maria

    2016-01-01

    Introduction Provider-initiated HIV testing (PITC) is increasingly adopted in Europe. The success of the approach at identifying new HIV cases relies on its effectiveness at testing individuals most at risk. However, its suitability to reach populations facing overlapping vulnerabilities is under researched. This qualitative study examined HIV testing experiences and perceptions amongst Latin-American migrant men who have sex with men and transgender females in Spain, as well as health professionals’ experiences offering HIV tests to migrants in Barcelona and Madrid. Methods We conducted 32 in-depth interviews and 8 discussion groups with 38 Latin-American migrants and 21 health professionals. We imported verbatim transcripts and detailed field work notes into the qualitative software package Nvivo-10 and applied to all data a coding framework to examine systematically different HIV testing dimensions and modalities. The dimensions analysed were based on the World Health Organization “5 Cs” principles: Consent, Counselling, Connection to treatment, Correctness of results and Confidentiality. Results Health professionals reported that PITC was conceptually acceptable for them, although their perceived inability to adequately communicate HIV+ results and resulting bottle necks in the flow of care were recurrent concerns. Endorsement and adherence to the principles underpinning the rights-based response to HIV varied widely across health settings. The offer of an HIV test during routine consultations was generally appreciated by users as a way of avoiding the embarrassment of asking for it. Several participants deemed compulsory testing as acceptable on public health grounds. In spite of—and sometimes because of—partial endorsement of rights-based approaches, PITC was acceptable in a population with high levels of internalised stigma. Conclusion PITC is a promising approach to reach sexual minority migrants who hold high levels of internalised stigma but explicit extra efforts are needed to safeguard the rights of the most vulnerable. PMID:26914023

  19. A Web-Based Learning System for Software Test Professionals

    ERIC Educational Resources Information Center

    Wang, Minhong; Jia, Haiyang; Sugumaran, V.; Ran, Weijia; Liao, Jian

    2011-01-01

    Fierce competition, globalization, and technology innovation have forced software companies to search for new ways to improve competitive advantage. Web-based learning is increasingly being used by software companies as an emergent approach for enhancing the skills of knowledge workers. However, the current practice of Web-based learning is…

  20. Nurses' maths: researching a practical approach.

    PubMed

    Wilson, Ann

    To compare a new practical maths test with a written maths test. The tests were undertaken by qualified nurses training for intravenous drug administration, a skill dependent on maths accuracy. The literature showed that the higher education institutes (HEIs) that provide nurse training use traditional maths tests, a practical way of testing maths had not been described. Fifty five nurses undertook two maths tests based on intravenous drug calculations. One was a traditional written test. The second was a new type of test using a simulated clinical environment. All participants were also interviewed one week later to ascertain their thoughts and feelings about the tests. There was a significant improvement in maths test scores for those nurses who took the practical maths test first. It is suggested that this is because it improved their conceptualisation skills and thus helped them to achieve accuracy in their calculations. Written maths tests are not the best way to help and support nurses in acquiring and improving their maths skills and should be replaced by a more practical approach.

  1. Kinetic approach to degradation mechanisms in polymer solar cells and their accurate lifetime predictions

    NASA Astrophysics Data System (ADS)

    Arshad, Muhammad Azeem; Maaroufi, AbdelKrim

    2018-07-01

    A beginning has been made in the present study regarding the accurate lifetime predictions of polymer solar cells. Certain reservations about the conventionally employed temperature accelerated lifetime measurements test for its unworthiness of predicting reliable lifetimes of polymer solar cells are brought into light. Critical issues concerning the accelerated lifetime testing include, assuming reaction mechanism instead of determining it, and relying solely on the temperature acceleration of a single property of material. An advanced approach comprising a set of theoretical models to estimate the accurate lifetimes of polymer solar cells is therefore suggested in order to suitably alternate the accelerated lifetime testing. This approach takes into account systematic kinetic modeling of various possible polymer degradation mechanisms under natural weathering conditions. The proposed kinetic approach is substantiated by its applications on experimental aging data-sets of polymer solar materials/solar cells including, P3HT polymer film, bulk heterojunction (MDMO-PPV:PCBM) and dye-sensitized solar cells. Based on the suggested approach, an efficacious lifetime determination formula for polymer solar cells is derived and tested on dye-sensitized solar cells. Some important merits of the proposed method are also pointed out and its prospective applications are discussed.

  2. DNA-Based Methods in the Immunohematology Reference Laboratory

    PubMed Central

    Denomme, Gregory A

    2010-01-01

    Although hemagglutination serves the immunohematology reference laboratory well, when used alone, it has limited capability to resolve complex problems. This overview discusses how molecular approaches can be used in the immunohematology reference laboratory. In order to apply molecular approaches to immunohematology, knowledge of genes, DNA-based methods, and the molecular bases of blood groups are required. When applied correctly, DNA-based methods can predict blood groups to resolve ABO/Rh discrepancies, identify variant alleles, and screen donors for antigen-negative units. DNA-based testing in immunohematology is a valuable tool used to resolve blood group incompatibilities and to support patients in their transfusion needs. PMID:21257350

  3. BIM-Based E-Procurement: An Innovative Approach to Construction E-Procurement

    PubMed Central

    2015-01-01

    This paper presents an innovative approach to e-procurement in construction, which uses building information models (BIM) to support the construction procurement process. The result is an integrated and electronic instrument connected to a rich knowledge base capable of advanced operations and able to strengthen transaction relationships and collaboration throughout the supply chain. The BIM-based e-procurement prototype has been developed using distinct existing electronic solutions and an IFC server and was tested in a pilot case study, which supported further discussions of the results of the research. PMID:26090518

  4. Hypothesis testing in functional linear regression models with Neyman's truncation and wavelet thresholding for longitudinal data.

    PubMed

    Yang, Xiaowei; Nie, Kun

    2008-03-15

    Longitudinal data sets in biomedical research often consist of large numbers of repeated measures. In many cases, the trajectories do not look globally linear or polynomial, making it difficult to summarize the data or test hypotheses using standard longitudinal data analysis based on various linear models. An alternative approach is to apply the approaches of functional data analysis, which directly target the continuous nonlinear curves underlying discretely sampled repeated measures. For the purposes of data exploration, many functional data analysis strategies have been developed based on various schemes of smoothing, but fewer options are available for making causal inferences regarding predictor-outcome relationships, a common task seen in hypothesis-driven medical studies. To compare groups of curves, two testing strategies with good power have been proposed for high-dimensional analysis of variance: the Fourier-based adaptive Neyman test and the wavelet-based thresholding test. Using a smoking cessation clinical trial data set, this paper demonstrates how to extend the strategies for hypothesis testing into the framework of functional linear regression models (FLRMs) with continuous functional responses and categorical or continuous scalar predictors. The analysis procedure consists of three steps: first, apply the Fourier or wavelet transform to the original repeated measures; then fit a multivariate linear model in the transformed domain; and finally, test the regression coefficients using either adaptive Neyman or thresholding statistics. Since a FLRM can be viewed as a natural extension of the traditional multiple linear regression model, the development of this model and computational tools should enhance the capacity of medical statistics for longitudinal data.

  5. Testing Surrogacy Assumptions: Can Threatened and Endangered Plants Be Grouped by Biological Similarity and Abundances?

    PubMed Central

    Che-Castaldo, Judy P.; Neel, Maile C.

    2012-01-01

    There is renewed interest in implementing surrogate species approaches in conservation planning due to the large number of species in need of management but limited resources and data. One type of surrogate approach involves selection of one or a few species to represent a larger group of species requiring similar management actions, so that protection and persistence of the selected species would result in conservation of the group of species. However, among the criticisms of surrogate approaches is the need to test underlying assumptions, which remain rarely examined. In this study, we tested one of the fundamental assumptions underlying use of surrogate species in recovery planning: that there exist groups of threatened and endangered species that are sufficiently similar to warrant similar management or recovery criteria. Using a comprehensive database of all plant species listed under the U.S. Endangered Species Act and tree-based random forest analysis, we found no evidence of species groups based on a set of distributional and biological traits or by abundances and patterns of decline. Our results suggested that application of surrogate approaches for endangered species recovery would be unjustified. Thus, conservation planning focused on individual species and their patterns of decline will likely be required to recover listed species. PMID:23240051

  6. Testing surrogacy assumptions: can threatened and endangered plants be grouped by biological similarity and abundances?

    PubMed

    Che-Castaldo, Judy P; Neel, Maile C

    2012-01-01

    There is renewed interest in implementing surrogate species approaches in conservation planning due to the large number of species in need of management but limited resources and data. One type of surrogate approach involves selection of one or a few species to represent a larger group of species requiring similar management actions, so that protection and persistence of the selected species would result in conservation of the group of species. However, among the criticisms of surrogate approaches is the need to test underlying assumptions, which remain rarely examined. In this study, we tested one of the fundamental assumptions underlying use of surrogate species in recovery planning: that there exist groups of threatened and endangered species that are sufficiently similar to warrant similar management or recovery criteria. Using a comprehensive database of all plant species listed under the U.S. Endangered Species Act and tree-based random forest analysis, we found no evidence of species groups based on a set of distributional and biological traits or by abundances and patterns of decline. Our results suggested that application of surrogate approaches for endangered species recovery would be unjustified. Thus, conservation planning focused on individual species and their patterns of decline will likely be required to recover listed species.

  7. Team-based Learning Strategy in Biochemistry: Perceptions and Attitudes of Faculty and 1st-Year Medical Students.

    PubMed

    Chhabra, Namrata; Kukreja, Sahiba; Chhabra, Sarah; Chhabra, Sahil; Khodabux, Sameenah; Sabane, Harshal

    2017-12-01

    Team-based learning (TBL) strategy has been widely adapted by medical schools all over the world, but the reports regarding the perceptions and the attitudes of faculty and undergraduate medical students towards TBL approach have been conflicting. The study aimed to introduce TBL strategy in curriculum of Biochemistry after evaluating its effectiveness through perceptions and attitudes of faculty and 1 st -year medical students. One hundred and fifty students of first professional M.B.B.S and five faculty members participated in the study. Their responses regarding perceptions and attitudes towards TBL strategy were collected using structured questionnaires, focus group discussions, and in-depth interviews. Data were analyzed using Wilcoxon signed-rank test, paired sample t -test, and Mann-Whitney U-test. Majority of the students expressed satisfaction with team approach and reported improvement in the academic scores, learning styles, and development of problem-solving, interpersonal, and professional skills. The faculty, however, recommended a modified TBL approach to benefit all sections of the students for the overall success of this intervention. TBL is an effective technique to enable the students to master the core concepts and develop professional and critical thinking skills; however, for the 1 st -year medical students, a modified TBL approach might be more appropriate for the effective outcomes.

  8. Assessing contaminant sensitivity of endangered and threatened aquatic species: Part I. Acute toxicity of five chemicals

    USGS Publications Warehouse

    Dwyer, F.J.; Mayer, F.L.; Sappington, L.C.; Buckler, D.R.; Bridges, C.M.; Greer, I.E.; Hardesty, D.K.; Henke, C.E.; Ingersoll, C.G.; Kunz, J.L.; Whites, D.W.; Augspurger, T.; Mount, D.R.; Hattala, K.; Neuderfer, G.N.

    2005-01-01

    Assessment of contaminant impacts to federally identified endangered, threatened and candidate, and state-identified endangered species (collectively referred to as "listed" species) requires understanding of a species' sensitivities to particular chemicals. The most direct approach would be to determine the sensitivity of a listed species to a particular contaminant or perturbation. An indirect approach for aquatic species would be application of toxicity data obtained from standard test procedures and species commonly used in laboratory toxicity tests. Common test species (fathead minnow, Pimephales promelas; sheepshead minnow, Cyprinodon variegatus; and rainbow trout, Oncorhynchus mykiss) and 17 listed or closely related species were tested in acute 96-hour water exposures with five chemicals (carbaryl, copper, 4-nonylphenol, pentachlorophenol, and permethrin) representing a broad range of toxic modes of action. No single species was the most sensitive to all chemicals. For the three standard test species evaluated, the rainbow trout was more sensitive than either the fathead minnow or sheepshead minnow and was equal to or more sensitive than listed and related species 81% of the time. To estimate an LC50 for a listed species, a factor of 0.63 can be applied to the geometric mean LC50 of rainbow trout toxicity data, and more conservative factors can be determined using variance estimates (0.46 based on 1 SD of the mean and 0.33 based on 2 SD of the mean). Additionally, a low- or no-acute effect concentration can be estimated by multiplying the respective LC50 by a factor of approximately 0.56, which supports the United States Environmental Protection Agency approach of multiplying the final acute value by 0.5 (division by 2). When captive or locally abundant populations of listed fish are available, consideration should be given to direct testing. When direct toxicity testing cannot be performed, approaches for developing protective measures using common test species toxicity data are available. ?? 2005 Springer Science+Business Media, Inc.

  9. LESSONS FROM A RETROSPECTIVE ANALYSIS OF A 5-YR PERIOD OF QUARANTINE AT SAN DIEGO ZOO: A RISK-BASED APPROACH TO QUARANTINE ISOLATION AND TESTING MAY BENEFIT ANIMAL WELFARE.

    PubMed

    Wallace, Chelsea; Marinkovich, Matt; Morris, Pat J; Rideout, Bruce; Pye, Geoffrey W

    2016-03-01

    Quarantine is designed primarily to prevent the introduction of transmissible diseases to zoological collections. Improvements in preventive medicine, disease eradication, and comprehensive pathology programs call into question current industry quarantine standards. Disease risk analysis was used at the San Diego Zoo (SDZ) and the SDZ Safari Park to eliminate quarantine isolation and transmissible disease testing for animals transferred between the two institutions. To determine if a risk-based approach might be valid between other institutions and SDZ, we reviewed quarantine data for animals arriving at SDZ from 81 Association of Zoos and Aquariums (AZA)-accredited and 124 other sources (e.g., non-AZA-accredited institutions, private breeders, private dealers, governmental bodies) over a 5-yr period (2009-2013). No mammal or herptile failed quarantine due to transmissible diseases of concern. Approximately 2.5% of incoming birds failed quarantine due to transmissible disease; however, all 14 failed individuals were obtained from three nonaccredited sources (private breeders, confiscation). The results of our study suggest that a risk-based approach could be used to minimize or eliminate quarantine for the transfer of animals from institutions with comprehensive disease surveillance programs and/or preshipment testing practices. Quarantine isolation with testing remains an essential defense against introducing transmissible diseases of concern when there is a lack of health knowledge about the animals being received.

  10. Psychodynamic psychotherapy for posttraumatic stress disorder related to childhood abuse--Principles for a treatment manual.

    PubMed

    Wöller, Wolfgang; Leichsenring, Falk; Leweke, Frank; Kruse, Johannes

    2012-01-01

    In this article, the authors present a psychodynamically oriented psychotherapy approach for posttraumatic stress disorder (PTSD) related to childhood abuse. This neurobiologically informed, phase-oriented treatment approach, which has been developed in Germany during the past 20 years, takes into account the broad comorbidity and the large degree of ego-function impairment typically found in these patients. Based on a psychodynamic relationship orientation, this treatment integrates a variety of trauma-specific imaginative and resource-oriented techniques. The approach places major emphasis on the prevention of vicarious traumatization. The authors are presently planning to test the approach in a randomized controlled trial aimed at strengthening the evidence base for psychodynamic psychotherapy in PTSD.

  11. [Peculiarities of research of flying thinking].

    PubMed

    Kovalenko, P A; Chulaevskiĭ, A O

    2011-01-01

    New approach to the research of flying thinking is offered. This approach is based on principals of stage-by-stage approach (research of the reflection of every parameter of flight, than its aggregate in figured and conceptual framework), on the usage of the methods of registration of inner and external characteristics of activity of the air staff with the priority of research of content area and mechanisms of flying thinking, typology of content area and mechanisms of flying thinking. This approach is also based on the effectiveness of reflection by means of correlation of the detected figured and conceptual framework with time and correctness of decisions of test flight tasks and with different psychophysiological characteristics.

  12. Automated Low-Cost Smartphone-Based Lateral Flow Saliva Test Reader for Drugs-of-Abuse Detection.

    PubMed

    Carrio, Adrian; Sampedro, Carlos; Sanchez-Lopez, Jose Luis; Pimienta, Miguel; Campoy, Pascual

    2015-11-24

    Lateral flow assay tests are nowadays becoming powerful, low-cost diagnostic tools. Obtaining a result is usually subject to visual interpretation of colored areas on the test by a human operator, introducing subjectivity and the possibility of errors in the extraction of the results. While automated test readers providing a result-consistent solution are widely available, they usually lack portability. In this paper, we present a smartphone-based automated reader for drug-of-abuse lateral flow assay tests, consisting of an inexpensive light box and a smartphone device. Test images captured with the smartphone camera are processed in the device using computer vision and machine learning techniques to perform automatic extraction of the results. A deep validation of the system has been carried out showing the high accuracy of the system. The proposed approach, applicable to any line-based or color-based lateral flow test in the market, effectively reduces the manufacturing costs of the reader and makes it portable and massively available while providing accurate, reliable results.

  13. Predicting the Future: Opportunities and Challenges for the Chemical Industry to Apply 21st-Century Toxicity Testing

    PubMed Central

    Settivari, Raja S; Ball, Nicholas; Murphy, Lynea; Rasoulpour, Reza; Boverhof, Darrell R; Carney, Edward W

    2015-01-01

    Interest in applying 21st-century toxicity testing tools for safety assessment of industrial chemicals is growing. Whereas conventional toxicology uses mainly animal-based, descriptive methods, a paradigm shift is emerging in which computational approaches, systems biology, high-throughput in vitro toxicity assays, and high-throughput exposure assessments are beginning to be applied to mechanism-based risk assessments in a time- and resource-efficient fashion. Here we describe recent advances in predictive safety assessment, with a focus on their strategic application to meet the changing demands of the chemical industry and its stakeholders. The opportunities to apply these new approaches is extensive and include screening of new chemicals, informing the design of safer and more sustainable chemical alternatives, filling information gaps on data-poor chemicals already in commerce, strengthening read-across methodology for categories of chemicals sharing similar modes of action, and optimizing the design of reduced-risk product formulations. Finally, we discuss how these predictive approaches dovetail with in vivo integrated testing strategies within repeated-dose regulatory toxicity studies, which are in line with 3Rs principles to refine, reduce, and replace animal testing. Strategic application of these tools is the foundation for informed and efficient safety assessment testing strategies that can be applied at all stages of the product-development process. PMID:25836969

  14. Extracellular matrix proteins as temporary coating for thin-film neural implants

    NASA Astrophysics Data System (ADS)

    Ceyssens, Frederik; Deprez, Marjolijn; Turner, Neill; Kil, Dries; van Kuyck, Kris; Welkenhuysen, Marleen; Nuttin, Bart; Badylak, Stephen; Puers, Robert

    2017-02-01

    Objective. This study investigates the suitability of a thin sheet of extracellular matrix (ECM) proteins as a resorbable coating for temporarily reinforcing fragile or ultra-low stiffness thin-film neural implants to be placed on the brain, i.e. microelectrocorticographic (µECOG) implants. Approach. Thin-film polyimide-based electrode arrays were fabricated using lithographic methods. ECM was harvested from porcine tissue by a decellularization method and coated around the arrays. Mechanical tests and an in vivo experiment on rats were conducted, followed by a histological tissue study combined with a statistical equivalence test (confidence interval approach, 0.05 significance level) to compare the test group with an uncoated control group. Main results. After 3 months, no significant damage was found based on GFAP and NeuN staining of the relevant brain areas. Significance. The study shows that ECM sheets are a suitable temporary coating for thin µECOG neural implants.

  15. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    NASA Technical Reports Server (NTRS)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  16. [Blended-learning in psychosomatics and psychotherapy - Increasing the satisfaction and knowledge of students with a web-based e-learning tool].

    PubMed

    Ferber, Julia; Schneider, Gudrun; Havlik, Linda; Heuft, Gereon; Friederichs, Hendrik; Schrewe, Franz-Bernhard; Schulz-Steinel, Andrea; Burgmer, Markus

    2014-01-01

    To improve the synergy of established methods of teaching, the Department of Psychosomatics and Psychotherapy, University Hospital Münster, developed a web-based elearning tool using video clips of standardized patients. The effect of this blended-learning approach was evaluated. A multiple-choice test was performed by a naive (without the e-learning tool) and an experimental (with the tool) cohort of medical students to test the groups' expertise in psychosomatics. In addition, participants' satisfaction with the new tool was evaluated (numeric rating scale of 0-10). The experimental cohort was more satisfied with the curriculum and more interested in psychosomatics. Furthermore, the experimental cohort scored significantly better in the multiple-choice test. The new tool proved to be an important addition to the classical curriculum as a blended-learning approach which improves students' satisfaction and knowledge in psychosomatics.

  17. Assessing noninferiority in a three-arm trial using the Bayesian approach.

    PubMed

    Ghosh, Pulak; Nathoo, Farouk; Gönen, Mithat; Tiwari, Ram C

    2011-07-10

    Non-inferiority trials, which aim to demonstrate that a test product is not worse than a competitor by more than a pre-specified small amount, are of great importance to the pharmaceutical community. As a result, methodology for designing and analyzing such trials is required, and developing new methods for such analysis is an important area of statistical research. The three-arm trial consists of a placebo, a reference and an experimental treatment, and simultaneously tests the superiority of the reference over the placebo along with comparing this reference to an experimental treatment. In this paper, we consider the analysis of non-inferiority trials using Bayesian methods which incorporate both parametric as well as semi-parametric models. The resulting testing approach is both flexible and robust. The benefit of the proposed Bayesian methods is assessed via simulation, based on a study examining home-based blood pressure interventions. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    PubMed

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  19. Comparative shotgun proteomics using spectral count data and quasi-likelihood modeling.

    PubMed

    Li, Ming; Gray, William; Zhang, Haixia; Chung, Christine H; Billheimer, Dean; Yarbrough, Wendell G; Liebler, Daniel C; Shyr, Yu; Slebos, Robbert J C

    2010-08-06

    Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography-tandem mass spectrometry (LC-MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher's Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples.

  20. The use of imputed sibling genotypes in sibship-based association analysis: on modeling alternatives, power and model misspecification.

    PubMed

    Minică, Camelia C; Dolan, Conor V; Hottenga, Jouke-Jan; Willemsen, Gonneke; Vink, Jacqueline M; Boomsma, Dorret I

    2013-05-01

    When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of two statistical approaches suitable to model imputed genotype data: the mixture approach, which involves the full distribution of the imputed genotypes and the dosage approach, where the mean of the conditional distribution features as the imputed genotype. Simulations were run by varying sibship size, size of the phenotypic correlations among siblings, imputation accuracy and minor allele frequency of the causal SNP. Furthermore, as imputing sibling data and extending the model to include sibships of size two or greater requires modeling the familial covariance matrix, we inquired whether model misspecification affects power. Finally, the results obtained via simulations were empirically verified in two datasets with continuous phenotype data (height) and with a dichotomous phenotype (smoking initiation). Across the settings considered, the mixture and the dosage approach are equally powerful and both produce unbiased parameter estimates. In addition, the likelihood-ratio test in the linear mixed model appears to be robust to the considered misspecification in the background covariance structure, given low to moderate phenotypic correlations among siblings. Empirical results show that the inclusion in association analysis of imputed sibling genotypes does not always result in larger test statistic. The actual test statistic may drop in value due to small effect sizes. That is, if the power benefit is small, that the change in distribution of the test statistic under the alternative is relatively small, the probability is greater of obtaining a smaller test statistic. As the genetic effects are typically hypothesized to be small, in practice, the decision on whether family-based imputation could be used as a means to increase power should be informed by prior power calculations and by the consideration of the background correlation.

Top