Sample records for method validation experiments

  1. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.

  2. Validation of Skills, Knowledge and Experience in Lifelong Learning in Europe

    ERIC Educational Resources Information Center

    Ogunleye, James

    2012-01-01

    The paper examines systems of validation of skills and experience as well as the main methods/tools currently used for validating skills and knowledge in lifelong learning. The paper uses mixed methods--a case study research and content analysis of European Union policy documents and frameworks--as a basis for this research. The selection of the…

  3. A user-targeted synthesis of the VALUE perfect predictor experiment

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutierrez, Jose; Kotlarski, Sven; Hertig, Elke; Wibig, Joanna; Rössler, Ole; Huth, Radan

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. We consider different aspects: (1) marginal aspects such as mean, variance and extremes; (2) temporal aspects such as spell length characteristics; (3) spatial aspects such as the de-correlation length of precipitation extremes; and multi-variate aspects such as the interplay of temperature and precipitation or scale-interactions. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur. Experiment 1 (perfect predictors): what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Experiment 2 (Global climate model predictors): how is the overall representation of regional climate, including errors inherited from global climate models? Experiment 3 (pseudo reality): do methods fail in representing regional climate change? Here, we present a user-targeted synthesis of the results of the first VALUE experiment. In this experiment, downscaling methods are driven with ERA-Interim reanalysis data to eliminate global climate model errors, over the period 1979-2008. As reference data we use, depending on the question addressed, (1) observations from 86 meteorological stations distributed across Europe; (2) gridded observations at the corresponding 86 locations or (3) gridded spatially extended observations for selected European regions. With more than 40 contributing methods, this study is the most comprehensive downscaling inter-comparison project so far. The results clearly indicate that for several aspects, the downscaling skill varies considerably between different methods. For specific purposes, some methods can therefore clearly be excluded.

  4. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments.

    PubMed

    Munkácsy, Gyöngyi; Sztupinszki, Zsófia; Herman, Péter; Bán, Bence; Pénzváltó, Zsófia; Szarvas, Nóra; Győrffy, Balázs

    2016-09-27

    No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA) for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal-Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC) of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E-06). Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR) or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E-04). There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  5. An Overlooked Population in Community College: International Students' (In)Validation Experiences With Academic Advising

    ERIC Educational Resources Information Center

    Zhang, Yi

    2016-01-01

    Objective: Guided by validation theory, this study aims to better understand the role that academic advising plays in international community college students' adjustment. More specifically, this study investigated how academic advising validates or invalidates their academic and social experiences in a community college context. Method: This…

  6. Chemometric and biological validation of a capillary electrophoresis metabolomic experiment of Schistosoma mansoni infection in mice.

    PubMed

    Garcia-Perez, Isabel; Angulo, Santiago; Utzinger, Jürg; Holmes, Elaine; Legido-Quigley, Cristina; Barbas, Coral

    2010-07-01

    Metabonomic and metabolomic studies are increasingly utilized for biomarker identification in different fields, including biology of infection. The confluence of improved analytical platforms and the availability of powerful multivariate analysis software have rendered the multiparameter profiles generated by these omics platforms a user-friendly alternative to the established analysis methods where the quality and practice of a procedure is well defined. However, unlike traditional assays, validation methods for these new multivariate profiling tools have yet to be established. We propose a validation for models obtained by CE fingerprinting of urine from mice infected with the blood fluke Schistosoma mansoni. We have analysed urine samples from two sets of mice infected in an inter-laboratory experiment where different infection methods and animal husbandry procedures were employed in order to establish the core biological response to a S. mansoni infection. CE data were analysed using principal component analysis. Validation of the scores consisted of permutation scrambling (100 repetitions) and a manual validation method, using a third of the samples (not included in the model) as a test or prediction set. The validation yielded 100% specificity and 100% sensitivity, demonstrating the robustness of these models with respect to deciphering metabolic perturbations in the mouse due to a S. mansoni infection. A total of 20 metabolites across the two experiments were identified that significantly discriminated between S. mansoni-infected and noninfected control samples. Only one of these metabolites, allantoin, was identified as manifesting different behaviour in the two experiments. This study shows the reproducibility of CE-based metabolic profiling methods for disease characterization and screening and highlights the importance of much needed validation strategies in the emerging field of metabolomics.

  7. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    NASA Technical Reports Server (NTRS)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  8. Learning to recognize rat social behavior: Novel dataset and cross-dataset application.

    PubMed

    Lorbach, Malte; Kyriakou, Elisavet I; Poppe, Ronald; van Dam, Elsbeth A; Noldus, Lucas P J J; Veltkamp, Remco C

    2018-04-15

    Social behavior is an important aspect of rodent models. Automated measuring tools that make use of video analysis and machine learning are an increasingly attractive alternative to manual annotation. Because machine learning-based methods need to be trained, it is important that they are validated using data from different experiment settings. To develop and validate automated measuring tools, there is a need for annotated rodent interaction datasets. Currently, the availability of such datasets is limited to two mouse datasets. We introduce the first, publicly available rat social interaction dataset, RatSI. We demonstrate the practical value of the novel dataset by using it as the training set for a rat interaction recognition method. We show that behavior variations induced by the experiment setting can lead to reduced performance, which illustrates the importance of cross-dataset validation. Consequently, we add a simple adaptation step to our method and improve the recognition performance. Most existing methods are trained and evaluated in one experimental setting, which limits the predictive power of the evaluation to that particular setting. We demonstrate that cross-dataset experiments provide more insight in the performance of classifiers. With our novel, public dataset we encourage the development and validation of automated recognition methods. We are convinced that cross-dataset validation enhances our understanding of rodent interactions and facilitates the development of more sophisticated recognition methods. Combining them with adaptation techniques may enable us to apply automated recognition methods to a variety of animals and experiment settings. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. HPLC-MS/MS method for dexmedetomidine quantification with Design of Experiments approach: application to pediatric pharmacokinetic study.

    PubMed

    Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta

    2017-02-01

    The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.

  10. Comparative assessment of three standardized robotic surgery training methods.

    PubMed

    Hung, Andrew J; Jayaratna, Isuru S; Teruya, Kara; Desai, Mihir M; Gill, Inderbir S; Goh, Alvin C

    2013-10-01

    To evaluate three standardized robotic surgery training methods, inanimate, virtual reality and in vivo, for their construct validity. To explore the concept of cross-method validity, where the relative performance of each method is compared. Robotic surgical skills were prospectively assessed in 49 participating surgeons who were classified as follows: 'novice/trainee': urology residents, previous experience <30 cases (n = 38) and 'experts': faculty surgeons, previous experience ≥30 cases (n = 11). Three standardized, validated training methods were used: (i) structured inanimate tasks; (ii) virtual reality exercises on the da Vinci Skills Simulator (Intuitive Surgical, Sunnyvale, CA, USA); and (iii) a standardized robotic surgical task in a live porcine model with performance graded by the Global Evaluative Assessment of Robotic Skills (GEARS) tool. A Kruskal-Wallis test was used to evaluate performance differences between novices and experts (construct validity). Spearman's correlation coefficient (ρ) was used to measure the association of performance across inanimate, simulation and in vivo methods (cross-method validity). Novice and expert surgeons had previously performed a median (range) of 0 (0-20) and 300 (30-2000) robotic cases, respectively (P < 0.001). Construct validity: experts consistently outperformed residents with all three methods (P < 0.001). Cross-method validity: overall performance of inanimate tasks significantly correlated with virtual reality robotic performance (ρ = -0.7, P < 0.001) and in vivo robotic performance based on GEARS (ρ = -0.8, P < 0.0001). Virtual reality performance and in vivo tissue performance were also found to be strongly correlated (ρ = 0.6, P < 0.001). We propose the novel concept of cross-method validity, which may provide a method of evaluating the relative value of various forms of skills education and assessment. We externally confirmed the construct validity of each featured training tool. © 2013 BJU International.

  11. Finite element analysis of dental implants with validation: to what extent can we expect the model to predict biological phenomena? A literature review and proposal for classification of a validation process.

    PubMed

    Chang, Yuanhan; Tambe, Abhijit Anil; Maeda, Yoshinobu; Wada, Masahiro; Gonda, Tomoya

    2018-03-08

    A literature review of finite element analysis (FEA) studies of dental implants with their model validation process was performed to establish the criteria for evaluating validation methods with respect to their similarity to biological behavior. An electronic literature search of PubMed was conducted up to January 2017 using the Medical Subject Headings "dental implants" and "finite element analysis." After accessing the full texts, the context of each article was searched using the words "valid" and "validation" and articles in which these words appeared were read to determine whether they met the inclusion criteria for the review. Of 601 articles published from 1997 to 2016, 48 that met the eligibility criteria were selected. The articles were categorized according to their validation method as follows: in vivo experiments in humans (n = 1) and other animals (n = 3), model experiments (n = 32), others' clinical data and past literature (n = 9), and other software (n = 2). Validation techniques with a high level of sufficiency and efficiency are still rare in FEA studies of dental implants. High-level validation, especially using in vivo experiments tied to an accurate finite element method, needs to become an established part of FEA studies. The recognition of a validation process should be considered when judging the practicality of an FEA study.

  12. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    ERIC Educational Resources Information Center

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  13. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  14. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  15. Inner experience in the scanner: can high fidelity apprehensions of inner experience be integrated with fMRI?

    PubMed Central

    Kühn, Simone; Fernyhough, Charles; Alderson-Day, Benjamin; Hurlburt, Russell T.

    2014-01-01

    To provide full accounts of human experience and behavior, research in cognitive neuroscience must be linked to inner experience, but introspective reports of inner experience have often been found to be unreliable. The present case study aimed at providing proof of principle that introspection using one method, descriptive experience sampling (DES), can be reliably integrated with fMRI. A participant was trained in the DES method, followed by nine sessions of sampling within an MRI scanner. During moments where the DES interview revealed ongoing inner speaking, fMRI data reliably showed activation in classic speech processing areas including left inferior frontal gyrus. Further, the fMRI data validated the participant’s DES observations of the experiential distinction between inner speaking and innerly hearing her own voice. These results highlight the precision and validity of the DES method as a technique of exploring inner experience and the utility of combining such methods with fMRI. PMID:25538649

  16. Experiences using IAEA Code of practice for radiation sterilization of tissue allografts: Validation and routine control

    NASA Astrophysics Data System (ADS)

    Hilmy, N.; Febrida, A.; Basril, A.

    2007-11-01

    Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.

  17. Inter-Disciplinary Collaboration in Support of the Post-Standby TREAT Mission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark; Baker, Benjamin; Ortensi, Javier

    Although analysis methods have advanced significantly in the last two decades, high fidelity multi- physics methods for reactors systems have been under development for only a few years and are not presently mature nor deployed. Furthermore, very few methods provide the ability to simulate rapid transients in three dimensions. Data for validation of advanced time-dependent multi- physics is sparse; at TREAT, historical data were not collected for the purpose of validating three-dimensional methods, let alone multi-physics simulations. Existing data continues to be collected to attempt to simulate the behavior of experiments and calibration transients, but it will be insufficient formore » the complete validation of analysis methods used for TREAT transient simulations. Hence, a 2018 restart will most likely occur without the direct application of advanced modeling and simulation methods. At present, the current INL modeling and simulation team plans to work with TREAT operations staff in performing reactor simulations with MAMMOTH, in parallel with the software packages currently being used in preparation for core restart (e.g., MCNP5, RELAP5, ABAQUS). The TREAT team has also requested specific measurements to be performed during startup testing, currently scheduled to run from February to August of 2018. These startup measurements will be crucial in validating the new analysis methods in preparation for ultimate application for TREAT operations and experiment design. This document describes the collaboration between modeling and simulation staff and restart, operations, instrumentation and experiment development teams to be able to effectively interact and achieve successful validation work during restart testing.« less

  18. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  19. Extension and Validation of a Hybrid Particle-Finite Element Method for Hypervelocity Impact Simulation. Chapter 2

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.; Shivarama, Ravishankar

    2004-01-01

    The hybrid particle-finite element method of Fahrenthold and Horban, developed for the simulation of hypervelocity impact problems, has been extended to include new formulations of the particle-element kinematics, additional constitutive models, and an improved numerical implementation. The extended formulation has been validated in three dimensional simulations of published impact experiments. The test cases demonstrate good agreement with experiment, good parallel speedup, and numerical convergence of the simulation results.

  20. A Complex Systems Approach to Causal Discovery in Psychiatry.

    PubMed

    Saxe, Glenn N; Statnikov, Alexander; Fenyo, David; Ren, Jiwen; Li, Zhiguo; Prasad, Meera; Wall, Dennis; Bergman, Nora; Briggs, Ernestine C; Aliferis, Constantin

    2016-01-01

    Conventional research methodologies and data analytic approaches in psychiatric research are unable to reliably infer causal relations without experimental designs, or to make inferences about the functional properties of the complex systems in which psychiatric disorders are embedded. This article describes a series of studies to validate a novel hybrid computational approach--the Complex Systems-Causal Network (CS-CN) method-designed to integrate causal discovery within a complex systems framework for psychiatric research. The CS-CN method was first applied to an existing dataset on psychopathology in 163 children hospitalized with injuries (validation study). Next, it was applied to a much larger dataset of traumatized children (replication study). Finally, the CS-CN method was applied in a controlled experiment using a 'gold standard' dataset for causal discovery and compared with other methods for accurately detecting causal variables (resimulation controlled experiment). The CS-CN method successfully detected a causal network of 111 variables and 167 bivariate relations in the initial validation study. This causal network had well-defined adaptive properties and a set of variables was found that disproportionally contributed to these properties. Modeling the removal of these variables resulted in significant loss of adaptive properties. The CS-CN method was successfully applied in the replication study and performed better than traditional statistical methods, and similarly to state-of-the-art causal discovery algorithms in the causal detection experiment. The CS-CN method was validated, replicated, and yielded both novel and previously validated findings related to risk factors and potential treatments of psychiatric disorders. The novel approach yields both fine-grain (micro) and high-level (macro) insights and thus represents a promising approach for complex systems-oriented research in psychiatry.

  1. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Validity of Adult Retrospective Reports of Adverse Childhood Experiences: Review of the Evidence

    ERIC Educational Resources Information Center

    Hardt, Jochen; Rutter, Michael

    2004-01-01

    Background: Influential studies have cast doubt on the validity of retrospective reports by adults of their own adverse experiences in childhood. Accordingly, many researchers view retrospective reports with scepticism. Method: A computer-based search, supplemented by hand searches, was used to identify studies reported between 1980 and 2001 in…

  3. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability?

    PubMed

    Janssen, Ellen M; Marshall, Deborah A; Hauber, A Brett; Bridges, John F P

    2017-12-01

    The recent endorsement of discrete-choice experiments (DCEs) and other stated-preference methods by regulatory and health technology assessment (HTA) agencies has placed a greater focus on demonstrating the validity and reliability of preference results. Areas covered: We present a practical overview of tests of validity and reliability that have been applied in the health DCE literature and explore other study qualities of DCEs. From the published literature, we identify a variety of methods to assess the validity and reliability of DCEs. We conceptualize these methods to create a conceptual model with four domains: measurement validity, measurement reliability, choice validity, and choice reliability. Each domain consists of three categories that can be assessed using one to four procedures (for a total of 24 tests). We present how these tests have been applied in the literature and direct readers to applications of these tests in the health DCE literature. Based on a stakeholder engagement exercise, we consider the importance of study characteristics beyond traditional concepts of validity and reliability. Expert commentary: We discuss study design considerations to assess the validity and reliability of a DCE, consider limitations to the current application of tests, and discuss future work to consider the quality of DCEs in healthcare.

  4. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  5. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  6. A Validation Study of the Adolescent Dissociative Experiences Scale

    ERIC Educational Resources Information Center

    Keck Seeley, Susan. M.; Perosa, Sandra, L.; Perosa, Linda, M.

    2004-01-01

    Objective: The purpose of this study was to further the validation process of the Adolescent Dissociative Experiences Scale (A-DES). In this study, a 6-item Likert response format with descriptors was used when responding to the A-DES rather than the 11-item response format used in the original A-DES. Method: The internal reliability and construct…

  7. Archeointensity estimates of a tenth-century kiln: first application of the Tsunakawa-Shaw paleointensity method to archeological relics

    NASA Astrophysics Data System (ADS)

    Kitahara, Yu; Yamamoto, Yuhji; Ohno, Masao; Kuwahara, Yoshihiro; Kameda, Shuichi; Hatakeyama, Tadahiro

    2018-05-01

    Paleomagnetic information reconstructed from archeological materials can be utilized to estimate the archeological age of excavated relics, in addition to revealing the geomagnetic secular variation and core dynamics. The direction and intensity of the Earth's magnetic field (archeodirection and archeointensity) can be ascertained using different methods, many of which have been proposed over the past decade. Among the new experimental techniques for archeointensity estimates is the Tsunakawa-Shaw method. This study demonstrates the validity of the Tsunakawa-Shaw method to reconstruct archeointensity from samples of baked clay from archeological relics. The validity of the approach was tested by comparison with the IZZI-Thellier method. The intensity values obtained coincided at the standard deviation (1 σ) level. A total of 8 specimens for the Tsunakawa-Shaw method and 16 specimens for the IZZI-Thellier method, from 8 baked clay blocks, collected from the surface of the kiln were used in these experiments. Among them, 8 specimens (for the Tsunakawa-Shaw method) and 3 specimens (for the IZZI-Thellier method) passed a set of strict selection criteria used in the final evaluation of validity. Additionally, we performed rock magnetic experiments, mineral analysis, and paleodirection measurement to evaluate the suitability of the baked clay samples for paleointensity experiments and hence confirmed that the sample properties were ideal for performing paleointensity experiments. It is notable that the newly estimated archaomagnetic intensity values are lower than those in previous studies that used other paleointensity methods for the tenth century in Japan. [Figure not available: see fulltext.

  8. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    PubMed

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  9. Moving to Capture Children's Attention: Developing a Methodology for Measuring Visuomotor Attention.

    PubMed

    Hill, Liam J B; Coats, Rachel O; Mushtaq, Faisal; Williams, Justin H G; Aucott, Lorna S; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child's development. However, methodological limitations currently make large-scale assessment of children's attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of 'Visual Motor Attention' (VMA)-a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method's core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults' attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action).

  10. Validation of space-based polarization measurements by use of a single-scattering approximation, with application to the global ozone monitoring experiment.

    PubMed

    Aben, Ilse; Tanzi, Cristina P; Hartmann, Wouter; Stam, Daphne M; Stammes, Piet

    2003-06-20

    A method is presented for in-flight validation of space-based polarization measurements based on approximation of the direction of polarization of scattered sunlight by the Rayleigh single-scattering value. This approximation is verified by simulations of radiative transfer calculations for various atmospheric conditions. The simulations show locations along an orbit where the scattering geometries are such that the intensities of the parallel and orthogonal polarization components of the light are equal, regardless of the observed atmosphere and surface. The method can be applied to any space-based instrument that measures the polarization of reflected solar light. We successfully applied the method to validate the Global Ozone Monitoring Experiment (GOME) polarization measurements. The error in the GOME's three broadband polarization measurements appears to be approximately 1%.

  11. Remote Patron Validation: Posting a Proxy Server at the Digital Doorway.

    ERIC Educational Resources Information Center

    Webster, Peter

    2002-01-01

    Discussion of remote access to library services focuses on proxy servers as a method for remote access, based on experiences at Saint Mary's University (Halifax). Topics include Internet protocol user validation; browser-directed proxies; server software proxies; vendor alternatives for validating remote users; and Internet security issues. (LRW)

  12. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.

  13. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin

    PubMed Central

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2016-01-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957

  14. Validation of spatial variability in downscaling results from the VALUE perfect predictor experiment

    NASA Astrophysics Data System (ADS)

    Widmann, Martin; Bedia, Joaquin; Gutiérrez, Jose Manuel; Maraun, Douglas; Huth, Radan; Fischer, Andreas; Keller, Denise; Hertig, Elke; Vrac, Mathieu; Wibig, Joanna; Pagé, Christian; Cardoso, Rita M.; Soares, Pedro MM; Bosshard, Thomas; Casado, Maria Jesus; Ramos, Petra

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. Within VALUE a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods has been developed. In the first validation experiment the downscaling methods are validated in a setup with perfect predictors taken from the ERA-interim reanalysis for the period 1997 - 2008. This allows to investigate the isolated skill of downscaling methods without further error contributions from the large-scale predictors. One aspect of the validation is the representation of spatial variability. As part of the VALUE validation we have compared various properties of the spatial variability of downscaled daily temperature and precipitation with the corresponding properties in observations. We have used two test validation datasets, one European-wide set of 86 stations, and one higher-density network of 50 stations in Germany. Here we present results based on three approaches, namely the analysis of i.) correlation matrices, ii.) pairwise joint threshold exceedances, and iii.) regions of similar variability. We summarise the information contained in correlation matrices by calculating the dependence of the correlations on distance and deriving decorrelation lengths, as well as by determining the independent degrees of freedom. Probabilities for joint threshold exceedances and (where appropriate) non-exceedances are calculated for various user-relevant thresholds related for instance to extreme precipitation or frost and heat days. The dependence of these probabilities on distance is again characterised by calculating typical length scales that separate dependent from independent exceedances. Regionalisation is based on rotated Principal Component Analysis. The results indicate which downscaling methods are preferable if the dependency of variability at different locations is relevant for the user.

  15. Validation in Support of Internationally Harmonised OECD Test Guidelines for Assessing the Safety of Chemicals.

    PubMed

    Gourmelon, Anne; Delrue, Nathalie

    Ten years elapsed since the OECD published the Guidance document on the validation and international regulatory acceptance of test methods for hazard assessment. Much experience has been gained since then in validation centres, in countries and at the OECD on a variety of test methods that were subjected to validation studies. This chapter reviews validation principles and highlights common features that appear to be important for further regulatory acceptance across studies. Existing OECD-agreed validation principles will most likely generally remain relevant and applicable to address challenges associated with the validation of future test methods. Some adaptations may be needed to take into account the level of technique introduced in test systems, but demonstration of relevance and reliability will continue to play a central role as pre-requisite for the regulatory acceptance. Demonstration of relevance will become more challenging for test methods that form part of a set of predictive tools and methods, and that do not stand alone. OECD is keen on ensuring that while these concepts evolve, countries can continue to rely on valid methods and harmonised approaches for an efficient testing and assessment of chemicals.

  16. PSI-Center Simulations of Validation Platform Experiments

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2013-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), PHD/ELF (UW/MSNW), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, non-local closures, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition is proving to be a powerful method to compare global temporal and spatial structures for validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  17. Demonstrating Experimenter "Ineptitude" as a Means of Teaching Internal and External Validity

    ERIC Educational Resources Information Center

    Treadwell, Kimberli R.H.

    2008-01-01

    Internal and external validity are key concepts in understanding the scientific method and fostering critical thinking. This article describes a class demonstration of a "botched" experiment to teach validity to undergraduates. Psychology students (N = 75) completed assessments at the beginning of the semester, prior to and immediately following…

  18. Robotic suturing on the FLS model possesses construct validity, is less physically demanding, and is favored by more surgeons compared with laparoscopy.

    PubMed

    Stefanidis, Dimitrios; Hope, William W; Scott, Daniel J

    2011-07-01

    The value of robotic assistance for intracorporeal suturing is not well defined. We compared robotic suturing with laparoscopic suturing on the FLS model with a large cohort of surgeons. Attendees (n=117) at the SAGES 2006 Learning Center robotic station placed intracorporeal sutures on the FLS box-trainer model using conventional laparoscopic instruments and the da Vinci® robot. Participant performance was recorded using a validated objective scoring system, and a questionnaire regarding demographics, task workload, and suturing modality preference was completed. Construct validity for both tasks was assessed by comparing the performance scores of subjects with various levels of experience. A validated questionnaire was used for workload measurement. Of the participants, 84% had prior laparoscopic and 10% prior robotic suturing experience. Within the allotted time, 83% of participants completed the suturing task laparoscopically and 72% with the robot. Construct validity was demonstrated for both simulated tasks according to the participants' advanced laparoscopic experience, laparoscopic suturing experience, and self-reported laparoscopic suturing ability (p<0.001 for all) and according to prior robotic experience, robotic suturing experience, and self-reported robotic suturing ability (p<0.001 for all), respectively. While participants achieved higher suturing scores with standard laparoscopy compared with the robot (84±75 vs. 56±63, respectively; p<0.001), they found the laparoscopic task more physically demanding (NASA score 13±5 vs. 10±5, respectively; p<0.001) and favored the robot as their method of choice for intracorporeal suturing (62 vs. 38%, respectively; p<0.01). Construct validity was demonstrated for robotic suturing on the FLS model. Suturing scores were higher using standard laparoscopy likely as a result of the participants' greater experience with laparoscopic suturing versus robotic suturing. Robotic assistance decreases the physical demand of intracorporeal suturing compared with conventional laparoscopy and, in this study, was the preferred suturing method by most surgeons. Curricula for robotic suturing training need to be developed.

  19. Development of an ultra high performance liquid chromatography method for determining triamcinolone acetonide in hydrogels using the design of experiments/design space strategy in combination with process capability index.

    PubMed

    Oliva, Alexis; Monzón, Cecilia; Santoveña, Ana; Fariña, José B; Llabrés, Matías

    2016-07-01

    An ultra high performance liquid chromatography method was developed and validated for the quantitation of triamcinolone acetonide in an injectable ophthalmic hydrogel to determine the contribution of analytical method error in the content uniformity measurement. During the development phase, the design of experiments/design space strategy was used. For this, the free R-program was used as a commercial software alternative, a fast efficient tool for data analysis. The process capability index was used to find the permitted level of variation for each factor and to define the design space. All these aspects were analyzed and discussed under different experimental conditions by the Monte Carlo simulation method. Second, a pre-study validation procedure was performed in accordance with the International Conference on Harmonization guidelines. The validated method was applied for the determination of uniformity of dosage units and the reasons for variability (inhomogeneity and the analytical method error) were analyzed based on the overall uncertainty. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Helicopter rotor loads using a matched asymptotic expansion technique

    NASA Technical Reports Server (NTRS)

    Pierce, G. A.; Vaidyanathan, A. R.

    1981-01-01

    The theoretical basis and computational feasibility of the Van Holten method, and its performance and range of validity by comparison with experiment and other approximate methods was examined. It is found that within the restrictions of incompressible, potential flow and the assumption of small disturbances, the method does lead to a valid description of the flow. However, the method begins to break down under conditions favoring nonlinear effects such as wake distortion and blade/rotor interaction.

  1. A Possible Tool for Checking Errors in the INAA Results, Based on Neutron Data and Method Validation

    NASA Astrophysics Data System (ADS)

    Cincu, Em.; Grigore, Ioana Manea; Barbos, D.; Cazan, I. L.; Manu, V.

    2008-08-01

    This work presents preliminary results of a new type of possible application in the INAA experiments of elemental analysis, useful to check errors occurred during investigation of unknown samples; it relies on the INAA method validation experiments and accuracy of the neutron data from the literature. The paper comprises 2 sections, the first one presents—in short—the steps of the experimental tests carried out for INAA method validation and for establishing the `ACTIVA-N' laboratory performance, which is-at the same time-an illustration of the laboratory evolution on the way to get performance. Section 2 presents our recent INAA results on CRMs, of which interpretation opens discussions about the usefulness of using a tool for checking possible errors, different from the usual statistical procedures. The questionable aspects and the requirements to develop a practical checking tool are discussed.

  2. Evaluation of Fission Product Critical Experiments and Associated Biases for Burnup Credit Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Don; Rearden, Bradley T; Reed, Davis Allan

    2010-01-01

    One of the challenges associated with implementation of burnup credit is the validation of criticality calculations used in the safety evaluation; in particular the availability and use of applicable critical experiment data. The purpose of the validation is to quantify the relationship between reality and calculated results. Validation and determination of bias and bias uncertainty require the identification of sets of critical experiments that are similar to the criticality safety models. A principal challenge for crediting fission products (FP) in a burnup credit safety evaluation is the limited availability of relevant FP critical experiments for bias and bias uncertainty determination.more » This paper provides an evaluation of the available critical experiments that include FPs, along with bounding, burnup-dependent estimates of FP biases generated by combining energy dependent sensitivity data for a typical burnup credit application with the nuclear data uncertainty information distributed with SCALE 6. A method for determining separate bias and bias uncertainty values for individual FPs and illustrative results is presented. Finally, a FP bias calculation method based on data adjustment techniques and reactivity sensitivity coefficients calculated with the SCALE sensitivity/uncertainty tools and some typical results is presented. Using the methods described in this paper, the cross-section bias for a representative high-capacity spent fuel cask associated with the ENDF/B-VII nuclear data for 16 most important stable or near stable FPs is predicted to be no greater than 2% of the total worth of the 16 FPs, or less than 0.13 % k/k.« less

  3. VALUE - A Framework to Validate Downscaling Approaches for Climate Change Studies

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilke, Renate A. I.

    2015-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. Here, we present the key ingredients of this framework. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.

  4. VALUE: A framework to validate downscaling approaches for climate change studies

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilcke, Renate A. I.

    2015-01-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. In this paper, we present the key ingredients of this framework. VALUE's main approach to validation is user- focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.

  5. VDA, a Method of Choosing a Better Algorithm with Fewer Validations

    PubMed Central

    Kluger, Yuval

    2011-01-01

    The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power. Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico. VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms. Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256

  6. On the validity of the autobiographical emotional memory task for emotion induction.

    PubMed

    Mills, Caitlin; D'Mello, Sidney

    2014-01-01

    The Autobiographical Emotional Memory Task (AEMT), which involves recalling and writing about intense emotional experiences, is a widely used method to experimentally induce emotions. The validity of this method depends upon the extent to which it can induce specific desired emotions (intended emotions), while not inducing any other (incidental) emotions at different levels across one (or more) conditions. A review of recent studies that used this method indicated that most studies exclusively monitor post-writing ratings of the intended emotions, without assessing the possibility that the method may have differentially induced other incidental emotions as well. We investigated the extent of this issue by collecting both pre- and post-writing ratings of incidental emotions in addition to the intended emotions. Using methods largely adapted from previous studies, participants were assigned to write about a profound experience of anger or fear (Experiment 1) or happiness or sadness (Experiment 2). In line with previous research, results indicated that intended emotions (anger and fear) were successfully induced in the respective conditions in Experiment 1. However, disgust and sadness were also induced while writing about an angry experience compared to a fearful experience. Similarly, although happiness and sadness were induced in the appropriate conditions, Experiment 2 indicated that writing about a sad experience also induced disgust, fear, and anger, compared to writing about a happy experience. Possible resolutions to avoid the limitations of the AEMT to induce specific discrete emotions are discussed.

  7. On the Validity of the Autobiographical Emotional Memory Task for Emotion Induction

    PubMed Central

    Mills, Caitlin; D'Mello, Sidney

    2014-01-01

    The Autobiographical Emotional Memory Task (AEMT), which involves recalling and writing about intense emotional experiences, is a widely used method to experimentally induce emotions. The validity of this method depends upon the extent to which it can induce specific desired emotions (intended emotions), while not inducing any other (incidental) emotions at different levels across one (or more) conditions. A review of recent studies that used this method indicated that most studies exclusively monitor post-writing ratings of the intended emotions, without assessing the possibility that the method may have differentially induced other incidental emotions as well. We investigated the extent of this issue by collecting both pre- and post-writing ratings of incidental emotions in addition to the intended emotions. Using methods largely adapted from previous studies, participants were assigned to write about a profound experience of anger or fear (Experiment 1) or happiness or sadness (Experiment 2). In line with previous research, results indicated that intended emotions (anger and fear) were successfully induced in the respective conditions in Experiment 1. However, disgust and sadness were also induced while writing about an angry experience compared to a fearful experience. Similarly, although happiness and sadness were induced in the appropriate conditions, Experiment 2 indicated that writing about a sad experience also induced disgust, fear, and anger, compared to writing about a happy experience. Possible resolutions to avoid the limitations of the AEMT to induce specific discrete emotions are discussed. PMID:24776697

  8. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Use of Bayesian Networks to Probabilistically Model and Improve the Likelihood of Validation of Microarray Findings by RT-PCR

    PubMed Central

    English, Sangeeta B.; Shih, Shou-Ching; Ramoni, Marco F.; Smith, Lois E.; Butte, Atul J.

    2014-01-01

    Though genome-wide technologies, such as microarrays, are widely used, data from these methods are considered noisy; there is still varied success in downstream biological validation. We report a method that increases the likelihood of successfully validating microarray findings using real time RT-PCR, including genes at low expression levels and with small differences. We use a Bayesian network to identify the most relevant sources of noise based on the successes and failures in validation for an initial set of selected genes, and then improve our subsequent selection of genes for validation based on eliminating these sources of noise. The network displays the significant sources of noise in an experiment, and scores the likelihood of validation for every gene. We show how the method can significantly increase validation success rates. In conclusion, in this study, we have successfully added a new automated step to determine the contributory sources of noise that determine successful or unsuccessful downstream biological validation. PMID:18790084

  10. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  12. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    PubMed

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  13. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method.

    PubMed

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-02-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5-12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim,) [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  14. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  15. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  16. A Comparison of Assessment Methods and Raters in Product Creativity

    ERIC Educational Resources Information Center

    Lu, Chia-Chen; Luh, Ding-Bang

    2012-01-01

    Although previous studies have attempted to use different experiences of raters to rate product creativity by adopting the Consensus Assessment Method (CAT) approach, the validity of replacing CAT with another measurement tool has not been adequately tested. This study aimed to compare raters with different levels of experience (expert ves.…

  17. Software reliability: Additional investigations into modeling with replicated experiments

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.

    1984-01-01

    The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.

  18. Methodology and issues of integral experiments selection for nuclear data validation

    NASA Astrophysics Data System (ADS)

    Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian

    2017-09-01

    Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).

  19. [Perception scales of validated food insecurity: the experience of the countries in Latin America and the Caribbean].

    PubMed

    Sperandio, Naiara; Morais, Dayane de Castro; Priore, Silvia Eloiza

    2018-02-01

    The scope of this systematic review was to compare the food insecurity scales validated and used in the countries in Latin America and the Caribbean, and analyze the methods used in validation studies. A search was conducted in the Lilacs, SciELO and Medline electronic databases. The publications were pre-selected by titles and abstracts, and subsequently by a full reading. Of the 16,325 studies reviewed, 14 were selected. Twelve validated scales were identified for the following countries: Venezuela, Brazil, Colombia, Bolivia, Ecuador, Costa Rica, Mexico, Haiti, the Dominican Republic, Argentina and Guatemala. Besides these, there is the Latin American and Caribbean scale, the scope of which is regional. The scales ranged from the standard reference used, number of questions and diagnosis of insecurity. The methods used by the studies for internal validation were calculation of Cronbach's alpha and the Rasch model; for external validation the authors calculated association and /or correlation with socioeconomic and food consumption variables. The successful experience of Latin America and the Caribbean in the development of national and regional scales can be an example for other countries that do not have this important indicator capable of measuring the phenomenon of food insecurity.

  20. Validation of the thermal transport model used for ITER startup scenario predictions with DIII-D experimental data

    DOE PAGES

    Casper, T. A.; Meyer, W. H.; Jackson, G. L.; ...

    2010-12-08

    We are exploring characteristics of ITER startup scenarios in similarity experiments conducted on the DIII-D Tokamak. In these experiments, we have validated scenarios for the ITER current ramp up to full current and developed methods to control the plasma parameters to achieve stability. Predictive simulations of ITER startup using 2D free-boundary equilibrium and 1D transport codes rely on accurate estimates of the electron and ion temperature profiles that determine the electrical conductivity and pressure profiles during the current rise. Here we present results of validation studies that apply the transport model used by the ITER team to DIII-D discharge evolutionmore » and comparisons with data from our similarity experiments.« less

  1. Establishing high resolution melting analysis: method validation and evaluation for c-RET proto-oncogene mutation screening.

    PubMed

    Benej, Martin; Bendlova, Bela; Vaclavikova, Eliska; Poturnajova, Martina

    2011-10-06

    Reliable and effective primary screening of mutation carriers is the key condition for common diagnostic use. The objective of this study is to validate the method high resolution melting (HRM) analysis for routine primary mutation screening and accomplish its optimization, evaluation and validation. Due to their heterozygous nature, germline point mutations of c-RET proto-oncogene, associated to multiple endocrine neoplasia type 2 (MEN2), are suitable for HRM analysis. Early identification of mutation carriers has a major impact on patients' survival due to early onset of medullary thyroid carcinoma (MTC) and resistance to conventional therapy. The authors performed a series of validation assays according to International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidelines for validation of analytical procedures, along with appropriate design and optimization experiments. After validated evaluation of HRM, the method was utilized for primary screening of 28 pathogenic c-RET mutations distributed among nine exons of c-RET gene. Validation experiments confirm the repeatability, robustness, accuracy and reproducibility of HRM. All c-RET gene pathogenic variants were detected with no occurrence of false-positive/false-negative results. The data provide basic information about design, establishment and validation of HRM for primary screening of genetic variants in order to distinguish heterozygous point mutation carriers among the wild-type sequence carriers. HRM analysis is a powerful and reliable tool for rapid and cost-effective primary screening, e.g., of c-RET gene germline and/or sporadic mutations and can be used as a first line potential diagnostic tool.

  2. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  3. Ab initio analytical Raman intensities for periodic systems through a coupled perturbed Hartree-Fock/Kohn-Sham method in an atomic orbital basis. II. Validation and comparison with experiments

    NASA Astrophysics Data System (ADS)

    Maschio, Lorenzo; Kirtman, Bernard; Rérat, Michel; Orlando, Roberto; Dovesi, Roberto

    2013-10-01

    In this work, we validate a new, fully analytical method for calculating Raman intensities of periodic systems, developed and presented in Paper I [L. Maschio, B. Kirtman, M. Rérat, R. Orlando, and R. Dovesi, J. Chem. Phys. 139, 164101 (2013)]. Our validation of this method and its implementation in the CRYSTAL code is done through several internal checks as well as comparison with experiment. The internal checks include consistency of results when increasing the number of periodic directions (from 0D to 1D, 2D, 3D), comparison with numerical differentiation, and a test of the sum rule for derivatives of the polarizability tensor. The choice of basis set as well as the Hamiltonian is also studied. Simulated Raman spectra of α-quartz and of the UiO-66 Metal-Organic Framework are compared with the experimental data.

  4. National Institutes of Health Pathways to Prevention Workshop: Methods for Evaluating Natural Experiments in Obesity.

    PubMed

    Emmons, Karen M; Doubeni, Chyke A; Fernandez, Maria E; Miglioretti, Diana L; Samet, Jonathan M

    2018-06-05

    On 5 and 6 December 2017, the National Institutes of Health (NIH) convened the Pathways to Prevention Workshop: Methods for Evaluating Natural Experiments in Obesity to identify the status of methods for assessing natural experiments to reduce obesity, areas in which these methods could be improved, and research needs for advancing the field. This article considers findings from a systematic evidence review on methods for evaluating natural experiments in obesity, workshop presentations by experts and stakeholders, and public comment. Research gaps are identified, and recommendations related to 4 key issues are provided. Recommendations on population-based data sources and data integration include maximizing use and sharing of existing surveillance and research databases and ensuring significant effort to integrate and link databases. Recommendations on measurement include use of standardized and validated measures of obesity-related outcomes and exposures, systematic measurement of co-benefits and unintended consequences, and expanded use of validated technologies for measurement. Study design recommendations include improving guidance, documentation, and communication about methods used; increasing use of designs that minimize bias in natural experiments; and more carefully selecting control groups. Cross-cutting recommendations target activities that the NIH and other funders might undertake to improve the rigor of natural experiments in obesity, including training and collaboration on modeling and causal inference, promoting the importance of community engagement in the conduct of natural experiments, ensuring maintenance of relevant surveillance systems, and supporting extended follow-up assessments for exemplar natural experiments. To combat the significant public health threat posed by obesity, researchers should continue to take advantage of natural experiments. The recommendations in this report aim to strengthen evidence from such studies.

  5. Validation of a dye stain assay for vaginally inserted HEC-filled microbicide applicators

    PubMed Central

    Katzen, Lauren L.; Fernández-Romero, José A.; Sarna, Avina; Murugavel, Kailapuri G.; Gawarecki, Daniel; Zydowsky, Thomas M.; Mensch, Barbara S.

    2011-01-01

    Background The reliability and validity of self-reports of vaginal microbicide use are questionable given the explicit understanding that participants are expected to comply with study protocols. Our objective was to optimize the Population Council's previously validated dye stain assay (DSA) and related procedures, and establish predictive values for the DSA's ability to identify vaginally inserted single-use, low-density polyethylene microbicide applicators filled with hydroxyethylcellulose gel. Methods Applicators, inserted by 252 female sex workers enrolled in a microbicide feasibility study in Southern India, served as positive controls for optimization and validation experiments. Prior to validation, optimal dye concentration and staining time were ascertained. Three validation experiments were conducted to determine sensitivity, specificity, negative predictive values and positive predictive values. Results The dye concentration of 0.05% (w/v) FD&C Blue No. 1 Granular Food Dye and staining time of five seconds were determined to be optimal and were used for the three validation experiments. There were a total of 1,848 possible applicator readings across validation experiments; 1,703 (92.2%) applicator readings were correct. On average, the DSA performed with 90.6% sensitivity, 93.9% specificity, and had a negative predictive value of 93.8% and a positive predictive value of 91.0%. No statistically significant differences between experiments were noted. Conclusions The DSA was optimized and successfully validated for use with single-use, low-density polyethylene applicators filled with hydroxyethylcellulose (HEC) gel. We recommend including the DSA in future microbicide trials involving vaginal gels in order to identify participants who have low adherence to dosing regimens. In doing so, we can develop strategies to improve adherence as well as investigate the association between product use and efficacy. PMID:21992983

  6. Design and Validation of an Augmented Reality System for Laparoscopic Surgery in a Real Environment

    PubMed Central

    López-Mir, F.; Naranjo, V.; Fuertes, J. J.; Alcañiz, M.; Bueno, J.; Pareja, E.

    2013-01-01

    Purpose. This work presents the protocol carried out in the development and validation of an augmented reality system which was installed in an operating theatre to help surgeons with trocar placement during laparoscopic surgery. The purpose of this validation is to demonstrate the improvements that this system can provide to the field of medicine, particularly surgery. Method. Two experiments that were noninvasive for both the patient and the surgeon were designed. In one of these experiments the augmented reality system was used, the other one was the control experiment, and the system was not used. The type of operation selected for all cases was a cholecystectomy due to the low degree of complexity and complications before, during, and after the surgery. The technique used in the placement of trocars was the French technique, but the results can be extrapolated to any other technique and operation. Results and Conclusion. Four clinicians and ninety-six measurements obtained of twenty-four patients (randomly assigned in each experiment) were involved in these experiments. The final results show an improvement in accuracy and variability of 33% and 63%, respectively, in comparison to traditional methods, demonstrating that the use of an augmented reality system offers advantages for trocar placement in laparoscopic surgery. PMID:24236293

  7. Measuring engagement in nurses: the psychometric properties of the Persian version of Utrecht Work Engagement Scale

    PubMed Central

    Torabinia, Mansour; Mahmoudi, Sara; Dolatshahi, Mojtaba; Abyaz, Mohamad Reza

    2017-01-01

    Background: Considering the overall tendency in psychology, researchers in the field of work and organizational psychology have become progressively interested in employees’ effective and optimistic experiments at work such as work engagement. This study was conducted to investigate 2 main purposes: assessing the psychometric properties of the Utrecht Work Engagement Scale, and finding any association between work engagement and burnout in nurses. Methods: The present methodological study was conducted in 2015 and included 248 females and 34 males with 6 months to 30 years of job experience. After the translation process, face and content validity were calculated by qualitative and quantitative methods. Moreover, content validation ratio, scale-level content validity index and item-level content validity index were measured for this scale. Construct validity was determined by factor analysis. Moreover, internal consistency and stability reliability were assessed. Factor analysis, test-retest, Cronbach’s alpha, and association analysis were used as statistical methods. Results: Face and content validity were acceptable. Exploratory factor analysis suggested a new 3- factor model. In this new model, some items from the construct model of the original version were dislocated with the same 17 items. The new model was confirmed by divergent Copenhagen Burnout Inventory as the Persian version of UWES. Internal consistency reliability for the total scale and the subscales was 0.76 to 0.89. Results from Pearson correlation test indicated a high degree of test-retest reliability (r = 0. 89). ICC was also 0.91. Engagement was negatively related to burnout and overtime per month, whereas it was positively related with age and job experiment. Conclusion: The Persian 3– factor model of Utrecht Work Engagement Scale is a valid and reliable instrument to measure work engagement in Iranian nurses as well as in other medical professionals. PMID:28955665

  8. Determination of lipophilic toxins by LC/MS/MS: single-laboratory validation.

    PubMed

    Villar-González, Adriano; Rodríguez-Velasco, María Luisa; Gago-Martínez, Ana

    2011-01-01

    An LC/MS/MS method has been developed, assessed, and intralaboratory-validated for the analysis of the lipophilic toxins currently regulated by European Union legislation: okadaic acid (OA) and dinophysistoxins 1 and 2, including their ester forms; azaspiracids 1, 2, and 3; pectenotoxins 1 and 2; yessotoxin (YTX), and the analogs 45 OH-YTX, Homo YTX, and 45 OH-Homo YTX; as well as for the analysis of 13-desmetil-spirolide C. The method consists of duplicate sample extraction with methanol and direct analysis of the crude extract without further cleanup or concentration. Ester forms of OA and dinophysistoxins are detected as the parent ions after alkaline hydrolysis of the extract. The validation process of this method was performed using both fortified and naturally contaminated samples, and experiments were designed according to International Organization for Standardization, International Union of Pure and Applied Chemistry, and AOAC guidelines. With the exception of YTX in fortified samples, RSDr below 15% and RSDR were below 25%. Recovery values were between 77 and 95%, and LOQs were below 60 microg/kg. These data together with validation experiments for recovery, selectivity, robustness, traceability, and linearity, as well as uncertainty calculations, are presented in this paper.

  9. DSMC Simulations of Hypersonic Flows and Comparison With Experiments

    NASA Technical Reports Server (NTRS)

    Moss, James N.; Bird, Graeme A.; Markelov, Gennady N.

    2004-01-01

    This paper presents computational results obtained with the direct simulation Monte Carlo (DSMC) method for several biconic test cases in which shock interactions and flow separation-reattachment are key features of the flow. Recent ground-based experiments have been performed for several biconic configurations, and surface heating rate and pressure measurements have been proposed for code validation studies. The present focus is to expand on the current validating activities for a relatively new DSMC code called DS2V that Bird (second author) has developed. Comparisons with experiments and other computations help clarify the agreement currently being achieved between computations and experiments and to identify the range of measurement variability of the proposed validation data when benchmarked with respect to the current computations. For the test cases with significant vibrational nonequilibrium, the effect of the vibrational energy surface accommodation on heating and other quantities is demonstrated.

  10. Proposal for risk-based scientific approach on full and partial validation for general changes in bioanalytical method.

    PubMed

    Mochizuki, Ayumi; Ieki, Katsunori; Kamimori, Hiroshi; Nagao, Akemi; Nakai, Keiko; Nakayama, Akira; Nanba, Eitaro

    2018-04-01

    The guidance and several guidelines on bioanalytical method validation, which were issued by the US FDA, EMA and Ministry of Health, Labour and Welfare, list the 'full' validation parameters; however, none of these provide any details for 'partial' validation. Japan Bioanalysis Forum approved a total of three annual discussion groups from 2012 to 2014. In the discussion groups, members from pharmaceutical companies and contract research organizations discussed the details of partial validation from a risk assessment viewpoint based on surveys focusing on bioanalysis of small molecules using LC-MS/MS in Japan. This manuscript presents perspectives and recommendations for most conceivable changes that can be made to full and partial validations by members of the discussion groups based on their experiences and discussions at the Japan Bioanalysis Forum Symposium.

  11. Culture Training: Validation Evidence for the Culture Assimilator.

    ERIC Educational Resources Information Center

    Mitchell, Terence R.; And Others

    The culture assimilator, a programed self-instructional approach to culture training, is described and a series of laboratory experiments and field studies validating the culture assimilator are reviewed. These studies show that the culture assimilator is an effective method of decreasing some of the stress experienced when one works with people…

  12. Reconceptualising the external validity of discrete choice experiments.

    PubMed

    Lancsar, Emily; Swait, Joffre

    2014-10-01

    External validity is a crucial but under-researched topic when considering using discrete choice experiment (DCE) results to inform decision making in clinical, commercial or policy contexts. We present the theory and tests traditionally used to explore external validity that focus on a comparison of final outcomes and review how this traditional definition has been empirically tested in health economics and other sectors (such as transport, environment and marketing) in which DCE methods are applied. While an important component, we argue that the investigation of external validity should be much broader than a comparison of final outcomes. In doing so, we introduce a new and more comprehensive conceptualisation of external validity, closely linked to process validity, that moves us from the simple characterisation of a model as being or not being externally valid on the basis of predictive performance, to the concept that external validity should be an objective pursued from the initial conceptualisation and design of any DCE. We discuss how such a broader definition of external validity can be fruitfully used and suggest innovative ways in which it can be explored in practice.

  13. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    PubMed

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  14. Ensuring the validity of calculated subcritical limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, H.K.

    1977-01-01

    The care taken at the Savannah River Laboratory and Plant to ensure the validity of calculated subcritical limits is described. Close attention is given to ANSI N16.1-1975, ''Validation of Calculational Methods for Nuclear Criticality Safety.'' The computer codes used for criticality safety computations, which are listed and are briefly described, have been placed in the SRL JOSHUA system to facilitate calculation and to reduce input errors. A driver module, KOKO, simplifies and standardizes input and links the codes together in various ways. For any criticality safety evaluation, correlations of the calculational methods are made with experiment to establish bias. Occasionallymore » subcritical experiments are performed expressly to provide benchmarks. Calculated subcritical limits contain an adequate but not excessive margin to allow for uncertainty in the bias. The final step in any criticality safety evaluation is the writing of a report describing the calculations and justifying the margin.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The validation of neutron transport methods used in nuclear criticality safety analyses is required by consensus American National Standards Institute/American Nuclear Society (ANSI/ANS) standards. In the last decade, there has been an increased interest in correlations among critical experiments used in validation that have shared physical attributes and which impact the independence of each measurement. The statistical methods included in many of the frequently cited guidance documents on performing validation calculations incorporate the assumption that all individual measurements are independent, so little guidance is available to practitioners on the topic. Typical guidance includes recommendations to select experiments from multiple facilitiesmore » and experiment series in an attempt to minimize the impact of correlations or common-cause errors in experiments. Recent efforts have been made both to determine the magnitude of such correlations between experiments and to develop and apply methods for adjusting the bias and bias uncertainty to account for the correlations. This paper describes recent work performed at Oak Ridge National Laboratory using the Sampler sequence from the SCALE code system to develop experimental correlations using a Monte Carlo sampling technique. Sampler will be available for the first time with the release of SCALE 6.2, and a brief introduction to the methods used to calculate experiment correlations within this new sequence is presented in this paper. Techniques to utilize these correlations in the establishment of upper subcritical limits are the subject of a companion paper and will not be discussed here. Example experimental uncertainties and correlation coefficients are presented for a variety of low-enriched uranium water-moderated lattice experiments selected for use in a benchmark exercise by the Working Party on Nuclear Criticality Safety Subgroup on Uncertainty Analysis in Criticality Safety Analyses. The results include studies on the effect of fuel rod pitch on the correlations, and some observations are also made regarding difficulties in determining experimental correlations using the Monte Carlo sampling technique.« less

  16. Detection of Mycoplasma hyopneumoniae by polymerase chain reaction in swine presenting respiratory problems

    PubMed Central

    Yamaguti, M.; Muller, E.E.; Piffer, A.I.; Kich, J.D.; Klein, C.S.; Kuchiishi, S.S.

    2008-01-01

    Since Mycoplasma hyopneumoniae isolation in appropriate media is a difficult task and impractical for daily routine diagnostics, Nested-PCR (N-PCR) techniques are currently used to improve the direct diagnostic sensitivity of Swine Enzootic Pneumonia. In a first experiment, this paper describes a N-PCR technique optimization based on three variables: different sampling sites, sample transport media, and DNA extraction methods, using eight pigs. Based on the optimization results, a second experiment was conducted for testing validity using 40 animals. In conclusion, the obtained results of the N-PCR optimization and validation allow us to recommend this test as a routine monitoring diagnostic method for Mycoplasma hyopneumoniae infection in swine herds. PMID:24031248

  17. Improvement of Experiment Planning as an Important Precondition for the Quality of Educational Research

    ERIC Educational Resources Information Center

    Rutkiene, Ausra; Tereseviciene, Margarita

    2010-01-01

    The article presents the stages of the experiment planning that are necessary to ensure the validity and reliability of it. The research data reveal that doctoral students of Educational Research approach the planning of the experiment as the planning of the whole dissertation research; and the experiment as a research method is often confused…

  18. Reliability and Validity of a Spanish Version of the Posttraumatic Growth Inventory

    ERIC Educational Resources Information Center

    Weiss, Tzipi; Berger, Roni

    2006-01-01

    Objectives. This study was designed to adapt and validate a Spanish translation of the Posttraumatic Growth Inventory (PTGI) for the assessment of positive life changes following the stressful experiences of immigration. Method. A cross-cultural equivalence model was used to pursue semantic, content, conceptual, and technical equivalence.…

  19. Advanced Method of Boundary-Layer Control Based on Localized Plasma Generation

    DTIC Science & Technology

    2009-05-01

    measurements, validation of experiments, wind-tunnel testing of the microwave / plasma generation system , preliminary assessment of energy required...and design of a microwave generator , electrodynamic and multivibrator systems for experiments in the IHM-NAU wind tunnel: MW generator and its high...equipped with the microwave - generation and protection systems to study advanced methods of flow control (Kiev) Fig. 2.1,a. The blade

  20. The Synthetic Experiment: E. B. Titchener's Cornell Psychological Laboratory and the Test of Introspective Analysis.

    PubMed

    Evans, Rand B

    2017-01-01

    Beginning in 1 9a0, a major thread of research was added to E. B. Titchener's Cornell laboratory: the synthetic experiment. Titchener and his graduate students used introspective analysis to reduce a perception, a complex experience, into its simple sensory constituents. To test the validity of that analysis, stimulus patterns were selected to reprodiuce the patterns of sensations found in the introspective analyses. If the original perception can be reconstructed in this way, then the analysis was considered validated. This article reviews development of the synthetic method in E. B. Titchener's laboratory at Cornell University and examines its impact on psychological research.

  1. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  2. A Consensus Approach to Investigate Undergraduate Pharmacy Students’ Experience of Interprofessional Education

    PubMed Central

    Obara, Ilona; Paterson, Alastair; Nazar, Zachariah; Portlock, Jane; Husband, Andrew

    2017-01-01

    Objective. To assess the development of knowledge, attitudes, and behaviors for collaborative practice among first-year pharmacy students following completion of interprofessional education. Methods. A mixed-methods strategy was employed to detect student self-reported change in knowledge, attitudes, and behaviors. Validated survey tools were used to assess student perception and attitudes. The Nominal Group Technique (NGT) was used to capture student reflections and provide peer discussion on the individual IPE sessions. Results. The validated survey tools did not detect any change in students’ attitudes and perceptions. The NGT succeeded in providing a milieu for participating students to reflect on their IPE experiences. The peer review process allowed students to compare their initial perceptions and reactions and renew their reflections on the learning experience. Conclusion. The NGT process has provided the opportunity to assess the student experience through the reflective process that was enriched via peer discussion. Students have demonstrated more positive attitudes and behaviors toward interprofessional working through IPE. PMID:28381886

  3. How Generalizable Is Your Experiment? An Index for Comparing Samples and Populations

    ERIC Educational Resources Information Center

    Tipton, Elizabeth

    2013-01-01

    Recent research on the design of social experiments has highlighted the effects of different design choices on research findings. Since experiments rarely collect their samples using random selection, in order to address these external validity problems and design choices, recent research has focused on two areas. The first area is on methods for…

  4. Identification of Average Treatment Effects in Social Experiments under Alternative Forms of Attrition

    ERIC Educational Resources Information Center

    Huber, Martin

    2012-01-01

    As any empirical method used for causal analysis, social experiments are prone to attrition which may flaw the validity of the results. This article considers the problem of partially missing outcomes in experiments. First, it systematically reveals under which forms of attrition--in terms of its relation to observable and/or unobservable…

  5. Experience Documentation in Assessing Professional Practice or Work Experience: Lessons from Granting Advanced Certification to Health Education Specialists

    ERIC Educational Resources Information Center

    Gambescia, Stephen F.; Lysoby, Linda; Perko, Michael; Sheu, Jiunn-Jye

    2016-01-01

    The purpose of this article is to demonstrate how one profession used an "experience documentation process" to grant advanced certification to qualified certified health education specialists. The competency validation process approved by the certifying organization serves as an example of an additional method, aside from traditional…

  6. Optimization and Validation of a Sensitive Method for HPLC-PDA Simultaneous Determination of Torasemide and Spironolactone in Human Plasma using Central Composite Design.

    PubMed

    Subramanian, Venkatesan; Nagappan, Kannappan; Sandeep Mannemala, Sai

    2015-01-01

    A sensitive, accurate, precise and rapid HPLC-PDA method was developed and validated for the simultaneous determination of torasemide and spironolactone in human plasma using Design of experiments. Central composite design was used to optimize the method using content of acetonitrile, concentration of buffer and pH of mobile phase as independent variables, while the retention factor of spironolactone, resolution between torasemide and phenobarbitone; and retention time of phenobarbitone were chosen as dependent variables. The chromatographic separation was achieved on Phenomenex C(18) column and the mobile phase comprising 20 mM potassium dihydrogen ortho phosphate buffer (pH-3.2) and acetonitrile in 82.5:17.5 v/v pumped at a flow rate of 1.0 mL min(-1). The method was validated according to USFDA guidelines in terms of selectivity, linearity, accuracy, precision, recovery and stability. The limit of quantitation values were 80 and 50 ng mL(-1) for torasemide and spironolactone respectively. Furthermore, the sensitivity and simplicity of the method suggests the validity of method for routine clinical studies.

  7. Creating wavelet-based models for real-time synthesis of perceptually convincing environmental sounds

    NASA Astrophysics Data System (ADS)

    Miner, Nadine Elizabeth

    1998-09-01

    This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.

  8. Viscosity Measurement of Highly Viscous Liquids Using Drop Coalescence in Low Gravity

    NASA Technical Reports Server (NTRS)

    Antar, Basil N.; Ethridge, Edwin; Maxwell, Daniel

    1999-01-01

    The method of drop coalescence is being investigated for use as a method for determining the viscosity of highly viscous undercooled liquids. Low gravity environment is necessary in this case to minimize the undesirable effects of body forces and liquid motion in levitated drops. Also, the low gravity environment will allow for investigating large liquid volumes which can lead to much higher accuracy for the viscosity calculations than possible under 1 - g conditions. The drop coalescence method is preferred over the drop oscillation technique since the latter method can only be applied for liquids with vanishingly small viscosities. The technique developed relies on both the highly accurate solution of the Navier-Stokes equations as well as on data from experiments conducted in near zero gravity environment. In the analytical aspect of the method two liquid volumes are brought into contact which will coalesce under the action of surface tension alone. The free surface geometry development as well as its velocity during coalescence which are obtained from numerical computations are compared with an analogous experimental model. The viscosity in the numerical computations is then adjusted to bring into agreement of the experimental results with the calculations. The true liquid viscosity is the one which brings the experiment closest to the calculations. Results are presented for method validation experiments performed recently on board the NASA/KC-135 aircraft. The numerical solution for this validation case was produced using the Boundary Element Method. In these tests the viscosity of a highly viscous liquid, in this case glycerine at room temperature, was determined to high degree of accuracy using the liquid coalescence method. These experiments gave very encouraging results which will be discussed together with plans for implementing the method in a shuttle flight experiment.

  9. 40 CFR Appendix A to Part 63 - Test Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... components by a different analyst). 3.3Surrogate Reference Materials. The analyst may use surrogate compounds... the variance of the proposed method is significantly different from that of the validated method by... variables can be determined in eight experiments rather than 128 (W.J. Youden, Statistical Manual of the...

  10. Linguistic and content validation of a German-language PRO-CTCAE-based patient-reported outcomes instrument to evaluate the late effect symptom experience after allogeneic hematopoietic stem cell transplantation.

    PubMed

    Kirsch, Monika; Mitchell, Sandra A; Dobbels, Fabienne; Stussi, Georg; Basch, Ethan; Halter, Jorg P; De Geest, Sabina

    2015-02-01

    The aim of this sequential mixed methods study was to develop a PRO-CTCAE (Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events)-based measure of the symptom experience of late effects in German speaking long-term survivors of allogeneic stem cell transplantation (SCT), and to examine its content validity. The US National Cancer Institute's PRO-CTAE item library was translated into German and linguistically validated. PRO-CTCAE symptoms prevalent in ≥50% of survivors (n = 15) and recognized in its importance by SCT experts (n = 9) were identified. Additional concepts relevant to the symptom experience and its consequences were elicited. Content validity of the PROVIVO (Patient-Reported Outcomes of long-term survivors after allogeneic SCT) instrument was assessed through an additional round of cognitive debriefing in 15 patients, and item and scale content validity indices by 9 experts. PROVIVO is comprised of a total of 49 items capturing the experience of physical, emotional and cognitive symptoms. To improve the instrument's utility for clinical decision-making, questions soliciting limitations in activities of daily living, frequent infections, and overall well-being were added. Cognitive debriefings demonstrated that items were well understood and relevant to the SCT survivor experience. Scale Content Validity Index (CVI) (0.94) and item CVI (median = 1; range 0.75-1) were very high. Qualitative and quantitative data provide preliminary evidence supporting the content validity of PROVIVO and identify a PRO-CTCAE item bundle for use in SCT survivors. A study to evaluate the measurement properties of PROVIVO and to examine its capacity to improve survivorship care planning is underway. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Supersonic Coaxial Jet Experiment for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Cutler, A. D.; Carty, A. A.; Doerner, S. E.; Diskin, G. S.; Drummond, J. P.

    1999-01-01

    A supersonic coaxial jet facility has been designed to provide experimental data suitable for the validation of CFD codes used to analyze high-speed propulsion flows. The center jet is of a light gas and the coflow jet is of air, and the mixing layer between them is compressible. Various methods have been employed in characterizing the jet flow field, including schlieren visualization, pitot, total temperature and gas sampling probe surveying, and RELIEF velocimetry. A Navier-Stokes code has been used to calculate the nozzle flow field and the results compared to the experiment.

  12. Generator Dynamic Model Validation and Parameter Calibration Using Phasor Measurements at the Point of Connection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhenyu; Du, Pengwei; Kosterev, Dmitry

    2013-05-01

    Disturbance data recorded by phasor measurement units (PMU) offers opportunities to improve the integrity of dynamic models. However, manually tuning parameters through play-back events demands significant efforts and engineering experiences. In this paper, a calibration method using the extended Kalman filter (EKF) technique is proposed. The formulation of EKF with parameter calibration is discussed. Case studies are presented to demonstrate its validity. The proposed calibration method is cost-effective, complementary to traditional equipment testing for improving dynamic model quality.

  13. Scale for positive aspects of caregiving experience: development, reliability, and factor structure.

    PubMed

    Kate, N; Grover, S; Kulhara, P; Nehra, R

    2012-06-01

    OBJECTIVE. To develop an instrument (Scale for Positive Aspects of Caregiving Experience [SPACE]) that evaluates positive caregiving experience and assess its psychometric properties. METHODS. Available scales which assess some aspects of positive caregiving experience were reviewed and a 50-item questionnaire with a 5-point rating was constructed. In all, 203 primary caregivers of patients with severe mental disorders were asked to complete the questionnaire. Internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity were evaluated. Principal component factor analysis was run to assess the factorial validity of the scale. RESULTS. The scale developed as part of the study was found to have good internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity. Principal component factor analysis yielded a 4-factor structure, which also had good test-retest reliability and cross-language reliability. There was a strong correlation between the 4 factors obtained. CONCLUSION. The SPACE developed as part of this study has good psychometric properties.

  14. The Spiritual Dimensions of Psychopolitical Validity: The Case of the Clergy Sexual Abuse Crisis

    ERIC Educational Resources Information Center

    Jones, Diana L.; Dokecki, Paul R.

    2008-01-01

    In this article, the authors explore the spiritual dimensions of psychopolitical validity and use it as a lens to analyze clergy sexual abuse. The psychopolitical approach suggests a comprehensive human science methodology that invites exploration of phenomena such as spirituality and religious experience and the use of methods from a wide variety…

  15. Quantification of DNA cleavage specificity in Hi-C experiments.

    PubMed

    Meluzzi, Dario; Arya, Gaurav

    2016-01-08

    Hi-C experiments produce large numbers of DNA sequence read pairs that are typically analyzed to deduce genomewide interactions between arbitrary loci. A key step in these experiments is the cleavage of cross-linked chromatin with a restriction endonuclease. Although this cleavage should happen specifically at the enzyme's recognition sequence, an unknown proportion of cleavage events may involve other sequences, owing to the enzyme's star activity or to random DNA breakage. A quantitative estimation of these non-specific cleavages may enable simulating realistic Hi-C read pairs for validation of downstream analyses, monitoring the reproducibility of experimental conditions and investigating biophysical properties that correlate with DNA cleavage patterns. Here we describe a computational method for analyzing Hi-C read pairs to estimate the fractions of cleavages at different possible targets. The method relies on expressing an observed local target distribution downstream of aligned reads as a linear combination of known conditional local target distributions. We validated this method using Hi-C read pairs obtained by computer simulation. Application of the method to experimental Hi-C datasets from murine cells revealed interesting similarities and differences in patterns of cleavage across the various experiments considered. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Development and initial validation of the Parental PELICAN Questionnaire (PaPEQu)--an instrument to assess parental experiences and needs during their child's end-of-life care.

    PubMed

    Zimmermann, Karin; Cignacco, Eva; Eskola, Katri; Engberg, Sandra; Ramelet, Anne-Sylvie; Von der Weid, Nicolas; Bergstraesser, Eva

    2015-12-01

    To develop and test the Parental PELICAN Questionnaire, an instrument to retrospectively assess parental experiences and needs during their child's end-of-life care. To offer appropriate care for dying children, healthcare professionals need to understand the illness experience from the family perspective. A questionnaire specific to the end-of-life experiences and needs of parents losing a child is needed to evaluate the perceived quality of paediatric end-of-life care. This is an instrument development study applying mixed methods based on recommendations for questionnaire design and validation. The Parental PELICAN Questionnaire was developed in four phases between August 2012-March 2014: phase 1: item generation; phase 2: validity testing; phase 3: translation; phase 4: pilot testing. Psychometric properties were assessed after applying the Parental PELICAN Questionnaire in a sample of 224 bereaved parents in April 2014. Validity testing covered the evidence based on tests of content, internal structure and relations to other variables. The Parental PELICAN Questionnaire consists of approximately 90 items in four slightly different versions accounting for particularities of the four diagnostic groups. The questionnaire's items were structured according to six quality domains described in the literature. Evidence of initial validity and reliability could be demonstrated with the involvement of healthcare professionals and bereaved parents. The Parental PELICAN Questionnaire holds promise as a measure to assess parental experiences and needs and is applicable to a broad range of paediatric specialties and settings. Future validation is needed to evaluate its suitability in different cultures. © 2015 John Wiley & Sons Ltd.

  17. Reliability and Validity of Gaze-Dependent Functional Vision Space: A Novel Metric Quantifying Visual Function in Infantile Nystagmus Syndrome.

    PubMed

    Roberts, Tawna L; Kester, Kristi N; Hertle, Richard W

    2018-04-01

    This study presents test-retest reliability of optotype visual acuity (OVA) across 60° of horizontal gaze position in patients with infantile nystagmus syndrome (INS). Also, the validity of the metric gaze-dependent functional vision space (GDFVS) is shown in patients with INS. In experiment 1, OVA was measured twice in seven horizontal gaze positions from 30° left to right in 10° steps in 20 subjects with INS and 14 without INS. Test-retest reliability was assessed using intraclass correlation coefficient (ICC) in each gaze. OVA area under the curve (AUC) was calculated with horizontal eye position on the x-axis, and logMAR visual acuity on the y-axis and then converted to GDFVS. In experiment 2, validity of GDFVS was determined over 40° horizontal gaze by applying the 95% limits of agreement from experiment 1 to pre- and post-treatment GDFVS values from 85 patients with INS. In experiment 1, test-retest reliability for OVA was high (ICC ≥ 0.88) as the difference in test-retest was on average less than 0.1 logMAR in each gaze position. In experiment 2, as a group, INS subjects had a significant increase (P < 0.001) in the size of their GDFVS that exceeded the 95% limits of agreement found during test-retest. OVA is a reliable measure in INS patients across 60° of horizontal gaze position. GDFVS is a valid clinical method to be used to quantify OVA as a function of eye position in INS patients. This method captures the dynamic nature of OVA in INS patients and may be a valuable measure to quantify visual function patients with INS, particularly in quantifying change as part of clinical studies.

  18. The Use of Virtual Reality in the Study of People's Responses to Violent Incidents.

    PubMed

    Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel

    2009-01-01

    This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call 'plausibility' - including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.

  19. The Use of Virtual Reality in the Study of People's Responses to Violent Incidents

    PubMed Central

    Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel

    2009-01-01

    This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call ‘plausibility’ – including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents. PMID:20076762

  20. A statistical method (cross-validation) for bone loss region detection after spaceflight

    PubMed Central

    Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.

    2010-01-01

    Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144

  1. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  2. Getting the most out of RNA-seq data analysis.

    PubMed

    Khang, Tsung Fei; Lau, Ching Yee

    2015-01-01

    Background. A common research goal in transcriptome projects is to find genes that are differentially expressed in different phenotype classes. Biologists might wish to validate such gene candidates experimentally, or use them for downstream systems biology analysis. Producing a coherent differential gene expression analysis from RNA-seq count data requires an understanding of how numerous sources of variation such as the replicate size, the hypothesized biological effect size, and the specific method for making differential expression calls interact. We believe an explicit demonstration of such interactions in real RNA-seq data sets is of practical interest to biologists. Results. Using two large public RNA-seq data sets-one representing strong, and another mild, biological effect size-we simulated different replicate size scenarios, and tested the performance of several commonly-used methods for calling differentially expressed genes in each of them. We found that, when biological effect size was mild, RNA-seq experiments should focus on experimental validation of differentially expressed gene candidates. Importantly, at least triplicates must be used, and the differentially expressed genes should be called using methods with high positive predictive value (PPV), such as NOISeq or GFOLD. In contrast, when biological effect size was strong, differentially expressed genes mined from unreplicated experiments using NOISeq, ASC and GFOLD had between 30 to 50% mean PPV, an increase of more than 30-fold compared to the cases of mild biological effect size. Among methods with good PPV performance, having triplicates or more substantially improved mean PPV to over 90% for GFOLD, 60% for DESeq2, 50% for NOISeq, and 30% for edgeR. At a replicate size of six, we found DESeq2 and edgeR to be reasonable methods for calling differentially expressed genes at systems level analysis, as their PPV and sensitivity trade-off were superior to the other methods'. Conclusion. When biological effect size is weak, systems level investigation is not possible using RNAseq data, and no meaningful result can be obtained in unreplicated experiments. Nonetheless, NOISeq or GFOLD may yield limited numbers of gene candidates with good validation potential, when triplicates or more are available. When biological effect size is strong, NOISeq and GFOLD are effective tools for detecting differentially expressed genes in unreplicated RNA-seq experiments for qPCR validation. When triplicates or more are available, GFOLD is a sharp tool for identifying high confidence differentially expressed genes for targeted qPCR validation; for downstream systems level analysis, combined results from DESeq2 and edgeR are useful.

  3. Using Discrete Choice Experiments to Inform the Benefit-Risk Assessment of Medicines: Are We Ready Yet?

    PubMed

    Vass, Caroline M; Payne, Katherine

    2017-09-01

    There is emerging interest in the use of discrete choice experiments as a means of quantifying the perceived balance between benefits and risks (quantitative benefit-risk assessment) of new healthcare interventions, such as medicines, under assessment by regulatory agencies. For stated preference data on benefit-risk assessment to be used in regulatory decision making, the methods to generate these data must be valid, reliable and capable of producing meaningful estimates understood by decision makers. Some reporting guidelines exist for discrete choice experiments, and for related methods such as conjoint analysis. However, existing guidelines focus on reporting standards, are general in focus and do not consider the requirements for using discrete choice experiments specifically for quantifying benefit-risk assessments in the context of regulatory decision making. This opinion piece outlines the current state of play in using discrete choice experiments for benefit-risk assessment and proposes key areas needing to be addressed to demonstrate that discrete choice experiments are an appropriate and valid stated preference elicitation method in this context. Methodological research is required to establish: how robust the results of discrete choice experiments are to formats and methods of risk communication; how information in the discrete choice experiment can be presented effectually to respondents; whose preferences should be elicited; the correct underlying utility function and analytical model; the impact of heterogeneity in preferences; and the generalisability of the results. We believe these methodological issues should be addressed, alongside developing a 'reference case', before agencies can safely and confidently use discrete choice experiments for quantitative benefit-risk assessment in the context of regulatory decision making for new medicines and healthcare products.

  4. Relations between inductive reasoning and deductive reasoning.

    PubMed

    Heit, Evan; Rotello, Caren M

    2010-05-01

    One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments. Experiment 1 showed 2 dissociations: For a common set of arguments, deduction judgments were more affected by validity, and induction judgments were more affected by similarity. Moreover, Experiment 2 showed that fast deduction judgments were like induction judgments-in terms of being more influenced by similarity and less influenced by validity, compared with slow deduction judgments. These novel results pose challenges for a 1-process account of reasoning and are interpreted in terms of a 2-process account of reasoning, which was implemented as a multidimensional signal detection model and applied to receiver operating characteristic data. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  5. Viscosity Measurement Using Drop Coalescence in Microgravity

    NASA Technical Reports Server (NTRS)

    Antar, Basil N.; Ethridge, Edwin C.; Maxwell, Daniel; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    We present in here validation studies of a new method for application in microgravity environment which measures the viscosity of highly viscous undercooled liquids using drop coalescence. The method has the advantage of avoiding heterogeneous nucleation at container walls caused by crystallization of undercooled liquids during processing. Homogeneous nucleation can also be avoided due to the rapidity of the measurement using this method. The technique relies on measurements from experiments conducted in near zero gravity environment as well as highly accurate analytical formulation for the coalescence process. The viscosity of the liquid is determined by allowing the computed free surface shape relaxation time to be adjusted in response to the measured free surface velocity for two coalescing drops. Results are presented from two sets of validation experiments for the method which were conducted on board aircraft flying parabolic trajectories. In these tests the viscosity of a highly viscous liquid, namely glycerin, was determined at different temperatures using the drop coalescence method described in here. The experiments measured the free surface velocity of two glycerin drops coalescing under the action of surface tension alone in low gravity environment using high speed photography. The liquid viscosity was determined by adjusting the computed free surface velocity values to the measured experimental data. The results of these experiments were found to agree reasonably well with the known viscosity for the test liquid used.

  6. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    NASA Astrophysics Data System (ADS)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  7. Design and validation of instruments to measure knowledge.

    PubMed

    Elliott, T E; Regal, R R; Elliott, B A; Renier, C M

    2001-01-01

    Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.

  8. Fiducial marker application method for position alignment of in situ multimodal X-ray experiments and reconstructions

    DOE PAGES

    Shade, Paul A.; Menasche, David B.; Bernier, Joel V.; ...

    2016-03-01

    An evolving suite of X-ray characterization methods are presently available to the materials community, providing a great opportunity to gain new insight into material behavior and provide critical validation data for materials models. Two critical and related issues are sample repositioning during anin situexperiment and registration of multiple data sets after the experiment. To address these issues, a method is described which utilizes a focused ion-beam scanning electron microscope equipped with a micromanipulator to apply gold fiducial markers to samples for X-ray measurements. The method is demonstrated with a synchrotron X-ray experiment involvingin situloading of a titanium alloy tensile specimen.

  9. On Selecting Commercial Information Systems

    PubMed Central

    Möhr, J.R.; Sawinski, R.; Kluge, A.; Alle, W.

    1984-01-01

    As more commercial information systems become available, the methodology for their selection gains importance. An instances where the method employed for the selection of laboratory information systems was multilevel assessment. The method is described and the experience gained in the project is summarized and discussed. Evidence is provided that the employed method is comprehensive, reproducible, valid and economic.

  10. Design of experiments in medical physics: Application to the AAA beam model validation.

    PubMed

    Dufreneix, S; Legrand, C; Di Bartolo, C; Bremaud, M; Mesgouez, J; Tiplica, T; Autret, D

    2017-09-01

    The purpose of this study is to evaluate the usefulness of the design of experiments in the analysis of multiparametric problems related to the quality assurance in radiotherapy. The main motivation is to use this statistical method to optimize the quality assurance processes in the validation of beam models. Considering the Varian Eclipse system, eight parameters with several levels were selected: energy, MLC, depth, X, Y 1 and Y 2 jaw dimensions, wedge and wedge jaw. A Taguchi table was used to define 72 validation tests. Measurements were conducted in water using a CC04 on a TrueBeam STx, a TrueBeam Tx, a Trilogy and a 2300IX accelerator matched by the vendor. Dose was computed using the AAA algorithm. The same raw data was used for all accelerators during the beam modelling. The mean difference between computed and measured doses was 0.1±0.5% for all beams and all accelerators with a maximum difference of 2.4% (under the 3% tolerance level). For all beams, the measured doses were within 0.6% for all accelerators. The energy was found to be an influencing parameter but the deviations observed were smaller than 1% and not considered clinically significant. Designs of experiment can help define the optimal measurement set to validate a beam model. The proposed method can be used to identify the prognostic factors of dose accuracy. The beam models were validated for the 4 accelerators which were found dosimetrically equivalent even though the accelerator characteristics differ. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  12. A Hardware Model Validation Tool for Use in Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

    2010-01-01

    One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

  13. Large-scale experimental technology with remote sensing in land surface hydrology and meteorology

    NASA Technical Reports Server (NTRS)

    Brutsaert, Wilfried; Schmugge, Thomas J.; Sellers, Piers J.; Hall, Forrest G.

    1988-01-01

    Two field experiments to study atmospheric and land surface processes and their interactions are summarized. The Hydrologic-Atmospheric Pilot Experiment, which tested techniques for measuring evaporation, soil moisture storage, and runoff at scales of about 100 km, was conducted over a 100 X 100 km area in France from mid-1985 to early 1987. The first International Satellite Land Surface Climatology Program field experiment was conducted in 1987 to develop and use relationships between current satellite measurements and hydrologic, climatic, and biophysical variables at the earth's surface and to validate these relationships with ground truth. This experiment also validated surface parameterization methods for simulation models that describe surface processes from the scale of vegetation leaves up to scales appropriate to satellite remote sensing.

  14. Emotional Arousal and Regulation: Further Evidence of the Validity of the "How I Feel" Questionnaire for Use with School-Age Children

    ERIC Educational Resources Information Center

    Ciucci, Enrica; Baroncelli, Andrea; Grazzani, Ilaria; Ornaghi, Veronica; Caprin, Claudia

    2016-01-01

    Background: The ability to understand and manage emotional experience is critical to children's health. This study confirmed the validity of the How I Feel (HIF) Questionnaire, a measure of children's emotional arousal and regulation, exploring its associations with measures of emotional and social functioning. Methods: The sample was comprised of…

  15. Comparative study between EDXRF and ASTM E572 methods using two-way ANOVA

    NASA Astrophysics Data System (ADS)

    Krummenauer, A.; Veit, H. M.; Zoppas-Ferreira, J.

    2018-03-01

    Comparison with reference method is one of the necessary requirements for the validation of non-standard methods. This comparison was made using the experiment planning technique with two-way ANOVA. In ANOVA, the results obtained using the EDXRF method, to be validated, were compared with the results obtained using the ASTM E572-13 standard test method. Fisher's tests (F-test) were used to comparative study between of the elements: molybdenum, niobium, copper, nickel, manganese, chromium and vanadium. All F-tests of the elements indicate that the null hypothesis (Ho) has not been rejected. As a result, there is no significant difference between the methods compared. Therefore, according to this study, it is concluded that the EDXRF method was approved in this method comparison requirement.

  16. Fractal Clustering and Knowledge-driven Validation Assessment for Gene Expression Profiling.

    PubMed

    Wang, Lu-Yong; Balasubramanian, Ammaiappan; Chakraborty, Amit; Comaniciu, Dorin

    2005-01-01

    DNA microarray experiments generate a substantial amount of information about the global gene expression. Gene expression profiles can be represented as points in multi-dimensional space. It is essential to identify relevant groups of genes in biomedical research. Clustering is helpful in pattern recognition in gene expression profiles. A number of clustering techniques have been introduced. However, these traditional methods mainly utilize shape-based assumption or some distance metric to cluster the points in multi-dimension linear Euclidean space. Their results shows poor consistence with the functional annotation of genes in previous validation study. From a novel different perspective, we propose fractal clustering method to cluster genes using intrinsic (fractal) dimension from modern geometry. This method clusters points in such a way that points in the same clusters are more self-affine among themselves than to the points in other clusters. We assess this method using annotation-based validation assessment for gene clusters. It shows that this method is superior in identifying functional related gene groups than other traditional methods.

  17. Validation of an improved abnormality insertion method for medical image perception investigations

    NASA Astrophysics Data System (ADS)

    Madsen, Mark T.; Durst, Gregory R.; Caldwell, Robert T.; Schartz, Kevin M.; Thompson, Brad H.; Berbaum, Kevin S.

    2009-02-01

    The ability to insert abnormalities in clinical tomographic images makes image perception studies with medical images practical. We describe a new insertion technique and its experimental validation that uses complementary image masks to select an abnormality from a library and place it at a desired location. The method was validated using a 4-alternative forced-choice experiment. For each case, four quadrants were simultaneously displayed consisting of 5 consecutive frames of a chest CT with a pulmonary nodule. One quadrant was unaltered, while the other 3 had the nodule from the unaltered quadrant artificially inserted. 26 different sets were generated and repeated with order scrambling for a total of 52 cases. The cases were viewed by radiology staff and residents who ranked each quadrant by realistic appearance. On average, the observers were able to correctly identify the unaltered quadrant in 42% of cases, and identify the unaltered quadrant both times it appeared in 25% of cases. Consensus, defined by a majority of readers, correctly identified the unaltered quadrant in only 29% of 52 cases. For repeats, the consensus observer successfully identified the unaltered quadrant only once. We conclude that the insertion method can be used to reliably place abnormalities in perception experiments.

  18. Determination of vitamin C in foods: current state of method validation.

    PubMed

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  19. Convergent validity between a discrete choice experiment and a direct, open-ended method: comparison of preferred attribute levels and willingness to pay estimates.

    PubMed

    Marjon van der Pol; Shiell, Alan; Au, Flora; Johnston, David; Tough, Suzanne

    2008-12-01

    The Discrete Choice Experiment (DCE) has become increasingly popular as a method for eliciting patient or population preferences. If DCE estimates are to inform health policy, it is crucial that the answers they provide are valid. Convergent validity is tested in this paper by comparing the results of a DCE exercise with the answers obtained from direct, open-ended questions. The two methods are compared in terms of preferred attribute levels and willingness to pay (WTP) values. Face-to-face interviews were held with 292 women in Calgary, Canada. Similar values were found between the two methods with respect to preferred levels for two out of three of the attributes examined. The DCE predicted less well for levels outside the range than for levels inside the range reaffirming the importance of extensive piloting to ensure appropriate level range in DCEs. The mean WTP derived from the open-ended question was substantially lower than the mean derived from the DCE. However, the two sets of willingness to pay estimates were consistent with each other in that individuals who were willing to pay more in the open-ended question were also willing to pay more in the DCE. The difference in mean WTP values between the two approaches (direct versus DCE) demonstrates the importance of continuing research into the different biases present across elicitation methods.

  20. Validating Remotely Sensed Land Surface Evapotranspiration Based on Multi-scale Field Measurements

    NASA Astrophysics Data System (ADS)

    Jia, Z.; Liu, S.; Ziwei, X.; Liang, S.

    2012-12-01

    The land surface evapotranspiration plays an important role in the surface energy balance and the water cycle. There have been significant technical and theoretical advances in our knowledge of evapotranspiration over the past two decades. Acquisition of the temporally and spatially continuous distribution of evapotranspiration using remote sensing technology has attracted the widespread attention of researchers and managers. However, remote sensing technology still has many uncertainties coming from model mechanism, model inputs, parameterization schemes, and scaling issue in the regional estimation. Achieving remotely sensed evapotranspiration (RS_ET) with confident certainty is required but difficult. As a result, it is indispensable to develop the validation methods to quantitatively assess the accuracy and error sources of the regional RS_ET estimations. This study proposes an innovative validation method based on multi-scale evapotranspiration acquired from field measurements, with the validation results including the accuracy assessment, error source analysis, and uncertainty analysis of the validation process. It is a potentially useful approach to evaluate the accuracy and analyze the spatio-temporal properties of RS_ET at both the basin and local scales, and is appropriate to validate RS_ET in diverse resolutions at different time-scales. An independent RS_ET validation using this method was presented over the Hai River Basin, China in 2002-2009 as a case study. Validation at the basin scale showed good agreements between the 1 km annual RS_ET and the validation data such as the water balanced evapotranspiration, MODIS evapotranspiration products, precipitation, and landuse types. Validation at the local scale also had good results for monthly, daily RS_ET at 30 m and 1 km resolutions, comparing to the multi-scale evapotranspiration measurements from the EC and LAS, respectively, with the footprint model over three typical landscapes. Although some validation experiments demonstrated that the models yield accurate estimates at flux measurement sites, the question remains whether they are performing well over the broader landscape. Moreover, a large number of RS_ET products have been released in recent years. Thus, we also pay attention to the cross-validation method of RS_ET derived from multi-source models. "The Multi-scale Observation Experiment on Evapotranspiration over Heterogeneous Land Surfaces: Flux Observation Matrix" campaign is carried out at the middle reaches of the Heihe River Basin, China in 2012. Flux measurements from an observation matrix composed of 22 EC and 4 LAS are acquired to investigate the cross-validation of multi-source models over different landscapes. In this case, six remote sensing models, including the empirical statistical model, the one-source and two-source models, the Penman-Monteith equation based model, the Priestley-Taylor equation based model, and the complementary relationship based model, are used to perform an intercomparison. All the results from the two cases of RS_ET validation showed that the proposed validation methods are reasonable and feasible.

  1. Rotational Dynamics with Tracker

    ERIC Educational Resources Information Center

    Eadkhong, T.; Rajsadorn, R.; Jannual, P.; Danworaphong, S.

    2012-01-01

    We propose the use of Tracker, freeware for video analysis, to analyse the moment of inertia ("I") of a cylindrical plate. Three experiments are performed to validate the proposed method. The first experiment is dedicated to find the linear coefficient of rotational friction ("b") for our system. By omitting the effect of such friction, we derive…

  2. Application of multi-factorial design of experiments to successfully optimize immunoassays for robust measurements of therapeutic proteins.

    PubMed

    Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh

    2009-02-20

    Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.

  3. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  4. Harmonics analysis of the ITER poloidal field converter based on a piecewise method

    NASA Astrophysics Data System (ADS)

    Xudong, WANG; Liuwei, XU; Peng, FU; Ji, LI; Yanan, WU

    2017-12-01

    Poloidal field (PF) converters provide controlled DC voltage and current to PF coils. The many harmonics generated by the PF converter flow into the power grid and seriously affect power systems and electric equipment. Due to the complexity of the system, the traditional integral operation in Fourier analysis is complicated and inaccurate. This paper presents a piecewise method to calculate the harmonics of the ITER PF converter. The relationship between the grid input current and the DC output current of the ITER PF converter is deduced. The grid current is decomposed into the sum of some simple functions. By calculating simple function harmonics based on the piecewise method, the harmonics of the PF converter under different operation modes are obtained. In order to examine the validity of the method, a simulation model is established based on Matlab/Simulink and a relevant experiment is implemented in the ITER PF integration test platform. Comparative results are given. The calculated results are found to be consistent with simulation and experiment. The piecewise method is proved correct and valid for calculating the system harmonics.

  5. Assessing the empirical validity of the "take-the-best" heuristic as a model of human probabilistic inference.

    PubMed

    Bröder, A

    2000-09-01

    The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.

  6. Tracking wakefulness as it fades: Micro-measures of alertness.

    PubMed

    Jagannathan, Sridhar R; Ezquerro-Nassar, Alejandro; Jachs, Barbara; Pustovaya, Olga V; Bareham, Corinne A; Bekinschtein, Tristan A

    2018-08-01

    A major problem in psychology and physiology experiments is drowsiness: around a third of participants show decreased wakefulness despite being instructed to stay alert. In some non-visual experiments participants keep their eyes closed throughout the task, thus promoting the occurrence of such periods of varying alertness. These wakefulness changes contribute to systematic noise in data and measures of interest. To account for this omnipresent problem in data acquisition we defined criteria and code to allow researchers to detect and control for varying alertness in electroencephalography (EEG) experiments under eyes-closed settings. We first revise a visual-scoring method developed for detection and characterization of the sleep-onset process, and adapt the same for detection of alertness levels. Furthermore, we show the major issues preventing the practical use of this method, and overcome these issues by developing an automated method (micro-measures algorithm) based on frequency and sleep graphoelements, which are capable of detecting micro variations in alertness. The validity of the micro-measures algorithm was verified by training and testing using a dataset where participants are known to fall asleep. In addition, we tested generalisability by independent validation on another dataset. The methods developed constitute a unique tool to assess micro variations in levels of alertness and control trial-by-trial retrospectively or prospectively in every experiment performed with EEG in cognitive neuroscience under eyes-closed settings. Copyright © 2018. Published by Elsevier Inc.

  7. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  8. Moving to Capture Children’s Attention: Developing a Methodology for Measuring Visuomotor Attention

    PubMed Central

    Coats, Rachel O.; Mushtaq, Faisal; Williams, Justin H. G.; Aucott, Lorna S.; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child’s development. However, methodological limitations currently make large-scale assessment of children’s attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of ‘Visual Motor Attention’ (VMA)—a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method’s core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults’ attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  9. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    PubMed

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  10. Using Online Methods to Develop and Examine the Hong Kong Chinese Translation of the Dissociative Experiences Scale.

    PubMed

    Chan, Chitat; Fung, Hong Wang; Choi, Tat Ming; Ross, Colin A

    2017-01-01

    Identifying dissociation is important for mental health services because it could fundamentally affect one's diagnosis and treatment plan. The Dissociative Experiences Scale (DES) is a widely-used self-report scale for measuring dissociative experiences. It has been translated into many languages and used in many countries. However, there is no validated Hong Kong Chinese version of the DES available in the field, and there is no other validated Hong Kong Chinese instrument for assessing dissociative disorders. This pilot study used online methods to translate the DES to Hong Kong Chinese (HKC-DES). The results indicated that the HKC-DES has excellent internal consistency (α = .953) and very good test-retest reliability (r = .797). Bilingual participants' responses to the DES and HKC-DES indicated high similarity, and were significantly correlated (r = .960). These results initially verified the reliability and cross-language equivalence of the scale. Implications for healthcare practice and research are discussed.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.

    The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.

  12. Pediatric Quality of Life Enjoyment and Satisfaction Questionnaire (PQ-LES-Q): Reliability and Validity

    ERIC Educational Resources Information Center

    Endicott, Jean; Nee, John; Yang, Ruoyong; Wohlberg, Christopher

    2006-01-01

    Objective: The pediatric version of the Short Form of the Quality of Life Enjoyment and Satisfaction Questionnaire (PQ-LES-Q) was developed to aid in the assessment of an important aspect of life experience in children and adolescents. Method: The reliability and validity of the PQ-LES-Q was tested using data from a sample of 376 outpatient…

  13. Numerical Investigation of the Performance of a Supersonic Combustion Chamber and Comparison with Experiments

    NASA Astrophysics Data System (ADS)

    Banica, M. C.; Chun, J.; Scheuermann, T.; Weigand, B.; Wolfersdorf, J. v.

    2009-01-01

    Scramjet powered vehicles can decrease costs for access to space but substantial obstacles still exist in their realization. For example, experiments in the relevant Mach number regime are difficult to perform and flight testing is expensive. Therefore, numerical methods are often employed for system layout but they require validation against experimental data. Here, we validate the commercial code CFD++ against experimental results for hydrogen combustion in the supersonic combustion facility of the Institute of Aerospace Thermodynamics (ITLR) at the Universität Stuttgart. Fuel is injected through a lobed a strut injector, which provides rapid mixing. Our numerical data shows reasonable agreement with experiments. We further investigate effects of varying equivalence ratios on several important performance parameters.

  14. Numerical Predictions of Wind Turbine Power and Aerodynamic Loads for the NREL Phase II and IV Combined Experiment Rotor

    NASA Technical Reports Server (NTRS)

    Duque, Earl P. N.; Johnson, Wayne; vanDam, C. P.; Chao, David D.; Cortes, Regina; Yee, Karen

    1999-01-01

    Accurate, reliable and robust numerical predictions of wind turbine rotor power remain a challenge to the wind energy industry. The literature reports various methods that compare predictions to experiments. The methods vary from Blade Element Momentum Theory (BEM), Vortex Lattice (VL), to variants of Reynolds-averaged Navier-Stokes (RaNS). The BEM and VL methods consistently show discrepancies in predicting rotor power at higher wind speeds mainly due to inadequacies with inboard stall and stall delay models. The RaNS methodologies show promise in predicting blade stall. However, inaccurate rotor vortex wake convection, boundary layer turbulence modeling and grid resolution has limited their accuracy. In addition, the inherently unsteady stalled flow conditions become computationally expensive for even the best endowed research labs. Although numerical power predictions have been compared to experiment. The availability of good wind turbine data sufficient for code validation experimental data that has been extracted from the IEA Annex XIV download site for the NREL Combined Experiment phase II and phase IV rotor. In addition, the comparisons will show data that has been further reduced into steady wind and zero yaw conditions suitable for comparisons to "steady wind" rotor power predictions. In summary, the paper will present and discuss the capabilities and limitations of the three numerical methods and make available a database of experimental data suitable to help other numerical methods practitioners validate their own work.

  15. Father for the first time - development and validation of a questionnaire to assess fathers’ experiences of first childbirth (FTFQ)

    PubMed Central

    2012-01-01

    Background A father’s experience of the birth of his first child is important not only for his birth-giving partner but also for the father himself, his relationship with the mother and the newborn. No validated questionnaire assessing first-time fathers' experiences during childbirth is currently available. Hence, the aim of this study was to develop and validate an instrument to assess first-time fathers’ experiences of childbirth. Method Domains and items were initially derived from interviews with first-time fathers, and supplemented by a literature search and a focus group interview with midwives. The comprehensibility, comprehension and relevance of the items were evaluated by four paternity research experts and a preliminary questionnaire was pilot tested in eight first-time fathers. A revised questionnaire was completed by 200 first-time fathers (response rate = 81%) Exploratory factor analysis using principal component analysis with varimax rotation was performed and multitrait scaling analysis was used to test scaling assumptions. External validity was assessed by means of known-groups analysis. Results Factor analysis yielded four factors comprising 22 items and accounting 48% of the variance. The domains found were Worry, Information, Emotional support and Acceptance. Multitrait analysis confirmed the convergent and discriminant validity of the domains; however, Cronbach’s alpha did not meet conventional reliability standards in two domains. The questionnaire was sensitive to differences between groups of fathers hypothesized to differ on important socio demographic or clinical variables. Conclusions The questionnaire adequately measures important dimensions of first-time fathers’ childbirth experience and may be used to assess aspects of fathers’ experiences during childbirth. To obtain the FTFQ and permission for its use, please contact the corresponding author. PMID:22594834

  16. Bioanalytical method development and validation for the determination of glycine in human cerebrospinal fluid by ion-pair reversed-phase liquid chromatography-tandem mass spectrometry.

    PubMed

    Jiang, Jian; James, Christopher A; Wong, Philip

    2016-09-05

    A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. TH-CD-202-06: A Method for Characterizing and Validating Dynamic Lung Density Change During Quiet Respiration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dou, T; Ruan, D; Heinrich, M

    2016-06-15

    Purpose: To obtain a functional relationship that calibrates the lung tissue density change under free breathing conditions through correlating Jacobian values to the Hounsfield units. Methods: Free-breathing lung computed tomography images were acquired using a fast helical CT protocol, where 25 scans were acquired per patient. Using a state-of-the-art deformable registration algorithm, a set of the deformation vector fields (DVF) was generated to provide spatial mapping from the reference image geometry to the other free-breathing scans. These DVFs were used to generate Jacobian maps, which estimate voxelwise volume change. Subsequently, the set of 25 corresponding Jacobian and voxel intensity inmore » Hounsfield units (HU) were collected and linear regression was performed based on the mass conservation relationship to correlate the volume change to density change. Based on the resulting fitting coefficients, the tissues were classified into parenchymal (Type I), vascular (Type II), and soft tissue (Type III) types. These coefficients modeled the voxelwise density variation during quiet breathing. The accuracy of the proposed method was assessed using mean absolute difference in HU between the CT scan intensities and the model predicted values. In addition, validation experiments employing a leave-five-out method were performed to evaluate the model accuracy. Results: The computed mean model errors were 23.30±9.54 HU, 29.31±10.67 HU, and 35.56±20.56 HU, respectively, for regions I, II, and III, respectively. The cross validation experiments averaged over 100 trials had mean errors of 30.02 ± 1.67 HU over the entire lung. These mean values were comparable with the estimated CT image background noise. Conclusion: The reported validation experiment statistics confirmed the lung density modeling during free breathing. The proposed technique was general and could be applied to a wide range of problem scenarios where accurate dynamic lung density information is needed. This work was supported in part by NIH R01 CA0096679.« less

  18. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  19. The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).

    PubMed

    Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S

    2016-12-01

    The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.

  20. Assessing birth experience in fathers as an important aspect of clinical obstetrics: how applicable is Salmon's Item List for men?

    PubMed

    Gawlik, Stephanie; Müller, Mitho; Hoffmann, Lutz; Dienes, Aimée; Reck, Corinna

    2015-01-01

    validated questionnaire assessment of fathers' experiences during childbirth is lacking in routine clinical practice. Salmon's Item List is a short, validated method used for the assessment of birth experience in mothers in both English- and German-speaking communities. With little to no validated data available for fathers, this pilot study aimed to assess the applicability of the German version of Salmon's Item List, including a multidimensional birth experience concept, in fathers. longitudinal study. Data were collected by questionnaires. University hospital in Germany. the birth experiences of 102 fathers were assessed four to six weeks post partum using the German version of Salmon's Item List. construct validity testing with exploratory factor analysis using principal component analysis with varimax rotation was performed to identify the dimensions of childbirth experiences. Internal consistency was also analysed. factor analysis yielded a four-factor solution comprising 17 items that accounted for 54.5% of the variance. The main domain was 'fulfilment', and the secondary domains were 'emotional distress', 'physical discomfort' and 'emotional adaption'. For fulfilment, Cronbach's α met conventional reliability standards (0.87). Salmon's Item List is an appropriate instrument to assess birth experience in fathers in terms of fulfilment. Larger samples need to be examined in order to prove the stability of the factor structure before this can be extended to routine clinical assessment. a reduced version of Salmon's Item List may be useful as a screening tool for general assessment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    PubMed Central

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-01-01

    Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625

  2. Cone beam x-ray luminescence computed tomography: a feasibility study.

    PubMed

    Chen, Dongmei; Zhu, Shouping; Yi, Huangjian; Zhang, Xianghan; Chen, Duofang; Liang, Jimin; Tian, Jie

    2013-03-01

    The appearance of x-ray luminescence computed tomography (XLCT) opens new possibilities to perform molecular imaging by x ray. In the previous XLCT system, the sample was irradiated by a sequence of narrow x-ray beams and the x-ray luminescence was measured by a highly sensitive charge coupled device (CCD) camera. This resulted in a relatively long sampling time and relatively low utilization of the x-ray beam. In this paper, a novel cone beam x-ray luminescence computed tomography strategy is proposed, which can fully utilize the x-ray dose and shorten the scanning time. The imaging model and reconstruction method are described. The validity of the imaging strategy has been studied in this paper. In the cone beam XLCT system, the cone beam x ray was adopted to illuminate the sample and a highly sensitive CCD camera was utilized to acquire luminescent photons emitted from the sample. Photons scattering in biological tissues makes it an ill-posed problem to reconstruct the 3D distribution of the x-ray luminescent sample in the cone beam XLCT. In order to overcome this issue, the authors used the diffusion approximation model to describe the photon propagation in tissues, and employed the sparse regularization method for reconstruction. An incomplete variables truncated conjugate gradient method and permissible region strategy were used for reconstruction. Meanwhile, traditional x-ray CT imaging could also be performed in this system. The x-ray attenuation effect has been considered in their imaging model, which is helpful in improving the reconstruction accuracy. First, simulation experiments with cylinder phantoms were carried out to illustrate the validity of the proposed compensated method. The experimental results showed that the location error of the compensated algorithm was smaller than that of the uncompensated method. The permissible region strategy was applied and reduced the reconstruction error to less than 2 mm. The robustness and stability were then evaluated from different view numbers, different regularization parameters, different measurement noise levels, and optical parameters mismatch. The reconstruction results showed that the settings had a small effect on the reconstruction. The nonhomogeneous phantom simulation was also carried out to simulate a more complex experimental situation and evaluated their proposed method. Second, the physical cylinder phantom experiments further showed similar results in their prototype XLCT system. With the discussion of the above experiments, it was shown that the proposed method is feasible to the general case and actual experiments. Utilizing numerical simulation and physical experiments, the authors demonstrated the validity of the new cone beam XLCT method. Furthermore, compared with the previous narrow beam XLCT, the cone beam XLCT could more fully utilize the x-ray dose and the scanning time would be shortened greatly. The study of both simulation experiments and physical phantom experiments indicated that the proposed method was feasible to the general case and actual experiments.

  3. A wall interference assessment/correction system

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Ulbrich, N.; Sickles, W. L.; Qian, Cathy X.

    1992-01-01

    A Wall Signature method, the Hackett method, has been selected to be adapted for the 12-ft Wind Tunnel wall interference assessment/correction (WIAC) system in the present phase. This method uses limited measurements of the static pressure at the wall, in conjunction with the solid wall boundary condition, to determine the strength and distribution of singularities representing the test article. The singularities are used in turn for estimating wall interferences at the model location. The Wall Signature method will be formulated for application to the unique geometry of the 12-ft Tunnel. The development and implementation of a working prototype will be completed, delivered and documented with a software manual. The WIAC code will be validated by conducting numerically simulated experiments rather than actual wind tunnel experiments. The simulations will be used to generate both free-air and confined wind-tunnel flow fields for each of the test articles over a range of test configurations. Specifically, the pressure signature at the test section wall will be computed for the tunnel case to provide the simulated 'measured' data. These data will serve as the input for the WIAC method-Wall Signature method. The performance of the WIAC method then may be evaluated by comparing the corrected parameters with those for the free-air simulation. Each set of wind tunnel/test article numerical simulations provides data to validate the WIAC method. A numerical wind tunnel test simulation is initiated to validate the WIAC methods developed in the project. In the present reported period, the blockage correction has been developed and implemented for a rectangular tunnel as well as the 12-ft Pressure Tunnel. An improved wall interference assessment and correction method for three-dimensional wind tunnel testing is presented in the appendix.

  4. USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Schultz

    2012-09-01

    A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applyingmore » the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.« less

  5. The Sensed Presence Questionnaire (SenPQ): initial psychometric validation of a measure of the “Sensed Presence” experience

    PubMed Central

    Bell, Vaughan

    2017-01-01

    Background The experience of ‘sensed presence’—a feeling or sense that another entity, individual or being is present despite no clear sensory or perceptual evidence—is known to occur in the general population, appears more frequently in religious or spiritual contexts, and seems to be prominent in certain psychiatric or neurological conditions and may reflect specific functions of social cognition or body-image representation systems in the brain. Previous research has relied on ad-hoc measures of the experience and no specific psychometric scale to measure the experience exists to date. Methods Based on phenomenological description in the literature, we created the 16-item Sensed Presence Questionnaire (SenPQ). We recruited participants from (i) a general population sample, and; (ii) a sample including specific selection for religious affiliation, to complete the SenPQ and additional measures of well-being, schizotypy, social anxiety, social imagery, and spiritual experience. We completed an analysis to test internal reliability, the ability of the SenPQ to distinguish between religious and non-religious participants, and whether the SenPQ was specifically related to positive schizotypical experiences and social imagery. A factor analysis was also conducted to examine underlying latent variables. Results The SenPQ was found to be reliable and valid, with religious participants significantly endorsing more items than non-religious participants, and the scale showing a selective relationship with construct relevant measures. Principal components analysis indicates two potential underlying factors interpreted as reflecting ‘benign’ and ‘malign’ sensed presence experiences. Discussion The SenPQ appears to be a reliable and valid measure of sensed presence experience although further validation in neurological and psychiatric conditions is warranted. PMID:28367379

  6. Evaluating the Social Validity of the Early Start Denver Model: A Convergent Mixed Methods Study.

    PubMed

    Ogilvie, Emily; McCrudden, Matthew T

    2017-09-01

    An intervention has social validity to the extent that it is socially acceptable to participants and stakeholders. This pilot convergent mixed methods study evaluated parents' perceptions of the social validity of the Early Start Denver Model (ESDM), a naturalistic behavioral intervention for children with autism. It focused on whether the parents viewed (a) the ESDM goals as appropriate for their children, (b) the intervention procedures as acceptable and appropriate, and (c) whether changes in their children's behavior was practically significant. Parents of four children who participated in the ESDM completed the TARF-R questionnaire and participated in a semi-structured interview. Both data sets indicated that parents rated their experiences with the ESDM positively and rated it as socially-valid. The findings indicated that what was implemented in the intervention is complemented by how it was implemented and by whom.

  7. Practical issues relating to soil column chromatography for sorption parameter determination.

    PubMed

    Bi, Erping; Schmidt, Torsten C; Haderlein, Stefan B

    2010-08-01

    Determination of sorption distribution coefficients (K(d)) of organic compounds by a dynamic soil column chromatography (SCC) method was developed and validated. Eurosoil 4, quartz, and alumina were chosen as exemplary packing materials. Heterocyclic aromatic compounds were selected in the validation of SCC. The prerequisites of SCC with regard to column dimension, packing procedure, and sample injection volume are discussed. Reproducible soil column packing was achieved by addition of a pre-column and an HPLC pump for subsequent compression of the packed material. Various methods to determine retention times from breakthrough curves are discussed and the use of the half mass method is recommended. To dilute soil with inert material can prevent column-clogging and help to complete experiments in a reasonable period of time. For the chosen probe compounds, quartz rather than alumina proved a suitable dilution material. Non-equilibrium issue can be overcome by conducting the experiments under different flowrates and/or performing numerical simulation. Copyright 2010 Elsevier Ltd. All rights reserved.

  8. Using meta-differential evolution to enhance a calculation of a continuous blood glucose level.

    PubMed

    Koutny, Tomas

    2016-09-01

    We developed a new model of glucose dynamics. The model calculates blood glucose level as a function of transcapillary glucose transport. In previous studies, we validated the model with animal experiments. We used analytical method to determine model parameters. In this study, we validate the model with subjects with type 1 diabetes. In addition, we combine the analytic method with meta-differential evolution. To validate the model with human patients, we obtained a data set of type 1 diabetes study that was coordinated by Jaeb Center for Health Research. We calculated a continuous blood glucose level from continuously measured interstitial fluid glucose level. We used 6 different scenarios to ensure robust validation of the calculation. Over 96% of calculated blood glucose levels fit A+B zones of the Clarke Error Grid. No data set required any correction of model parameters during the time course of measuring. We successfully verified the possibility of calculating a continuous blood glucose level of subjects with type 1 diabetes. This study signals a successful transition of our research from an animal experiment to a human patient. Researchers can test our model with their data on-line at https://diabetes.zcu.cz. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  9. Quantum-state anomaly detection for arbitrary errors using a machine-learning technique

    NASA Astrophysics Data System (ADS)

    Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki

    2016-10-01

    The accurate detection of small deviations in given density matrice is important for quantum information processing, which is a difficult task because of the intrinsic fluctuation in density matrices reconstructed using a limited number of experiments. We previously proposed a method for decoherence error detection using a machine-learning technique [S. Hara, T. Ono, R. Okamoto, T. Washio, and S. Takeuchi, Phys. Rev. A 89, 022104 (2014), 10.1103/PhysRevA.89.022104]. However, the previous method is not valid when the errors are just changes in phase. Here, we propose a method that is valid for arbitrary errors in density matrices. The performance of the proposed method is verified using both numerical simulation data and real experimental data.

  10. The Outpatient Experience Questionnaire of comprehensive public hospital in China: development, validity and reliability.

    PubMed

    Hu, Yinhuan; Zhang, Zixia; Xie, Jinzhu; Wang, Guanping

    2017-02-01

    The objective of this study is to describe the development of the Outpatient Experience Questionnaire (OPEQ) and to assess the validity and reliability of the scale. Literature review, patient interviews, Delphi method and Cross-sectional validation survey. Six comprehensive public hospitals in China. The survey was carried out on a sample of 600 outpatients. Acceptability of the questionnaire was assessed according to the overall response rate, item non-response rate and the average completion time. Correlation coefficients and confirmatory factor analysis were used to test construct validity. Delphi method was used to assess the content validity of the questionnaire. Cronbach's coefficient alpha and split-half reliability coefficient were used to estimate the internal reliability of the questionnaire. The overall response rate was 97.2% and the item non-response rate ranged from 0% to 0.3%. The mean completion time was 6 min. The Spearman correlations of item-total score ranged from 0.466 to 0.765. The results of confirmatory factor analysis showed that all items had factor loadings above 0.40 and the dimension intercorrelation ranged from 0.449 to 0.773, the goodness of fit of the questionnaire was reasonable. The overall authority grade of expert consultation was 0.80 and Kendall's coefficient of concordance W was 0.186. The Cronbach's coefficients alpha of six dimensions ranged from 0.708 to 0.895, the split-half reliability coefficient (Spearman-Brown coefficient) was 0.969. The OPEQ is a promising instrument covering the most important aspects which influence outpatient experiences of comprehensive public hospital in China. It has good evidence for acceptability, validity and reliability. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  11. A New Method to Cross Calibrate and Validate TOMS, SBUV/2, and SCIAMACHY Measurements

    NASA Technical Reports Server (NTRS)

    Ahmad, Ziauddin; Hilsenrath, Ernest; Einaudi, Franco (Technical Monitor)

    2001-01-01

    A unique method to validate back scattered ultraviolet (buv) type satellite data that complements the measurements from existing ground networks is proposed. The method involves comparing the zenith sky radiance measurements from the ground to the nadir radiance measurements taken from space. Since the measurements are compared directly, the proposed method is superior to any other method that involves comparing derived products (for example, ozone), because comparison of derived products involve inversion algorithms which are susceptible to several type of errors. Forward radiative transfer (RT) calculations show that for an aerosol free atmosphere, the ground-based zenith sky radiance measurement and the satellite nadir radiance measurements can be predicted with an accuracy of better than 1 percent. The RT computations also show that for certain values of the solar zenith angles, the radiance comparisons could be better than half a percent. This accuracy is practically independent of ozone amount and aerosols in the atmosphere. Experiences with the Shuttle Solar Backscatter Ultraviolet (SSBUV) program show that the accuracy of the ground-based zenith sky radiance measuring instrument can be maintained at a level of a few tenth of a percent. This implies that the zenith sky radiance measurements can be used to validate Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet (SBUV/2), and The SCanning Imaging Absorption SpectroMeter for Atmospheric CHartographY (SCIAMACHY) radiance data. Also, this method will help improve the long term precision of the measurements for better trend detection and the accuracy of other BUV products such as tropospheric ozone and aerosols. Finally, in the long term, this method is a good candidate to inter-calibrate and validate long term observations of upcoming operational instruments such as Global Ozone Monitoring Experiment (GOME-2), Ozone Mapping Instrument (OMI), Ozone Dynamics Ultraviolet Spectrometer (ODUS), and Ozone Mapping and Profiler Suite (OMPS).

  12. MRPrimer: a MapReduce-based method for the thorough design of valid and ranked primers for PCR

    PubMed Central

    Kim, Hyerin; Kang, NaNa; Chon, Kang-Wook; Kim, Seonho; Lee, NaHye; Koo, JaeHyung; Kim, Min-Soo

    2015-01-01

    Primer design is a fundamental technique that is widely used for polymerase chain reaction (PCR). Although many methods have been proposed for primer design, they require a great deal of manual effort to generate feasible and valid primers, including homology tests on off-target sequences using BLAST-like tools. That approach is inconvenient for many target sequences of quantitative PCR (qPCR) due to considering the same stringent and allele-invariant constraints. To address this issue, we propose an entirely new method called MRPrimer that can design all feasible and valid primer pairs existing in a DNA database at once, while simultaneously checking a multitude of filtering constraints and validating primer specificity. Furthermore, MRPrimer suggests the best primer pair for each target sequence, based on a ranking method. Through qPCR analysis using 343 primer pairs and the corresponding sequencing and comparative analyses, we showed that the primer pairs designed by MRPrimer are very stable and effective for qPCR. In addition, MRPrimer is computationally efficient and scalable and therefore useful for quickly constructing an entire collection of feasible and valid primers for frequently updated databases like RefSeq. Furthermore, we suggest that MRPrimer can be utilized conveniently for experiments requiring primer design, especially real-time qPCR. PMID:26109350

  13. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  14. Dynamic Assessment: One Approach and Some Initial Data. Technical Report No. 361.

    ERIC Educational Resources Information Center

    Campione, Joseph C.; Brown, Ann L.

    In an effort to validate dynamic assessment methods influenced by Vygotsky's (1978) definition of zones of proximal development (an indicator of readiness), three sets of experiments addressed two goals: the development of diagnostic assessment methods and the use of diagnostic results to guide the design of instructional programs. The first two…

  15. ICCS/ESCCA consensus guidelines to detect GPI-deficient cells in paroxysmal nocturnal hemoglobinuria (PNH) and related disorders part 4 - assay validation and quality assurance.

    PubMed

    Oldaker, Teri; Whitby, Liam; Saber, Maryam; Holden, Jeannine; Wallace, Paul K; Litwin, Virginia

    2018-01-01

    Over the past six years, a diverse group of stakeholders have put forth recommendations regarding the analytical validation of flow cytometric methods and described in detail the differences between cell-based and traditional soluble analyte assay validations. This manuscript is based on these general recommendations as well as the published experience of experts in the area of PNH testing. The goal is to provide practical assay-specific guidelines for the validation of high-sensitivity flow cytometric PNH assays. Examples of the reports and validation data described herein are provided in Supporting Information. © 2017 International Clinical Cytometry Society. © 2017 International Clinical Cytometry Society.

  16. Experimental validation of boundary element methods for noise prediction

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Oswald, Fred B.

    1992-01-01

    Experimental validation of methods to predict radiated noise is presented. A combined finite element and boundary element model was used to predict the vibration and noise of a rectangular box excited by a mechanical shaker. The predicted noise was compared to sound power measured by the acoustic intensity method. Inaccuracies in the finite element model shifted the resonance frequencies by about 5 percent. The predicted and measured sound power levels agree within about 2.5 dB. In a second experiment, measured vibration data was used with a boundary element model to predict noise radiation from the top of an operating gearbox. The predicted and measured sound power for the gearbox agree within about 3 dB.

  17. Hybrid Particle-Element Simulation of Impact on Composite Orbital Debris Shields

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.

    2004-01-01

    This report describes the development of new numerical methods and new constitutive models for the simulation of hypervelocity impact effects on spacecraft. The research has included parallel implementation of the numerical methods and material models developed under the project. Validation work has included both one dimensional simulations, for comparison with exact solutions, and three dimensional simulations of published hypervelocity impact experiments. The validated formulations have been applied to simulate impact effects in a velocity and kinetic energy regime outside the capabilities of current experimental methods. The research results presented here allow for the expanded use of numerical simulation, as a complement to experimental work, in future design of spacecraft for hypervelocity impact effects.

  18. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  19. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Development and examination of the psychometric properties of the Learning Experience Scale in nursing.

    PubMed

    Takase, Miyuki; Imai, Takiko; Uemura, Chizuru

    2016-06-01

    This paper examines the psychometric properties of the Learning Experience Scale. A survey method was used to collect data from a total of 502 nurses. Data were analyzed by factor analysis and the known-groups technique to examine the construct validity of the scale. In addition, internal consistency was evaluated by Cronbach's alpha, and stability was examined by test-retest correlation. Factor analysis showed that the Learning Experience Scale consisted of five factors: learning from practice, others, training, feedback, and reflection. The scale also had the power to discriminate between nurses with high and low levels of nursing competence. The internal consistency and the stability of the scale were also acceptable. The Learning Experience Scale is a valid and reliable instrument, and helps organizations to effectively design learning interventions for nurses. © 2015 Wiley Publishing Asia Pty Ltd.

  1. Utilization of Low Gravity Environment for Measuring Liquid Viscosity

    NASA Technical Reports Server (NTRS)

    Antar, Basil N.; Ethridge, Edwin

    1998-01-01

    The method of drop coalescence is used for determining the viscosity of highly viscous undercooled liquids. Low gravity environment is necessary in order to allow for examining large volumes affording much higher accuracy for the viscosity calculations than possible for smaller volumes available under 1 - g conditions. The drop coalescence method is preferred over the drop oscillation technique since the latter method can only be applied for liquids with vanishingly small viscosities. The technique developed relies on both the highly accurate solution of the Navier-Stokes equations as well as on data from experiments conducted in near zero gravity environment. Results are presented for method validation experiments recently performed on board the NASA/KC-135 aircraft. While the numerical solution was produced using the Boundary Element Method. In these tests the viscosity of a highly viscous liquid, glycerine at room temperature, was determined using the liquid coalescence method. The results from these experiments will be discussed.

  2. Non-Invasive Measurement of Adrenocortical Activity in Blue-Fronted Parrots (Amazona aestiva, Linnaeus, 1758)

    PubMed Central

    Ferreira, João C. P.; Fujihara, Caroline J.; Fruhvald, Erika; Trevisol, Eduardo; Destro, Flavia C.; Teixeira, Carlos R.; Pantoja, José C. F.; Schmidt, Elizabeth M. S.; Palme, Rupert

    2015-01-01

    Parrots kept in zoos and private households often develop psychological and behavioural disorders. Despite knowing that such disorders have a multifactorial aetiology and that chronic stress is involved, little is known about their development mainly due to a poor understanding of the parrots’ physiology and the lack of validated methods to measure stress in these species. In birds, blood corticosterone concentrations provide information about adrenocortical activity. However, blood sampling techniques are difficult, highly invasive and inappropriate to investigate stressful situations and welfare conditions. Thus, a non-invasive method to measure steroid hormones is critically needed. Aiming to perform a physiological validation of a cortisone enzyme immunoassay (EIA) to measure glucocorticoid metabolites (GCM) in droppings of 24 Blue-fronted parrots (Amazona aestiva), two experiments were designed. During the experiments all droppings were collected at 3-h intervals. Initially, birds were sampled for 24 h (experiment 1) and one week later assigned to four different treatments (experiment 2): Control (undisturbed), Saline (0.2 mL of 0.9% NaCl IM), Dexamethasone (1 mg/kg IM) and Adrenocorticotropic hormone (ACTH; 25 IU IM). Treatments (always one week apart) were applied to all animals in a cross-over study design. A daily rhythm pattern in GCM excretion was detected but there were no sex differences (first experiment). Saline and dexamethasone treatments had no effect on GCM (not different from control concentrations). Following ACTH injection, GCM concentration increased about 13.1-fold (median) at the peak (after 3–9 h), and then dropped to pre-treatment concentrations. By a successful physiological validation, we demonstrated the suitability of the cortisone EIA to non-invasively monitor increased adrenocortical activity, and thus, stress in the Blue-fronted parrot. This method opens up new perspectives for investigating the connection between behavioural disorders and stress in this bird species, and could also help in their captive management. PMID:26717147

  3. Non-Invasive Measurement of Adrenocortical Activity in Blue-Fronted Parrots (Amazona aestiva, Linnaeus, 1758).

    PubMed

    Ferreira, João C P; Fujihara, Caroline J; Fruhvald, Erika; Trevisol, Eduardo; Destro, Flavia C; Teixeira, Carlos R; Pantoja, José C F; Schmidt, Elizabeth M S; Palme, Rupert

    2015-01-01

    Parrots kept in zoos and private households often develop psychological and behavioural disorders. Despite knowing that such disorders have a multifactorial aetiology and that chronic stress is involved, little is known about their development mainly due to a poor understanding of the parrots' physiology and the lack of validated methods to measure stress in these species. In birds, blood corticosterone concentrations provide information about adrenocortical activity. However, blood sampling techniques are difficult, highly invasive and inappropriate to investigate stressful situations and welfare conditions. Thus, a non-invasive method to measure steroid hormones is critically needed. Aiming to perform a physiological validation of a cortisone enzyme immunoassay (EIA) to measure glucocorticoid metabolites (GCM) in droppings of 24 Blue-fronted parrots (Amazona aestiva), two experiments were designed. During the experiments all droppings were collected at 3-h intervals. Initially, birds were sampled for 24 h (experiment 1) and one week later assigned to four different treatments (experiment 2): Control (undisturbed), Saline (0.2 mL of 0.9% NaCl IM), Dexamethasone (1 mg/kg IM) and Adrenocorticotropic hormone (ACTH; 25 IU IM). Treatments (always one week apart) were applied to all animals in a cross-over study design. A daily rhythm pattern in GCM excretion was detected but there were no sex differences (first experiment). Saline and dexamethasone treatments had no effect on GCM (not different from control concentrations). Following ACTH injection, GCM concentration increased about 13.1-fold (median) at the peak (after 3-9 h), and then dropped to pre-treatment concentrations. By a successful physiological validation, we demonstrated the suitability of the cortisone EIA to non-invasively monitor increased adrenocortical activity, and thus, stress in the Blue-fronted parrot. This method opens up new perspectives for investigating the connection between behavioural disorders and stress in this bird species, and could also help in their captive management.

  4. An Approach for Validating Actinide and Fission Product Burnup Credit Criticality Safety Analyses--Criticality (keff) Predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scaglione, John M; Mueller, Don; Wagner, John C

    2011-01-01

    One of the most significant remaining challenges associated with expanded implementation of burnup credit in the United States is the validation of depletion and criticality calculations used in the safety evaluation - in particular, the availability and use of applicable measured data to support validation, especially for fission products. Applicants and regulatory reviewers have been constrained by both a scarcity of data and a lack of clear technical basis or approach for use of the data. U.S. Nuclear Regulatory Commission (NRC) staff have noted that the rationale for restricting their Interim Staff Guidance on burnup credit (ISG-8) to actinide-only ismore » based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address the issue of validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach (both depletion and criticality) for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the criticality (k{sub eff}) validation approach, and resulting observations and recommendations. Validation of the isotopic composition (depletion) calculations is addressed in a companion paper at this conference. For criticality validation, the approach is to utilize (1) available laboratory critical experiment (LCE) data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the French Haut Taux de Combustion (HTC) program to support validation of the principal actinides and (2) calculated sensitivities, nuclear data uncertainties, and the limited available fission product LCE data to predict and verify individual biases for relevant minor actinides and fission products. This paper (1) provides a detailed description of the approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias results based on the prerelease SCALE 6.1 code package and ENDF/B-VII nuclear cross-section data, and (4) provides recommendations for application of the results and methods to other code and data packages.« less

  5. The Question-Driven Laboratory Exercise: A New Pedagogy Applied to a Green Modification of Grignard Reagent Formation and Reaction

    ERIC Educational Resources Information Center

    Teixeira, Jennifer M.; Byers, Jessie Nedrow; Perez, Marilu G.; Holman, R. W.

    2010-01-01

    Experimental exercises within second-year-level organic laboratory manuals typically involve a statement of a principle that is then validated by student generation of data in a single experiment. These experiments are structured in the exact opposite order of the scientific method, in which data interpretation, typically from multiple related…

  6. fMRI capture of auditory hallucinations: Validation of the two-steps method.

    PubMed

    Leroy, Arnaud; Foucher, Jack R; Pins, Delphine; Delmaire, Christine; Thomas, Pierre; Roser, Mathilde M; Lefebvre, Stéphanie; Amad, Ali; Fovet, Thomas; Jaafari, Nemat; Jardri, Renaud

    2017-10-01

    Our purpose was to validate a reliable method to capture brain activity concomitant with hallucinatory events, which constitute frequent and disabling experiences in schizophrenia. Capturing hallucinations using functional magnetic resonance imaging (fMRI) remains very challenging. We previously developed a method based on a two-steps strategy including (1) multivariate data-driven analysis of per-hallucinatory fMRI recording and (2) selection of the components of interest based on a post-fMRI interview. However, two tests still need to be conducted to rule out critical pitfalls of conventional fMRI capture methods before this two-steps strategy can be adopted in hallucination research: replication of these findings on an independent sample and assessment of the reliability of the hallucination-related patterns at the subject level. To do so, we recruited a sample of 45 schizophrenia patients suffering from frequent hallucinations, 20 schizophrenia patients without hallucinations and 20 matched healthy volunteers; all participants underwent four different experiments. The main findings are (1) high accuracy in reporting unexpected sensory stimuli in an MRI setting; (2) good detection concordance between hypothesis-driven and data-driven analysis methods (as used in the two-steps strategy) when controlled unexpected sensory stimuli are presented; (3) good agreement of the two-steps method with the online button-press approach to capture hallucinatory events; (4) high spatial consistency of hallucinatory-related networks detected using the two-steps method on two independent samples. By validating the two-steps method, we advance toward the possible transfer of such technology to new image-based therapies for hallucinations. Hum Brain Mapp 38:4966-4979, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  7. Experiences in integrating auto-translated state-chart designs for model checking

    NASA Technical Reports Server (NTRS)

    Pingree, P. J.; Benowitz, E. G.

    2003-01-01

    In the complex environment of JPL's flight missions with increasing dependency on advanced software designs, traditional software validation methods of simulation and testing are being stretched to adequately cover the needs of software development.

  8. Elaboration and validation of the method for the quantification of the emetic toxin of Bacillus cereus as described in EN-ISO 18465 - Microbiology of the food chain - Quantitative determination of emetic toxin (cereulide) using LC-MS/MS.

    PubMed

    In 't Veld, P H; van der Laak, L F J; van Zon, M; Biesta-Peters, E G

    2018-04-12

    A method for the quantification of the Bacillus cereus emetic toxin (cereulide) was developed and validated. The method principle is based on LC-MS as this is the most sensitive and specific method for cereulide. Therefore the study design is different from the microbiological methods validated under this mandate. As the method had to be developed a two stage validation study approach was used. The first stage (pre-study) focussed on the method applicability and the experience of the laboratories with the method. Based on the outcome of the pre-study and comments received during voting at CEN and ISO level a final method was agreed to be used for the second stage the (final) validation of the method. In the final (validation) study samples of cooked rice (both artificially contaminated with cereulide or contaminated with B. cereus for production of cereulide in the rice) and 6 other food matrices (fried rice dish, cream pastry with chocolate, hotdog sausage, mini pancakes, vanilla custard and infant formula) were used. All these samples were spiked by the participating laboratories using standard solutions of cereulide supplied by the organising laboratory. The results of the study indicate that the method is fit for purpose. Repeatability values were obtained of 0.6 μg/kg at low level spike (ca. 5 μg/kg) and 7 to 9.6 μg/kg at high level spike (ca. 75 μg/kg). Reproducibility at low spike level ranged from 0.6 to 0.9 μg/kg and from 8.7 to 14.5 μg/kg at high spike level. Recovery from the spiked samples ranged between 96.5% for mini-pancakes to 99.3% for fries rice dish. Copyright © 2018. Published by Elsevier B.V.

  9. Validation of Helicopter Gear Condition Indicators Using Seeded Fault Tests

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula; Brandon, E. Bruce

    2013-01-01

    A "seeded fault test" in support of a rotorcraft condition based maintenance program (CBM), is an experiment in which a component is tested with a known fault while health monitoring data is collected. These tests are performed at operating conditions comparable to operating conditions the component would be exposed to while installed on the aircraft. Performance of seeded fault tests is one method used to provide evidence that a Health Usage Monitoring System (HUMS) can replace current maintenance practices required for aircraft airworthiness. Actual in-service experience of the HUMS detecting a component fault is another validation method. This paper will discuss a hybrid validation approach that combines in service-data with seeded fault tests. For this approach, existing in-service HUMS flight data from a naturally occurring component fault will be used to define a component seeded fault test. An example, using spiral bevel gears as the targeted component, will be presented. Since the U.S. Army has begun to develop standards for using seeded fault tests for HUMS validation, the hybrid approach will be mapped to the steps defined within their Aeronautical Design Standard Handbook for CBM. This paper will step through their defined processes, and identify additional steps that may be required when using component test rig fault tests to demonstrate helicopter CI performance. The discussion within this paper will provide the reader with a better appreciation for the challenges faced when defining a seeded fault test for HUMS validation.

  10. Mitigating Communication Delays in Remotely Connected Hardware-in-the-loop Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cale, James; Johnson, Brian; Dall'Anese, Emiliano

    Here, this paper introduces a potential approach for mitigating the effects of communication delays between multiple, closed-loop hardware-in-the-loop experiments which are virtually connected, yet physically separated. The method consists of an analytical method for the compensation of communication delays, along with the supporting computational and communication infrastructure. The control design leverages tools for the design of observers for the compensation of measurement errors in systems with time-varying delays. The proposed methodology is validated through computer simulation and hardware experimentation connecting hardware-in-the-loop experiments conducted between laboratories separated by a distance of over 100 km.

  11. Mitigating Communication Delays in Remotely Connected Hardware-in-the-loop Experiments

    DOE PAGES

    Cale, James; Johnson, Brian; Dall'Anese, Emiliano; ...

    2018-03-30

    Here, this paper introduces a potential approach for mitigating the effects of communication delays between multiple, closed-loop hardware-in-the-loop experiments which are virtually connected, yet physically separated. The method consists of an analytical method for the compensation of communication delays, along with the supporting computational and communication infrastructure. The control design leverages tools for the design of observers for the compensation of measurement errors in systems with time-varying delays. The proposed methodology is validated through computer simulation and hardware experimentation connecting hardware-in-the-loop experiments conducted between laboratories separated by a distance of over 100 km.

  12. Computational simulations of supersonic magnetohydrodynamic flow control, power and propulsion systems

    NASA Astrophysics Data System (ADS)

    Wan, Tian

    This work is motivated by the lack of fully coupled computational tool that solves successfully the turbulent chemically reacting Navier-Stokes equation, the electron energy conservation equation and the electric current Poisson equation. In the present work, the abovementioned equations are solved in a fully coupled manner using fully implicit parallel GMRES methods. The system of Navier-Stokes equations are solved using a GMRES method with combined Schwarz and ILU(0) preconditioners. The electron energy equation and the electric current Poisson equation are solved using a GMRES method with combined SOR and Jacobi preconditioners. The fully coupled method has also been implemented successfully in an unstructured solver, US3D, and convergence test results were presented. This new method is shown two to five times faster than the original DPLR method. The Poisson solver is validated with analytic test problems. Then, four problems are selected; two of them are computed to explore the possibility of onboard MHD control and power generation, and the other two are simulation of experiments. First, the possibility of onboard reentry shock control by a magnetic field is explored. As part of a previous project, MHD power generation onboard a re-entry vehicle is also simulated. Then, the MHD acceleration experiments conducted at NASA Ames research center are simulated. Lastly, the MHD power generation experiments known as the HVEPS project are simulated. For code validation, the scramjet experiments at University of Queensland are simulated first. The generator section of the HVEPS test facility is computed then. The main conclusion is that the computational tool is accurate for different types of problems and flow conditions, and its accuracy and efficiency are necessary when the flow complexity increases.

  13. Validation of published Stirling engine design methods using engine characteristics from the literature

    NASA Technical Reports Server (NTRS)

    Martini, W. R.

    1980-01-01

    Four fully disclosed reference engines and five design methods are discussed. So far, the agreement between theory and experiment is about as good for the simpler calculation methods as it is for the more complicated methods, that is, within 20%. For the simpler methods, a one number adjustable constant can be used to reduce the error in predicting power output and efficiency over the entire operating map to less than 10%.

  14. A Computational Framework for High-Throughput Isotopic Natural Abundance Correction of Omics-Level Ultra-High Resolution FT-MS Datasets

    PubMed Central

    Carreer, William J.; Flight, Robert M.; Moseley, Hunter N. B.

    2013-01-01

    New metabolomics applications of ultra-high resolution and accuracy mass spectrometry can provide thousands of detectable isotopologues, with the number of potentially detectable isotopologues increasing exponentially with the number of stable isotopes used in newer isotope tracing methods like stable isotope-resolved metabolomics (SIRM) experiments. This huge increase in usable data requires software capable of correcting the large number of isotopologue peaks resulting from SIRM experiments in a timely manner. We describe the design of a new algorithm and software system capable of handling these high volumes of data, while including quality control methods for maintaining data quality. We validate this new algorithm against a previous single isotope correction algorithm in a two-step cross-validation. Next, we demonstrate the algorithm and correct for the effects of natural abundance for both 13C and 15N isotopes on a set of raw isotopologue intensities of UDP-N-acetyl-D-glucosamine derived from a 13C/15N-tracing experiment. Finally, we demonstrate the algorithm on a full omics-level dataset. PMID:24404440

  15. NASA DOE POD NDE Capabilities Data Book

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  16. Do Procedures for Verbal Reporting of Thinking Have to Be Reactive? A Meta-Analysis and Recommendations for Best Reporting Methods

    ERIC Educational Resources Information Center

    Fox, Mark C.; Ericsson, K. Anders; Best, Ryan

    2011-01-01

    Since its establishment, psychology has struggled to find valid methods for studying thoughts and subjective experiences. Thirty years ago, Ericsson and Simon (1980) proposed that participants can give concurrent verbal expression to their thoughts (think aloud) while completing tasks without changing objectively measurable performance (accuracy).…

  17. Validation of Gujarati Version of ABILOCO-Kids Questionnaire

    PubMed Central

    Diwan, Jasmin; Patel, Pankaj; Bansal, Ankita B.

    2015-01-01

    Background ABILOCO-Kids is a measure of locomotion ability for children with cerebral palsy (CP) aged 6 to 15 years & is available in English & French. Aim To validate the Gujarati version of ABILOCO-Kids questionnaire to be used in clinical research on Gujarati population. Materials and Methods ABILOCO-Kids questionnaire was translated into Gujarati from English using forward-backward-forward method. To ensure face & content validity of Gujarati version using group consensus method, each item was examined by group of experts having mean experience of 24.62 years in field of paediatric and paediatric physiotherapy. Each item was analysed for content, meaning, wording, format, ease of administration & scoring. Each item was scored by expert group as either accepted, rejected or accepted with modification. Procedure was continued until 80% of consensus for all items. Concurrent validity was examined on 55 children with Cerebral Palsy (6-15 years) of all Gross Motor Functional Classification System (GMFCS) level & all clinical types by correlating score of ABILOCO-Kids with Gross Motor Functional Measure & GMFCS. Result In phase 1 of validation, 16 items were accepted as it is; 22 items accepted with modification & 3 items went for phase 2 validation. For concurrent validity, highly significant positive correlation was found between score of ABILOCO-Kids & total GMFM (r=0.713, p<0.005) & highly significant negative correlation with GMFCS (r= -0.778, p<0.005). Conclusion Gujarati translated version of ABILOCO-Kids questionnaire has good face & content validity as well as concurrent validity which can be used to measure caregiver reported locomotion ability in children with CP. PMID:26557603

  18. Ten Commandments Revisited: A Ten-Year Perspective on the Industrial Application of Formal Methods

    NASA Technical Reports Server (NTRS)

    Bowen, Jonathan P.; Hinchey, Michael G.

    2005-01-01

    Ten years ago, our 1995 paper Ten Commandments of Formal Methods suggested some guidelines to help ensure the success of a formal methods project. It proposed ten important requirements (or "commandments") for formal developers to consider and follow, based on our knowledge of several industrial application success stories, most of which have been reported in more detail in two books. The paper was surprisingly popular, is still widely referenced, and used as required reading in a number of formal methods courses. However, not all have agreed with some of our commandments, feeling that they may not be valid in the long-term. We re-examine the original commandments ten years on, and consider their validity in the light of a further decade of industrial best practice and experiences.

  19. Infrared Light Structured Sensor 3D Approach to Estimate Kidney Volume: A Validation Study.

    PubMed

    Garisto, Juan; Bertolo, Riccardo; Dagenais, Julien; Kaouk, Jihad

    2018-06-26

    To validate a new procedure for the three-dimensional (3D) estimation of total renal parenchyma volume (RPV) using a structured-light infrared laser sensor. To evaluate the accuracy of the sensor for assessing renal volume, we performed three experiments. Twenty freshly excised porcine kidneys were obtained. Experiment A, the water displacement method was used to obtain a determination of the RPV after immersing every kidney into 0.9% saline. Thereafter a structured sensor (Occipital, San Francisco, CA, USA) was used to scan the kidney. Kidney sample surface was presented initially as a mesh and then imported into MeshLab (Visual Computing Lab, Pisa, Italy) software to obtain the surface volume. Experiment B, a partial excision of the kidney with measurement of the excised volume and remnant was performed. Experiment C, a renorrhaphy of the remnant kidney was performed then measured. Bias and limits of agreement (LOA) were determined using the Bland-Altman method. Reliability was assessed using the intraclass correlation coefficient (ICC). Experiment A, the sensor bias was -1.95 mL (LOA: -19.5 to 15.59, R2= 0.410) with slightly overestimating the volumes. Experiment B, remnant kidney after partial excision and excised kidney volume were measured showing a sensor bias of -0.5 mL (LOA -5.34 to 4.20, R2= 0.490) and -0.6 mL (LOA: -1.97.08 to 0.77, R2= 0.561), respectively. Experiment C, the sensor bias was -0.89 mL (LOA -12.9 to 11.1, R2= 0.888). ICC was 0.9998. The sensor is a reliable method for assessing total renal volume with high levels of accuracy. Copyright © 2018. Published by Elsevier Inc.

  20. Determination of Nitrogen, Phosphorus, and Potassium Release Rates of Slow- and Controlled-Release Fertilizers: Single-Laboratory Validation, First Action 2015.15.

    PubMed

    Thiex, Nancy

    2016-01-01

    A previously validated method for the determination of nitrogen release patterns of slow- and controlled-release fertilizers (SRFs and CRFs, respectively) was submitted to the Expert Review Panel (ERP) for Fertilizers for consideration of First Action Official Method(SM) status. The ERP evaluated the single-laboratory validation results and recommended the method for First Action Official Method status and provided recommendations for achieving Final Action. The 180 day soil incubation-column leaching technique was demonstrated to be a robust and reliable method for characterizing N release patterns from SRFs and CRFs. The method was reproducible, and the results were only slightly affected by variations in environmental factors such as microbial activity, soil moisture, temperature, and texture. The release of P and K were also studied, but at fewer replications than for N. Optimization experiments on the accelerated 74 h extraction method indicated that temperature was the only factor found to substantially influence nutrient-release rates from the materials studied, and an optimized extraction profile was established as follows: 2 h at 25°C, 2 h at 50°C, 20 h at 55°C, and 50 h at 60°C.

  1. Longitudinal Measurement Equivalence of Subjective Language Brokering Experiences Scale in Mexican American Adolescents

    PubMed Central

    Kim, Su Yeong; Hou, Yang; Shen, Yishan; Zhang, Minyu

    2016-01-01

    Objectives Language brokering occurs frequently in immigrant families and can have significant implications for the well-being of family members involved. The present study aimed to develop and validate a measure that can be used to assess multiple dimensions of subjective language brokering experiences among Mexican American adolescents. Methods Participants were 557 adolescent language brokers (54.2% female, Mage.wave1 =12.96, SD=.94) in Mexican American families. Results Using exploratory and confirmatory factor analyses, we were able to identify seven reliable subscales of language brokering: linguistic benefits, socio-emotional benefits, efficacy, positive parent-child relationships, parental dependence, negative feelings, and centrality. Tests of factorial invariance show that these subscales demonstrate, at minimum, partial strict invariance across time and across experiences of translating for mothers and fathers, and in most cases, also across adolescent gender, nativity, and translation frequency. Thus, in general, the means of the subscales and the relations among the subscales with other variables can be compared across these different occasions and groups. Tests of criterion-related validity demonstrated that these subscales correlated, concurrently and longitudinally, with parental warmth and hostility, parent-child alienation, adolescent family obligation, depressive symptoms, resilience, and life meaning. Conclusions This reliable and valid subjective language brokering experiences scale will be helpful for gaining a better understanding of adolescents’ language brokering experiences with their mothers and fathers, and how such experiences may influence their development. PMID:27362872

  2. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2017-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. An extended MHD model has shown good agreement with experimental data at 14 kHz injector operation. Efforts to extend the existing validation to a range of higher frequencies (36, 53, 68 kHz) using the PSI-Tet 3D extended MHD code will be presented, along with simulations of potential combinations of flux conserver features and helicity injector configurations and their impact on current drive performance, density control, and temperature for future SIHI experiments. Work supported by USDoE.

  3. Outsourcing bioanalytical services at Janssen Research and Development: the sequel anno 2017.

    PubMed

    Dillen, Lieve; Verhaeghe, Tom

    2017-08-01

    The strategy of outsourcing bioanalytical services at Janssen has been evolving over the last years and an update will be given on the recent changes in our processes. In 2016, all internal GLP-related activities were phased out and this decision lead to the re-orientation of the in-house bioanalytical activities. As a consequence, in-depth experience with the validated bioanalytical assays for new drug candidates is currently gained together with the external partner, since development and validation of the assay and execution of GLP preclinical studies are now transferred to the CRO. The evolution to externalize more bioanalytical support has created opportunities to build even stronger partnerships with the CROs and to refocus internal resources. Case studies are presented illustrating challenges encountered during method development and validation at preferred partners when limited internal experience is obtained or with introduction of new technology.

  4. Supersonic pressure measurements and comparison of theory to experiment for an arrow-wing configuration

    NASA Technical Reports Server (NTRS)

    Manro, M. E.

    1976-01-01

    A wind tunnel test of an arrow-wing-body configuration consisting of flat and twisted wings, as well as leading- and trailing-edge control surface deflections, was conducted at Mach numbers from 1.54 to 2.50 to provide an experimental pressure data base for comparison with theoretical methods. Theory-to-experiment comparisons of detailed pressure distributions were made using a state-of-the-art inviscid flow, constant-pressure-panel method. Emphasis was on conditions under which this theory is valid for both flat and twisted wings.

  5. Modeling hole transport in wet and dry DNA.

    PubMed

    Pavanello, Michele; Adamowicz, Ludwik; Volobuyev, Maksym; Mennucci, Benedetta

    2010-04-08

    We present a DFT/classical molecular dynamics model of DNA charge conductivity. The model involves a temperature-driven, hole-hopping charge transfer and includes the time-dependent nonequilibrium interaction of DNA with its molecular environment. We validate our method against a variety of hole transport experiments. The method predicts a significant hole-transfer slowdown of approximately 35% from dry to wet DNA with and without electric field bias. In addition, in agreement with experiments, it also predicts an insulating behavior of (GC)(N) oligomers for 40 < N < 1000, depending on the experimental setup.

  6. Validation of light water reactor calculation methods and JEF-1-based data libraries by TRX and BAPL critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Pelloni, S.; Grimm, P.

    1991-04-01

    This paper analyzes the capability of various code systems and JEF-1-based nuclear data libraries to compute light water reactor lattices by comparing calculations with results from thermal reactor benchmark experiments TRX and BAPL and with previously published values. With the JEF-1 evaluation, eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and all methods give reasonable results for the measured reaction rate ratios within, or not too far from, the experimental uncertainty.

  7. Markov Jump-Linear Performance Models for Recoverable Flight Control Computers

    NASA Technical Reports Server (NTRS)

    Zhang, Hong; Gray, W. Steven; Gonzalez, Oscar R.

    2004-01-01

    Single event upsets in digital flight control hardware induced by atmospheric neutrons can reduce system performance and possibly introduce a safety hazard. One method currently under investigation to help mitigate the effects of these upsets is NASA Langley s Recoverable Computer System. In this paper, a Markov jump-linear model is developed for a recoverable flight control system, which will be validated using data from future experiments with simulated and real neutron environments. The method of tracking error analysis and the plan for the experiments are also described.

  8. Transonic pressure measurements and comparison of theory to experiment for three arrow-wing configurations

    NASA Technical Reports Server (NTRS)

    Manro, M. E.

    1982-01-01

    Wind tunnel tests of arrow-wing body configurations consisting of flat, twisted, and cambered twisted wings, as well as a variety of leading and trailing edge control surface deflections, were conducted at Mach numbers from 0.4 to 1.05 to provide an experimental pressure data base for comparison with theoretical methods. Theory to experiment comparisons of detailed pressure distributions were made using state of the art attached flow methods. Conditions under which these theories are valid for these wings are presented.

  9. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  10. Probing eukaryotic cell mechanics via mesoscopic simulations

    PubMed Central

    Shang, Menglin; Lim, Chwee Teck

    2017-01-01

    Cell mechanics has proven to be important in many biological processes. Although there is a number of experimental techniques which allow us to study mechanical properties of cell, there is still a lack of understanding of the role each sub-cellular component plays during cell deformations. We present a new mesoscopic particle-based eukaryotic cell model which explicitly describes cell membrane, nucleus and cytoskeleton. We employ Dissipative Particle Dynamics (DPD) method that provides us with the unified framework for modeling of a cell and its interactions in the flow. Data from micropipette aspiration experiments were used to define model parameters. The model was validated using data from microfluidic experiments. The validated model was then applied to study the impact of the sub-cellular components on the cell viscoelastic response in micropipette aspiration and microfluidic experiments. PMID:28922399

  11. The Daily Heterosexist Experiences Questionnaire: Measuring Minority Stress Among Lesbian, Gay, Bisexual, and Transgender Adults

    PubMed Central

    Balsam, Kimberly F.; Beadnell, Blair; Molina, Yamile

    2013-01-01

    The authors conducted a three-phase, mixed-methods study to develop a self-report measure assessing the unique aspects of minority stress for lesbian, gay, bisexual, and transgender adults. The Daily Heterosexist Experiences Questionnaire has 50 items and nine subscales with acceptable internal reliability, and construct and concurrent validity. Mean sexual orientation and gender differences were found. PMID:24058262

  12. SAS Code for Calculating Intraclass Correlation Coefficients and Effect Size Benchmarks for Site-Randomized Education Experiments

    ERIC Educational Resources Information Center

    Brandon, Paul R.; Harrison, George M.; Lawton, Brian E.

    2013-01-01

    When evaluators plan site-randomized experiments, they must conduct the appropriate statistical power analyses. These analyses are most likely to be valid when they are based on data from the jurisdictions in which the studies are to be conducted. In this method note, we provide software code, in the form of a SAS macro, for producing statistical…

  13. Continued Development and Validation of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2015-11-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.

  14. Zig-zag tape influence in NREL Phase VI wind turbine

    NASA Astrophysics Data System (ADS)

    Gomez-Iradi, Sugoi; Munduate, Xabier

    2014-06-01

    Two bladed 10 metre diameter wind turbine was tested in the 24.4m × 36.6m NASA-Ames wind tunnel (Phase VI). These experiments have been extensively used for validation purposes for CFD and other engineering tools. The free transition case (S), has been, and is, the most employed one for validation purposes, and consist in a 3° pitch case with a rotational speed of 72rpm upwind configuration with and without yaw misalignment. However, there is another less visited case (M) where identical configuration was tested but with the inclusion of a zig-zag tape. This was called transition fixed sequence. This paper shows the differences between the free and the fix transition cases, that should be more appropriate for comparison with fully turbulent simulations. Steady k-ω SST fully turbulent computations performed with WMB CFD method are compared with the experiments showing, better predictions in the attached flow region when it is compared with the transition fixed experiments. This work wants to prove the utility of M case (transition fixed) and show its differences respect the S case (free transition) for validation purposes.

  15. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2016-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.

  16. Validation of the activity expansion method with ultrahigh pressure shock equations of state

    NASA Astrophysics Data System (ADS)

    Rogers, Forrest J.; Young, David A.

    1997-11-01

    Laser shock experiments have recently been used to measure the equation of state (EOS) of matter in the ultrahigh pressure region between condensed matter and a weakly coupled plasma. Some ultrahigh pressure data from nuclear-generated shocks are also available. Matter at these conditions has proven very difficult to treat theoretically. The many-body activity expansion method (ACTEX) has been used for some time to calculate EOS and opacity data in this region, for use in modeling inertial confinement fusion and stellar interior plasmas. In the present work, we carry out a detailed comparison with the available experimental data in order to validate the method. The agreement is good, showing that ACTEX adequately describes strongly shocked matter.

  17. Successful validation of in vitro methods in toxicology by ZEBET, the National Centre for Alternatives in Germany at the BfR (Federal Institute for Risk Assessment).

    PubMed

    Spielmann, Horst; Grune, Barbara; Liebsch, Manfred; Seiler, Andrea; Vogel, Richard

    2008-06-01

    A short description of the history of the 3Rs concept is given, which was developed as the scientific concept to refine, reduce and replace animal experiments by Russel and Burch more than 40 years ago. In addition, the legal framework in Europe for developing alternatives to animal experiments is given and the current status of in vitro systems in pharmacology and toxicology is described including an update on metabolising systems. The decrease in experimental animal numbers during the past decade in Europe is illustrated by the situation in Germany and the contribution of international harmonisation of test guidelines on reducing animal numbers in regulatory testing is described. A review of the development of the principles of experimental validation is given and the 3T3 NRU in vitro phototoxicity test is used as an example for a successful validation study, which led to the acceptance of the first in vitro toxicity test for regulatory purposes by the OECD. Finally, the currently accepted alternative methods for standardisation and safety testing of drugs, biologicals and medical devices are summarised.

  18. MRPrimer: a MapReduce-based method for the thorough design of valid and ranked primers for PCR.

    PubMed

    Kim, Hyerin; Kang, NaNa; Chon, Kang-Wook; Kim, Seonho; Lee, NaHye; Koo, JaeHyung; Kim, Min-Soo

    2015-11-16

    Primer design is a fundamental technique that is widely used for polymerase chain reaction (PCR). Although many methods have been proposed for primer design, they require a great deal of manual effort to generate feasible and valid primers, including homology tests on off-target sequences using BLAST-like tools. That approach is inconvenient for many target sequences of quantitative PCR (qPCR) due to considering the same stringent and allele-invariant constraints. To address this issue, we propose an entirely new method called MRPrimer that can design all feasible and valid primer pairs existing in a DNA database at once, while simultaneously checking a multitude of filtering constraints and validating primer specificity. Furthermore, MRPrimer suggests the best primer pair for each target sequence, based on a ranking method. Through qPCR analysis using 343 primer pairs and the corresponding sequencing and comparative analyses, we showed that the primer pairs designed by MRPrimer are very stable and effective for qPCR. In addition, MRPrimer is computationally efficient and scalable and therefore useful for quickly constructing an entire collection of feasible and valid primers for frequently updated databases like RefSeq. Furthermore, we suggest that MRPrimer can be utilized conveniently for experiments requiring primer design, especially real-time qPCR. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Analysis and optimization of hybrid excitation permanent magnet synchronous generator for stand-alone power system

    NASA Astrophysics Data System (ADS)

    Wang, Huijun; Qu, Zheng; Tang, Shaofei; Pang, Mingqi; Zhang, Mingju

    2017-08-01

    In this paper, electromagnetic design and permanent magnet shape optimization for permanent magnet synchronous generator with hybrid excitation are investigated. Based on generator structure and principle, design outline is presented for obtaining high efficiency and low voltage fluctuation. In order to realize rapid design, equivalent magnetic circuits for permanent magnet and iron poles are developed. At the same time, finite element analysis is employed. Furthermore, by means of design of experiment (DOE) method, permanent magnet is optimized to reduce voltage waveform distortion. Finally, the validity of proposed design methods is validated by the analytical and experimental results.

  20. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  1. Analysis of a boron-carbide-drum-controlled critical reactor experiment

    NASA Technical Reports Server (NTRS)

    Mayo, W. T.

    1972-01-01

    In order to validate methods and cross sections used in the neutronic design of compact fast-spectrum reactors for generating electric power in space, an analysis of a boron-carbide-drum-controlled critical reactor was made. For this reactor the transport analysis gave generally satisfactory results. The calculated multiplication factor for the most detailed calculation was only 0.7-percent Delta k too high. Calculated reactivity worth of the control drums was $11.61 compared to measurements of $11.58 by the inverse kinetics methods and $11.98 by the inverse counting method. Calculated radial and axial power distributions were in good agreement with experiment.

  2. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    PubMed Central

    Burckhardt, Bjoern B.; Laeer, Stephanie

    2015-01-01

    In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum). Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers. PMID:25873972

  3. Development of a High-Performance Liquid Chromatography–Tandem Mass Spectrometry Method for the Identification and Quantification of CP-47,497, CP-47,497-C8 and JWH-250 in Mouse Brain

    PubMed Central

    Samano, Kimberly L.; Poklis, Justin L.; Lichtman, Aron H.; Poklis, Alphonse

    2014-01-01

    While Marijuana continues to be the most widely used illicit drug, abuse of synthetic cannabinoid (SCB) compounds in ‘Spice’ or ‘K2’ herbal incense products has emerged as a significant public health concern in many European countries and in the USA. Several of these SCBs have been declared Schedule I controlled substances but detection and quantification in biological samples remain a challenge. Therefore, we present a liquid chromatography–tandem mass spectrometry method after liquid–liquid extraction for the quantitation of CP-47,497, CP-47,497-C8 and JWH-250 in mouse brain. We report data for linearity, limit of quantification, accuracy/bias, precision, recovery, selectivity, carryover, matrix effects and stability experiments which were developed and fully validated based on Scientific Working Group for Forensic Toxicology guidelines for forensic toxicology method validation. Acceptable coefficients of variation for accuracy/bias, within- and between-run precision and selectivity were determined, with all values within ±15% of the target concentration. Validation experiments revealed degradation of CP-47, 497 and CP-47,497-C8 at different temperatures, and significant ion suppression was produced in brain for all compounds tested. The method was successfully applied to detect and quantify CP-47,497 in brains from mice demonstrating significant cannabimimetic behavioral effects as assessed by the classical tetrad paradigm. PMID:24816398

  4. Development of a Research Participants’ Perception Survey to Improve Clinical Research

    PubMed Central

    Yessis, Jennifer L.; Kost, Rhonda G.; Lee, Laura M.; Coller, Barry S.; Henderson, David K.

    2012-01-01

    Abstract Introduction: Clinical research participants’ perceptions regarding their experiences during research protocols provide outcome‐based insights into the effectiveness of efforts to protect rights and safety, and opportunities to enhance participants’ clinical research experiences. Use of validated surveys measuring patient‐centered outcomes is standard in hospitals, yet no instruments exist to assess outcomes of clinical research processes. Methods: We derived survey questions from data obtained from focus groups comprised of research participants and professionals. We assessed the survey for face/content validity, and privacy/confidentiality protections and fielded it to research participants at 15 centers. We conducted analyses of response rates, sample characteristics, and psychometrics, including survey and item completion and analysis, internal consistency, item internal consistency, criterion‐related validity, and item usefulness. Responses were tested for fit into existing patient‐centered dimensions of care and new clinical research dimensions using Cronbach's alpha coefficient. Results: Surveys were mailed to 18,890 individuals; 4,961 were returned (29%). Survey completion was 89% overall; completion rates exceeded 90% for 88 of 93 evaluable items. Questions fit into three dimensions of patient‐centered care and two novel clinical research dimensions (Cronbach's alpha for dimensions: 0.69–0.85). Conclusions: The validated survey offers a new method for assessing and improving outcomes of clinical research processes. Clin Trans Sci 2012; Volume 5: 452–460 PMID:23253666

  5. Challenges in validating the sterilisation dose for processed human amniotic membranes

    NASA Astrophysics Data System (ADS)

    Yusof, Norimah; Hassan, Asnah; Firdaus Abd Rahman, M. N.; Hamid, Suzina A.

    2007-11-01

    Most of the tissue banks in the Asia Pacific region have been using ionising radiation at 25 kGy to sterilise human tissues for save clinical usage. Under tissue banking quality system, any dose employed for sterilisation has to be validated and the validation exercise has to be a part of quality document. Tissue grafts, unlike medical items, are not produced in large number per each processing batch and tissues relatively have a different microbial population. A Code of Practice established by the International Atomic Energy Agency (IAEA) in 2004 offers several validation methods using smaller number of samples compared to ISO 11137 (1995), which is meant for medical products. The methods emphasise on bioburden determination, followed by sterility test on samples after they were exposed to verification dose for attaining of sterility assurance level (SAL) of 10 -1. This paper describes our experience in using the IAEA Code of Practice in conducting the validation exercise for substantiating 25 kGy as sterilisation dose for both air-dried amnion and those preserved in 99% glycerol.

  6. Design process and preliminary psychometric study of a video game to detect cognitive impairment in senior adults

    PubMed Central

    Perez-Rodriguez, Roberto; Facal, David; Fernandez-Iglesias, Manuel J.; Anido-Rifon, Luis; Mouriño-Garcia, Marcos

    2017-01-01

    Introduction Assessment of episodic memory has been traditionally used to evaluate potential cognitive impairments in senior adults. Typically, episodic memory evaluation is based on personal interviews and pen-and-paper tests. This article presents the design, development and a preliminary validation of a novel digital game to assess episodic memory intended to overcome the limitations of traditional methods, such as the cost of its administration, its intrusive character, the lack of early detection capabilities, the lack of ecological validity, the learning effect and the existence of confounding factors. Materials and Methods Our proposal is based on the gamification of the California Verbal Learning Test (CVLT) and it has been designed to comply with the psychometric characteristics of reliability and validity. Two qualitative focus groups and a first pilot experiment were carried out to validate the proposal. Results A more ecological, non-intrusive and better administrable tool to perform cognitive assessment was developed. Initial evidence from the focus groups and pilot experiment confirmed the developed game’s usability and offered promising results insofar its psychometric validity is concerned. Moreover, the potential of this game for the cognitive classification of senior adults was confirmed, and administration time is dramatically reduced with respect to pen-and-paper tests. Limitations Additional research is needed to improve the resolution of the game for the identification of specific cognitive impairments, as well as to achieve a complete validation of the psychometric properties of the digital game. Conclusion Initial evidence show that serious games can be used as an instrument to assess the cognitive status of senior adults, and even to predict the onset of mild cognitive impairments or Alzheimer’s disease. PMID:28674661

  7. Applying heuristic inquiry to nurse migration from the UK to Australia.

    PubMed

    Vafeas, Caroline; Hendricks, Joyce

    2017-01-23

    Background Heuristic inquiry is a research approach that improves understanding of the essence of an experience. This qualitative method relies on researchers' ability to discover and interpret their own experience while exploring those of others. Aim To present a discussion of heuristic inquiry's methodology and its application to the experience of nurse migration. Discussion The researcher's commitment to the research is central to heuristic inquiry. It is immersive, reflective, reiterative and a personally-affecting method of gathering knowledge. Researchers are acknowledged as the only people who can validate the findings of the research by exploring their own experiences while also examining those of others with the same experiences to truly understand the phenomena being researched. This paper presents the ways in which the heuristic process guides this discovery in relation to traditional research steps. Conclusion Heuristic inquiry is an appropriate method for exploring nurses' experiences of migration because nurse researchers can tell their own stories and it brings understanding of themselves and the phenomenon as experienced by others. Implications for practice Although not a popular method in nursing research, heuristic inquiry offers a depth of exploration and understanding that may not be revealed by other methods.

  8. Direct determination of one-dimensional interphase structures using normalized crystal truncation rod analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawaguchi, Tomoya; Liu, Yihua; Reiter, Anthony

    Here, a one-dimensional non-iterative direct method was employed for normalized crystal truncation rod analysis. The non-iterative approach, utilizing the Kramers–Kronig relation, avoids the ambiguities due to an improper initial model or incomplete convergence in the conventional iterative methods. The validity and limitations of the present method are demonstrated through both numerical simulations and experiments with Pt(111) in a 0.1 M CsF aqueous solution. The present method is compared with conventional iterative phase-retrieval methods.

  9. Direct determination of one-dimensional interphase structures using normalized crystal truncation rod analysis

    DOE PAGES

    Kawaguchi, Tomoya; Liu, Yihua; Reiter, Anthony; ...

    2018-04-20

    Here, a one-dimensional non-iterative direct method was employed for normalized crystal truncation rod analysis. The non-iterative approach, utilizing the Kramers–Kronig relation, avoids the ambiguities due to an improper initial model or incomplete convergence in the conventional iterative methods. The validity and limitations of the present method are demonstrated through both numerical simulations and experiments with Pt(111) in a 0.1 M CsF aqueous solution. The present method is compared with conventional iterative phase-retrieval methods.

  10. Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment

    NASA Technical Reports Server (NTRS)

    Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.

    2008-01-01

    Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.

  11. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns

    PubMed Central

    Chérel, Guillaume; Cottineau, Clémentine; Reuillon, Romain

    2015-01-01

    Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model’s predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic. PMID:26368917

  12. [Phenomenology and phenomenological method: their usefulness for nursing knowledge and practice].

    PubMed

    Vellone, E; Sinapi, N; Rastelli, D

    2000-01-01

    Phenomenology is a thought movement the main aim of which is to study human fenomena as they are experienced and lived. Key concepts of phenomenology are: the study of lived experience and subjectivity of human beings, the intentionality of consciousness, perception and interpretation. Phenomenological research method has nine steps: definition of the research topic; superficial literature searching; sample selection; gathering of lived experiences; analysis of lived experiences; written synthesis of lived experiences; validation of written synthesis; deep literature searching; writing of the scientific document. Phenomenology and phenomenological method are useful for nursing either to develop knowledge or to guide practice. Qualitative-phenomenological and quantitative-positivistic research are complementary: the first one guides clinicians towards a person-centered approach, the second one allows the manipulation of phenomena which can damage health, worsen illness or decrease the quality of life of people who rely on nursing care.

  13. Statistical Methodologies to Integrate Experimental and Computational Research

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  14. A design of experiments approach to validation sampling for logistic regression modeling with error-prone medical records.

    PubMed

    Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay

    2016-04-01

    Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Diagnostic accuracy of different caries risk assessment methods. A systematic review.

    PubMed

    Senneby, Anna; Mejàre, Ingegerd; Sahlin, Nils-Eric; Svensäter, Gunnel; Rohlin, Madeleine

    2015-12-01

    To evaluate the accuracy of different methods used to identify individuals with increased risk of developing dental coronal caries. Studies on following methods were included: previous caries experience, tests using microbiota, buffering capacity, salivary flow rate, oral hygiene, dietary habits and sociodemographic variables. QUADAS-2 was used to assess risk of bias. Sensitivity, specificity, predictive values, and likelihood ratios (LR) were calculated. Quality of evidence based on ≥3 studies of a method was rated according to GRADE. PubMed, Cochrane Library, Web of Science and reference lists of included publications were searched up to January 2015. From 5776 identified articles, 18 were included. Assessment of study quality identified methodological limitations concerning study design, test technology and reporting. No study presented low risk of bias in all domains. Three or more studies were found only for previous caries experience and salivary mutans streptococci and quality of evidence for these methods was low. Evidence regarding other methods was lacking. For previous caries experience, sensitivity ranged between 0.21 and 0.94 and specificity between 0.20 and 1. Tests using salivary mutans streptococci resulted in low sensitivity and high specificity. For children with primary teeth at baseline, pooled LR for a positive test was 3 for previous caries experience and 4 for salivary mutans streptococci, given a threshold ≥10(5) CFU/ml. Evidence on the validity of analysed methods used for caries risk assessment is limited. As methodological quality was low, there is a need to improve study design. Low validity for the analysed methods may lead to patients with increased risk not being identified, whereas some are falsely identified as being at risk. As caries risk assessment guides individualized decisions on interventions and intervals for patient recall, improved performance based on best evidence is greatly needed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Image analysis method for the measurement of water saturation in a two-dimensional experimental flow tank

    NASA Astrophysics Data System (ADS)

    Belfort, Benjamin; Weill, Sylvain; Lehmann, François

    2017-07-01

    A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.

  17. Computational discovery and in vivo validation of hnf4 as a regulatory gene in planarian regeneration.

    PubMed

    Lobo, Daniel; Morokuma, Junji; Levin, Michael

    2016-09-01

    Automated computational methods can infer dynamic regulatory network models directly from temporal and spatial experimental data, such as genetic perturbations and their resultant morphologies. Recently, a computational method was able to reverse-engineer the first mechanistic model of planarian regeneration that can recapitulate the main anterior-posterior patterning experiments published in the literature. Validating this comprehensive regulatory model via novel experiments that had not yet been performed would add in our understanding of the remarkable regeneration capacity of planarian worms and demonstrate the power of this automated methodology. Using the Michigan Molecular Interactions and STRING databases and the MoCha software tool, we characterized as hnf4 an unknown regulatory gene predicted to exist by the reverse-engineered dynamic model of planarian regeneration. Then, we used the dynamic model to predict the morphological outcomes under different single and multiple knock-downs (RNA interference) of hnf4 and its predicted gene pathway interactors β-catenin and hh Interestingly, the model predicted that RNAi of hnf4 would rescue the abnormal regenerated phenotype (tailless) of RNAi of hh in amputated trunk fragments. Finally, we validated these predictions in vivo by performing the same surgical and genetic experiments with planarian worms, obtaining the same phenotypic outcomes predicted by the reverse-engineered model. These results suggest that hnf4 is a regulatory gene in planarian regeneration, validate the computational predictions of the reverse-engineered dynamic model, and demonstrate the automated methodology for the discovery of novel genes, pathways and experimental phenotypes. michael.levin@tufts.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. The Arthroscopic Surgical Skill Evaluation Tool (ASSET).

    PubMed

    Koehler, Ryan J; Amsdell, Simon; Arendt, Elizabeth A; Bisson, Leslie J; Braman, Jonathan P; Bramen, Jonathan P; Butler, Aaron; Cosgarea, Andrew J; Harner, Christopher D; Garrett, William E; Olson, Tyson; Warme, Winston J; Nicandri, Gregg T

    2013-06-01

    Surgeries employing arthroscopic techniques are among the most commonly performed in orthopaedic clinical practice; however, valid and reliable methods of assessing the arthroscopic skill of orthopaedic surgeons are lacking. The Arthroscopic Surgery Skill Evaluation Tool (ASSET) will demonstrate content validity, concurrent criterion-oriented validity, and reliability when used to assess the technical ability of surgeons performing diagnostic knee arthroscopic surgery on cadaveric specimens. Cross-sectional study; Level of evidence, 3. Content validity was determined by a group of 7 experts using the Delphi method. Intra-articular performance of a right and left diagnostic knee arthroscopic procedure was recorded for 28 residents and 2 sports medicine fellowship-trained attending surgeons. Surgeon performance was assessed by 2 blinded raters using the ASSET. Concurrent criterion-oriented validity, interrater reliability, and test-retest reliability were evaluated. Content validity: The content development group identified 8 arthroscopic skill domains to evaluate using the ASSET. Concurrent criterion-oriented validity: Significant differences in the total ASSET score (P < .05) between novice, intermediate, and advanced experience groups were identified. Interrater reliability: The ASSET scores assigned by each rater were strongly correlated (r = 0.91, P < .01), and the intraclass correlation coefficient between raters for the total ASSET score was 0.90. Test-retest reliability: There was a significant correlation between ASSET scores for both procedures attempted by each surgeon (r = 0.79, P < .01). The ASSET appears to be a useful, valid, and reliable method for assessing surgeon performance of diagnostic knee arthroscopic surgery in cadaveric specimens. Studies are ongoing to determine its generalizability to other procedures as well as to the live operating room and other simulated environments.

  19. Ego-Dissolution and Psychedelics: Validation of the Ego-Dissolution Inventory (EDI)

    PubMed Central

    Nour, Matthew M.; Evans, Lisa; Nutt, David; Carhart-Harris, Robin L.

    2016-01-01

    Aims: The experience of a compromised sense of “self”, termed ego-dissolution, is a key feature of the psychedelic experience. This study aimed to validate the Ego-Dissolution Inventory (EDI), a new 8-item self-report scale designed to measure ego-dissolution. Additionally, we aimed to investigate the specificity of the relationship between psychedelics and ego-dissolution. Method: Sixteen items relating to altered ego-consciousness were included in an internet questionnaire; eight relating to the experience of ego-dissolution (comprising the EDI), and eight relating to the antithetical experience of increased self-assuredness, termed ego-inflation. Items were rated using a visual analog scale. Participants answered the questionnaire for experiences with classical psychedelic drugs, cocaine and/or alcohol. They also answered the seven questions from the Mystical Experiences Questionnaire (MEQ) relating to the experience of unity with one’s surroundings. Results: Six hundred and ninety-one participants completed the questionnaire, providing data for 1828 drug experiences (1043 psychedelics, 377 cocaine, 408 alcohol). Exploratory factor analysis demonstrated that the eight EDI items loaded exclusively onto a single common factor, which was orthogonal to a second factor comprised of the items relating to ego-inflation (rho = −0.110), demonstrating discriminant validity. The EDI correlated strongly with the MEQ-derived measure of unitive experience (rho = 0.735), demonstrating convergent validity. EDI internal consistency was excellent (Cronbach’s alpha 0.93). Three analyses confirmed the specificity of ego-dissolution for experiences occasioned by psychedelic drugs. Firstly, EDI score correlated with drug-dose for psychedelic drugs (rho = 0.371), but not for cocaine (rho = 0.115) or alcohol (rho = −0.055). Secondly, the linear regression line relating the subjective intensity of the experience to ego-dissolution was significantly steeper for psychedelics (unstandardized regression coefficient = 0.701) compared with cocaine (0.135) or alcohol (0.144). Ego-inflation, by contrast, was specifically associated with cocaine experiences. Finally, a binary Support Vector Machine classifier identified experiences occasioned by psychedelic drugs vs. cocaine or alcohol with over 85% accuracy using ratings of ego-dissolution and ego-inflation alone. Conclusion: Our results demonstrate the psychometric structure, internal consistency and construct validity of the EDI. Moreover, we demonstrate the close relationship between ego-dissolution and the psychedelic experience. The EDI will facilitate the study of the neuronal correlates of ego-dissolution, which is relevant for psychedelic-assisted psychotherapy and our understanding of psychosis. PMID:27378878

  20. Double regions growing algorithm for automated satellite image mosaicking

    NASA Astrophysics Data System (ADS)

    Tan, Yihua; Chen, Chen; Tian, Jinwen

    2011-12-01

    Feathering is a most widely used method in seamless satellite image mosaicking. A simple but effective algorithm - double regions growing (DRG) algorithm, which utilizes the shape content of images' valid regions, is proposed for generating robust feathering-line before feathering. It works without any human intervention, and experiment on real satellite images shows the advantages of the proposed method.

  1. An Insider Perspective on Implementing the Harvard Case Study Method in Business Teaching

    ERIC Educational Resources Information Center

    Rebeiz, Karim S.

    2011-01-01

    This paper provides practical guidance on the implementation of the CSM (case study method) using the HBS (Harvard Business School) model. The analysis is based on the first-hand experience of the author as a user and implementer of this mode of instruction. The results are further validated with surveys given to MBA (Master of Business…

  2. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps.

    PubMed

    Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás

    2014-02-01

    Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps.

  3. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  4. Method validation for simultaneous determination of chromium, molybdenum and selenium in infant formulas by ICP-OES and ICP-MS.

    PubMed

    Khan, Naeem; Jeong, In Seon; Hwang, In Min; Kim, Jae Sung; Choi, Sung Hwa; Nho, Eun Yeong; Choi, Ji Yeon; Kwak, Byung-Man; Ahn, Jang-Hyuk; Yoon, Taehyung; Kim, Kyong Su

    2013-12-15

    This study aimed to validate the analytical method for simultaneous determination of chromium (Cr), molybdenum (Mo), and selenium (Se) in infant formulas available in South Korea. Various digestion methods of dry-ashing, wet-digestion and microwave were evaluated for samples preparation and both inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS) were compared for analysis. The analytical techniques were validated by detection limits, precision, accuracy and recovery experiments. Results showed that wet-digestion and microwave methods were giving satisfactory results for sample preparation, while ICP-MS was found more sensitive and effective technique than ICP-OES. The recovery (%) of Se, Mo and Cr by ICP-OES were 40.9, 109.4 and 0, compared to 99.1, 98.7 and 98.4, respectively by ICP-MS. The contents of Cr, Mo and Se in infant formulas by ICP-MS were found in good nutritional values in accordance to nutrient standards for infant formulas CODEX values. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. An Approach for Validating Actinide and Fission Product Burnup Credit Criticality Safety Analyses: Criticality (k eff) Predictions

    DOE PAGES

    Scaglione, John M.; Mueller, Don E.; Wagner, John C.

    2014-12-01

    One of the most important remaining challenges associated with expanded implementation of burnup credit in the United States is the validation of depletion and criticality calculations used in the safety evaluation—in particular, the availability and use of applicable measured data to support validation, especially for fission products (FPs). Applicants and regulatory reviewers have been constrained by both a scarcity of data and a lack of clear technical basis or approach for use of the data. In this study, this paper describes a validation approach for commercial spent nuclear fuel (SNF) criticality safety (k eff) evaluations based on best-available data andmore » methods and applies the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The criticality validation approach utilizes not only available laboratory critical experiment (LCE) data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the French Haut Taux de Combustion program to support validation of the principal actinides but also calculated sensitivities, nuclear data uncertainties, and limited available FP LCE data to predict and verify individual biases for relevant minor actinides and FPs. The results demonstrate that (a) sufficient critical experiment data exist to adequately validate k eff calculations via conventional validation approaches for the primary actinides, (b) sensitivity-based critical experiment selection is more appropriate for generating accurate application model bias and uncertainty, and (c) calculated sensitivities and nuclear data uncertainties can be used for generating conservative estimates of bias for minor actinides and FPs. Results based on the SCALE 6.1 and the ENDF/B-VII.0 cross-section libraries indicate that a conservative estimate of the bias for the minor actinides and FPs is 1.5% of their worth within the application model. Finally, this paper provides a detailed description of the approach and its technical bases, describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models, and provides reference bias results based on the prerelease SCALE 6.1 code package and ENDF/B-VII nuclear cross-section data.« less

  6. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract

    NASA Astrophysics Data System (ADS)

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-01

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.

  7. Using L-M BP Algorithm Forecase the 305 Days Production of First-Breed Dairy

    NASA Astrophysics Data System (ADS)

    Wei, Xiaoli; Qi, Guoqiang; Shen, Weizheng; Jian, Sun

    Aiming at the shortage of conventional BP algorithm, a BP neural net works improved by L-M algorithm is put forward. On the basis of the network, a Prediction model for 305 day's milk productions was set up. Traditional methods finish these data must spend at least 305 days, But this model can forecast first-breed dairy's 305 days milk production ahead of 215 days. The validity of the improved BP neural network predictive model was validated through the experiments.

  8. Topological charge number multiplexing for JTC multiple-image encryption

    NASA Astrophysics Data System (ADS)

    Chen, Qi; Shen, Xueju; Dou, Shuaifeng; Lin, Chao; Wang, Long

    2018-04-01

    We propose a method of topological charge number multiplexing based on the JTC encryption system to achieve multiple-image encryption. Using this method, multi-image can be encrypted into single ciphertext, and the original images can be recovered according to the authority level. The number of encrypted images is increased, moreover, the quality of decrypted images is improved. Results of computer simulation and initial experiment identify the validity of our proposed method.

  9. [Reduction of animal experiments in experimental drug testing].

    PubMed

    Behrensdorf-Nicol, H; Krämer, B

    2014-10-01

    In order to ensure the quality of biomedical products, an experimental test for every single manufactured batch is required for many products. Especially in vaccine testing, animal experiments are traditionally used for this purpose. For example, efficacy is often determined via challenge experiments in laboratory animals. Safety tests of vaccine batches are also mostly performed using laboratory animals. However, many animal experiments have clear inherent disadvantages (low accuracy, questionable transferability to humans, unclear significance). Furthermore, for ethical reasons and animal welfare aspects animal experiments are also seen very critical by the public. Therefore, there is a strong trend towards replacing animal experiments with methods in which no animals are used ("replacement"). If a replacement is not possible, the required animal experiments should be improved in order to minimize the number of animals necessary ("reduction") and to reduce pain and suffering caused by the experiment to a minimum ("refinement"). This "3R concept" is meanwhile firmly established in legislature. In recent years many mandatory animal experiments have been replaced by alternative in vitro methods or improved according to the 3R principles; numerous alternative methods are currently under development. Nevertheless, the process from the development of a new method to its legal implementation takes a long time. Therefore, supplementary regulatory measures to facilitate validation and acceptance of new alternative methods could contribute to a faster and more consequent implementation of the 3R concept in the testing of biomedical products.

  10. A Differential Scanning Calorimetry Method for Construction of Continuous Cooling Transformation Diagram of Blast Furnace Slag

    NASA Astrophysics Data System (ADS)

    Gan, Lei; Zhang, Chunxia; Shangguan, Fangqin; Li, Xiuping

    2012-06-01

    The continuous cooling crystallization of a blast furnace slag was studied by the application of the differential scanning calorimetry (DSC) method. A kinetic model describing the correlation between the evolution of the degree of crystallization with time was obtained. Bulk cooling experiments of the molten slag coupled with numerical simulation of heat transfer were conducted to validate the results of the DSC methods. The degrees of crystallization of the samples from the bulk cooling experiments were estimated by means of the X-ray diffraction (XRD) and the DSC method. It was found that the results from the DSC cooling and bulk cooling experiments are in good agreement. The continuous cooling transformation (CCT) diagram of the blast furnace slag was constructed according to crystallization kinetic model and experimental data. The obtained CCT diagram characterizes with two crystallization noses at different temperature ranges.

  11. Development and validation of the BRIGHTLIGHT Survey, a patient-reported experience measure for young people with cancer.

    PubMed

    Taylor, Rachel M; Fern, Lorna A; Solanki, Anita; Hooker, Louise; Carluccio, Anna; Pye, Julia; Jeans, David; Frere-Smith, Tom; Gibson, Faith; Barber, Julie; Raine, Rosalind; Stark, Dan; Feltbower, Richard; Pearce, Susie; Whelan, Jeremy S

    2015-07-28

    Patient experience is increasingly used as an indicator of high quality care in addition to more traditional clinical end-points. Surveys are generally accepted as appropriate methodology to capture patient experience. No validated patient experience surveys exist specifically for adolescents and young adults (AYA) aged 13-24 years at diagnosis with cancer. This paper describes early work undertaken to develop and validate a descriptive patient experience survey for AYA with cancer that encompasses both their cancer experience and age-related issues. We aimed to develop, with young people, an experience survey meaningful and relevant to AYA to be used in a longitudinal cohort study (BRIGHTLIGHT), ensuring high levels of acceptability to maximise study retention. A three-stage approach was employed: Stage 1 involved developing a conceptual framework, conducting literature/Internet searches and establishing content validity of the survey; Stage 2 confirmed the acceptability of methods of administration and consisted of four focus groups involving 11 young people (14-25 years), three parents and two siblings; and Stage 3 established survey comprehension through telephone-administered cognitive interviews with a convenience sample of 23 young people aged 14-24 years. Stage 1: Two-hundred and thirty eight questions were developed from qualitative reports of young people's cancer and treatment-related experience. Stage 2: The focus groups identified three core themes: (i) issues directly affecting young people, e.g. impact of treatment-related fatigue on ability to complete survey; (ii) issues relevant to the actual survey, e.g. ability to answer questions anonymously; (iii) administration issues, e.g. confusing format in some supporting documents. Stage 3: Cognitive interviews indicated high levels of comprehension requiring minor survey amendments. Collaborating with young people with cancer has enabled a survey of to be developed that is both meaningful to young people but also examines patient experience and outcomes associated with specialist cancer care. Engagement of young people throughout the survey development has ensured the content appropriately reflects their experience and is easily understood. The BRIGHTLIGHT survey was developed for a specific research project but has the potential to be used as a TYA cancer survey to assess patient experience and the care they receive.

  12. Libet's experiment: Questioning the validity of measuring the urge to move.

    PubMed

    Dominik, Tomáš; Dostál, Daniel; Zielina, Martin; Šmahaj, Jan; Sedláčková, Zuzana; Procházka, Roman

    2017-03-01

    The time of subjectively registered urge to move (W) constituted the central point of most Libet-style experiments. It is therefore crucial to verify the W validity. Our experiment was based on the assumption that the W time is inferred, rather than introspectively perceived. We used the rotating spot method to gather the W reports together with the reports of the subjective timing of actual movement (M). The subjects were assigned the tasks in two different orders. When measured as first in the respective session, no significant difference between W and M values was found, which suggests that uninformed subjects tend to confuse W for M reports. Moreover, we found that W values measured after the M task were significantly earlier than W values measured before M. This phenomenon suggests that the apparent difference between W and M values is in fact caused by the subjects' previous experience with M measurements. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Experiences of mothers of infants with congenital heart disease before, during, and after complex cardiac surgery.

    PubMed

    Harvey, Kayla A; Kovalesky, Andrea; Woods, Ronald K; Loan, Lori A

    2013-01-01

    Experiences of mothers of infants undergoing complex heart surgery were explored to build evidence-based family-centered interventions. Congenital heart disease is the most frequent birth defect in the United States and is common worldwide. Eight mothers recalled through journal entries their experiences of the days before, during, and after their infant's surgery and shared advice for other mothers. Colaizzi's phenomenological method was utilized for data analysis. A validation survey of seven additional mothers from a support group occurred via email. Six themes were identified and validated: Feeling Intense Fluctuating Emotion; Navigating the Medical World; Dealing with the Unknown; Facing the Possibility of My Baby Dying, Finding Meaning and Spiritual Connection, and the umbrella theme of Mothering Through It All. Through a clearer understanding of experiences as described by mothers, health-care providers may gain insight as to how to better support mothers of infants undergoing heart surgery. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. The Impact of the Mode of Thought in Complex Decisions: Intuitive Decisions are Better

    PubMed Central

    Usher, Marius; Russo, Zohar; Weyers, Mark; Brauner, Ran; Zakay, Dan

    2011-01-01

    A number of recent studies have reported that decision quality is enhanced under conditions of inattention or distraction (unconscious thought; Dijksterhuis, 2004; Dijksterhuis and Nordgren, 2006; Dijksterhuis et al., 2006). These reports have generated considerable controversy, for both experimental (problems of replication) and theoretical reasons (interpretation). Here we report the results of four experiments. The first experiment replicates the unconscious thought effect, under conditions that validate and control the subjective criterion of decision quality. The second and third experiments examine the impact of a mode of thought manipulation (without distraction) on decision quality in immediate decisions. Here we find that intuitive or affective manipulations improve decision quality compared to analytic/deliberation manipulations. The fourth experiment combines the two methods (distraction and mode of thought manipulations) and demonstrates enhanced decision quality, in a situation that attempts to preserve ecological validity. The results are interpreted within a framework that is based on two interacting subsystems of decision-making: an affective/intuition based system and an analytic/deliberation system. PMID:21716605

  15. The Significance of Meaning: Why Do Over 90% of Behavioral Neuroscience Results Fail to Translate to Humans, and What Can We Do to Fix It?

    PubMed Central

    Garner, Joseph P.

    2014-01-01

    The vast majority of drugs entering human trials fail. This problem (called “attrition”) is widely recognized as a public health crisis, and has been discussed openly for the last two decades. Multiple recent reviews argue that animals may be just too different physiologically, anatomically, and psychologically from humans to be able to predict human outcomes, essentially questioning the justification of basic biomedical research in animals. This review argues instead that the philosophy and practice of experimental design and analysis is so different in basic animal work and human clinical trials that an animal experiment (as currently conducted) cannot reasonably predict the outcome of a human trial. Thus, attrition does reflect a lack of predictive validity of animal experiments, but it would be a tragic mistake to conclude that animal models cannot show predictive validity. A variety of contributing factors to poor validity are reviewed. The need to adopt methods and models that are highly specific (i.e., which can identify true negative results) in order to complement the current preponderance of highly sensitive methods (which are prone to false positive results) is emphasized. Concepts in biomarker-based medicine are offered as a potential solution, and changes in the use of animal models required to embrace a translational biomarker-based approach are outlined. In essence, this review advocates a fundamental shift, where we treat every aspect of an animal experiment that we can as if it was a clinical trial in a human population. However, it is unrealistic to expect researchers to adopt a new methodology that cannot be empirically justified until a successful human trial. “Validation with known failures” is proposed as a solution. Thus new methods or models can be compared against existing ones using a drug that has translated (a known positive) and one that has failed (a known negative). Current methods should incorrectly identify both as effective, but a more specific method should identify the negative compound correctly. By using a library of known failures we can thereby empirically test the impact of suggested solutions such as enrichment, controlled heterogenization, biomarker-based models, or reverse-translated measures. PMID:25541546

  16. Measurement of patient safety: a systematic review of the reliability and validity of adverse event detection with record review

    PubMed Central

    Hanskamp-Sebregts, Mirelle; Zegers, Marieke; Vincent, Charles; van Gurp, Petra J; de Vet, Henrica C W; Wollersheim, Hub

    2016-01-01

    Objectives Record review is the most used method to quantify patient safety. We systematically reviewed the reliability and validity of adverse event detection with record review. Design A systematic review of the literature. Methods We searched PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Library and from their inception through February 2015. We included all studies that aimed to describe the reliability and/or validity of record review. Two reviewers conducted data extraction. We pooled κ values (κ) and analysed the differences in subgroups according to number of reviewers, reviewer experience and training level, adjusted for the prevalence of adverse events. Results In 25 studies, the psychometric data of the Global Trigger Tool (GTT) and the Harvard Medical Practice Study (HMPS) were reported and 24 studies were included for statistical pooling. The inter-rater reliability of the GTT and HMPS showed a pooled κ of 0.65 and 0.55, respectively. The inter-rater agreement was statistically significantly higher when the group of reviewers within a study consisted of a maximum five reviewers. We found no studies reporting on the validity of the GTT and HMPS. Conclusions The reliability of record review is moderate to substantial and improved when a small group of reviewers carried out record review. The validity of the record review method has never been evaluated, while clinical data registries, autopsy or direct observations of patient care are potential reference methods that can be used to test concurrent validity. PMID:27550650

  17. Parametric study of different contributors to tumor thermal profile

    NASA Astrophysics Data System (ADS)

    Tepper, Michal; Gannot, Israel

    2014-03-01

    Treating cancer is one of the major challenges of modern medicine. There is great interest in assessing tumor development in in vivo animal and human models, as well as in in vitro experiments. Existing methods are either limited by cost and availability or by their low accuracy and reproducibility. Thermography holds the potential of being a noninvasive, low-cost, irradiative and easy-to-use method for tumor monitoring. Tumors can be detected in thermal images due to their relatively higher or lower temperature compared to the temperature of the healthy skin surrounding them. Extensive research is performed to show the validity of thermography as an efficient method for tumor detection and the possibility of extracting tumor properties from thermal images, showing promising results. However, deducing from one type of experiment to others is difficult due to the differences in tumor properties, especially between different types of tumors or different species. There is a need in a research linking different types of tumor experiments. In this research, parametric analysis of possible contributors to tumor thermal profiles was performed. The effect of tumor geometric, physical and thermal properties was studied, both independently and together, in phantom model experiments and computer simulations. Theoretical and experimental results were cross-correlated to validate the models used and increase the accuracy of simulated complex tumor models. The contribution of different parameters in various tumor scenarios was estimated and the implication of these differences on the observed thermal profiles was studied. The correlation between animal and human models is discussed.

  18. Isotope Inversion Experiment evaluating the suitability of calibration in surrogate matrix for quantification via LC-MS/MS-Exemplary application for a steroid multi-method.

    PubMed

    Suhr, Anna Catharina; Vogeser, Michael; Grimm, Stefanie H

    2016-05-30

    For quotable quantitative analysis of endogenous analytes in complex biological samples by isotope dilution LC-MS/MS, the creation of appropriate calibrators is a challenge, since analyte-free authentic material is in general not available. Thus, surrogate matrices are often used to prepare calibrators and controls. However, currently employed validation protocols do not include specific experiments to verify the suitability of a surrogate matrix calibration for quantification of authentic matrix samples. The aim of the study was the development of a novel validation experiment to test whether surrogate matrix based calibrators enable correct quantification of authentic matrix samples. The key element of the novel validation experiment is the inversion of nonlabelled analytes and their stable isotope labelled (SIL) counterparts in respect to their functions, i.e. SIL compound is the analyte and nonlabelled substance is employed as internal standard. As a consequence, both surrogate and authentic matrix are analyte-free regarding SIL analytes, which allows a comparison of both matrices. We called this approach Isotope Inversion Experiment. As figure of merit we defined the accuracy of inverse quality controls in authentic matrix quantified by means of a surrogate matrix calibration curve. As a proof-of-concept application a LC-MS/MS assay addressing six corticosteroids (cortisol, cortisone, corticosterone, 11-deoxycortisol, 11-deoxycorticosterone, and 17-OH-progesterone) was chosen. The integration of the Isotope Inversion Experiment in the validation protocol for the steroid assay was successfully realized. The accuracy results of the inverse quality controls were all in all very satisfying. As a consequence the suitability of a surrogate matrix calibration for quantification of the targeted steroids in human serum as authentic matrix could be successfully demonstrated. The Isotope Inversion Experiment fills a gap in the validation process for LC-MS/MS assays quantifying endogenous analytes. We consider it a valuable and convenient tool to evaluate the correct quantification of authentic matrix samples based on a calibration curve in surrogate matrix. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Verification and Validation of Requirements on the CEV Parachute Assembly System Using Design of Experiments

    NASA Technical Reports Server (NTRS)

    Schulte, Peter Z.; Moore, James W.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.

  20. Merlin: Computer-Aided Oligonucleotide Design for Large Scale Genome Engineering with MAGE.

    PubMed

    Quintin, Michael; Ma, Natalie J; Ahmed, Samir; Bhatia, Swapnil; Lewis, Aaron; Isaacs, Farren J; Densmore, Douglas

    2016-06-17

    Genome engineering technologies now enable precise manipulation of organism genotype, but can be limited in scalability by their design requirements. Here we describe Merlin ( http://merlincad.org ), an open-source web-based tool to assist biologists in designing experiments using multiplex automated genome engineering (MAGE). Merlin provides methods to generate pools of single-stranded DNA oligonucleotides (oligos) for MAGE experiments by performing free energy calculation and BLAST scoring on a sliding window spanning the targeted site. These oligos are designed not only to improve recombination efficiency, but also to minimize off-target interactions. The application further assists experiment planning by reporting predicted allelic replacement rates after multiple MAGE cycles, and enables rapid result validation by generating primer sequences for multiplexed allele-specific colony PCR. Here we describe the Merlin oligo and primer design procedures and validate their functionality compared to OptMAGE by eliminating seven AvrII restriction sites from the Escherichia coli genome.

  1. A Machine Learning and Cross-Validation Approach for the Discrimination of Vegetation Physiognomic Types Using Satellite Based Multispectral and Multitemporal Data.

    PubMed

    Sharma, Ram C; Hara, Keitarou; Hirayama, Hidetake

    2017-01-01

    This paper presents the performance and evaluation of a number of machine learning classifiers for the discrimination between the vegetation physiognomic classes using the satellite based time-series of the surface reflectance data. Discrimination of six vegetation physiognomic classes, Evergreen Coniferous Forest, Evergreen Broadleaf Forest, Deciduous Coniferous Forest, Deciduous Broadleaf Forest, Shrubs, and Herbs, was dealt with in the research. Rich-feature data were prepared from time-series of the satellite data for the discrimination and cross-validation of the vegetation physiognomic types using machine learning approach. A set of machine learning experiments comprised of a number of supervised classifiers with different model parameters was conducted to assess how the discrimination of vegetation physiognomic classes varies with classifiers, input features, and ground truth data size. The performance of each experiment was evaluated by using the 10-fold cross-validation method. Experiment using the Random Forests classifier provided highest overall accuracy (0.81) and kappa coefficient (0.78). However, accuracy metrics did not vary much with experiments. Accuracy metrics were found to be very sensitive to input features and size of ground truth data. The results obtained in the research are expected to be useful for improving the vegetation physiognomic mapping in Japan.

  2. Quantifying uncertainty in geoacoustic inversion. II. Application to broadband, shallow-water data.

    PubMed

    Dosso, Stan E; Nielsen, Peter L

    2002-01-01

    This paper applies the new method of fast Gibbs sampling (FGS) to estimate the uncertainties of seabed geoacoustic parameters in a broadband, shallow-water acoustic survey, with the goal of interpreting the survey results and validating the method for experimental data. FGS applies a Bayesian approach to geoacoustic inversion based on sampling the posterior probability density to estimate marginal probability distributions and parameter covariances. This requires knowledge of the statistical distribution of the data errors, including both measurement and theory errors, which is generally not available. Invoking the simplifying assumption of independent, identically distributed Gaussian errors allows a maximum-likelihood estimate of the data variance and leads to a practical inversion algorithm. However, it is necessary to validate these assumptions, i.e., to verify that the parameter uncertainties obtained represent meaningful estimates. To this end, FGS is applied to a geoacoustic experiment carried out at a site off the west coast of Italy where previous acoustic and geophysical studies have been performed. The parameter uncertainties estimated via FGS are validated by comparison with: (i) the variability in the results of inverting multiple independent data sets collected during the experiment; (ii) the results of FGS inversion of synthetic test cases designed to simulate the experiment and data errors; and (iii) the available geophysical ground truth. Comparisons are carried out for a number of different source bandwidths, ranges, and levels of prior information, and indicate that FGS provides reliable and stable uncertainty estimates for the geoacoustic inverse problem.

  3. Learning from samples of one or fewer*

    PubMed Central

    March, J; Sproull, L; Tamuz, M

    2003-01-01

    

 Organizations learn from experience. Sometimes, however, history is not generous with experience. We explore how organizations convert infrequent events into interpretations of history, and how they balance the need to achieve agreement on interpretations with the need to interpret history correctly. We ask what methods are used, what problems are involved, and what improvements might be made. Although the methods we observe are not guaranteed to lead to consistent agreement on interpretations, valid knowledge, improved organizational performance, or organizational survival, they provide possible insights into the possibilities for and problems of learning from fragments of history. PMID:14645764

  4. Creation of a novel simulator for minimally invasive neurosurgery: fusion of 3D printing and special effects.

    PubMed

    Weinstock, Peter; Rehder, Roberta; Prabhu, Sanjay P; Forbes, Peter W; Roussin, Christopher J; Cohen, Alan R

    2017-07-01

    OBJECTIVE Recent advances in optics and miniaturization have enabled the development of a growing number of minimally invasive procedures, yet innovative training methods for the use of these techniques remain lacking. Conventional teaching models, including cadavers and physical trainers as well as virtual reality platforms, are often expensive and ineffective. Newly developed 3D printing technologies can recreate patient-specific anatomy, but the stiffness of the materials limits fidelity to real-life surgical situations. Hollywood special effects techniques can create ultrarealistic features, including lifelike tactile properties, to enhance accuracy and effectiveness of the surgical models. The authors created a highly realistic model of a pediatric patient with hydrocephalus via a unique combination of 3D printing and special effects techniques and validated the use of this model in training neurosurgery fellows and residents to perform endoscopic third ventriculostomy (ETV), an effective minimally invasive method increasingly used in treating hydrocephalus. METHODS A full-scale reproduction of the head of a 14-year-old adolescent patient with hydrocephalus, including external physical details and internal neuroanatomy, was developed via a unique collaboration of neurosurgeons, simulation engineers, and a group of special effects experts. The model contains "plug-and-play" replaceable components for repetitive practice. The appearance of the training model (face validity) and the reproducibility of the ETV training procedure (content validity) were assessed by neurosurgery fellows and residents of different experience levels based on a 14-item Likert-like questionnaire. The usefulness of the training model for evaluating the performance of the trainees at different levels of experience (construct validity) was measured by blinded observers using the Objective Structured Assessment of Technical Skills (OSATS) scale for the performance of ETV. RESULTS A combination of 3D printing technology and casting processes led to the creation of realistic surgical models that include high-fidelity reproductions of the anatomical features of hydrocephalus and allow for the performance of ETV for training purposes. The models reproduced the pulsations of the basilar artery, ventricles, and cerebrospinal fluid (CSF), thus simulating the experience of performing ETV on an actual patient. The results of the 14-item questionnaire showed limited variability among participants' scores, and the neurosurgery fellows and residents gave the models consistently high ratings for face and content validity. The mean score for the content validity questions (4.88) was higher than the mean score for face validity (4.69) (p = 0.03). On construct validity scores, the blinded observers rated performance of fellows significantly higher than that of residents, indicating that the model provided a means to distinguish between novice and expert surgical skills. CONCLUSIONS A plug-and-play lifelike ETV training model was developed through a combination of 3D printing and special effects techniques, providing both anatomical and haptic accuracy. Such simulators offer opportunities to accelerate the development of expertise with respect to new and novel procedures as well as iterate new surgical approaches and innovations, thus allowing novice neurosurgeons to gain valuable experience in surgical techniques without exposing patients to risk of harm.

  5. Validation and scaling of soil moisture in a semi-arid environment: SMAP Validation Experiment 2015 (SMAPVEX15)

    USDA-ARS?s Scientific Manuscript database

    The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data product.The main goals of the experiment were to address issues regarding the spatial disaggregation...

  6. Elastic Cherenkov effects in transversely isotropic soft materials-II: Ex vivo and in vivo experiments

    NASA Astrophysics Data System (ADS)

    Li, Guo-Yang; He, Qiong; Qian, Lin-Xue; Geng, Huiying; Liu, Yanlin; Yang, Xue-Yi; Luo, Jianwen; Cao, Yanping

    2016-09-01

    In part I of this study, we investigated the elastic Cherenkov effect (ECE) in an incompressible transversely isotropic (TI) soft solid using a combined theoretical and computational approach, based on which an inverse method has been proposed to measure both the anisotropic and hyperelastic parameters of TI soft tissues. In this part, experiments were carried out to validate the inverse method and demonstrate its usefulness in practical measurements. We first performed ex vivo experiments on bovine skeletal muscles. Not only the shear moduli along and perpendicular to the direction of muscle fibers but also the elastic modulus EL and hyperelastic parameter c2 were determined. We next carried out tensile tests to determine EL, which was compared with the value obtained using the shear wave elastography method. Furthermore, we conducted in vivo experiments on the biceps brachii and gastrocnemius muscles of ten healthy volunteers. To the best of our knowledge, this study represents the first attempt to determine EL of human muscles using the dynamic elastography method and inverse analysis. The significance of our method and its potential for clinical use are discussed.

  7. Efficient generalized cross-validation with applications to parametric image restoration and resolution enhancement.

    PubMed

    Nguyen, N; Milanfar, P; Golub, G

    2001-01-01

    In many image restoration/resolution enhancement applications, the blurring process, i.e., point spread function (PSF) of the imaging system, is not known or is known only to within a set of parameters. We estimate these PSF parameters for this ill-posed class of inverse problem from raw data, along with the regularization parameters required to stabilize the solution, using the generalized cross-validation method (GCV). We propose efficient approximation techniques based on the Lanczos algorithm and Gauss quadrature theory, reducing the computational complexity of the GCV. Data-driven PSF and regularization parameter estimation experiments with synthetic and real image sequences are presented to demonstrate the effectiveness and robustness of our method.

  8. Study of the Conservation of Mechanical Energy in the Motion of a Pendulum Using a Smartphone

    ERIC Educational Resources Information Center

    Pierratos, Theodoros; Polatoglou, Hariton M.

    2018-01-01

    A common method that scientists use to validate a theory is to utilize known principles and laws to produce results on specific settings, which can be assessed using the appropriate experimental methods and apparatuses. Smartphones have various sensors built-in and could be used for measuring and logging data in physics experiments. In this work,…

  9. Attitudes about Advances in Sweat Patch Testing in Drug Courts: Insights from a Case Study in Southern California

    ERIC Educational Resources Information Center

    Polzer, Katherine

    2010-01-01

    Drug courts are reinventing the drug testing framework by experimenting with new methods, including use of the sweat patch. The sweat patch is a band-aid like strip used to monitor drug court participants. The validity and reliability of the sweat patch as an effective testing method was examined, as well as the effectiveness, meaning how likely…

  10. Validation of the Neonatal Satisfaction Survey (NSS-8) in six Norwegian neonatal intensive care units: a quantitative cross-sectional study.

    PubMed

    Hagen, Inger Hilde; Svindseth, Marit Følsvik; Nesset, Erik; Orner, Roderick; Iversen, Valentina Cabral

    2018-03-27

    The experience of having their new-borns admitted to an intensive care unit (NICU) can be extremely distressing. Subsequent risk of post-incident-adjustment difficulties are increased for parents, siblings, and affected families. Patient and next of kin satisfaction surveys provide key indicators of quality in health care. Methodically constructed and validated survey tools are in short supply and parents' experiences of care in Neonatal Intensive Care Units is under-researched. This paper reports a validation of the Neonatal Satisfaction Survey (NSS-8) in six Norwegian NICUs. Parents' survey returns were collected using the Neonatal Satisfaction Survey (NSS-13). Data quality and psychometric properties were systematically assessed using exploratory factor analysis, tests of internal consistency, reliability, construct, convergent and discriminant validity. Each set of hospital returns were subjected to an apostasy analysis before an overall satisfaction rate was calculated. The survey sample of 568 parents represents 45% of total eligible population for the period of the study. Missing data accounted for 1,1% of all returns. Attrition analysis shows congruence between sample and total population. Exploratory factor analysis identified eight factors of concern to parents,"Care and Treatment", "Doctors", "Visits", "Information", "Facilities", "Parents' Anxiety", "Discharge" and "Sibling Visits". All factors showed satisfactory internal consistency, good reliability (Cronbach's alpha ranged from 0.70-0.94). For the whole scale of 51 items α 0.95. Convergent validity using Spearman's rank between the eight factors and question measuring overall satisfaction was significant on all factors. Discriminant validity was established for all factors. Overall satisfaction rates ranged from 86 to 90% while for each of the eight factors measures of satisfaction varied between 64 and 86%. The NSS-8 questionnaire is a valid and reliable scale for measuring parents' assessment of quality of care in NICU. Statistical analysis confirms the instrument's capacity to gauge parents' experiences of NICU. Further research is indicated to validate the survey questionnaire in other Nordic countries and beyond.

  11. Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT

    PubMed Central

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2014-01-01

    Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354

  12. Full-wave Nonlinear Inverse Scattering for Acoustic and Electromagnetic Breast Imaging

    NASA Astrophysics Data System (ADS)

    Haynes, Mark Spencer

    Acoustic and electromagnetic full-wave nonlinear inverse scattering techniques are explored in both theory and experiment with the ultimate aim of noninvasively mapping the material properties of the breast. There is evidence that benign and malignant breast tissue have different acoustic and electrical properties and imaging these properties directly could provide higher quality images with better diagnostic certainty. In this dissertation, acoustic and electromagnetic inverse scattering algorithms are first developed and validated in simulation. The forward solvers and optimization cost functions are modified from traditional forms in order to handle the large or lossy imaging scenes present in ultrasonic and microwave breast imaging. An antenna model is then presented, modified, and experimentally validated for microwave S-parameter measurements. Using the antenna model, a new electromagnetic volume integral equation is derived in order to link the material properties of the inverse scattering algorithms to microwave S-parameters measurements allowing direct comparison of model predictions and measurements in the imaging algorithms. This volume integral equation is validated with several experiments and used as the basis of a free-space inverse scattering experiment, where images of the dielectric properties of plastic objects are formed without the use of calibration targets. These efforts are used as the foundation of a solution and formulation for the numerical characterization of a microwave near-field cavity-based breast imaging system. The system is constructed and imaging results of simple targets are given. Finally, the same techniques are used to explore a new self-characterization method for commercial ultrasound probes. The method is used to calibrate an ultrasound inverse scattering experiment and imaging results of simple targets are presented. This work has demonstrated the feasibility of quantitative microwave inverse scattering by way of a self-consistent characterization formalism, and has made headway in the same area for ultrasound.

  13. Validation of Gujarati Version of ABILOCO-Kids Questionnaire.

    PubMed

    Diwan, Shraddha; Diwan, Jasmin; Patel, Pankaj; Bansal, Ankita B

    2015-10-01

    ABILOCO-Kids is a measure of locomotion ability for children with cerebral palsy (CP) aged 6 to 15 years & is available in English & French. To validate the Gujarati version of ABILOCO-Kids questionnaire to be used in clinical research on Gujarati population. ABILOCO-Kids questionnaire was translated into Gujarati from English using forward-backward-forward method. To ensure face & content validity of Gujarati version using group consensus method, each item was examined by group of experts having mean experience of 24.62 years in field of paediatric and paediatric physiotherapy. Each item was analysed for content, meaning, wording, format, ease of administration & scoring. Each item was scored by expert group as either accepted, rejected or accepted with modification. Procedure was continued until 80% of consensus for all items. Concurrent validity was examined on 55 children with Cerebral Palsy (6-15 years) of all Gross Motor Functional Classification System (GMFCS) level & all clinical types by correlating score of ABILOCO-Kids with Gross Motor Functional Measure & GMFCS. In phase 1 of validation, 16 items were accepted as it is; 22 items accepted with modification & 3 items went for phase 2 validation. For concurrent validity, highly significant positive correlation was found between score of ABILOCO-Kids & total GMFM (r=0.713, p<0.005) & highly significant negative correlation with GMFCS (r= -0.778, p<0.005). Gujarati translated version of ABILOCO-Kids questionnaire has good face & content validity as well as concurrent validity which can be used to measure caregiver reported locomotion ability in children with CP.

  14. Feasibility study for wax deposition imaging in oil pipelines by PGNAA technique.

    PubMed

    Cheng, Can; Jia, Wenbao; Hei, Daqian; Wei, Zhiyong; Wang, Hongtao

    2017-10-01

    Wax deposition in pipelines is a crucial problem in the oil industry. A method based on the prompt gamma-ray neutron activation analysis technique was applied to reconstruct the image of wax deposition in oil pipelines. The 2.223MeV hydrogen capture gamma rays were used to reconstruct the wax deposition image. To validate the method, both MCNP simulation and experiments were performed for wax deposited with a maximum thickness of 20cm. The performance of the method was simulated using the MCNP code. The experiment was conducted with a 252 Cf neutron source and a LaBr 3 : Ce detector. A good correspondence between the simulations and the experiments was observed. The results obtained indicate that the present approach is efficient for wax deposition imaging in oil pipelines. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Applied Chaos Level Test for Validation of Signal Conditions Underlying Optimal Performance of Voice Classification Methods.

    PubMed

    Liu, Boquan; Polce, Evan; Sprott, Julien C; Jiang, Jack J

    2018-05-17

    The purpose of this study is to introduce a chaos level test to evaluate linear and nonlinear voice type classification method performances under varying signal chaos conditions without subjective impression. Voice signals were constructed with differing degrees of noise to model signal chaos. Within each noise power, 100 Monte Carlo experiments were applied to analyze the output of jitter, shimmer, correlation dimension, and spectrum convergence ratio. The computational output of the 4 classifiers was then plotted against signal chaos level to investigate the performance of these acoustic analysis methods under varying degrees of signal chaos. A diffusive behavior detection-based chaos level test was used to investigate the performances of different voice classification methods. Voice signals were constructed by varying the signal-to-noise ratio to establish differing signal chaos conditions. Chaos level increased sigmoidally with increasing noise power. Jitter and shimmer performed optimally when the chaos level was less than or equal to 0.01, whereas correlation dimension was capable of analyzing signals with chaos levels of less than or equal to 0.0179. Spectrum convergence ratio demonstrated proficiency in analyzing voice signals with all chaos levels investigated in this study. The results of this study corroborate the performance relationships observed in previous studies and, therefore, demonstrate the validity of the validation test method. The presented chaos level validation test could be broadly utilized to evaluate acoustic analysis methods and establish the most appropriate methodology for objective voice analysis in clinical practice.

  16. Reduction of bias and variance for evaluation of computer-aided diagnostic schemes.

    PubMed

    Li, Qiang; Doi, Kunio

    2006-04-01

    Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists in detecting various lesions in medical images. In addition to the development, an equally important problem is the reliable evaluation of the performance levels of various CAD schemes. It is good to see that more and more investigators are employing more reliable evaluation methods such as leave-one-out and cross validation, instead of less reliable methods such as resubstitution, for assessing their CAD schemes. However, the common applications of leave-one-out and cross-validation evaluation methods do not necessarily imply that the estimated performance levels are accurate and precise. Pitfalls often occur in the use of leave-one-out and cross-validation evaluation methods, and they lead to unreliable estimation of performance levels. In this study, we first identified a number of typical pitfalls for the evaluation of CAD schemes, and conducted a Monte Carlo simulation experiment for each of the pitfalls to demonstrate quantitatively the extent of bias and/or variance caused by the pitfall. Our experimental results indicate that considerable bias and variance may exist in the estimated performance levels of CAD schemes if one employs various flawed leave-one-out and cross-validation evaluation methods. In addition, for promoting and utilizing a high standard for reliable evaluation of CAD schemes, we attempt to make recommendations, whenever possible, for overcoming these pitfalls. We believe that, with the recommended evaluation methods, we can considerably reduce the bias and variance in the estimated performance levels of CAD schemes.

  17. MCDA swing weighting and discrete choice experiments for elicitation of patient benefit-risk preferences: a critical assessment.

    PubMed

    Tervonen, Tommi; Gelhorn, Heather; Sri Bhashyam, Sumitra; Poon, Jiat-Ling; Gries, Katharine S; Rentz, Anne; Marsh, Kevin

    2017-12-01

    Multiple criteria decision analysis swing weighting (SW) and discrete choice experiments (DCE) are appropriate methods for capturing patient preferences on treatment benefit-risk trade-offs. This paper presents a qualitative comparison of the 2 methods. We review and critically assess similarities and differences of SW and DCE based on 6 aspects: comprehension by study participants, cognitive biases, sample representativeness, ability to capture heterogeneity in preferences, reliability and validity, and robustness of the results. The SW choice task can be more difficult, but the workshop context in which SW is conducted may provide more support to patients who are unfamiliar with the end points being evaluated or who have cognitive impairments. Both methods are similarly prone to a number of biases associated with preference elicitation, and DCE is prone to simplifying heuristics, which limits its application with large number of attributes. The low cost per patient of the DCE means that it can be better at achieving a representative sample, though SW does not require such large sample sizes due to exact nature of the collected preference data. This also means that internal validity is automatically enforced with SW, while the internal validity of DCE results needs to be assessed manually. Choice between the 2 methods depends on characteristics of the benefit-risk assessment, especially on how difficult the trade-offs are for the patients to make and how many patients are available. Although there exist some empirical studies on many of the evaluation aspects, critical evidence gaps remain. Copyright © 2017 John Wiley & Sons, Ltd.

  18. A method to determine the mammographic regions that show early changes due to the development of breast cancer

    NASA Astrophysics Data System (ADS)

    Karemore, Gopal; Nielsen, Mads; Karssemeijer, Nico; Brandt, Sami S.

    2014-11-01

    It is well understood nowadays that changes in the mammographic parenchymal pattern are an indicator of a risk of breast cancer and we have developed a statistical method that estimates the mammogram regions where the parenchymal changes, due to breast cancer, occur. This region of interest is computed from a score map by utilising the anatomical breast coordinate system developed in our previous work. The method also makes an automatic scale selection to avoid overfitting while the region estimates are computed by a nested cross-validation scheme. In this way, it is possible to recover those mammogram regions that show a significant difference in classification scores between the cancer and the control group. Our experiments suggested that the most significant mammogram region is the region behind the nipple and that can be justified by previous findings from other research groups. This result was conducted on the basis of the cross-validation experiments on independent training, validation and testing sets from the case-control study of 490 women, of which 245 women were diagnosed with breast cancer within a period of 2-4 years after the baseline mammograms. We additionally generalised the estimated region to another, mini-MIAS study and showed that the transferred region estimate gives at least a similar classification result when compared to the case where the whole breast region is used. In all, by following our method, one most likely improves both preclinical and follow-up breast cancer screening, but a larger study population will be required to test this hypothesis.

  19. Correlation of X-ray computed tomography with quantitative nuclear magnetic resonance methods for pre-clinical measurement of adipose and lean tissues in living mice.

    PubMed

    Metzinger, Matthew N; Miramontes, Bernadette; Zhou, Peng; Liu, Yueying; Chapman, Sarah; Sun, Lucy; Sasser, Todd A; Duffield, Giles E; Stack, M Sharon; Leevy, W Matthew

    2014-10-08

    Numerous obesity studies have coupled murine models with non-invasive methods to quantify body composition in longitudinal experiments, including X-ray computed tomography (CT) or quantitative nuclear magnetic resonance (QMR). Both microCT and QMR have been separately validated with invasive techniques of adipose tissue quantification, like post-mortem fat extraction and measurement. Here we report a head-to-head study of both protocols using oil phantoms and mouse populations to determine the parameters that best align CT data with that from QMR. First, an in vitro analysis of oil/water mixtures was used to calibrate and assess the overall accuracy of microCT vs. QMR data. Next, experiments were conducted with two cohorts of living mice (either homogenous or heterogeneous by sex, age and genetic backgrounds) to assess the microCT imaging technique for adipose tissue segmentation and quantification relative to QMR. Adipose mass values were obtained from microCT data with three different resolutions, after which the data were analyzed with different filter and segmentation settings. Strong linearity was noted between the adipose mass values obtained with microCT and QMR, with optimal parameters and scan conditions reported herein. Lean tissue (muscle, internal organs) was also segmented and quantified using the microCT method relative to the analogous QMR values. Overall, the rigorous calibration and validation of the microCT method for murine body composition, relative to QMR, ensures its validity for segmentation, quantification and visualization of both adipose and lean tissues.

  20. Design and validation of a method for evaluation of interocular interaction.

    PubMed

    Lai, Xin Jie Angela; Alexander, Jack; Ho, Arthur; Yang, Zhikuan; He, Mingguang; Suttle, Catherine

    2012-02-01

    To design a simple viewing system allowing dichoptic masking, and to validate this system in adults and children with normal vision. A Trial Frame Apparatus (TFA) was designed to evaluate interocular interaction. This device consists of a trial frame, a 1 mm pinhole in front of the tested eye and a full or partial occluder in front of the non-tested eye. The difference in visual function in one eye between the full- and partial-occlusion conditions was termed the Interaction Index. In experiment 1, low-contrast acuity was measured in six adults using five types of partial occluder. Interaction Index was compared between these five, and the occluder showing the highest Index was used in experiment 2. In experiment 2, low-contrast acuity, contrast sensitivity, and alignment sensitivity were measured in the non-dominant eye of 45 subjects (15 older adults, 15 young adults, and 15 children), using the TFA and an existing well-validated device (shutter goggles) with full and partial occlusion of the dominant eye. These measurements were repeated on 11 subjects of each group using TFA in the partial-occlusion condition only. Repeatability of visual function measurements using TFA was assessed using the Bland-Altman method and agreement between TFA and goggles in terms of visual functions and interactions was assessed using the Bland-Altman method and t-test. In all three subject groups, the TFA showed a high level of repeatability in all visual function measurements. Contrast sensitivity was significantly poorer when measured using TFA than using goggles (p < 0.05). However, Interaction Index of all three visual functions showed acceptable agreement between TFA and goggles (p > 0.05). The TFA may provide an acceptable method for the study of some forms of dichoptic masking in populations where more complex devices (e.g., shutter goggles) cannot be used.

  1. Intoxication-Related AmED (Alcohol Mixed with Energy Drink) Expectancies Scale: Initial Development and Validation

    PubMed Central

    Miller, Kathleen E.; Dermen, Kurt H.; Lucke, Joseph F.

    2017-01-01

    BACKGROUND Young adult use of alcohol mixed with energy drinks (AmEDs) has been linked with elevated risks for a constellation of problem behaviors. These risks may be conditioned by expectancies regarding the effects of caffeine in conjunction with alcohol consumption. The aim of this study was to describe the construction and psychometric evaluation of the Intoxication-Related AmED Expectancies Scale (AmED_EXPI), 15 self-report items measuring beliefs about how the experience of AmED intoxication differs from the experience of noncaffeinated alcohol (NCA) intoxication. METHODS Scale development and testing were conducted using data from a U.S. national sample of 3,105 adolescents and emerging adults aged 13–25. Exploratory and confirmatory factor analyses were conducted to evaluate the factor structure and establish factor invariance across gender, age, and prior experience with AmED use. Cross-sectional and longitudinal analyses examining correlates of AmED use were used to assess construct and predictive validity. RESULTS In confirmatory factor analyses, fit indices for the hypothesized four-factor structure (i.e., Intoxication Management [IM], Alertness [AL], Sociability [SO], and Jitters [JT]) revealed a moderately good fit to the data. Together, these factors accounted for 75.3% of total variance. The factor structure was stable across male/female, teen/young adult, and AmED experience/no experience subgroups. The resultant unit-weighted subscales showed strong internal consistency and satisfactory convergent validity. Baseline scores on the IM, SO, and JT subscales predicted changes in AmED use over a subsequent three-month period. CONCLUSIONS The AmED_EXPI appears to be a reliable and valid tool for measuring expectancies about the effects of caffeine during alcohol intoxication. PMID:28421613

  2. Reactivity loss validation of high burn-up PWR fuels with pile-oscillation experiments in MINERVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leconte, P.; Vaglio-Gaudard, C.; Eschbach, R.

    2012-07-01

    The ALIX experimental program relies on the experimental validation of the spent fuel inventory, by chemical analysis of samples irradiated in a PWR between 5 and 7 cycles, and also on the experimental validation of the spent fuel reactivity loss with bum-up, obtained by pile-oscillation measurements in the MINERVE reactor. These latter experiments provide an overall validation of both the fuel inventory and of the nuclear data responsible for the reactivity loss. This program offers also unique experimental data for fuels with a burn-up reaching 85 GWd/t, as spent fuels in French PWRs never exceeds 70 GWd/t up to now.more » The analysis of these experiments is done in two steps with the APOLLO2/SHEM-MOC/CEA2005v4 package. In the first one, the fuel inventory of each sample is obtained by assembly calculations. The calculation route consists in the self-shielding of cross sections on the 281 energy group SHEM mesh, followed by the flux calculation by the Method Of Characteristics in a 2D-exact heterogeneous geometry of the assembly, and finally a depletion calculation by an iterative resolution of the Bateman equations. In the second step, the fuel inventory is used in the analysis of pile-oscillation experiments in which the reactivity of the ALIX spent fuel samples is compared to the reactivity of fresh fuel samples. The comparison between Experiment and Calculation shows satisfactory results with the JEFF3.1.1 library which predicts the reactivity loss within 2% for burn-up of {approx}75 GWd/t and within 4% for burn-up of {approx}85 GWd/t. (authors)« less

  3. Gamifying Self-Management of Chronic Illnesses: A Mixed-Methods Study

    PubMed Central

    Wills, Gary; Ranchhod, Ashok

    2016-01-01

    Background Self-management of chronic illnesses is an ongoing issue in health care research. Gamification is a concept that arose in the field of computer science and has been borrowed by many other disciplines. It is perceived by many that gamification can improve the self-management experience of people with chronic illnesses. This paper discusses the validation of a framework (called The Wheel of Sukr) that was introduced to achieve this goal. Objective This research aims to (1) discuss a gamification framework targeting the self-management of chronic illnesses and (2) validate the framework by diabetic patients, medical professionals, and game experts. Methods A mixed-method approach was used to validate the framework. Expert interviews (N=8) were conducted in order to validate the themes of the framework. Additionally, diabetic participants completed a questionnaire (N=42) in order to measure their attitudes toward the themes of the framework. Results The results provide a validation of the framework. This indicates that gamification might improve the self-management of chronic illnesses, such as diabetes. Namely, the eight themes in the Wheel of Sukr (fun, esteem, socializing, self-management, self-representation, motivation, growth, sustainability) were perceived positively by 71% (30/42) of the participants with P value <.001. Conclusions In this research, both the interviews and the questionnaire yielded positive results that validate the framework (The Wheel of Sukr). Generally, this study indicates an overall acceptance of the notion of gamification in the self-management of diabetes. PMID:27612632

  4. Microgravity research at the University of Mexico: Experiments in payload G-006

    NASA Technical Reports Server (NTRS)

    Peralta-Fabi, Ricardo; Mendieta-Jimenez, Javier

    1988-01-01

    The experiments contained in the G-006 payload related to thin film vapor deposition, vacuum variations in a chamber vented to space, solidification of a Zn-Al-Cu alloy, and multiple location temperature monitoring for thermal model validation are described in detail. A discussion of the expected results is presented, together with the methods selected to conduct the postflight analysis, and finally, a overview of the future activities in this field.

  5. Methods Data Qualification Interim Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Sam Alessi; Tami Grimmett; Leng Vang

    The overall goal of the Next Generation Nuclear Plant (NGNP) Data Management and Analysis System (NDMAS) is to maintain data provenance for all NGNP data including the Methods component of NGNP data. Multiple means are available to access data stored in NDMAS. A web portal environment allows users to access data, view the results of qualification tests and view graphs and charts of various attributes of the data. NDMAS also has methods for the management of the data output from VHTR simulation models and data generated from experiments designed to verify and validate the simulation codes. These simulation models representmore » the outcome of mathematical representation of VHTR components and systems. The methods data management approaches described herein will handle data that arise from experiment, simulation, and external sources for the main purpose of facilitating parameter estimation and model verification and validation (V&V). A model integration environment entitled ModelCenter is used to automate the storing of data from simulation model runs to the NDMAS repository. This approach does not adversely change the why computational scientists conduct their work. The method is to be used mainly to store the results of model runs that need to be preserved for auditing purposes or for display to the NDMAS web portal. This interim report demonstrates the currently development of NDMAS for Methods data and discusses data and its qualification that is currently part of NDMAS.« less

  6. The response dynamics of preferential choice.

    PubMed

    Koop, Gregory J; Johnson, Joseph G

    2013-12-01

    The ubiquity of psychological process models requires an increased degree of sophistication in the methods and metrics that we use to evaluate them. We contribute to this venture by capitalizing on recent work in cognitive science analyzing response dynamics, which shows that the bearing information processing dynamics have on intended action is also revealed in the motor system. This decidedly "embodied" view suggests that researchers are missing out on potential dependent variables with which to evaluate their models-those associated with the motor response that produces a choice. The current work develops a method for collecting and analyzing such data in the domain of decision making. We first validate this method using widely normed stimuli from the International Affective Picture System (Experiment 1), and demonstrate that curvature in response trajectories provides a metric of the competition between choice options. We next extend the method to risky decision making (Experiment 2) and develop predictions for three popular classes of process model. The data provided by response dynamics demonstrate that choices contrary to the maxim of risk seeking in losses and risk aversion in gains may be the product of at least one "online" preference reversal, and can thus begin to discriminate amongst the candidate models. Finally, we incorporate attentional data collected via eye-tracking (Experiment 3) to develop a formal computational model of joint information sampling and preference accumulation. In sum, we validate response dynamics for use in preferential choice tasks and demonstrate the unique conclusions afforded by response dynamics over and above traditional methods. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. The Arthroscopic Surgical Skill Evaluation Tool (ASSET)

    PubMed Central

    Koehler, Ryan J.; Amsdell, Simon; Arendt, Elizabeth A; Bisson, Leslie J; Braman, Jonathan P; Butler, Aaron; Cosgarea, Andrew J; Harner, Christopher D; Garrett, William E; Olson, Tyson; Warme, Winston J.; Nicandri, Gregg T.

    2014-01-01

    Background Surgeries employing arthroscopic techniques are among the most commonly performed in orthopaedic clinical practice however, valid and reliable methods of assessing the arthroscopic skill of orthopaedic surgeons are lacking. Hypothesis The Arthroscopic Surgery Skill Evaluation Tool (ASSET) will demonstrate content validity, concurrent criterion-oriented validity, and reliability, when used to assess the technical ability of surgeons performing diagnostic knee arthroscopy on cadaveric specimens. Study Design Cross-sectional study; Level of evidence, 3 Methods Content validity was determined by a group of seven experts using a Delphi process. Intra-articular performance of a right and left diagnostic knee arthroscopy was recorded for twenty-eight residents and two sports medicine fellowship trained attending surgeons. Subject performance was assessed by two blinded raters using the ASSET. Concurrent criterion-oriented validity, inter-rater reliability, and test-retest reliability were evaluated. Results Content validity: The content development group identified 8 arthroscopic skill domains to evaluate using the ASSET. Concurrent criterion-oriented validity: Significant differences in total ASSET score (p<0.05) between novice, intermediate, and advanced experience groups were identified. Inter-rater reliability: The ASSET scores assigned by each rater were strongly correlated (r=0.91, p <0.01) and the intra-class correlation coefficient between raters for the total ASSET score was 0.90. Test-retest reliability: there was a significant correlation between ASSET scores for both procedures attempted by each individual (r = 0.79, p<0.01). Conclusion The ASSET appears to be a useful, valid, and reliable method for assessing surgeon performance of diagnostic knee arthroscopy in cadaveric specimens. Studies are ongoing to determine its generalizability to other procedures as well as to the live OR and other simulated environments. PMID:23548808

  8. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  9. CFD validation experiments for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.

  10. WE-EF-210-06: Ultrasound 2D Strain Measurement of Radiation-Induced Toxicity: Phantom and Ex Vivo Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Torres, M; Rossi, P

    Purpose: Radiation-induced fibrosis is a common long-term complication affecting many patients following cancer radiotherapy. Standard clinical assessment of subcutaneous fibrosis is subjective and often limited to visual inspection and palpation. Ultrasound strain imaging describes the compressibility (elasticity) of biological tissues. This study’s purpose is to develop a quantitative ultrasound strain imaging that can consistently and accurately characterize radiation-induce fibrosis. Methods: In this study, we propose a 2D strain imaging method based on deformable image registration. A combined affine and B-spline transformation model is used to calculate the displacement of tissue between pre-stress and post-stress B-mode image sequences. The 2D displacementmore » is estimated through a hybrid image similarity measure metric, which is a combination of the normalized mutual information (NMI) and normalized sum-of-squared-differences (NSSD). And 2D strain is obtained from the gradient of the local displacement. We conducted phantom experiments under various compressions and compared the performance of our proposed method with the standard cross-correlation (CC)- based method using the signal-to-noise (SNR) and contrast-to-noise (CNS) ratios. In addition, we conducted ex-vivo beef muscle experiment to further validate the proposed method. Results: For phantom study, the SNR and CNS values of the proposed method were significantly higher than those calculated from the CC-based method under different strains. The SNR and CNR increased by a factor of 1.9 and 2.7 comparing to the CC-based method. For the ex-vivo experiment, the CC-based method failed to work due to large deformation (6.7%), while our proposed method could accurately detect the stiffness change. Conclusion: We have developed a 2D strain imaging technique based on the deformable image registration, validated its accuracy and feasibility with phantom and ex-vivo data. This 2D ultrasound strain imaging technology may be valuable as physicians try to eliminate radiation-induce fibrosis and improve the therapeutic ratio of cancer radiotherapy. This research is supported in part by DOD PCRP Award W81XWH-13-1-0269, and National Cancer Institute (NCI) Grant CA114313.« less

  11. Crystal growth furnace safety system validation

    NASA Technical Reports Server (NTRS)

    Mackowski, D. W.; Hartfield, R.; Bhavnani, S. H.; Belcher, V. M.

    1994-01-01

    The findings are reported regarding the safe operation of the NASA crystal growth furnace (CGF) and potential methods for detecting containment failures of the furnace. The main conclusions are summarized by ampoule leak detection, cartridge leak detection, and detection of hazardous species in the experiment apparatus container (EAC).

  12. An Experiment in Teaching Human Ethology

    ERIC Educational Resources Information Center

    Barnett, S. A.

    1977-01-01

    Students of ethology are often confused about the validity of arguments based on comparisons of animal and human behavior. The problem can be dealt with purely theoretically or through observational or experimental studies of human behavior. Some results of using these two methods are described and discussed. (Author/MA)

  13. A new practice environment measure based on the reality and experiences of nurses working lives.

    PubMed

    Webster, Joan; Flint, Anndrea; Courtney, Mary

    2009-01-01

    To explore the underlying organizational issues affecting a nurses' decision to leave and to develop a contemporary practice environment measure based on the experiences of nurses working lives. Turnover had reached an unacceptable level in our organization but underlying reasons for leaving were unknown. In-depth interviews were conducted with 13 nurses who had resigned. Transcripts were analysed using the constant comparative method. Information from the interviews informed the development a new practice environment tool, which has undergone initial testing using the Content Validity Index and Chronbach's alpha. Two domains ('work life' and 'personal life/professional development') and five themes ('feeling safe', 'feeling valued', 'getting things done', 'professional development' and 'being flexible') emerged from the interviews. A content validity score for the new instrument was 0.79 and Chronbach's alpha 0.93. The new practice environment tool has shown useful initial reliability and validity but requires wider testing in other settings. The reality and experiences of nurses working lives can be identified through exit interviews conducted by an independent person. Information from such interviews is useful in identifying an organization's strength and weaknesses and to develop initiatives to support retention.

  14. Jointly determining significance levels of primary and replication studies by controlling the false discovery rate in two-stage genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-01-01

    In genome-wide association studies, we normally discover associations between genetic variants and diseases/traits in primary studies, and validate the findings in replication studies. We consider the associations identified in both primary and replication studies as true findings. An important question under this two-stage setting is how to determine significance levels in both studies. In traditional methods, significance levels of the primary and replication studies are determined separately. We argue that the separate determination strategy reduces the power in the overall two-stage study. Therefore, we propose a novel method to determine significance levels jointly. Our method is a reanalysis method that needs summary statistics from both studies. We find the most powerful significance levels when controlling the false discovery rate in the two-stage study. To enjoy the power improvement from the joint determination method, we need to select single nucleotide polymorphisms for replication at a less stringent significance level. This is a common practice in studies designed for discovery purpose. We suggest this practice is also suitable in studies with validation purpose in order to identify more true findings. Simulation experiments show that our method can provide more power than traditional methods and that the false discovery rate is well-controlled. Empirical experiments on datasets of five diseases/traits demonstrate that our method can help identify more associations. The R-package is available at: http://bioinformatics.ust.hk/RFdr.html .

  15. Validation of the enthalpy method by means of analytical solution

    NASA Astrophysics Data System (ADS)

    Kleiner, Thomas; Rückamp, Martin; Bondzio, Johannes; Humbert, Angelika

    2014-05-01

    Numerical simulations moved in the recent year(s) from describing the cold-temperate transition surface (CTS) towards an enthalpy description, which allows avoiding incorporating a singular surface inside the model (Aschwanden et al., 2012). In Enthalpy methods the CTS is represented as a level set of the enthalpy state variable. This method has several numerical and practical advantages (e.g. representation of the full energy by one scalar field, no restriction to topology and shape of the CTS). The proposed method is rather new in glaciology and to our knowledge not verified and validated against analytical solutions. Unfortunately we are still lacking analytical solutions for sufficiently complex thermo-mechanically coupled polythermal ice flow. However, we present two experiments to test the implementation of the enthalpy equation and corresponding boundary conditions. The first experiment tests particularly the functionality of the boundary condition scheme and the corresponding basal melt rate calculation. Dependent on the different thermal situations that occur at the base, the numerical code may have to switch to another boundary type (from Neuman to Dirichlet or vice versa). The main idea of this set-up is to test the reversibility during transients. A former cold ice body that run through a warmer period with an associated built up of a liquid water layer at the base must be able to return to its initial steady state. Since we impose several assumptions on the experiment design analytical solutions can be formulated for different quantities during distinct stages of the simulation. The second experiment tests the positioning of the internal CTS in a parallel-sided polythermal slab. We compare our simulation results to the analytical solution proposed by Greve and Blatter (2009). Results from three different ice flow-models (COMIce, ISSM, TIMFD3) are presented.

  16. Quantitative validation of carbon-fiber laminate low velocity impact simulations

    DOE PAGES

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    2015-09-26

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  17. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  18. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  19. Integration of relational and textual biomedical sources. A pilot experiment using a semi-automated method for logical schema acquisition.

    PubMed

    García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J

    2010-01-01

    Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.

  20. Development of a Self-Rated Mixed Methods Skills Assessment: The National Institutes of Health Mixed Methods Research Training Program for the Health Sciences.

    PubMed

    Guetterman, Timothy C; Creswell, John W; Wittink, Marsha; Barg, Fran K; Castro, Felipe G; Dahlberg, Britt; Watkins, Daphne C; Deutsch, Charles; Gallo, Joseph J

    2017-01-01

    Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a self-rated mixed methods skills assessment and provide validity evidence. The instrument taps six research domains: "Research question," "Design/approach," "Sampling," "Data collection," "Analysis," and "Dissemination." Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using the Cronbach alpha to assess reliability and an analysis of variance that compared a mixed methods experience index with assessment scores to assess criterion relatedness. Internal consistency reliability was high for the total set of items (0.95) and adequate (≥0.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (eg, published a mixed methods article) rated themselves as more skilled, which was statistically significant across the research domains. This self-rated mixed methods assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning.

  1. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  2. Enabling Self-Monitoring Data Exchange in Participatory Medicine.

    PubMed

    Lopez-Campos, Guillermo; Ofoghi, Bahadorreza; Martin-Sanchez, Fernando

    2015-01-01

    The development of new methods, devices and apps for self-monitoring have enabled the extension of the application of these approaches for consumer health and research purposes. The increase in the number and variety of devices has generated a complex scenario where reporting guidelines and data exchange formats will be needed to ensure the quality of the information and the reproducibility of results of the experiments. Based on the Minimal Information for Self Monitoring Experiments (MISME) reporting guideline we have developed an XML format (MISME-ML) to facilitate data exchange for self monitoring experiments. We have also developed a sample instance to illustrate the concept and a Java MISME-ML validation tool. The implementation and adoption of these tools should contribute to the consolidation of a set of methods that ensure the reproducibility of self monitoring experiments for research purposes.

  3. Experimental Validation of Normalized Uniform Load Surface Curvature Method for Damage Localization

    PubMed Central

    Jung, Ho-Yeon; Sung, Seung-Hoon; Jung, Hyung-Jo

    2015-01-01

    In this study, we experimentally validated the normalized uniform load surface (NULS) curvature method, which has been developed recently to assess damage localization in beam-type structures. The normalization technique allows for the accurate assessment of damage localization with greater sensitivity irrespective of the damage location. In this study, damage to a simply supported beam was numerically and experimentally investigated on the basis of the changes in the NULS curvatures, which were estimated from the modal flexibility matrices obtained from the acceleration responses under an ambient excitation. Two damage scenarios were considered for the single damage case as well as the multiple damages case by reducing the bending stiffness (EI) of the affected element(s). Numerical simulations were performed using MATLAB as a preliminary step. During the validation experiments, a series of tests were performed. It was found that the damage locations could be identified successfully without any false-positive or false-negative detections using the proposed method. For comparison, the damage detection performances were compared with those of two other well-known methods based on the modal flexibility matrix, namely, the uniform load surface (ULS) method and the ULS curvature method. It was confirmed that the proposed method is more effective for investigating the damage locations of simply supported beams than the two conventional methods in terms of sensitivity to damage under measurement noise. PMID:26501286

  4. Evaluation of the user experience of "astronaut training device": an immersive, vr-based, motion-training system

    NASA Astrophysics Data System (ADS)

    Yue, Kang; Wang, Danli; Yang, Xinpan; Hu, Haichen; Liu, Yuqing; Zhu, Xiuqing

    2016-10-01

    To date, as the different application fields, most VR-based training systems have been different. Therefore, we should take the characteristics of application field into consideration and adopt different evaluation methods when evaluate the user experience of these training systems. In this paper, we propose a method to evaluate the user experience of virtual astronauts training system. Also, we design an experiment based on the proposed method. The proposed method takes learning performance as one of the evaluation dimensions, also combines with other evaluation dimensions such as: presence, immersion, pleasure, satisfaction and fatigue to evaluation user experience of the System. We collect subjective and objective data, the subjective data are mainly from questionnaire designed based on the evaluation dimensions and user interview conducted before and after the experiment. While the objective data are consisted of Electrocardiogram (ECG), reaction time, numbers of reaction error and the video data recorded during the experiment. For the analysis of data, we calculate the integrated score of each evaluation dimension by using factor analysis. In order to improve the credibility of the assessment, we use the ECG signal and reaction test data before and after experiment to validate the changes of fatigue during the experiment, and the typical behavioral features extracted from the experiment video to explain the result of subjective questionnaire. Experimental results show that the System has a better user experience and learning performance, but slight visual fatigue exists after experiment.

  5. The Development of Accepted Performance Items to Demonstrate Braille Competence in the Nemeth Code for Mathematics and Science Notation

    ERIC Educational Resources Information Center

    Smith, Derrick; Rosenblum, L. Penny

    2013-01-01

    Introduction: The purpose of the study presented here was the initial validation of a comprehensive set of competencies focused solely on the Nemeth code. Methods: Using the Delphi method, 20 expert panelists were recruited to participate in the study on the basis of their past experience in teaching a university-level course in the Nemeth code.…

  6. Chapter 17: Residential Behavior Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W.; Stewart, James; Todd, Annika

    Residential behavior-based (BB) programs use strategies grounded in the behavioral and social sciences to influence household energy use. These may include providing households with real-time or delayed feedback about their energy use; supplying energy efficiency education and tips; rewarding households for reducing their energy use; comparing households to their peers; and establishing games, tournaments, and competitions. BB programs often target multiple energy end uses and encourage energy savings, demand savings, or both. Savings from BB programs are usually a small percentage of energy use, typically less than 5 percent. Utilities will continue to implement residential BB programs as large-scale, randomizedmore » control trials (RCTs); however, some are now experimenting with alternative program designs that are smaller scale; involve new communication channels such as the web, social media, and text messaging; or that employ novel strategies for encouraging behavior change (for example, Facebook competitions). These programs will create new evaluation challenges and may require different evaluation methods than those currently employed to verify any savings they generate. Quasi-experimental methods, however, require stronger assumptions to yield valid savings estimates and may not measure savings with the same degree of validity and accuracy as randomized experiments.« less

  7. Research on simulated infrared image utility evaluation using deep representation

    NASA Astrophysics Data System (ADS)

    Zhang, Ruiheng; Mu, Chengpo; Yang, Yu; Xu, Lixin

    2018-01-01

    Infrared (IR) image simulation is an important data source for various target recognition systems. However, whether simulated IR images could be used as training data for classifiers depends on the features of fidelity and authenticity of simulated IR images. For evaluation of IR image features, a deep-representation-based algorithm is proposed. Being different from conventional methods, which usually adopt a priori knowledge or manually designed feature, the proposed method can extract essential features and quantitatively evaluate the utility of simulated IR images. First, for data preparation, we employ our IR image simulation system to generate large amounts of IR images. Then, we present the evaluation model of simulated IR image, for which an end-to-end IR feature extraction and target detection model based on deep convolutional neural network is designed. At last, the experiments illustrate that our proposed method outperforms other verification algorithms in evaluating simulated IR images. Cross-validation, variable proportion mixed data validation, and simulation process contrast experiments are carried out to evaluate the utility and objectivity of the images generated by our simulation system. The optimum mixing ratio between simulated and real data is 0.2≤γ≤0.3, which is an effective data augmentation method for real IR images.

  8. Social Capital: Its Constructs and Survey Development

    ERIC Educational Resources Information Center

    Enfield, Richard P.; Nathaniel, Keith C.

    2013-01-01

    This article reports on experiences and methods of adapting a valid adult social capital assessment to youth audiences in order to measure social capital and sense of place. The authors outline the process of adapting, revising, prepiloting, piloting, and administering a youth survey exploring young people's sense of community, involvement in the…

  9. The Validity of a Personality Disorder Diagnosis for People with an Intellectual Disability

    ERIC Educational Resources Information Center

    Moreland, Jessica; Hendy, Steve; Brown, Freddy

    2008-01-01

    Background: It has long been appreciated that people with intellectual disabilities experience mental health problems. Studies into the prevalence of personality disorder in the population of people with an intellectual disability indicate significant variations, which have no clear explanation. Method: Work on personality disorder and personality…

  10. Optical simulations for experimental networks: lessons from MONET

    NASA Astrophysics Data System (ADS)

    Richards, Dwight H.; Jackel, Janet L.; Goodman, Matthew S.; Roudas, Ioannis; Wagner, Richard E.; Antoniades, Neophytos

    1999-08-01

    We have used optical simulations as a means of setting component requirements, assessing component compatibility, and designing experiments in the MONET (Multiwavelength Optical Networking) Project. This paper reviews the simulation method, gives some examples of the types of simulations that have been performed, and discusses the validation of the simulations.

  11. Empathy Training: Methods, Evaluation Practices, and Validity

    ERIC Educational Resources Information Center

    Lam, Tony Chiu Ming; Kolomitro, Klodiana; Alamparambil, Flanny C.

    2011-01-01

    Background: Empathy is an individual's capacity to understand the behavior of others, to experience their feelings, and to express that understanding to them. Empathic ability is an asset professionally for individuals, such as teachers, physicians and social workers, who work with people. Being empathetic is also critical to our being able to…

  12. Preliminary Evaluation of an Educational Outcomes Assessment Process for Dental Interpretive Radiography.

    ERIC Educational Resources Information Center

    Weems, Richard A.; And Others

    1992-01-01

    A procedure for testing the ability of dental students to detect presence and depth of dental caries was evaluated. Students (n=40) from four experience groups examined radiographs obtained from a model. Results indicated that this method of assessing student competence in radiographic interpretation is valid. (MSE)

  13. Physical Activity Measurement Methods for Young Children: A Comparative Study

    ERIC Educational Resources Information Center

    Hands, Beth; Parker, Helen; Larkin, Dawne

    2006-01-01

    Many behavior patterns that impact on physical activity experiences are established in early childhood, therefore it is important that valid, reliable, and feasible measures are constructed to identify children who are not developing appropriate and healthy activity habits. In this study, measures of physical activity derived by accelerometry and…

  14. Toward Evidence-Informed Policy and Practice in Child Welfare

    ERIC Educational Resources Information Center

    Littell, Julia H.; Shlonsky, Aron

    2010-01-01

    Drawing on the authors' experience in the international Campbell Collaboration, this essay presents a principled and pragmatic approach to evidence-informed decisions about child welfare. This approach takes into account the growing body of empirical evidence on the reliability and validity of various methods of research synthesis. It also…

  15. Techniques for obtaining subjective response to vertical vibration

    NASA Technical Reports Server (NTRS)

    Clarke, M. J.; Oborne, D. J.

    1975-01-01

    Laboratory experiments were performed to validate the techniques used for obtaining ratings in the field surveys carried out by the University College of Swansea. In addition, attempts were made to evaluate the basic form of the human response to vibration. Some of the results obtained by different methods are described.

  16. Development and validation of the Measure of Indigenous Racism Experiences (MIRE)

    PubMed Central

    Paradies, Yin C; Cunningham, Joan

    2008-01-01

    Background In recent decades there has been increasing evidence of a relationship between self-reported racism and health. Although a plethora of instruments to measure racism have been developed, very few have been described conceptually or psychometrically Furthermore, this research field has been limited by a dearth of instruments that examine reactions/responses to racism and by a restricted focus on African American populations. Methods In response to these limitations, the 31-item Measure of Indigenous Racism Experiences (MIRE) was developed to assess self-reported racism for Indigenous Australians. This paper describes the development of the MIRE together with an opportunistic examination of its content, construct and convergent validity in a population health study involving 312 Indigenous Australians. Results Focus group research supported the content validity of the MIRE, and inter-item/scale correlations suggested good construct validity. A good fit with a priori conceptual dimensions was demonstrated in factor analysis, and convergence with a separate item on discrimination was satisfactory. Conclusion The MIRE has considerable utility as an instrument that can assess multiple facets of racism together with responses/reactions to racism among indigenous populations and, potentially, among other ethnic/racial groups. PMID:18426602

  17. Assessment of MARMOT Grain Growth Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fromm, B.; Zhang, Y.; Schwen, D.

    2015-12-01

    This report assesses the MARMOT grain growth model by comparing modeling predictions with experimental results from thermal annealing. The purpose here is threefold: (1) to demonstrate the validation approach of using thermal annealing experiments with non-destructive characterization, (2) to test the reconstruction capability and computation efficiency in MOOSE, and (3) to validate the grain growth model and the associated parameters that are implemented in MARMOT for UO 2. To assure a rigorous comparison, the 2D and 3D initial experimental microstructures of UO 2 samples were characterized using non-destructive Synchrotron x-ray. The same samples were then annealed at 2273K for grainmore » growth, and their initial microstructures were used as initial conditions for simulated annealing at the same temperature using MARMOT. After annealing, the final experimental microstructures were characterized again to compare with the results from simulations. So far, comparison between modeling and experiments has been done for 2D microstructures, and 3D comparison is underway. The preliminary results demonstrated the usefulness of the non-destructive characterization method for MARMOT grain growth model validation. A detailed analysis of the 3D microstructures is in progress to fully validate the current model in MARMOT.« less

  18. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, C., E-mail: hansec@uw.edu; Columbia University, New York, New York 10027; Victor, B.

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numericalmore » validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.« less

  19. Complementary and Alternative Medicine: Italian Validation of a Questionnaire on Nurses' Personal and Professional Use, Knowledge, and Attitudes.

    PubMed

    Belletti, Giada; Shorofi, Seyed Afshin; Arbon, Paul; Dal Molin, Alberto

    2017-08-01

    Patients are showing an increasing interest in the use of complementary and alternative medicine (CAM). Most nurses are open to the adoption of CAM into clinical nursing practice, but they may experience a lack of knowledge about the safe and effective use of these therapies. Several studies concerning nurses' knowledge and attitudes toward CAM have been published, but only in one, the authors (Shorofi and Arbon) used a validated questionnaire. In Italy, there are no validated questionnaires to investigate this aspect of nursing practice. To test the psychometric properties of the Italian Shorofi and Arbon questionnaire for use with Italian nurses. A forward-backward translation method was used to translate the questionnaire from English to Italian. Content validity, face validity and reliability were established. This study examined the potential usefulness of the Shorofi and Arbon questionnaire for the evaluation of CAM knowledge of Italian speaking nurses, which showed good content validity and good reliability.

  20. Edge enhancement of color images using a digital micromirror device.

    PubMed

    Di Martino, J Matías; Flores, Jorge L; Ayubi, Gastón A; Alonso, Julia R; Fernández, Ariel; Ferrari, José A

    2012-06-01

    A method for orientation-selective enhancement of edges in color images is proposed. The method utilizes the capacity of digital micromirror devices to generate a positive and a negative color replica of the image used as input. When both images are slightly displaced and imagined together, one obtains an image with enhanced edges. The proposed technique does not require a coherent light source or precise alignment. The proposed method could be potentially useful for processing large image sequences in real time. Validation experiments are presented.

  1. Design of a Synthetic Aperture Array to Support Experiments in Active Control of Scattering

    DTIC Science & Technology

    1990-06-01

    becomes necessary to validate the theory and test the control system algorithms . While experiments in open water would be most like the anticipated...mathematical development of the beamforming algorithms used as well as an estimate of their applicability to the specifics of beamforming in a reverberant...Chebyshev array have been proposed. The method used in ARRAY, a nested product algorithm , proposed by Bresler [21] is recommended by Pozar [19] and

  2. Tutorial in Biostatistics: Instrumental Variable Methods for Causal Inference*

    PubMed Central

    Baiocchi, Michael; Cheng, Jing; Small, Dylan S.

    2014-01-01

    A goal of many health studies is to determine the causal effect of a treatment or intervention on health outcomes. Often, it is not ethically or practically possible to conduct a perfectly randomized experiment and instead an observational study must be used. A major challenge to the validity of observational studies is the possibility of unmeasured confounding (i.e., unmeasured ways in which the treatment and control groups differ before treatment administration which also affect the outcome). Instrumental variables analysis is a method for controlling for unmeasured confounding. This type of analysis requires the measurement of a valid instrumental variable, which is a variable that (i) is independent of the unmeasured confounding; (ii) affects the treatment; and (iii) affects the outcome only indirectly through its effect on the treatment. This tutorial discusses the types of causal effects that can be estimated by instrumental variables analysis; the assumptions needed for instrumental variables analysis to provide valid estimates of causal effects and sensitivity analysis for those assumptions; methods of estimation of causal effects using instrumental variables; and sources of instrumental variables in health studies. PMID:24599889

  3. Development plan for the External Hazards Experimental Group. Light Water Reactor Sustainability Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin Leigh; Smith, Curtis Lee; Burns, Douglas Edward

    This report describes the development plan for a new multi-partner External Hazards Experimental Group (EHEG) coordinated by Idaho National Laboratory (INL) within the Risk-Informed Safety Margin Characterization (RISMC) technical pathway of the Light Water Reactor Sustainability Program. Currently, there is limited data available for development and validation of the tools and methods being developed in the RISMC Toolkit. The EHEG is being developed to obtain high-quality, small- and large-scale experimental data validation of RISMC tools and methods in a timely and cost-effective way. The group of universities and national laboratories that will eventually form the EHEG (which is ultimately expectedmore » to include both the initial participants and other universities and national laboratories that have been identified) have the expertise and experimental capabilities needed to both obtain and compile existing data archives and perform additional seismic and flooding experiments. The data developed by EHEG will be stored in databases for use within RISMC. These databases will be used to validate the advanced external hazard tools and methods.« less

  4. Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results

    NASA Astrophysics Data System (ADS)

    Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.

    2017-05-01

    The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.

  5. A rotation-translation invariant molecular descriptor of partial charges and its use in ligand-based virtual screening

    PubMed Central

    2014-01-01

    Background Measures of similarity for chemical molecules have been developed since the dawn of chemoinformatics. Molecular similarity has been measured by a variety of methods including molecular descriptor based similarity, common molecular fragments, graph matching and 3D methods such as shape matching. Similarity measures are widespread in practice and have proven to be useful in drug discovery. Because of our interest in electrostatics and high throughput ligand-based virtual screening, we sought to exploit the information contained in atomic coordinates and partial charges of a molecule. Results A new molecular descriptor based on partial charges is proposed. It uses the autocorrelation function and linear binning to encode all atoms of a molecule into two rotation-translation invariant vectors. Combined with a scoring function, the descriptor allows to rank-order a database of compounds versus a query molecule. The proposed implementation is called ACPC (AutoCorrelation of Partial Charges) and released in open source. Extensive retrospective ligand-based virtual screening experiments were performed and other methods were compared with in order to validate the method and associated protocol. Conclusions While it is a simple method, it performed remarkably well in experiments. At an average speed of 1649 molecules per second, it reached an average median area under the curve of 0.81 on 40 different targets; hence validating the proposed protocol and implementation. PMID:24887178

  6. Improved patch-based learning for image deblurring

    NASA Astrophysics Data System (ADS)

    Dong, Bo; Jiang, Zhiguo; Zhang, Haopeng

    2015-05-01

    Most recent image deblurring methods only use valid information found in input image as the clue to fill the deblurring region. These methods usually have the defects of insufficient prior information and relatively poor adaptiveness. Patch-based method not only uses the valid information of the input image itself, but also utilizes the prior information of the sample images to improve the adaptiveness. However the cost function of this method is quite time-consuming and the method may also produce ringing artifacts. In this paper, we propose an improved non-blind deblurring algorithm based on learning patch likelihoods. On one hand, we consider the effect of the Gaussian mixture model with different weights and normalize the weight values, which can optimize the cost function and reduce running time. On the other hand, a post processing method is proposed to solve the ringing artifacts produced by traditional patch-based method. Extensive experiments are performed. Experimental results verify that our method can effectively reduce the execution time, suppress the ringing artifacts effectively, and keep the quality of deblurred image.

  7. Developing 3D microscopy with CLARITY on human brain tissue: Towards a tool for informing and validating MRI-based histology.

    PubMed

    Morawski, Markus; Kirilina, Evgeniya; Scherf, Nico; Jäger, Carsten; Reimann, Katja; Trampel, Robert; Gavriilidis, Filippos; Geyer, Stefan; Biedermann, Bernd; Arendt, Thomas; Weiskopf, Nikolaus

    2017-11-28

    Recent breakthroughs in magnetic resonance imaging (MRI) enabled quantitative relaxometry and diffusion-weighted imaging with sub-millimeter resolution. Combined with biophysical models of MR contrast the emerging methods promise in vivo mapping of cyto- and myelo-architectonics, i.e., in vivo histology using MRI (hMRI) in humans. The hMRI methods require histological reference data for model building and validation. This is currently provided by MRI on post mortem human brain tissue in combination with classical histology on sections. However, this well established approach is limited to qualitative 2D information, while a systematic validation of hMRI requires quantitative 3D information on macroscopic voxels. We present a promising histological method based on optical 3D imaging combined with a tissue clearing method, Clear Lipid-exchanged Acrylamide-hybridized Rigid Imaging compatible Tissue hYdrogel (CLARITY), adapted for hMRI validation. Adapting CLARITY to the needs of hMRI is challenging due to poor antibody penetration into large sample volumes and high opacity of aged post mortem human brain tissue. In a pilot experiment we achieved transparency of up to 8 mm-thick and immunohistochemical staining of up to 5 mm-thick post mortem brain tissue by a combination of active and passive clearing, prolonged clearing and staining times. We combined 3D optical imaging of the cleared samples with tailored image processing methods. We demonstrated the feasibility for quantification of neuron density, fiber orientation distribution and cell type classification within a volume with size similar to a typical MRI voxel. The presented combination of MRI, 3D optical microscopy and image processing is a promising tool for validation of MRI-based microstructure estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Multivariate optimization and validation of an analytical methodology by RP-HPLC for the determination of losartan potassium in capsules.

    PubMed

    Bonfilio, Rudy; Tarley, César Ricardo Teixeira; Pereira, Gislaine Ribeiro; Salgado, Hérida Regina Nunes; de Araújo, Magali Benjamim

    2009-11-15

    This paper describes the optimization and validation of an analytical methodology for the determination of losartan potassium in capsules by HPLC using 2(5-1) fractional factorial and Doehlert designs. This multivariate approach allows a considerable improvement in chromatographic performance using fewer experiments, without additional cost for columns or other equipment. The HPLC method utilized potassium phosphate buffer (pH 6.2; 58 mmol L(-1))-acetonitrile (65:35, v/v) as the mobile phase, pumped at a flow rate of 1.0 mL min(-1). An octylsilane column (100 mm x 4.6mm i.d., 5 microm) maintained at 35 degrees C was used as the stationary phase. UV detection was performed at 254 nm. The method was validated according to the ICH guidelines, showing accuracy, precision (intra-day relative standard deviation (R.S.D.) and inter-day R.S.D values <2.0%), selectivity, robustness and linearity (r=0.9998) over a concentration range from 30 to 70 mg L(-1) of losartan potassium. The limits of detection and quantification were 0.114 and 0.420 mg L(-1), respectively. The validated method may be used to quantify losartan potassium in capsules and to determine the stability of this drug.

  9. Slip resistance of winter footwear on snow and ice measured using maximum achievable incline.

    PubMed

    Hsu, Jennifer; Shaw, Robert; Novak, Alison; Li, Yue; Ormerod, Marcus; Newton, Rita; Dutta, Tilak; Fernie, Geoff

    2016-05-01

    Protective footwear is necessary for preventing injurious slips and falls in winter conditions. Valid methods for assessing footwear slip resistance on winter surfaces are needed in order to evaluate footwear and outsole designs. The purpose of this study was to utilise a method of testing winter footwear that was ecologically valid in terms of involving actual human testers walking on realistic winter surfaces to produce objective measures of slip resistance. During the experiment, eight participants tested six styles of footwear on wet ice, on dry ice, and on dry ice after walking over soft snow. Slip resistance was measured by determining the maximum incline angles participants were able to walk up and down in each footwear-surface combination. The results indicated that testing on a variety of surfaces is necessary for establishing winter footwear performance and that standard mechanical bench tests for footwear slip resistance do not adequately reflect actual performance. Practitioner Summary: Existing standardised methods for measuring footwear slip resistance lack validation on winter surfaces. By determining the maximum inclines participants could walk up and down slopes of wet ice, dry ice, and ice with snow, in a range of footwear, an ecologically valid test for measuring winter footwear performance was established.

  10. Slip resistance of winter footwear on snow and ice measured using maximum achievable incline

    PubMed Central

    Hsu, Jennifer; Shaw, Robert; Novak, Alison; Li, Yue; Ormerod, Marcus; Newton, Rita; Dutta, Tilak; Fernie, Geoff

    2016-01-01

    Abstract Protective footwear is necessary for preventing injurious slips and falls in winter conditions. Valid methods for assessing footwear slip resistance on winter surfaces are needed in order to evaluate footwear and outsole designs. The purpose of this study was to utilise a method of testing winter footwear that was ecologically valid in terms of involving actual human testers walking on realistic winter surfaces to produce objective measures of slip resistance. During the experiment, eight participants tested six styles of footwear on wet ice, on dry ice, and on dry ice after walking over soft snow. Slip resistance was measured by determining the maximum incline angles participants were able to walk up and down in each footwear–surface combination. The results indicated that testing on a variety of surfaces is necessary for establishing winter footwear performance and that standard mechanical bench tests for footwear slip resistance do not adequately reflect actual performance. Practitioner Summary: Existing standardised methods for measuring footwear slip resistance lack validation on winter surfaces. By determining the maximum inclines participants could walk up and down slopes of wet ice, dry ice, and ice with snow, in a range of footwear, an ecologically valid test for measuring winter footwear performance was established. PMID:26555738

  11. A Numerical Theory for Impedance Education in Three-Dimensional Normal Incidence Tubes

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.

    2016-01-01

    A method for educing the locally-reacting acoustic impedance of a test sample mounted in a 3-D normal incidence impedance tube is presented and validated. The unique feature of the method is that the excitation frequency (or duct geometry) may be such that high-order duct modes may exist. The method educes the impedance, iteratively, by minimizing an objective function consisting of the difference between the measured and numerically computed acoustic pressure at preselected measurement points in the duct. The method is validated on planar and high-order mode sources with data synthesized from exact mode theory. These data are then subjected to random jitter to simulate the effects of measurement uncertainties on the educed impedance spectrum. The primary conclusions of the study are 1) Without random jitter the method is in excellent agreement with that for known impedance samples, and 2) Random jitter that is compatible to that found in a typical experiment has minimal impact on the accuracy of the educed impedance.

  12. Extended Finite Element Method with Simplified Spherical Harmonics Approximation for the Forward Model of Optical Molecular Imaging

    PubMed Central

    Li, Wei; Yi, Huangjian; Zhang, Qitan; Chen, Duofang; Liang, Jimin

    2012-01-01

    An extended finite element method (XFEM) for the forward model of 3D optical molecular imaging is developed with simplified spherical harmonics approximation (SPN). In XFEM scheme of SPN equations, the signed distance function is employed to accurately represent the internal tissue boundary, and then it is used to construct the enriched basis function of the finite element scheme. Therefore, the finite element calculation can be carried out without the time-consuming internal boundary mesh generation. Moreover, the required overly fine mesh conforming to the complex tissue boundary which leads to excess time cost can be avoided. XFEM conveniences its application to tissues with complex internal structure and improves the computational efficiency. Phantom and digital mouse experiments were carried out to validate the efficiency of the proposed method. Compared with standard finite element method and classical Monte Carlo (MC) method, the validation results show the merits and potential of the XFEM for optical imaging. PMID:23227108

  13. Extended finite element method with simplified spherical harmonics approximation for the forward model of optical molecular imaging.

    PubMed

    Li, Wei; Yi, Huangjian; Zhang, Qitan; Chen, Duofang; Liang, Jimin

    2012-01-01

    An extended finite element method (XFEM) for the forward model of 3D optical molecular imaging is developed with simplified spherical harmonics approximation (SP(N)). In XFEM scheme of SP(N) equations, the signed distance function is employed to accurately represent the internal tissue boundary, and then it is used to construct the enriched basis function of the finite element scheme. Therefore, the finite element calculation can be carried out without the time-consuming internal boundary mesh generation. Moreover, the required overly fine mesh conforming to the complex tissue boundary which leads to excess time cost can be avoided. XFEM conveniences its application to tissues with complex internal structure and improves the computational efficiency. Phantom and digital mouse experiments were carried out to validate the efficiency of the proposed method. Compared with standard finite element method and classical Monte Carlo (MC) method, the validation results show the merits and potential of the XFEM for optical imaging.

  14. Validation of model-based deformation correction in image-guided liver surgery via tracked intraoperative ultrasound: preliminary method and results

    NASA Astrophysics Data System (ADS)

    Clements, Logan W.; Collins, Jarrod A.; Wu, Yifei; Simpson, Amber L.; Jarnagin, William R.; Miga, Michael I.

    2015-03-01

    Soft tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface based metrics and sub-surface validation has largely been performed via phantom experiments. Tracked intraoperative ultrasound (iUS) provides a means to digitize sub-surface anatomical landmarks during clinical procedures. The proposed method involves the validation of a deformation correction algorithm for open hepatic image-guided surgery systems via sub-surface targets digitized with tracked iUS. Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration within the guidance system and for use in retrospective deformation correction. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. After the procedure, the clinician reviewed the iUS images to delineate contours of anatomical target features for use in the validation procedure. Mean closest point distances between the feature contours delineated in the iUS images and corresponding 3-D anatomical model generated from the preoperative tomograms were computed to quantify the extent to which the deformation correction algorithm improved registration accuracy. The preliminary results for two patients indicate that the deformation correction method resulted in a reduction in target error of approximately 50%.

  15. Item generation in the development of an inpatient experience questionnaire: a qualitative study

    PubMed Central

    2013-01-01

    Background Patient experience is a key feature of quality improvement in modern health-care delivery. Measuring patient experience is one of several tools used to assess and monitor the quality of health services. This study aims to develop a tool for assessing patient experience with inpatient care in public hospitals in Hong Kong. Methods Based on the General Inpatient Questionnaire (GIQ) framework of the Care Quality Commission as a discussion guide, a qualitative study involving focus group discussions and in-depth individual interviews with patients was employed to develop a tool for measuring inpatient experience in Hong Kong. Results All participants agreed that a patient satisfaction survey is an important platform for collecting patients’ views on improving the quality of health-care services. Findings of the focus group discussions and in-depth individual interviews identified nine key themes as important hospital quality indicators: prompt access, information provision, care and involvement in decision making, physical and emotional needs, coordination of care, respect and privacy, environment and facilities, handling of patient feedback, and overall care from health-care professionals and quality of care. Privacy, complaint mechanisms, patient involvement, and information provision were further highlighted as particularly important areas for item revision by the in-depth individual interviews. Thus, the initial version of the Hong Kong Inpatient Experience Questionnaire (HKIEQ), comprising 58 core items under nine themes, was developed. Conclusions A set of dimensions and core items of the HKIEQ was developed and the instrument will undergo validity and reliability tests through a validation survey. A valid and reliable tool is important in accurately assessing patient experience with care delivery in hospitals to improve the quality of health-care services. PMID:23835186

  16. Sampling Participants’ Experience in Laboratory Experiments: Complementary Challenges for More Complete Data Collection

    PubMed Central

    McAuliffe, Alan; McGann, Marek

    2016-01-01

    Speelman and McGann’s (2013) examination of the uncritical way in which the mean is often used in psychological research raises questions both about the average’s reliability and its validity. In the present paper, we argue that interrogating the validity of the mean involves, amongst other things, a better understanding of the person’s experiences, the meaning of their actions, at the time that the behavior of interest is carried out. Recently emerging approaches within Psychology and Cognitive Science have argued strongly that experience should play a more central role in our examination of behavioral data, but the relationship between experience and behavior remains very poorly understood. We outline some of the history of the science on this fraught relationship, as well as arguing that contemporary methods for studying experience fall into one of two categories. “Wide” approaches tend to incorporate naturalistic behavior settings, but sacrifice accuracy and reliability in behavioral measurement. “Narrow” approaches maintain controlled measurement of behavior, but involve too specific a sampling of experience, which obscures crucial temporal characteristics. We therefore argue for a novel, mid-range sampling technique, that extends Hurlburt’s descriptive experience sampling, and adapts it for the controlled setting of the laboratory. This controlled descriptive experience sampling may be an appropriate tool to help calibrate both the mean and the meaning of an experimental situation with one another. PMID:27242588

  17. Enhanced Facilitation of Spatial Attention in Schizophrenia

    PubMed Central

    Spencer, Kevin M.; Nestor, Paul G.; Valdman, Olga; Niznikiewicz, Margaret A.; Shenton, Martha E.; McCarley, Robert W.

    2010-01-01

    Objective While attentional functions are usually found to be impaired in schizophrenia, a review of the literature on the orienting of spatial attention in schizophrenia suggested that voluntary attentional orienting in response to a valid cue might be paradoxically enhanced. We tested this hypothesis with orienting tasks involving the cued detection of a laterally-presented target stimulus. Method Subjects were chronic schizophrenia patients (SZ) and matched healthy control subjects (HC). In Experiment 1 (15 SZ, 16 HC), cues were endogenous (arrows) and could be valid (100% predictive) or neutral with respect to the subsequent target position. In Experiment 2 (16 SZ, 16 HC), subjects performed a standard orienting task with unpredictive exogenous cues (brightening of the target boxes). Results In Experiment 1, SZ showed a larger attentional facilitation effect on reaction time than HC. In Experiment 2, no clear sign of enhanced attentional facilitation was found in SZ. Conclusions The voluntary, facilitatory shifting of spatial attention may be relatively enhanced in individuals with schizophrenia in comparison to healthy individuals. This effect bears resemblance to other relative enhancements of information processing in schizophrenia such as saccade speed and semantic priming. PMID:20919764

  18. A Robust Inner and Outer Loop Control Method for Trajectory Tracking of a Quadrotor

    PubMed Central

    Xia, Dunzhu; Cheng, Limei; Yao, Yanhong

    2017-01-01

    In order to achieve the complicated trajectory tracking of quadrotor, a geometric inner and outer loop control scheme is presented. The outer loop generates the desired rotation matrix for the inner loop. To improve the response speed and robustness, a geometric SMC controller is designed for the inner loop. The outer loop is also designed via sliding mode control (SMC). By Lyapunov theory and cascade theory, the closed-loop system stability is guaranteed. Next, the tracking performance is validated by tracking three representative trajectories. Then, the robustness of the proposed control method is illustrated by trajectory tracking in presence of model uncertainty and disturbances. Subsequently, experiments are carried out to verify the method. In the experiment, ultra wideband (UWB) is used for indoor positioning. Extended Kalman Filter (EKF) is used for fusing inertial measurement unit (IMU) and UWB measurements. The experimental results show the feasibility of the designed controller in practice. The comparative experiments with PD and PD loop demonstrate the robustness of the proposed control method. PMID:28925984

  19. Diverse convergent evidence in the genetic analysis of complex disease: coordinating omic, informatic, and experimental evidence to better identify and validate risk factors

    PubMed Central

    2014-01-01

    In omic research, such as genome wide association studies, researchers seek to repeat their results in other datasets to reduce false positive findings and thus provide evidence for the existence of true associations. Unfortunately this standard validation approach cannot completely eliminate false positive conclusions, and it can also mask many true associations that might otherwise advance our understanding of pathology. These issues beg the question: How can we increase the amount of knowledge gained from high throughput genetic data? To address this challenge, we present an approach that complements standard statistical validation methods by drawing attention to both potential false negative and false positive conclusions, as well as providing broad information for directing future research. The Diverse Convergent Evidence approach (DiCE) we propose integrates information from multiple sources (omics, informatics, and laboratory experiments) to estimate the strength of the available corroborating evidence supporting a given association. This process is designed to yield an evidence metric that has utility when etiologic heterogeneity, variable risk factor frequencies, and a variety of observational data imperfections might lead to false conclusions. We provide proof of principle examples in which DiCE identified strong evidence for associations that have established biological importance, when standard validation methods alone did not provide support. If used as an adjunct to standard validation methods this approach can leverage multiple distinct data types to improve genetic risk factor discovery/validation, promote effective science communication, and guide future research directions. PMID:25071867

  20. Validating Bayesian truth serum in large-scale online human experiments.

    PubMed

    Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.

  1. Public health tools for holding self-regulators accountable: lessons from the alcohol experience.

    PubMed

    Jernigan, David H

    2011-05-01

    Self-regulation is a common strategy used by industries to avoid or supplement statutory health and safety regulation of their products and practices. The public health experience with self-regulation in the alcohol industry provides methods and lessons relevant to health educators and advocates working in other public health fields. Methods for and examples and limitations of monitoring content and placement of marketing messages are described. The alcohol experience shows that, although self-regulation has many drawbacks in terms of protecting the health of the public, there are tools available for valid monitoring of self-regulated activities that, when combined with aggressive dissemination of results to media and policy makers, can make self-regulation more accountable and build an evidence base for effective measures to be taken.

  2. Journaling: identification of challenges and reflection on strategies.

    PubMed

    Hayman, Brenda; Wilkes, Lesley; Jackson, Debra

    2012-01-01

    To identify the challenges associated with using journaling as a method of data collection and to offer strategies for effectively managing those challenges. While journaling can be used for a variety of reasons, in the context of this paper, journaling refers to the process of participants sharing thoughts, ideas, feelings and experiences through writing and/or other media. Journaling is used in phenomenological research studies to record participant experiences in their natural contexts. The findings are based on the experiences of the researchers during a qualitative study that explored the experiences of lesbian mothers and used journaling as one method of data collection. This is a methodological paper. Three main challenges affect journaling as a method of data collection: poor participation, feeling exposed and staying on track. Six strategies to promote participation in journaling are: coaching participants, limiting the journaling period, providing follow-up contact, promoting comfort, ensuring safety and providing clear content expectations. Each strategy is discussed and methods of implementing the strategies are offered. Journaling as a method of data collection has long been accepted as a valid method of accessing rich qualitative data. By acknowledging the common challenges associated with the process of journaling that are experienced by the participants, researchers employing this data collection method can promote constructive and valuable participation. Further research examining participants' experiences of journaling as a method of qualitative data collection would be useful in determining challenges, barriers and benefits of the method.

  3. The project ownership survey: measuring differences in scientific inquiry experiences.

    PubMed

    Hanauer, David I; Dolan, Erin L

    2014-01-01

    A growing body of research documents the positive outcomes of research experiences for undergraduates, including increased persistence in science. Study of undergraduate lab learning experiences has demonstrated that the design of the experience influences the extent to which students report ownership of the project and that project ownership is one of the psychosocial factors involved in student retention in the sciences. To date, methods for measuring project ownership have not been suitable for the collection of larger data sets. The current study aims to rectify this by developing, presenting, and evaluating a new instrument for measuring project ownership. Eighteen scaled items were generated based on prior research and theory related to project ownership and combined with 30 items shown to measure respondents' emotions about an experience, resulting in the Project Ownership survey (POS). The POS was analyzed to determine its dimensionality, reliability, and validity. The POS had a coefficient alpha of 0.92 and thus has high internal consistency. Known-groups validity was analyzed through the ability of the instrument to differentiate between students who studied in traditional versus research-based laboratory courses. The POS scales as differentiated between the groups and findings paralleled previous results in relation to the characteristics of project ownership.

  4. Development of MPS Method for Analyzing Melt Spreading Behavior and MCCI in Severe Accidents

    NASA Astrophysics Data System (ADS)

    Yamaji, Akifumi; Li, Xin

    2016-08-01

    Spreading of molten core (corium) on reactor containment vessel floor and molten corium-concrete interaction (MCCI) are important phenomena in the late phase of a severe accident for assessment of the containment integrity and managing the severe accident. The severe accident research at Waseda University has been advancing to show that simulations with moving particle semi-implicit (MPS) method (one of the particle methods) can greatly improve the analytical capability and mechanical understanding of the melt behavior in severe accidents. MPS models have been developed and verified regarding calculations of radiation and thermal field, solid-liquid phase transition, buoyancy, and temperature dependency of viscosity to simulate phenomena, such as spreading of corium, ablation of concrete by the corium, crust formation and cooling of the corium by top flooding. Validations have been conducted against experiments such as FARO L26S, ECOKATS-V1, Theofanous, and SPREAD for spreading, SURC-2, SURC-4, SWISS-1, and SWISS-2 for MCCI. These validations cover melt spreading behaviors and MCCI by mixture of molten oxides (including prototypic UO2-ZrO2), metals, and water. Generally, the analytical results show good agreement with the experiment with respect to the leading edge of spreading melt and ablation front history of concrete. The MPS results indicate that crust formation may play important roles in melt spreading and MCCI. There is a need to develop a code for two dimensional MCCI experiment simulation with MPS method as future study, which will be able to simulate anisotropic ablation of concrete.

  5. Development of a Self-Rated Mixed Methods Skills Assessment: The NIH Mixed Methods Research Training Program for the Health Sciences

    PubMed Central

    Guetterman, Timothy C.; Creswell, John W.; Wittink, Marsha; Barg, Fran K.; Castro, Felipe G.; Dahlberg, Britt; Watkins, Daphne C.; Deutsch, Charles; Gallo, Joseph J.

    2017-01-01

    Introduction Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a Self-Rated Mixed Methods Skills Assessment and provide validity evidence. The instrument taps six research domains: “Research question,” “Design/approach,” “Sampling,” “Data collection,” “Analysis,” and “Dissemination.” Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. Methods We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using Cronbach’s alpha to assess reliability and an ANOVA that compared a mixed methods experience index with assessment scores to assess criterion-relatedness. Results Internal consistency reliability was high for the total set of items (.95) and adequate (>=.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (e.g., published a mixed methods paper) rated themselves as more skilled, which was statistically significant across the research domains. Discussion This Self-Rated Mixed Methods Assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning. PMID:28562495

  6. A review and update of the Health of the Nation Outcome Scales (HoNOS).

    PubMed

    James, Mick; Painter, Jon; Buckingham, Bill; Stewart, Malcolm W

    2018-04-01

    Aims and method The Health of the Nation Outcome Scales (HoNOS) and its older adults' version (HoNOS 65+) have been used widely for 20 years, but their glossaries have not been revised to reflect clinicians' experiences or changes in service delivery. The Royal College of Psychiatrists convened an international advisory board, with UK, Australian and New Zealand expertise, to identify desirable amendments. The aim was to improve rater experience by removing ambiguity and inconsistency in the glossary rather than more radical revision. Changes proposed to the HoNOS are reported. HoNOS 65+ changes will be reported separately. Based on the views and experience of the countries involved, a series of amendments were identified. Clinical implications While effective clinician training remains critically important, these revisions aim to improve intra- and interrater reliability and improve validity. Next steps will depend on feedback from HoNOS users. Reliability and validity testing will depend on funding. Declaration of interest None.

  7. Characterization of Aluminum Honeycomb and Experimentation for Model Development and Validation, Volume I: Discovery and Characterization Experiments for High-Density Aluminum Honeycomb

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Wei-Yang; Korellis, John S.; Lee, Kenneth L.

    2006-08-01

    Honeycomb is a structure that consists of two-dimensional regular arrays of open cells. High-density aluminum honeycomb has been used in weapon assemblies to mitigate shock and protect payload because of its excellent crush properties. In order to use honeycomb efficiently and to certify the payload is protected by the honeycomb under various loading conditions, a validated honeycomb crush model is required and the mechanical properties of the honeycombs need to be fully characterized. Volume I of this report documents an experimental study of the crush behavior of high-density honeycombs. Two sets of honeycombs were included in this investigation: commercial grademore » for initial exploratory experiments, and weapon grade, which satisfied B61 specifications. This investigation also includes developing proper experimental methods for crush characterization, conducting discovery experiments to explore crush behaviors for model improvement, and identifying experimental and material uncertainties.« less

  8. World Ocean Circulation Experiment

    NASA Technical Reports Server (NTRS)

    Clarke, R. Allyn

    1992-01-01

    The oceans are an equal partner with the atmosphere in the global climate system. The World Ocean Circulation Experiment is presently being implemented to improve ocean models that are useful for climate prediction both by encouraging more model development but more importantly by providing quality data sets that can be used to force or to validate such models. WOCE is the first oceanographic experiment that plans to generate and to use multiparameter global ocean data sets. In order for WOCE to succeed, oceanographers must establish and learn to use more effective methods of assembling, quality controlling, manipulating and distributing oceanographic data.

  9. A Huygens immersed-finite-element particle-in-cell method for modeling plasma-surface interactions with moving interface

    NASA Astrophysics Data System (ADS)

    Cao, Huijun; Cao, Yong; Chu, Yuchuan; He, Xiaoming; Lin, Tao

    2018-06-01

    Surface evolution is an unavoidable issue in engineering plasma applications. In this article an iterative method for modeling plasma-surface interactions with moving interface is proposed and validated. In this method, the plasma dynamics is simulated by an immersed finite element particle-in-cell (IFE-PIC) method, and the surface evolution is modeled by the Huygens wavelet method which is coupled with the iteration of the IFE-PIC method. Numerical experiments, including prototypical engineering applications, such as the erosion of Hall thruster channel wall, are presented to demonstrate features of this Huygens IFE-PIC method for simulating the dynamic plasma-surface interactions.

  10. When Less Is More in Cognitive Diagnosis: A Rapid Online Method for Diagnosing Learner Task-Specific Expertise

    ERIC Educational Resources Information Center

    Kalyuga, Slava

    2008-01-01

    Rapid cognitive diagnosis allows measuring current levels of learner domain-specific knowledge in online learning environments. Such measures are required for individualizing instructional support in real time, as students progress through a learning session. This article describes 2 experiments designed to validate a rapid online diagnostic…

  11. Ethnic-Racial Attitudes, Images, and Behavior by Verbal Associations. Technical Report.

    ERIC Educational Resources Information Center

    Szalay, Lorand B.; And Others

    The investigations focused on two main subject areas. The first series of experiments explored the validity of verbal association based inferences as an attitude measure and predictor of behavior. When compared with paper-and-pencil methods, the association based attitude index (EDI) showed high positive correlation as a group measure and medium…

  12. Methodology for Software Reliability Prediction. Volume 2.

    DTIC Science & Technology

    1987-11-01

    The overall acquisition ,z program shall include the resources, schedule, management, structure , and controls necessary to ensure that specified AD...Independent Verification/Validation - Programming Team Structure - Educational Level of Team Members - Experience Level of Team Members * Methods Used...Prediction or Estimation Parameter Supported: Software - Characteristics 3. Objectives: Structured programming studies and Government Ur.’.. procurement

  13. Student Teacher Perceptions of the Impact of Mentoring on Student Teaching

    ERIC Educational Resources Information Center

    Bird, Lori K.

    2012-01-01

    Mentoring is an essential component of the student teaching experience. The support provided by highly prepared and effective mentors contributes to the success of student teachers during this high stakes period of professional development. Findings from this mixed-methods study support five mentoring factors as valid and a useful framework for…

  14. Relations between Inductive Reasoning and Deductive Reasoning

    ERIC Educational Resources Information Center

    Heit, Evan; Rotello, Caren M.

    2010-01-01

    One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments.…

  15. W17_geonuc “Application of the Spectral Element Method to improvement of Ground-based Nuclear Explosion Monitoring”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larmat, Carene; Rougier, Esteban; Lei, Zhou

    This project is in support of the Source Physics Experiment SPE (Snelson et al. 2013), which aims to develop new seismic source models of explosions. One priority of this program is first principle numerical modeling to validate and extend current empirical models.

  16. Cognitive Anxiety: A Method of Content Analysis for Verbal Samples

    ERIC Educational Resources Information Center

    Viney, Linda L.; Westbrook, Mary

    1976-01-01

    Five groups--second year students, psychiatric inpatients, incoming students, mothers, and relocated women were tested with verbal samples to examine the effects of cognitive anxiety as a construct implying a reaction to being unable to anticipate and integrate experiences meaningfully. The measure used was found to be valid. (Author/DEP)

  17. The Relation of Moral Judgment Development and Educational Experience to Recall of Moral Narratives and Expository Texts

    ERIC Educational Resources Information Center

    Narvaez, Darcia; Gleason, Tracy

    2007-01-01

    Moral text processing was used as an ecologically valid method for assessing implicit and explicit moral understanding and development. The authors tested undergraduates, seminarians, and graduate students in political science and philosophy for recall of moral narratives and moral expository texts. Multivariate analyses of covariance using…

  18. What Motivates Introductory Geology Students to Study for an Exam?

    ERIC Educational Resources Information Center

    Lukes, Laura A.; McConnell, David A.

    2014-01-01

    There is a need to understand why some students succeed and persist in STEM fields and others do not. While numerous studies have focused on the positive results of using empirically validated teaching methods in introductory science, technology, engineering, and math (STEM) courses, little data has been collected about the student experience in…

  19. Design and Validation of a Questionnaire to Measure Research Skills: Experience with Engineering Students

    ERIC Educational Resources Information Center

    Cobos Alvarado, Fabián; Peñaherrera León, Mónica; Ortiz Colon, Ana María

    2016-01-01

    Universities in Latin American countries are undergoing major changes in its institutional and academic settings. One strategy for continuous improvement of teaching and learning process is the incorporation of methods and teaching aids seeking to develop scientific research skills in students from their undergraduate studies. The aim of this…

  20. FDIR Strategy Validation with the B Method

    NASA Astrophysics Data System (ADS)

    Sabatier, D.; Dellandrea, B.; Chemouil, D.

    2008-08-01

    In a formation flying satellite system, the FDIR strategy (Failure Detection, Isolation and Recovery) is paramount. When a failure occurs, satellites should be able to take appropriate reconfiguration actions to obtain the best possible results given the failure, ranging from avoiding satellite-to-satellite collision to continuing the mission without disturbance if possible. To achieve this goal, each satellite in the formation has an implemented FDIR strategy that governs how it detects failures (from tests or by deduction) and how it reacts (reconfiguration using redundant equipments, avoidance manoeuvres, etc.). The goal is to protect the satellites first and the mission as much as possible. In a project initiated by the CNES, ClearSy experiments the B Method to validate the FDIR strategies developed by Thales Alenia Space, of the inter satellite positioning and communication devices that will be used for the SIMBOL-X (2 satellite configuration) and the PEGASE (3 satellite configuration) missions and potentially for other missions afterward. These radio frequency metrology sensor devices provide satellite positioning and inter satellite communication in formation flying. This article presents the results of this experience.

  1. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  2. Joint multifractal analysis based on wavelet leaders

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  3. Transonic pressure measurements and comparison of theory to experiment for an arrow-wing configuration. Volume 1: Experimental data report, base configuration and effects of wing twist and leading-edge configuration. [wind tunnel tests, aircraft models

    NASA Technical Reports Server (NTRS)

    Manro, M. E.; Manning, K. J. R.; Hallstaff, T. H.; Rogers, J. T.

    1975-01-01

    A wind tunnel test of an arrow-wing-body configuration consisting of flat and twisted wings, as well as a variety of leading- and trailing-edge control surface deflections, was conducted at Mach numbers from 0.4 to 1.1 to provide an experimental pressure data base for comparison with theoretical methods. Theory-to-experiment comparisons of detailed pressure distributions were made using current state-of-the-art attached and separated flow methods. The purpose of these comparisons was to delineate conditions under which these theories are valid for both flat and twisted wings and to explore the use of empirical methods to correct the theoretical methods where theory is deficient.

  4. Theoretical analysis and experimental study of constraint boundary conditions for acquiring the beacon in satellite-ground laser communications

    NASA Astrophysics Data System (ADS)

    Yu, Siyuan; Wu, Feng; Wang, Qiang; Tan, Liying; Ma, Jing

    2017-11-01

    Acquisition and recognition for the beacon is the core technology of establishing the satellite optical link. In order to acquire the beacon correctly, the beacon image should be recognized firstly, excluding the influence of the background light. In this processing, many factors will influence the recognition precision of the beacon. This paper studies the constraint boundary conditions for acquiring the beacon from the perspective of theory and experiment, and as satellite-ground laser communications, an approach for obtaining the adaptive segmentation method is also proposed. Finally, the long distance laser communication experiment (11.16 km) verifies the validity of this method and the tracking error with the method is the least compared with the traditional approaches. The method helps to greatly improve the tracking precision in the satellite-ground laser communications.

  5. Prestressing force monitoring method for a box girder through distributed long-gauge FBG sensors

    NASA Astrophysics Data System (ADS)

    Chen, Shi-Zhi; Wu, Gang; Xing, Tuo; Feng, De-Cheng

    2018-01-01

    Monitoring prestressing forces is essential for prestressed concrete box girder bridges. However, the current monitoring methods used for prestressing force were not applicable for a box girder neither because of the sensor’s setup being constrained or shear lag effect not being properly considered. Through combining with the previous analysis model of shear lag effect in the box girder, this paper proposed an indirect monitoring method for on-site determination of prestressing force in a concrete box girder utilizing the distributed long-gauge fiber Bragg grating sensor. The performance of this method was initially verified using numerical simulation for three different distribution forms of prestressing tendons. Then, an experiment involving two concrete box girders was conducted to study the feasibility of this method under different prestressing levels preliminarily. The results of both numerical simulation and lab experiment validated this method’s practicability in a box girder.

  6. Real-time Tracking of DNA Fragment Separation by Smartphone.

    PubMed

    Tao, Chunxian; Yang, Bo; Li, Zhenqing; Zhang, Dawei; Yamaguchi, Yoshinori

    2017-06-01

    Slab gel electrophoresis (SGE) is the most common method for the separation of DNA fragments; thus, it is broadly applied to the field of biology and others. However, the traditional SGE protocol is quite tedious, and the experiment takes a long time. Moreover, the chemical consumption in SGE experiments is very high. This work proposes a simple method for the separation of DNA fragments based on an SGE chip. The chip is made by an engraving machine. Two plastic sheets are used for the excitation and emission wavelengths of the optical signal. The fluorescence signal of the DNA bands is collected by smartphone. To validate this method, 50, 100, and 1,000 bp DNA ladders were separated. The results demonstrate that a DNA ladder smaller than 5,000 bp can be resolved within 12 min and with high resolution when using this method, indicating that it is an ideal substitute for the traditional SGE method.

  7. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation?

    PubMed

    Birt, Linda; Scott, Suzanne; Cavers, Debbie; Campbell, Christine; Walter, Fiona

    2016-06-22

    The trustworthiness of results is the bedrock of high quality qualitative research. Member checking, also known as participant or respondent validation, is a technique for exploring the credibility of results. Data or results are returned to participants to check for accuracy and resonance with their experiences. Member checking is often mentioned as one in a list of validation techniques. This simplistic reporting might not acknowledge the value of using the method, nor its juxtaposition with the interpretative stance of qualitative research. In this commentary, we critique how member checking has been used in published research, before describing and evaluating an innovative in-depth member checking technique, Synthesized Member Checking. The method was used in a study with patients diagnosed with melanoma. Synthesized Member Checking addresses the co-constructed nature of knowledge by providing participants with the opportunity to engage with, and add to, interview and interpreted data, several months after their semi-structured interview. © The Author(s) 2016.

  8. Benchmark tests for a Formula SAE Student car prototyping

    NASA Astrophysics Data System (ADS)

    Mariasiu, Florin

    2011-12-01

    Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.

  9. Workshop Report: Crystal City VI-Bioanalytical Method Validation for Biomarkers.

    PubMed

    Arnold, Mark E; Booth, Brian; King, Lindsay; Ray, Chad

    2016-11-01

    With the growing focus on translational research and the use of biomarkers to drive drug development and approvals, biomarkers have become a significant area of research within the pharmaceutical industry. However, until the US Food and Drug Administration's (FDA) 2013 draft guidance on bioanalytical method validation included consideration of biomarker assays using LC-MS and LBA, those assays were created, validated, and used without standards of performance. This lack of expectations resulted in the FDA receiving data from assays of varying quality in support of efficacy and safety claims. The AAPS Crystal City VI (CC VI) Workshop in 2015 was held as the first forum for industry-FDA discussion around the general issues of biomarker measurements (e.g., endogenous levels) and specific technology strengths and weaknesses. The 2-day workshop served to develop a common understanding among the industrial scientific community of the issues around biomarkers, informed the FDA of the current state of the science, and will serve as a basis for further dialogue as experience with biomarkers expands with both groups.

  10. Validation of a Novel Virtual Reality Simulator for Robotic Surgery

    PubMed Central

    Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.

    2014-01-01

    Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328

  11. Numerical modeling of the acoustic wave propagation across a homogenized rigid microstructure in the time domain

    NASA Astrophysics Data System (ADS)

    Lombard, Bruno; Maurel, Agnès; Marigo, Jean-Jacques

    2017-04-01

    Homogenization of a thin micro-structure yields effective jump conditions that incorporate the geometrical features of the scatterers. These jump conditions apply across a thin but nonzero thickness interface whose interior is disregarded. This paper aims (i) to propose a numerical method able to handle the jump conditions in order to simulate the homogenized problem in the time domain, (ii) to inspect the validity of the homogenized problem when compared to the real one. For this purpose, we adapt the Explicit Simplified Interface Method originally developed for standard jump conditions across a zero-thickness interface. Doing so allows us to handle arbitrary-shaped interfaces on a Cartesian grid with the same efficiency and accuracy of the numerical scheme than those obtained in a homogeneous medium. Numerical experiments are performed to test the properties of the numerical method and to inspect the validity of the homogenization problem.

  12. Development and Validation of an Internet Use Attitude Scale

    ERIC Educational Resources Information Center

    Zhang, Yixin

    2007-01-01

    This paper describes the development and validation of a new 40-item Internet Attitude Scale (IAS), a one-dimensional inventory for measuring the Internet attitudes. The first experiment initiated a generic Internet attitude questionnaire, ensured construct validity, and examined factorial validity and reliability. The second experiment further…

  13. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  14. Calibration of a rotating accelerometer gravity gradiometer using centrifugal gradients

    NASA Astrophysics Data System (ADS)

    Yu, Mingbiao; Cai, Tijing

    2018-05-01

    The purpose of this study is to calibrate scale factors and equivalent zero biases of a rotating accelerometer gravity gradiometer (RAGG). We calibrate scale factors by determining the relationship between the centrifugal gradient excitation and RAGG response. Compared with calibration by changing the gravitational gradient excitation, this method does not need test masses and is easier to implement. The equivalent zero biases are superpositions of self-gradients and the intrinsic zero biases of the RAGG. A self-gradient is the gravitational gradient produced by surrounding masses, and it correlates well with the RAGG attitude angle. We propose a self-gradient model that includes self-gradients and the intrinsic zero biases of the RAGG. The self-gradient model is a function of the RAGG attitude, and it includes parameters related to surrounding masses. The calibration of equivalent zero biases determines the parameters of the self-gradient model. We provide detailed procedures and mathematical formulations for calibrating scale factors and parameters in the self-gradient model. A RAGG physical simulation system substitutes for the actual RAGG in the calibration and validation experiments. Four point masses simulate four types of surrounding masses producing self-gradients. Validation experiments show that the self-gradients predicted by the self-gradient model are consistent with those from the outputs of the RAGG physical simulation system, suggesting that the presented calibration method is valid.

  15. Testing the convergent validity of the contingent valuation and travel cost methods in valuing the benefits of health care.

    PubMed

    Clarke, Philip M

    2002-03-01

    In this study, the convergent validity of the contingent valuation method (CVM) and travel cost method (TCM) is tested by comparing estimates of the willingness to pay (WTP) for improving access to mammographic screening in rural areas of Australia. It is based on a telephone survey of 458 women in 19 towns, in which they were asked about their recent screening behaviour and their WTP to have a mobile screening unit visit their nearest town. After eliminating missing data and other non-usable responses the contingent valuation experiment and travel cost model were based on information from 372 and 319 women, respectively. Estimates of the maximum WTP for the use of mobile screening units were derived using both methods and compared. The highest mean WTP estimated using the TCM was $83.10 (95% C.I. $99.06-$68.53), which is significantly less than the estimate of $148.09 ($131.13-$166.60) using the CVM. This could be due to the CVM estimates also reflecting non-use values such as altruism, or a range of potential biases that are known to affect both methods. Further tests of validity are required in order to gain a greater understanding of the relationship between these two methods of estimating WTP. Copyright 2001 John Wiley & Sons, Ltd.

  16. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. Tomore » overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.« less

  17. Measuring Black Men’s Police-Based Discrimination Experiences: Development and Validation of the Police and Law Enforcement (PLE) Scale

    PubMed Central

    English, Devin; Bowleg, Lisa; del Río-González, Ana Maria; Tschann, Jeanne M.; Agans, Robert; Malebranche, David J

    2017-01-01

    Objectives Although social science research has examined police and law enforcement-perpetrated discrimination against Black men using policing statistics and implicit bias studies, there is little quantitative evidence detailing this phenomenon from the perspective of Black men. Consequently, there is a dearth of research detailing how Black men’s perspectives on police and law enforcement-related stress predict negative physiological and psychological health outcomes. This study addresses these gaps with the qualitative development and quantitative test of the Police and Law Enforcement (PLE) scale. Methods In Study 1, we employed thematic analysis on transcripts of individual qualitative interviews with 90 Black men to assess key themes and concepts and develop quantitative items. In Study 2, we used 2 focus groups comprised of 5 Black men each (n=10), intensive cognitive interviewing with a separate sample of Black men (n=15), and piloting with another sample of Black men (n=13) to assess the ecological validity of the quantitative items. For study 3, we analyzed data from a sample of 633 Black men between the ages of 18 and 65 to test the factor structure of the PLE, as we all as its concurrent validity and convergent/discriminant validity. Results Qualitative analyses and confirmatory factor analyses suggested that a 5-item, 1-factor measure appropriately represented respondents’ experiences of police/law enforcement discrimination. As hypothesized, the PLE was positively associated with measures of racial discrimination and depressive symptoms. Conclusions Preliminary evidence suggests that the PLE is a reliable and valid measure of Black men’s experiences of discrimination with police/law enforcement. PMID:28080104

  18. Do we know what foundation year doctors think about patient safety incident reporting? Development of a Web based tool to assess attitude and knowledge.

    PubMed

    Robson, Jean; de Wet, Carl; McKay, John; Bowie, Paul

    2011-11-01

    Making healthcare safer is an international priority. Patient safety modules are now taught in medical schools, and methods to assess related student knowledge and attitudes have been developed. However, little is known about the attitudes and knowledge which foundation doctors are developing to patient safety and incident reporting in the healthcare workplace, since a specific assessment tool appears to be lacking. To develop, content validate and pilot test an online questionnaire survey to elicit foundation doctors' knowledge and experience of patient safety and incident reporting, and assess related attitudes and behaviours. Questionnaire content validity was facilitated through: a steering group; literature review; feedback from foundation year doctors and consultant staff; a modified Delphi group; and completion of a content validity index by experts. In 2010 a cross-sectional online survey of 110 foundation year 1 and 2 doctors was then undertaken in three Scottish NHS board areas, utilising the developed 25 item questionnaire. The questionnaire was validated, and piloted among 69 foundation year doctors who responded to the questionnaire. The pilot has provided valuable insights into trainee attitudes and experience. For example, 32 (48%) believed that most safety incidents were due to things that they could not do anything about; and 31 (43%) admitted to being involved in medication errors which were not formally reported. The pilot study was successful in taking the first steps to developing a validated survey questionnaire for a key staff group, foundation year doctors, in a priority area. However, the findings raise concerns about trainee experience of and attitudes to reporting, and the frequency with which incidents go unreported.

  19. Assessing and minimizing contamination in time of flight based validation data

    NASA Astrophysics Data System (ADS)

    Lennox, Kristin P.; Rosenfield, Paul; Blair, Brenton; Kaplan, Alan; Ruz, Jaime; Glenn, Andrew; Wurtz, Ronald

    2017-10-01

    Time of flight experiments are the gold standard method for generating labeled training and testing data for the neutron/gamma pulse shape discrimination problem. As the popularity of supervised classification methods increases in this field, there will also be increasing reliance on time of flight data for algorithm development and evaluation. However, time of flight experiments are subject to various sources of contamination that lead to neutron and gamma pulses being mislabeled. Such labeling errors have a detrimental effect on classification algorithm training and testing, and should therefore be minimized. This paper presents a method for identifying minimally contaminated data sets from time of flight experiments and estimating the residual contamination rate. This method leverages statistical models describing neutron and gamma travel time distributions and is easily implemented using existing statistical software. The method produces a set of optimal intervals that balance the trade-off between interval size and nuisance particle contamination, and its use is demonstrated on a time of flight data set for Cf-252. The particular properties of the optimal intervals for the demonstration data are explored in detail.

  20. Experimental comparison and validation of hot-ball method with guarded hot plate method on polyurethane foams

    NASA Astrophysics Data System (ADS)

    Hudec, Ján; Glorieux, Christ; Dieška, Peter; Kubičár, Ľudovít

    2016-07-01

    The Hot-ball method is an innovative transient method for measuring thermophysical properties. The principle is based on heating of a small ball, incorporated in measured medium, by constant heating power and simultaneous measuring of the ball's temperature response since the heating was initiated. The shape of the temperature response depends on thermophysical properties of the medium, where the sensor is placed. This method is patented by Institute of Physics, SAS, where the method and sensors based on this method are being developed. At the beginning of the development of sensors for this method we were oriented on monitoring applications, where relative precision is much more important than accuracy. Meanwhile, the quality of sensors was improved good enough to be used for a new application - absolute measuring of thermophysical parameters of low thermally conductive materials. This paper describes experimental verification and validation of measurement by hot-ball method. Thanks to cooperation with Laboratory of Soft Matter and Biophysics of Catholic University of Leuven in Belgium, established Guarded Hot Plate method was used as a reference. Details about measuring setups, description of the experiments and results of the comparison are presented.

  1. Development and validation of LC-HRMS and GC-NICI-MS methods for stereoselective determination of MDMA and its phase I and II metabolites in human urine

    PubMed Central

    Schwaninger, Andrea E.; Meyer, Markus R.; Huestis, Marilyn A.; Maurer, Hans H.

    2013-01-01

    3,4-Methylenedioxymethamphetamine (MDMA) is a racemic drug of abuse and its R- and S-enantiomers are known to differ in their dose-response curve. The S-enantiomer was shown to be eliminated at a higher rate than the R-enantiomer most likely explained by stereoselective metabolism that was observed in various in vitro experiments. The aim of this work was the development and validation of methods for evaluating the stereoselective elimination of phase I and particularly phase II metabolites of MDMA in human urine. Urine samples were divided into three different methods. Method A allowed stereoselective determination of the 4-hydroxy-3-methoxymethamphetamine (HMMA) glucuronides and only achiral determination of the intact sulfate conjugates of HMMA and 3,4-dihydroxymethamphetamine (DHMA) after C18 solid-phase extraction by liquid chromatography–high-resolution mass spectrometry with electrospray ionization. Method B allowed the determination of the enantiomer ratios of DHMA and HMMA sulfate conjugates after selective enzymatic cleavage and chiral analysis of the corresponding deconjugated metabolites after chiral derivatization with S-heptafluorobutyrylprolyl chloride using gas chromatography–mass spectrometry with negativeion chemical ionization. Method C allowed the chiral determination of MDMA and its unconjugated metabolites using method B without sulfate cleavage. The validation process including specificity, recovery, matrix effects, process efficiency, accuracy and precision, stabilities and limits of quantification and detection showed that all methods were selective, sensitive, accurate and precise for all tested analytes. PMID:21656610

  2. Interpretation of energy deposition data from historical operation of the transient test facility (TREAT)

    DOE PAGES

    DeHart, Mark D.; Baker, Benjamin A.; Ortensi, Javier

    2017-07-27

    The Transient Test Reactor (TREAT) at Idaho National Laboratory will resume operations in late 2017 after a 23 year hiatus while maintained in a cold standby state. Over that time period, computational power and simulation capabilities have increased substantially and now allow for new multiphysics modeling possibilities that were not practical or feasible for most of TREAT's operational history. Hence the return of TREAT to operational service provides a unique opportunity to apply state-of-the-art software and associated methods in the modeling and simulation of general three-dimensional steady state and kinetic behavior for reactor operation, and for coupling of the coremore » power transient model to experiment simulations. However, measurements taken in previous operations were intended to predict power deposition in experimental samples, with little consideration of three-dimensional core power distributions. Hence, interpretation of data for the purpose of validation of modern methods can be challenging. For the research discussed herein, efforts are described for the process of proper interpretation of data from the most recent calibration experiments performed in the core, the M8 calibration series (M8-CAL). These measurements were taken between 1990 and 1993 using a set of fission wires and test fuel pins to estimate the power deposition that would be produced in fast reactor test fuel pins during the M8 experiment series. Because of the decision to place TREAT into a standby state in 1994, the M8 series of transients were never performed. However, potentially valuable information relevant for validation is available in the M8-CAL measurement data, if properly interpreted. This article describes the current state of the process of recovery of useful data from M8-CAL measurements and quantification of biases and uncertainties to potentially apply to the validation of multiphysics methods.« less

  3. Interpretation of energy deposition data from historical operation of the transient test facility (TREAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark D.; Baker, Benjamin A.; Ortensi, Javier

    The Transient Test Reactor (TREAT) at Idaho National Laboratory will resume operations in late 2017 after a 23 year hiatus while maintained in a cold standby state. Over that time period, computational power and simulation capabilities have increased substantially and now allow for new multiphysics modeling possibilities that were not practical or feasible for most of TREAT's operational history. Hence the return of TREAT to operational service provides a unique opportunity to apply state-of-the-art software and associated methods in the modeling and simulation of general three-dimensional steady state and kinetic behavior for reactor operation, and for coupling of the coremore » power transient model to experiment simulations. However, measurements taken in previous operations were intended to predict power deposition in experimental samples, with little consideration of three-dimensional core power distributions. Hence, interpretation of data for the purpose of validation of modern methods can be challenging. For the research discussed herein, efforts are described for the process of proper interpretation of data from the most recent calibration experiments performed in the core, the M8 calibration series (M8-CAL). These measurements were taken between 1990 and 1993 using a set of fission wires and test fuel pins to estimate the power deposition that would be produced in fast reactor test fuel pins during the M8 experiment series. Because of the decision to place TREAT into a standby state in 1994, the M8 series of transients were never performed. However, potentially valuable information relevant for validation is available in the M8-CAL measurement data, if properly interpreted. This article describes the current state of the process of recovery of useful data from M8-CAL measurements and quantification of biases and uncertainties to potentially apply to the validation of multiphysics methods.« less

  4. A Novel Analysis Method for Paired-Sample Microbial Ecology Experiments.

    PubMed

    Olesen, Scott W; Vora, Suhani; Techtmann, Stephen M; Fortney, Julian L; Bastidas-Oyanedel, Juan R; Rodríguez, Jorge; Hazen, Terry C; Alm, Eric J

    2016-01-01

    Many microbial ecology experiments use sequencing data to measure a community's response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samples and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method's validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of "bottle effects".

  5. Statistical Calibration and Validation of a Homogeneous Ventilated Wall-Interference Correction Method for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walker, Eric L.

    2005-01-01

    Wind tunnel experiments will continue to be a primary source of validation data for many types of mathematical and computational models in the aerospace industry. The increased emphasis on accuracy of data acquired from these facilities requires understanding of the uncertainty of not only the measurement data but also any correction applied to the data. One of the largest and most critical corrections made to these data is due to wall interference. In an effort to understand the accuracy and suitability of these corrections, a statistical validation process for wall interference correction methods has been developed. This process is based on the use of independent cases which, after correction, are expected to produce the same result. Comparison of these independent cases with respect to the uncertainty in the correction process establishes a domain of applicability based on the capability of the method to provide reasonable corrections with respect to customer accuracy requirements. The statistical validation method was applied to the version of the Transonic Wall Interference Correction System (TWICS) recently implemented in the National Transonic Facility at NASA Langley Research Center. The TWICS code generates corrections for solid and slotted wall interference in the model pitch plane based on boundary pressure measurements. Before validation could be performed on this method, it was necessary to calibrate the ventilated wall boundary condition parameters. Discrimination comparisons are used to determine the most representative of three linear boundary condition models which have historically been used to represent longitudinally slotted test section walls. Of the three linear boundary condition models implemented for ventilated walls, the general slotted wall model was the most representative of the data. The TWICS code using the calibrated general slotted wall model was found to be valid to within the process uncertainty for test section Mach numbers less than or equal to 0.60. The scatter among the mean corrected results of the bodies of revolution validation cases was within one count of drag on a typical transport aircraft configuration for Mach numbers at or below 0.80 and two counts of drag for Mach numbers at or below 0.90.

  6. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions

    DOE PAGES

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    2018-03-03

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less

  7. A Comprehensive Validation Approach Using The RAVEN Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J

    2015-06-01

    The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less

  8. Validation of the Small Hot Jet Acoustic Rig for Jet Noise Research

    NASA Technical Reports Server (NTRS)

    Bridges, James; Brown, Clifford A.

    2005-01-01

    The development and acoustic validation of the Small Hot Jet Aeroacoustic Rig (SHJAR) is documented. Originally conceived to support fundamental research in jet noise, the rig has been designed and developed using the best practices of the industry. While validating the rig for acoustic work, a method of characterizing all extraneous rig noise was developed. With this in hand, the researcher can know when the jet data being measured is being contaminated and design the experiment around this limitation. Also considered is the question of uncertainty, where it is shown that there is a fundamental uncertainty of 0.5dB or so to the best experiments, confirmed by repeatability studies. One area not generally accounted for in the uncertainty analysis is the variation which can result from differences in initial condition of the nozzle shear layer. This initial condition was modified and the differences in both flow and sound were documented. The bottom line is that extreme caution must be applied when working on small jet rigs, but that highly accurate results can be made independent of scale.

  9. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions.

    PubMed

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    2018-05-30

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175-183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave ). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave . The improved model contains six of the 10 terms in the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. Compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Using simulated fluorescence cell micrographs for the evaluation of cell image segmentation algorithms.

    PubMed

    Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas

    2017-03-18

    Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.

  11. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less

  12. Lessons Learned Designing and Using an Online Discussion Forum for Care Coordinators in Primary Care.

    PubMed

    Ferrante, Jeanne M; Friedman, Asia; Shaw, Eric K; Howard, Jenna; Cohen, Deborah J; Shahidi, Laleh

    2015-10-18

    While an increasing number of researchers are using online discussion forums for qualitative research, few authors have documented their experiences and lessons learned to demonstrate this method's viability and validity in health services research. We comprehensively describe our experiences, from start to finish, of designing and using an asynchronous online discussion forum for collecting and analyzing information elicited from care coordinators in Patient-Centered Medical Homes across the United States. Our lessons learned from each phase, including planning, designing, implementing, using, and ending this private online discussion forum, provide some recommendations for other health services researchers considering this method. An asynchronous online discussion forum is a feasible, efficient, and effective method to conduct a qualitative study, particularly when subjects are health professionals. © The Author(s) 2015.

  13. Assessing the stability of human locomotion: a review of current measures

    PubMed Central

    Bruijn, S. M.; Meijer, O. G.; Beek, P. J.; van Dieën, J. H.

    2013-01-01

    Falling poses a major threat to the steadily growing population of the elderly in modern-day society. A major challenge in the prevention of falls is the identification of individuals who are at risk of falling owing to an unstable gait. At present, several methods are available for estimating gait stability, each with its own advantages and disadvantages. In this paper, we review the currently available measures: the maximum Lyapunov exponent (λS and λL), the maximum Floquet multiplier, variability measures, long-range correlations, extrapolated centre of mass, stabilizing and destabilizing forces, foot placement estimator, gait sensitivity norm and maximum allowable perturbation. We explain what these measures represent and how they are calculated, and we assess their validity, divided up into construct validity, predictive validity in simple models, convergent validity in experimental studies, and predictive validity in observational studies. We conclude that (i) the validity of variability measures and λS is best supported across all levels, (ii) the maximum Floquet multiplier and λL have good construct validity, but negative predictive validity in models, negative convergent validity and (for λL) negative predictive validity in observational studies, (iii) long-range correlations lack construct validity and predictive validity in models and have negative convergent validity, and (iv) measures derived from perturbation experiments have good construct validity, but data are lacking on convergent validity in experimental studies and predictive validity in observational studies. In closing, directions for future research on dynamic gait stability are discussed. PMID:23516062

  14. Pesticide analysis in teas and chamomile by liquid chromatography and gas chromatography tandem mass spectrometry using a modified QuEChERS method: validation and pilot survey in real samples.

    PubMed

    Lozano, Ana; Rajski, Łukasz; Belmonte-Valles, Noelia; Uclés, Ana; Uclés, Samanta; Mezcua, Milagros; Fernández-Alba, Amadeo R

    2012-12-14

    This paper presents the validation of a modified QuEChERS method in four matrices - green tea, red tea, black tea and chamomile. The experiments were carried out using blank samples spiked with a solution of 86 pesticides (insecticides, fungicides and herbicides) at four levels - 10, 25, 50 and 100 μg/kg. The samples were extracted according to the citrate QuEChERS protocol; however, to reduce the amount of coextracted matrix compounds, calcium chloride was employed instead of magnesium sulphate in the clean-up step. The samples were analysed by LC-MS/MS and GC-MS/MS. Included in the scope of validation were: recovery, linearity, matrix effects, limits of detection and quantitation as well as intra-day and inter-day precision. The validated method was used in a real sample survey carried out on 75 samples purchased in ten different countries. In all matrices, recoveries of the majority of compounds were in the 70-120% range and were characterised by precision lower than 20%. In 85% of pesticide/matrix combinations the analytes can be detected quantitatively by the proposed method at the European Union Maximum Residue Level. The analysis of the real samples revealed that large number of teas and chamomiles sold in the European Union contain pesticides whose usage is not approved and also pesticides in concentrations above the EU MRLs. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Development and validation of sensitive LC/MS/MS method for quantitative bioanalysis of levonorgestrel in rat plasma and application to pharmacokinetics study.

    PubMed

    Ananthula, Suryatheja; Janagam, Dileep R; Jamalapuram, Seshulatha; Johnson, James R; Mandrell, Timothy D; Lowe, Tao L

    2015-10-15

    Rapid, sensitive, selective and accurate LC/MS/MS method was developed for quantitative determination of levonorgestrel (LNG) in rat plasma and further validated for specificity, linearity, accuracy, precision, sensitivity, matrix effect, recovery efficiency and stability. Liquid-liquid extraction procedure using hexane:ethyl acetate mixture at 80:20 v:v ratio was employed to efficiently extract LNG from rat plasma. Reversed phase Luna column C18(2) (50×2.0mm i.d., 3μM) installed on a AB SCIEX Triple Quad™ 4500 LC/MS/MS system was used to perform chromatographic separation. LNG was identified within 2min with high specificity. Linear calibration curve was drawn within 0.5-50ng·mL(-1) concentration range. The developed method was validated for intra-day and inter-day accuracy and precision whose values fell in the acceptable limits. Matrix effect was found to be minimal. Recovery efficiency at three quality control (QC) concentrations 0.5 (low), 5 (medium) and 50 (high) ng·mL(-1) was found to be >90%. Stability of LNG at various stages of experiment including storage, extraction and analysis was evaluated using QC samples, and the results showed that LNG was stable at all the conditions. This validated method was successfully used to study the pharmacokinetics of LNG in rats after SubQ injection, providing its applicability in relevant preclinical studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Improving the sensitivity and specificity of a bioanalytical assay for the measurement of certolizumab pegol.

    PubMed

    Smeraglia, John; Silva, John-Paul; Jones, Kieran

    2017-08-01

    In order to evaluate placental transfer of certolizumab pegol (CZP), a more sensitive and selective bioanalytical assay was required to accurately measure low CZP concentrations in infant and umbilical cord blood. Results & methodology: A new electrochemiluminescence immunoassay was developed to measure CZP levels in human plasma. Validation experiments demonstrated improved selectivity (no matrix interference observed) and a detection range of 0.032-5.0 μg/ml. Accuracy and precision met acceptance criteria (mean total error ≤20.8%). Dilution linearity and sample stability were acceptable and sufficient to support the method. The electrochemiluminescence immunoassay was validated for measuring low CZP concentrations in human plasma. The method demonstrated a more than tenfold increase in sensitivity compared with previous assays, and improved selectivity for intact CZP.

  17. Simulation of magnetic particles in microfluidic channels

    NASA Astrophysics Data System (ADS)

    Gusenbauer, Markus; Schrefl, Thomas

    2018-01-01

    In the field of biomedicine the applications of magnetic beads have increased immensely in the last decade. Drug delivery, magnetic resonance imaging, bioseparation or hyperthermia are only a small excerpt of their usage. Starting from microscaled particles the research is focusing more and more on nanoscaled particles. We are investigating and validating a method for simulating magnetic beads in a microfluidic flow which will help to manipulate beads in a controlled and reproducible manner. We are using the soft-matter simulation package ESPResSo to simulate magnetic particle dynamics in a lattice Boltzmann flow and applied external magnetic fields. Laminar as well as turbulent flow conditions in microfluidic systems can be analyzed while particles tend to agglomerate due to magnetic interactions. The proposed simulation methods are validated with experiments from literature.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  19. Integrated software for the detection of epileptogenic zones in refractory epilepsy.

    PubMed

    Mottini, Alejandro; Miceli, Franco; Albin, Germán; Nuñez, Margarita; Ferrándo, Rodolfo; Aguerrebere, Cecilia; Fernandez, Alicia

    2010-01-01

    In this paper we present an integrated software designed to help nuclear medicine physicians in the detection of epileptogenic zones (EZ) by means of ictal-interictal SPECT and MR images. This tool was designed to be flexible, friendly and efficient. A novel detection method was included (A-contrario) along with the classical detection method (Subtraction analysis). The software's performance was evaluated with two separate sets of validation studies: visual interpretation of 12 patient images by an experimented observer and objective analysis of virtual brain phantom experiments by proposed numerical observers. Our results support the potential use of the proposed software to help nuclear medicine physicians in the detection of EZ in clinical practice.

  20. Fully automated registration of first-pass myocardial perfusion MRI using independent component analysis.

    PubMed

    Milles, J; van der Geest, R J; Jerosch-Herold, M; Reiber, J H C; Lelieveldt, B P F

    2007-01-01

    This paper presents a novel method for registration of cardiac perfusion MRI. The presented method successfully corrects for breathing motion without any manual interaction using Independent Component Analysis to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of ICA, and used to compute the displacement caused by breathing for each frame. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Validation experiments showed a reduction of the average LV motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. We conclude that this fully automatic ICA-based method shows an excellent accuracy, robustness and computation speed, adequate for use in a clinical environment.

  1. A novel multi-target regression framework for time-series prediction of drug efficacy.

    PubMed

    Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin

    2017-01-18

    Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task.

  2. A novel multi-target regression framework for time-series prediction of drug efficacy

    PubMed Central

    Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin

    2017-01-01

    Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task. PMID:28098186

  3. Numerical modeling of local scour around hydraulic structure in sandy beds by dynamic mesh method

    NASA Astrophysics Data System (ADS)

    Fan, Fei; Liang, Bingchen; Bai, Yuchuan; Zhu, Zhixia; Zhu, Yanjun

    2017-10-01

    Local scour, a non-negligible factor in hydraulic engineering, endangers the safety of hydraulic structures. In this work, a numerical model for simulating local scour was constructed, based on the open source code computational fluid dynamics model OpenFOAM. We consider both the bedload and suspended load sediment transport in the scour model and adopt the dynamic mesh method to simulate the evolution of the bed elevation. We use the finite area method to project data between the three-dimensional flow model and the two-dimensional (2D) scour model. We also improved the 2D sand slide method and added it to the scour model to correct the bed bathymetry when the bed slope angle exceeds the angle of repose. Moreover, to validate our scour model, we conducted and compared the results of three experiments with those of the developed model. The validation results show that our developed model can reliably simulate local scour.

  4. Assessing culture via the Internet: methods and techniques for psychological research.

    PubMed

    Barry, D T

    2001-02-01

    This study examines the acculturation experiences of Arabic immigrants and assesses the utility of the Internet as a data collection tool. Based on in-depth pilot interview data from 10 male Arabic immigrants and items selected from pre-existing measures, the Male Arabic Ethnic Identity Measure (MAEIM) was developed. Male Arab immigrants (115 males) were solicited through traditional methods in addition to the Internet. Satisfactory reliability and validity were reported for the MAEIM. No significant differences emerged between the Internet and Midwestern samples. The Internet proved to be an effective method for soliciting a relatively large, geographically dispersed sample of Arabic immigrants. The use of the Internet as a research tool is examined in the context of anonymity, networking, low-cost, perceived interactive control, methodological rigor, and external validity. The Internet was an effective vehicle for addressing concerns raised by prospective participants. It is suggested that the Internet may be an important method to assess culture-relevant variables in further research on Arab and other immigrant populations.

  5. The analysis of bottom forming process for hybrid heating device

    NASA Astrophysics Data System (ADS)

    Bałon, Paweł; Świątoniowski, Andrzej; Kiełbasa, Bartłomiej

    2017-10-01

    In this paper the authors present an unusual method for bottom forming applicable for various industrial purposes including the manufacture of water heaters or pressure equipment. This method allows for the fabrication of the bottom of a given piece of stainless steel into a pre-determined shape conforming to the DIN standard which determines the most advantageous dimensions for the bottom cross section in terms of working pressure loading. The authors checked the validity of the method in a numerical and experimental way generating a tool designed to produce bottoms of specified geometry. Many problems are encountered during the design and production of parts, especially excessive sheet wrinkling over a large area of the part. The experiment showed that a lack of experience and numerical analysis in the design of such elements would result in the production of highly wrinkled parts. This defect would render the parts impossible to assemble with the cylindrical part. Many tool shops employ a method for drawing elements with a spherical surface which involves additional spinning, stamping, and grading operations, which greatly increases the cost of parts production. The authors present and compare two forming methods for spherical and parabolic objects, and experimentally confirm the validity of the sheet reversing method with adequate pressure force. The applied method produces parts in one drawing operation and in a following operation that is based on laser or water cutting to obtain a round blank. This reduces the costs of tooling manufacturing by requiring just one tool which can be placed on any hydraulic press with a minimum force of 2 000 kN.

  6. Compressed domain ECG biometric with two-lead features

    NASA Astrophysics Data System (ADS)

    Lee, Wan-Jou; Chang, Wen-Whei

    2016-07-01

    This study presents a new method to combine ECG biometrics with data compression within a common JPEG2000 framework. We target the two-lead ECG configuration that is routinely used in long-term heart monitoring. Incorporation of compressed-domain biometric techniques enables faster person identification as it by-passes the full decompression. Experiments on public ECG databases demonstrate the validity of the proposed method for biometric identification with high accuracies on both healthy and diseased subjects.

  7. Surface Tension of Solids in the Absence of Adsorption

    PubMed Central

    2009-01-01

    A method has been recently proposed for determining the value of the surface tension of a solid in the absence of adsorption, γS0, using material properties determined from vapor adsorption experiments. If valid, the value obtained for γS0 must be independent of the vapor used. We apply the proposed method to determine the value of γS0 for four solids using at least two vapors for each solid and find results that support the proposed method for determining γS0. PMID:19719092

  8. Emitter signal separation method based on multi-level digital channelization

    NASA Astrophysics Data System (ADS)

    Han, Xun; Ping, Yifan; Wang, Sujun; Feng, Ying; Kuang, Yin; Yang, Xinquan

    2018-02-01

    To solve the problem of emitter separation under complex electromagnetic environment, a signal separation method based on multi-level digital channelization is proposed in this paper. A two-level structure which can divide signal into different channel is designed first, after that, the peaks of different channels are tracked using the track filter and the coincident signals in time domain are separated in time-frequency domain. Finally, the time domain waveforms of different signals are acquired by reverse transformation. The validness of the proposed method is proved by experiment.

  9. Fecal electrolyte testing for evaluation of unexplained diarrhea: Validation of body fluid test accuracy in the absence of a reference method.

    PubMed

    Voskoboev, Nikolay V; Cambern, Sarah J; Hanley, Matthew M; Giesen, Callen D; Schilling, Jason J; Jannetto, Paul J; Lieske, John C; Block, Darci R

    2015-11-01

    Validation of tests performed on body fluids other than blood or urine can be challenging due to the lack of a reference method to confirm accuracy. The aim of this study was to evaluate alternate assessments of accuracy that laboratories can rely on to validate body fluid tests in the absence of a reference method using the example of sodium (Na(+)), potassium (K(+)), and magnesium (Mg(2+)) testing in stool fluid. Validations of fecal Na(+), K(+), and Mg(2+) were performed on the Roche cobas 6000 c501 (Roche Diagnostics) using residual stool specimens submitted for clinical testing. Spiked recovery, mixing studies, and serial dilutions were performed and % recovery of each analyte was calculated to assess accuracy. Results were confirmed by comparison to a reference method (ICP-OES, PerkinElmer). Mean recoveries for fecal electrolytes were Na(+) upon spiking=92%, mixing=104%, and dilution=105%; K(+) upon spiking=94%, mixing=96%, and dilution=100%; and Mg(2+) upon spiking=93%, mixing=98%, and dilution=100%. When autoanalyzer results were compared to reference ICP-OES results, Na(+) had a slope=0.94, intercept=4.1, and R(2)=0.99; K(+) had a slope=0.99, intercept=0.7, and R(2)=0.99; and Mg(2+) had a slope=0.91, intercept=-4.6, and R(2)=0.91. Calculated osmotic gap using both methods were highly correlated with slope=0.95, intercept=4.5, and R(2)=0.97. Acid pretreatment increased magnesium recovery from a subset of clinical specimens. A combination of mixing, spiking, and dilution recovery experiments are an acceptable surrogate for assessing accuracy in body fluid validations in the absence of a reference method. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. A multi-frequency iterative imaging method for discontinuous inverse medium problem

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Feng, Lixin

    2018-06-01

    The inverse medium problem with discontinuous refractive index is a kind of challenging inverse problem. We employ the primal dual theory and fast solution of integral equations, and propose a new iterative imaging method. The selection criteria of regularization parameter is given by the method of generalized cross-validation. Based on multi-frequency measurements of the scattered field, a recursive linearization algorithm has been presented with respect to the frequency from low to high. We also discuss the initial guess selection strategy by semi-analytical approaches. Numerical experiments are presented to show the effectiveness of the proposed method.

  11. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn; Lin, Guang, E-mail: guanglin@purdue.edu

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  12. A simplified method for extracting androgens from avian egg yolks

    USGS Publications Warehouse

    Kozlowski, C.P.; Bauman, J.E.; Hahn, D.C.

    2009-01-01

    Female birds deposit significant amounts of steroid hormones into the yolks of their eggs. Studies have demonstrated that these hormones, particularly androgens, affect nestling growth and development. In order to measure androgen concentrations in avian egg yolks, most authors follow the extraction methods outlined by Schwabl (1993. Proc. Nat. Acad. Sci. USA 90:11446-11450). We describe a simplified method for extracting androgens from avian egg yolks. Our method, which has been validated through recovery and linearity experiments, consists of a single ethanol precipitation that produces substantially higher recoveries than those reported by Schwabl.

  13. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role ofmore » expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related to VHTRs, sodium-cooled fast reactors, and light-water reactors. These experiments range from relatively low-cost benchtop experiments for investigating individual phenomena to large electrically-heated integral facilities for investigating reactor accidents and transients.« less

  14. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  15. Development and Validation of a Monte Carlo Simulation Tool for Multi-Pinhole SPECT

    PubMed Central

    Mok, Greta S. P.; Du, Yong; Wang, Yuchuan; Frey, Eric C.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose In this work, we developed and validated a Monte Carlo simulation (MCS) tool for investigation and evaluation of multi-pinhole (MPH) SPECT imaging. Procedures This tool was based on a combination of the SimSET and MCNP codes. Photon attenuation and scatter in the object, as well as penetration and scatter through the collimator detector, are modeled in this tool. It allows accurate and efficient simulation of MPH SPECT with focused pinhole apertures and user-specified photon energy, aperture material, and imaging geometry. The MCS method was validated by comparing the point response function (PRF), detection efficiency (DE), and image profiles obtained from point sources and phantom experiments. A prototype single-pinhole collimator and focused four- and five-pinhole collimators fitted on a small animal imager were used for the experimental validations. We have also compared computational speed among various simulation tools for MPH SPECT, including SimSET-MCNP, MCNP, SimSET-GATE, and GATE for simulating projections of a hot sphere phantom. Results We found good agreement between the MCS and experimental results for PRF, DE, and image profiles, indicating the validity of the simulation method. The relative computational speeds for SimSET-MCNP, MCNP, SimSET-GATE, and GATE are 1: 2.73: 3.54: 7.34, respectively, for 120-view simulations. We also demonstrated the application of this MCS tool in small animal imaging by generating a set of low-noise MPH projection data of a 3D digital mouse whole body phantom. Conclusions The new method is useful for studying MPH collimator designs, data acquisition protocols, image reconstructions, and compensation techniques. It also has great potential to be applied for modeling the collimator-detector response with penetration and scatter effects for MPH in the quantitative reconstruction method. PMID:19779896

  16. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  17. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  18. A novel, rapid and automated conductometric method to evaluate surfactant-cells interactions by means of critical micellar concentration analysis.

    PubMed

    Tiecco, Matteo; Corte, Laura; Roscini, Luca; Colabella, Claudia; Germani, Raimondo; Cardinali, Gianluigi

    2014-07-25

    Conductometry is widely used to determine critical micellar concentration and micellar aggregates surface properties of amphiphiles. Current conductivity experiments of surfactant solutions are typically carried out by manual pipetting, yielding some tens reading points within a couple of hours. In order to study the properties of surfactant-cells interactions, each amphiphile must be tested in different conditions against several types of cells. This calls for complex experimental designs making the application of current methods seriously time consuming, especially because long experiments risk to determine alterations of cells, independently of the surfactant action. In this paper we present a novel, accurate and rapid automated procedure to obtain conductometric curves with several hundreds reading points within tens of minutes. The method was validated with surfactant solutions alone and in combination with Saccharomyces cerevisiae cells. An easy-to use R script, calculates conductometric parameters and their statistical significance with a graphic interface to visualize data and results. The validations showed that indeed the procedure works in the same manner with surfactant alone or in combination with cells, yielding around 1000 reading points within 20 min and with high accuracy, as determined by the regression analysis. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. [Reliability and validity of Driving Anger Scale in professional drivers in China].

    PubMed

    Li, Z; Yang, Y M; Zhang, C; Li, Y; Hu, J; Gao, L W; Zhou, Y X; Zhang, X J

    2017-11-10

    Objective: To assess the reliability and validity of the Chinese version of Driving Anger Scale (DAS) in professional drivers in China and provide a scientific basis for the application of the scale in drivers in China. Methods: Professional drivers, including taxi drivers, bus drivers, truck drivers and school bus drivers, were selected to complete the questionnaire. Cronbach's α and split-half reliability were calculated to evaluate the reliability of DAS, and content, contract, discriminant and convergent validity were performed to measure the validity of the scale. Results: The overall Cronbach's α of DAS was 0.934 and the split-half reliability was 0.874. The correlation coefficient of each subscale with the total scale was 0.639-0.922. The simplified version of DAS supported a presupposed six-factor structure, explaining 56.371% of the total variance revealed by exploratory factor analysis. The DAS had good convergent and discriminant validity, with the success rate of calibration experiment of 100%. Conclusion: DAS has a good reliability and validity in professional drivers in China, and the use of DAS is worth promoting in divers.

  1. [Translation and validation in French of the First-Time Father Questionnaire].

    PubMed

    Capponi, I; Carquillat, P; Premberg, Å; Vendittelli, F; Guittier, M-J

    2016-09-01

    For fathers, being present at a birth for the first time is not an insignificant event. Witnessing suffering can cause feelings of loneliness and powerlessness, which may be associated with postnatal problems such as depression. However, without a confirmed French-language tool concerning the experience of childbirth for fathers, we are limited in our ability to develop our understanding of their experiences and establish links between these experiences and their distress (anxiety, depression, etc.), or to develop appropriate methods of support. Our objective has been to translate and validate the Swedish "First-Time Father Questionnaire" with a French-speaking sample. The tool was translated using a translation/backtranslation process (using two independent agencies, with a pre-test on 30 new fathers as well as exchanges with the Swedish authors). The French version was then tested with 154 new fathers at 1 month post-partum. Factorial analysis followed by multi-trait analysis and variance analyses were conducted, with subgroups contrasted according to the mode of delivery. The factorial structure is satisfactory, retaining 19 items and reproducing 54.12% of variance. Professional support, worry, and prenatal preparation constitute the 3 dimensions of this. Internal consistency, homogeneity, and the discriminating capacity of the questionnaire are good. Validation of the questionnaire shows good metrological qualities. It can therefore be used in the perinatal field to evaluate the childbirth experience for first-time fathers. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  2. Determination of sulfonamide antibiotics and metabolites in liver, muscle and kidney samples by pressurized liquid extraction or ultrasound-assisted extraction followed by liquid chromatography-quadrupole linear ion trap-tandem mass spectrometry (HPLC-QqLIT-MS/MS).

    PubMed

    Hoff, Rodrigo Barcellos; Pizzolato, Tânia Mara; Peralba, Maria do Carmo Ruaro; Díaz-Cruz, M Silvia; Barceló, Damià

    2015-03-01

    Sulfonamides are widely used in human and veterinary medicine. The presence of sulfonamides residues in food is an issue of great concern. Throughout the present work, a method for the targeted analysis of 16 sulfonamides and metabolites residue in liver of several species has been developed and validated. Extraction and clean-up has been statistically optimized using central composite design experiments. Two extraction methods have been developed, validated and compared: i) pressurized liquid extraction, in which samples were defatted with hexane and subsequently extracted with acetonitrile and ii) ultrasound-assisted extraction with acetonitrile and further liquid-liquid extraction with hexane. Extracts have been analyzed by liquid chromatography-quadrupole linear ion trap-tandem mass spectrometry. Validation procedure has been based on the Commission Decision 2002/657/EC and included the assessment of parameters such as decision limit (CCα), detection capability (CCβ), sensitivity, selectivity, accuracy and precision. Method׳s performance has been satisfactory, with CCα values within the range of 111.2-161.4 µg kg(-1), limits of detection of 10 µg kg(-1) and accuracy values around 100% for all compounds. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. [Study on freshness evaluation of ice-stored large yellow croaker (Pseudosciaena crocea) using near infrared spectroscopy].

    PubMed

    Liu, Yuan; Chen, Wei-Hua; Hou, Qiao-Juan; Wang, Xi-Chang; Dong, Ruo-Yan; Wu, Hao

    2014-04-01

    Near infrared spectroscopy (NIR) was used in this experiment to evaluate the freshness of ice-stored large yellow croaker (Pseudosciaena crocea) during different storage periods. And the TVB-N was used as an index to evaluate the freshness. Through comparing the correlation coefficent and standard deviations of calibration set and validation set of models established by singly and combined using of different pretreatment methods, different modeling methods and different wavelength region, the best TVB-N models of ice-stored large yellow croaker sold in the market were established to predict the freshness quickly. According to the research, the model shows that the best performance could be established by using the normalization by closure (Ncl) with 1st derivative (Dbl) and normalization to unit length (Nle) with 1st derivative as the pretreated method and partial least square (PLS) as the modeling method combined with choosing the wavelength region of 5 000-7 144, and 7 404-10 000 cm(-1). The calibration model gave the correlation coefficient of 0.992, with a standard error of calibration of 1.045 and the validation model gave the correlation coefficient of 0.999, with a standard error of prediction of 0.990. This experiment attempted to combine several pretreatment methods and choose the best wavelength region, which has got a good result. It could have a good prospective application of freshness detection and quality evaluation of large yellow croaker in the market.

  4. X-ray luminescence computed tomography imaging based on X-ray distribution model and adaptively split Bregman method

    PubMed Central

    Chen, Dongmei; Zhu, Shouping; Cao, Xu; Zhao, Fengjun; Liang, Jimin

    2015-01-01

    X-ray luminescence computed tomography (XLCT) has become a promising imaging technology for biological application based on phosphor nanoparticles. There are mainly three kinds of XLCT imaging systems: pencil beam XLCT, narrow beam XLCT and cone beam XLCT. Narrow beam XLCT can be regarded as a balance between the pencil beam mode and the cone-beam mode in terms of imaging efficiency and image quality. The collimated X-ray beams are assumed to be parallel ones in the traditional narrow beam XLCT. However, we observe that the cone beam X-rays are collimated into X-ray beams with fan-shaped broadening instead of parallel ones in our prototype narrow beam XLCT. Hence we incorporate the distribution of the X-ray beams in the physical model and collected the optical data from only two perpendicular directions to further speed up the scanning time. Meanwhile we propose a depth related adaptive regularized split Bregman (DARSB) method in reconstruction. The simulation experiments show that the proposed physical model and method can achieve better results in the location error, dice coefficient, mean square error and the intensity error than the traditional split Bregman method and validate the feasibility of method. The phantom experiment can obtain the location error less than 1.1 mm and validate that the incorporation of fan-shaped X-ray beams in our model can achieve better results than the parallel X-rays. PMID:26203388

  5. [Comparative data regarding two HPLC methods for determination of isoniazid].

    PubMed

    Gârbuleţ, Daniela; Spac, A F; Dorneanu, V

    2009-01-01

    For the determination of isoniazide (isonicotinic acid hydrazide - HIN) two different HPLC methods were developed and validated. Both experiments were performed using a Waters 2695 liquid chromatograph and a UV - Waters 2489 detector. The first method (I) used a Nucleosil 100-10 C18 column (250 x 4.6 mm), a mobile phase formed by a mixture of acetonitrile/10(-2) M oxalic acid (80/20) and a flow of 1.5 mL/ min; detection was done at 230 nm. The second method (II) used a Luna 100-5 C18 column (250 x 4.6 mm), a mobile phase formed by a mixture of methanol/acetate buffer, pH = 5.0 (20/ 80), a flow of 1 mL/min; detection was done at 270 nm. Both methods were validated, the correlation coefficients were 0.9998 (I) and 0.9999 (II), the detection limits were 0.6 microg/mL (I) and 0.055 microg/mL (II), the quantitation limits were 1.9 microg/mL (I) and 0.2 microg/ mL (II). There were also studied: the system precision (RSD = 0.1692% (I) and 0.2000% (II)), the method precision (RSD = 1.1844% (I) and 0.6170% (II)) and the intermediate precision (RSD = 1.8058% (I) and 0.5970% (II)). The accuracy was good, the calculated recoveries were 102.66% (I) and 101.36 (II). Both validated methods were applied for HIN determination from tablets with good and comparable results.

  6. A Study of a Super-Cooling Technique for Removal of Rubber from Solid-Rubber Tires.

    DTIC Science & Technology

    environmental pollution . In answering these questions, an experiment is conducted to validate the concept and to determine liquid...is performed to compare the costs of the super-cooling technique with those of the brake drum lathe method of rubber removal. Safety and environmental pollution factors are also investigated and

  7. Quantitative impurity analysis of monoclonal antibody size heterogeneity by CE-LIF: example of development and validation through a quality-by-design framework.

    PubMed

    Michels, David A; Parker, Monica; Salas-Solano, Oscar

    2012-03-01

    This paper describes the framework of quality by design applied to the development, optimization and validation of a sensitive capillary electrophoresis-sodium dodecyl sulfate (CE-SDS) assay for monitoring impurities that potentially impact drug efficacy or patient safety produced in the manufacture of therapeutic MAb products. Drug substance or drug product samples are derivatized with fluorogenic 3-(2-furoyl)quinoline-2-carboxaldehyde and nucleophilic cyanide before separation by CE-SDS coupled to LIF detection. Three design-of-experiments enabled critical labeling parameters to meet method requirements for detecting minor impurities while building precision and robustness into the assay during development. The screening design predicted optimal conditions to control labeling artifacts while two full factorial designs demonstrated method robustness through control of temperature and cyanide parameters within the normal operating range. Subsequent validation according to the guidelines of the International Committee of Harmonization showed the CE-SDS/LIF assay was specific, accurate, and precise (RSD ≤ 0.8%) for relative peak distribution and linear (R > 0.997) between the range of 0.5-1.5 mg/mL with LOD and LOQ of 10 ng/mL and 35 ng/mL, respectively. Validation confirmed the system suitability criteria used as a level of control to ensure reliable method performance. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Current status of validation for robotic surgery simulators - a systematic review.

    PubMed

    Abboudi, Hamid; Khan, Mohammed S; Aboumarzouk, Omar; Guru, Khurshid A; Challacombe, Ben; Dasgupta, Prokar; Ahmed, Kamran

    2013-02-01

    To analyse studies validating the effectiveness of robotic surgery simulators. The MEDLINE(®), EMBASE(®) and PsycINFO(®) databases were systematically searched until September 2011. References from retrieved articles were reviewed to broaden the search. The simulator name, training tasks, participant level, training duration and evaluation scoring were extracted from each study. We also extracted data on feasibility, validity, cost-effectiveness, reliability and educational impact. We identified 19 studies investigating simulation options in robotic surgery. There are five different robotic surgery simulation platforms available on the market. In all, 11 studies sought opinion and compared performance between two different groups; 'expert' and 'novice'. Experts ranged in experience from 21-2200 robotic cases. The novice groups consisted of participants with no prior experience on a robotic platform and were often medical students or junior doctors. The Mimic dV-Trainer(®), ProMIS(®), SimSurgery Educational Platform(®) (SEP) and Intuitive systems have shown face, content and construct validity. The Robotic Surgical SimulatorTM system has only been face and content validated. All of the simulators except SEP have shown educational impact. Feasibility and cost-effectiveness of simulation systems was not evaluated in any trial. Virtual reality simulators were shown to be effective training tools for junior trainees. Simulation training holds the greatest potential to be used as an adjunct to traditional training methods to equip the next generation of robotic surgeons with the skills required to operate safely. However, current simulation models have only been validated in small studies. There is no evidence to suggest one type of simulator provides more effective training than any other. More research is needed to validate simulated environments further and investigate the effectiveness of animal and cadaveric training in robotic surgery. © 2012 BJU International.

  9. Simultaneous acquisition for T2 -T2 Exchange and T1 -T2 correlation NMR experiments

    NASA Astrophysics Data System (ADS)

    Montrazi, Elton T.; Lucas-Oliveira, Everton; Araujo-Ferreira, Arthur G.; Barsi-Andreeta, Mariane; Bonagamba, Tito J.

    2018-04-01

    The NMR measurements of longitudinal and transverse relaxation times and its multidimensional correlations provide useful information about molecular dynamics. However, these experiments are very time-consuming, and many researchers proposed faster experiments to reduce this issue. This paper presents a new way to simultaneously perform T2 -T2 Exchange and T1 -T2 correlation experiments by taking the advantage of the storage time and the two steps phase cycling used for running the relaxation exchange experiment. The data corresponding to each step is either summed or subtracted to produce the T2 -T2 and T1 -T2 data, enhancing the information obtained while maintaining the experiment duration. Comparing the results from this technique with traditional NMR experiments it was possible to validate the method.

  10. Research on target tracking in coal mine based on optical flow method

    NASA Astrophysics Data System (ADS)

    Xue, Hongye; Xiao, Qingwei

    2015-03-01

    To recognize, track and count the bolting machine in coal mine video images, a real-time target tracking method based on the Lucas-Kanade sparse optical flow is proposed in this paper. In the method, we judge whether the moving target deviate from its trajectory, predicate and correct the position of the moving target. The method solves the problem of failure to track the target or lose the target because of the weak light, uneven illumination and blocking. Using the VC++ platform and Opencv lib we complete the recognition and tracking. The validity of the method is verified by the result of the experiment.

  11. Good reliability and validity for a new utility instrument measuring the birth experience, the Labor and Delivery Index.

    PubMed

    Gärtner, Fania R; de Miranda, Esteriek; Rijnders, Marlies E; Freeman, Liv M; Middeldorp, Johanna M; Bloemenkamp, Kitty W M; Stiggelbout, Anne M; van den Akker-van Marle, M Elske

    2015-10-01

    To validate the Labor and Delivery Index (LADY-X), a new delivery-specific utility measure. In a test-retest design, women were surveyed online, 6 to 8 weeks postpartum and again 1 to 2 weeks later. For reliability testing, we assessed the standard error of measurement (S.E.M.) and the intraclass correlation coefficient (ICC). For construct validity, we tested hypotheses on the association with comparison instruments (Mackey Childbirth Satisfaction Rating Scale and Wijma Delivery Experience Questionnaire), both on domain and total score levels. We assessed known-group differences using eight obstetrical indicators: method and place of birth, induction, transfer, control over pain medication, complications concerning mother and child, and experienced control. The questionnaire was completed by 308 women, 257 (83%) completed the retest. The distribution of LADY-X scores was skewed. The reliability was good, as the ICC exceeded 0.80 and the S.E.M. was 0.76. Requirements for good construct validity were fulfilled: all hypotheses for convergent and divergent validity were confirmed, and six of eight hypotheses for known-group differences were confirmed as all differences were statistically significant (P-values: <0.001-0.023), but for two tests, difference scores did not exceed the S.E.M. The LADY-X demonstrates good reliability and construct validity. Despite its skewed distribution, the LADY-X can discriminate between groups. With the preference weights available, the LADY-X might fulfill the need for a utility measure for cost-effectiveness studies for perinatal care interventions. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Design and Development Computer-Based E-Learning Teaching Material for Improving Mathematical Understanding Ability and Spatial Sense of Junior High School Students

    NASA Astrophysics Data System (ADS)

    Nurjanah; Dahlan, J. A.; Wibisono, Y.

    2017-02-01

    This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.

  13. On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method

    PubMed Central

    Roux, Benoît; Weare, Jonathan

    2013-01-01

    An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method. PMID:23464140

  14. Comparison between prediction and experiment for all-movable wing and body combinations at supersonic speeds : lift, pitching moment, and hinge moment

    NASA Technical Reports Server (NTRS)

    Nielsen, Jack N; Kaattari, George E; Drake, William C

    1952-01-01

    A simple method is presented for estimating lift, pitching-moment, and hinge-moment characteristics of all-movable wings in the presence of a body as well as the characteristics of wing-body combinations employing such wings. In general, good agreement between the method and experiment was obtained for the lift and pitching moment of the entire wing-body combination and for the lift of the wing in the presence of the body. The method is valid for moderate angles of attack, wing deflection angles, and width of gap between wing and body. The method of estimating hinge moment was not considered sufficiently accurate for triangular all-movable wings. An alternate procedure is proposed based on the experimental moment characteristics of the wing alone. Further theoretical and experimental work is required to substantiate fully the proposed procedure.

  15. Gaseous Sulfate Solubility in Glass: Experimental Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bliss, Mary

    2013-11-30

    Sulfate solubility in glass is a key parameter in many commercial glasses and nuclear waste glasses. This report summarizes key publications specific to sulfate solubility experimental methods and the underlying physical chemistry calculations. The published methods and experimental data are used to verify the calculations in this report and are expanded to a range of current technical interest. The calculations and experimental methods described in this report will guide several experiments on sulfate solubility and saturation for the Hanford Waste Treatment Plant Enhanced Waste Glass Models effort. There are several tables of sulfate gas equilibrium values at high temperature tomore » guide experimental gas mixing and to achieve desired SO3 levels. This report also describes the necessary equipment and best practices to perform sulfate saturation experiments for molten glasses. Results and findings will be published when experimental work is finished and this report is validated from the data obtained.« less

  16. Parameter Estimation and Image Reconstruction of Rotating Targets with Vibrating Interference in the Terahertz Band

    NASA Astrophysics Data System (ADS)

    Yang, Qi; Deng, Bin; Wang, Hongqiang; Qin, Yuliang

    2017-07-01

    Rotation is one of the typical micro-motions of radar targets. In many cases, rotation of the targets is always accompanied with vibrating interference, and it will significantly affect the parameter estimation and imaging, especially in the terahertz band. In this paper, we propose a parameter estimation method and an image reconstruction method based on the inverse Radon transform, the time-frequency analysis, and its inverse. The method can separate and estimate the rotating Doppler and the vibrating Doppler simultaneously and can obtain high-quality reconstructed images after vibration compensation. In addition, a 322-GHz radar system and a 25-GHz commercial radar are introduced and experiments on rotating corner reflectors are carried out in this paper. The results of the simulation and experiments verify the validity of the methods, which lay a foundation for the practical processing of the terahertz radar.

  17. A practical method of predicting the loudness of complex electrical stimuli

    NASA Astrophysics Data System (ADS)

    McKay, Colette M.; Henshall, Katherine R.; Farrell, Rebecca J.; McDermott, Hugh J.

    2003-04-01

    The output of speech processors for multiple-electrode cochlear implants consists of current waveforms with complex temporal and spatial patterns. The majority of existing processors output sequential biphasic current pulses. This paper describes a practical method of calculating loudness estimates for such stimuli, in addition to the relative loudness contributions from different cochlear regions. The method can be used either to manipulate the loudness or levels in existing processing strategies, or to control intensity cues in novel sound processing strategies. The method is based on a loudness model described by McKay et al. [J. Acoust. Soc. Am. 110, 1514-1524 (2001)] with the addition of the simplifying approximation that current pulses falling within a temporal integration window of several milliseconds' duration contribute independently to the overall loudness of the stimulus. Three experiments were carried out with six implantees who use the CI24M device manufactured by Cochlear Ltd. The first experiment validated the simplifying assumption, and allowed loudness growth functions to be calculated for use in the loudness prediction method. The following experiments confirmed the accuracy of the method using multiple-electrode stimuli with various patterns of electrode locations and current levels.

  18. Overview of the 2014 Edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John D. Bess; J. Blair Briggs; Jim Gulliford

    2014-10-01

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) is a widely recognized world class program. The work of the IRPhEP is documented in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). Integral data from the IRPhEP Handbook is used by reactor safety and design, nuclear data, criticality safety, and analytical methods development specialists, worldwide, to perform necessary validations of their calculational techniques. The IRPhEP Handbook is among the most frequently quoted reference in the nuclear industry and is expected to be a valuable resource for future decades.

  19. Computer-aided assessment of regional abdominal fat with food residue removal in CT.

    PubMed

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2013-11-01

    Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.

  20. Bond Graph Modeling and Validation of an Energy Regenerative System for Emulsion Pump Tests

    PubMed Central

    Li, Yilei; Zhu, Zhencai; Chen, Guoan

    2014-01-01

    The test system for emulsion pump is facing serious challenges due to its huge energy consumption and waste nowadays. To settle this energy issue, a novel energy regenerative system (ERS) for emulsion pump tests is briefly introduced at first. Modeling such an ERS of multienergy domains needs a unified and systematic approach. Bond graph modeling is well suited for this task. The bond graph model of this ERS is developed by first considering the separate components before assembling them together and so is the state-space equation. Both numerical simulation and experiments are carried out to validate the bond graph model of this ERS. Moreover the simulation and experiments results show that this ERS not only satisfies the test requirements, but also could save at least 25% of energy consumption as compared to the original test system, demonstrating that it is a promising method of energy regeneration for emulsion pump tests. PMID:24967428

  1. Unsteady Aerodynamic Validation Experiences From the Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Chawlowski, Pawel

    2014-01-01

    The AIAA Aeroelastic Prediction Workshop (AePW) was held in April 2012, bringing together communities of aeroelasticians, computational fluid dynamicists and experimentalists. The extended objective was to assess the state of the art in computational aeroelastic methods as practical tools for the prediction of static and dynamic aeroelastic phenomena. As a step in this process, workshop participants analyzed unsteady aerodynamic and weakly-coupled aeroelastic cases. Forced oscillation and unforced system experiments and computations have been compared for three configurations. This paper emphasizes interpretation of the experimental data, computational results and their comparisons from the perspective of validation of unsteady system predictions. The issues examined in detail are variability introduced by input choices for the computations, post-processing, and static aeroelastic modeling. The final issue addressed is interpreting unsteady information that is present in experimental data that is assumed to be steady, and the resulting consequences on the comparison data sets.

  2. Upgrades for the CMS simulation

    DOE PAGES

    Lange, D. J.; Hildreth, M.; Ivantchenko, V. N.; ...

    2015-05-22

    Over the past several years, the CMS experiment has made significant changes to its detector simulation application. The geometry has been generalized to include modifications being made to the CMS detector for 2015 operations, as well as model improvements to the simulation geometry of the current CMS detector and the implementation of a number of approved and possible future detector configurations. These include both completely new tracker and calorimetry systems. We have completed the transition to Geant4 version 10, we have made significant progress in reducing the CPU resources required to run our Geant4 simulation. These have been achieved throughmore » both technical improvements and through numerical techniques. Substantial speed improvements have been achieved without changing the physics validation benchmarks that the experiment uses to validate our simulation application for use in production. As a result, we will discuss the methods that we implemented and the corresponding demonstrated performance improvements deployed for our 2015 simulation application.« less

  3. Validation of a 30-year-old process for the manufacture of L-asparaginase from Erwinia chrysanthemi.

    PubMed

    Gervais, David; Allison, Nigel; Jennings, Alan; Jones, Shane; Marks, Trevor

    2013-04-01

    A 30-year-old manufacturing process for the biologic product L-asparaginase from the plant pathogen Erwinia chrysanthemi was rigorously qualified and validated, with a high level of agreement between validation data and the 6-year process database. L-Asparaginase exists in its native state as a tetrameric protein and is used as a chemotherapeutic agent in the treatment regimen for Acute Lymphoblastic Leukaemia (ALL). The manufacturing process involves fermentation of the production organism, extraction and purification of the L-asparaginase to make drug substance (DS), and finally formulation and lyophilisation to generate drug product (DP). The extensive manufacturing experience with the product was used to establish ranges for all process parameters and product quality attributes. The product and in-process intermediates were rigorously characterised, and new assays, such as size-exclusion and reversed-phase UPLC, were developed, validated, and used to analyse several pre-validation batches. Finally, three prospective process validation batches were manufactured and product quality data generated using both the existing and the new analytical methods. These data demonstrated the process to be robust, highly reproducible and consistent, and the validation was successful, contributing to the granting of an FDA product license in November, 2011.

  4. New Reactor Physics Benchmark Data in the March 2012 Edition of the IRPhEP Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John D. Bess; J. Blair Briggs; Jim Gulliford

    2012-11-01

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications. Numerous experiments that have been performed worldwide, represent a large investment of infrastructure, expertise, and cost, and are valuable resources of data for present and future research. These valuable assets provide the basis for recording, development, and validation of methods. If the experimental data are lost, the high cost to repeat many of these measurements may be prohibitive. The purpose of the IRPhEP is to provide an extensively peer-reviewed set ofmore » reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. Contributors from around the world collaborate in the evaluation and review of selected benchmark experiments for inclusion in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [1]. Several new evaluations have been prepared for inclusion in the March 2012 edition of the IRPhEP Handbook.« less

  5. Experimental Database with Baseline CFD Solutions: 2-D and Axisymmetric Hypersonic Shock-Wave/Turbulent-Boundary-Layer Interactions

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.; Brown, James L.; Gnoffo, Peter A.

    2013-01-01

    A database compilation of hypersonic shock-wave/turbulent boundary layer experiments is provided. The experiments selected for the database are either 2D or axisymmetric, and include both compression corner and impinging type SWTBL interactions. The strength of the interactions range from attached to incipient separation to fully separated flows. The experiments were chosen based on criterion to ensure quality of the datasets, to be relevant to NASA's missions and to be useful for validation and uncertainty assessment of CFD Navier-Stokes predictive methods, both now and in the future. An emphasis on datasets selected was on surface pressures and surface heating throughout the interaction, but include some wall shear stress distributions and flowfield profiles. Included, for selected cases, are example CFD grids and setup information, along with surface pressure and wall heating results from simulations using current NASA real-gas Navier-Stokes codes by which future CFD investigators can compare and evaluate physics modeling improvements and validation and uncertainty assessments of future CFD code developments. The experimental database is presented tabulated in the Appendices describing each experiment. The database is also provided in computer-readable ASCII files located on a companion DVD.

  6. Validation of a two-dimensional liquid chromatography method for quality control testing of pharmaceutical materials.

    PubMed

    Yang, Samuel H; Wang, Jenny; Zhang, Kelly

    2017-04-07

    Despite the advantages of 2D-LC, there is currently little to no work in demonstrating the suitability of these 2D-LC methods for use in a quality control (QC) environment for good manufacturing practice (GMP) tests. This lack of information becomes more critical as the availability of commercial 2D-LC instrumentation has significantly increased, and more testing facilities begin to acquire these 2D-LC capabilities. It is increasingly important that the transferability of developed 2D-LC methods be assessed in terms of reproducibility, robustness and performance across different laboratories worldwide. The work presented here focuses on the evaluation of a heart-cutting 2D-LC method used for the analysis of a pharmaceutical material, where a key, co-eluting impurity in the first dimension ( 1 D) is resolved from the main peak and analyzed in the second dimension ( 2 D). A design-of-experiments (DOE) approach was taken in the collection of the data, and the results were then modeled in order to evaluate method robustness using statistical modeling software. This quality by design (QBD) approach gives a deeper understanding of the impact of these 2D-LC critical method attributes (CMAs) and how they affect overall method performance. Although there are multiple parameters that may be critical from method development point of view, a special focus of this work is devoted towards evaluation of unique 2D-LC critical method attributes from method validation perspective that transcend conventional method development and validation. The 2D-LC method attributes are evaluated for their recovery, peak shape, and resolution of the two co-eluting compounds in question on the 2 D. In the method, linearity, accuracy, precision, repeatability, and sensitivity are assessed along with day-to-day, analyst-to-analyst, and lab-to-lab (instrument-to-instrument) assessments. The results of this validation study demonstrate that the 2D-LC method is accurate, sensitive, and robust and is ultimately suitable for QC testing with good method transferability. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Nanoliter microfluidic hybrid method for simultaneous screening and optimization validated with crystallization of membrane proteins

    PubMed Central

    Li, Liang; Mustafi, Debarshi; Fu, Qiang; Tereshko, Valentina; Chen, Delai L.; Tice, Joshua D.; Ismagilov, Rustem F.

    2006-01-01

    High-throughput screening and optimization experiments are critical to a number of fields, including chemistry and structural and molecular biology. The separation of these two steps may introduce false negatives and a time delay between initial screening and subsequent optimization. Although a hybrid method combining both steps may address these problems, miniaturization is required to minimize sample consumption. This article reports a “hybrid” droplet-based microfluidic approach that combines the steps of screening and optimization into one simple experiment and uses nanoliter-sized plugs to minimize sample consumption. Many distinct reagents were sequentially introduced as ≈140-nl plugs into a microfluidic device and combined with a substrate and a diluting buffer. Tests were conducted in ≈10-nl plugs containing different concentrations of a reagent. Methods were developed to form plugs of controlled concentrations, index concentrations, and incubate thousands of plugs inexpensively and without evaporation. To validate the hybrid method and demonstrate its applicability to challenging problems, crystallization of model membrane proteins and handling of solutions of detergents and viscous precipitants were demonstrated. By using 10 μl of protein solution, ≈1,300 crystallization trials were set up within 20 min by one researcher. This method was compatible with growth, manipulation, and extraction of high-quality crystals of membrane proteins, demonstrated by obtaining high-resolution diffraction images and solving a crystal structure. This robust method requires inexpensive equipment and supplies, should be especially suitable for use in individual laboratories, and could find applications in a number of areas that require chemical, biochemical, and biological screening and optimization. PMID:17159147

  8. Residual stress measurement in a metal microdevice by micro Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Song, Chang; Du, Liqun; Qi, Leijie; Li, Yu; Li, Xiaojun; Li, Yuanqi

    2017-10-01

    Large residual stress induced during the electroforming process cannot be ignored to fabricate reliable metal microdevices. Accurate measurement is the basis for studying the residual stress. Influenced by the topological feature size of micron scale in the metal microdevice, residual stress in it can hardly be measured by common methods. In this manuscript, a methodology is proposed to measure the residual stress in the metal microdevice using micro Raman spectroscopy (MRS). To estimate the residual stress in metal materials, micron sized β-SiC particles were mixed in the electroforming solution for codeposition. First, the calculated expression relating the Raman shifts to the induced biaxial stress for β-SiC was derived based on the theory of phonon deformation potentials and Hooke’s law. Corresponding micro electroforming experiments were performed and the residual stress in Ni-SiC composite layer was both measured by x-ray diffraction (XRD) and MRS methods. Then, the validity of the MRS measurements was verified by comparing with the residual stress measured by XRD method. The reliability of the MRS method was further validated by the statistical student’s t-test. The MRS measurements were found to have no systematic error in comparison with the XRD measurements, which confirm that the residual stresses measured by the MRS method are reliable. Besides that, the MRS method, by which the residual stress in a micro inertial switch was measured, has been confirmed to be a convincing experiment tool for estimating the residual stress in metal microdevice with micron order topological feature size.

  9. Gamifying Self-Management of Chronic Illnesses: A Mixed-Methods Study.

    PubMed

    AlMarshedi, Alaa; Wills, Gary; Ranchhod, Ashok

    2016-09-09

    Self-management of chronic illnesses is an ongoing issue in health care research. Gamification is a concept that arose in the field of computer science and has been borrowed by many other disciplines. It is perceived by many that gamification can improve the self-management experience of people with chronic illnesses. This paper discusses the validation of a framework (called The Wheel of Sukr) that was introduced to achieve this goal. This research aims to (1) discuss a gamification framework targeting the self-management of chronic illnesses and (2) validate the framework by diabetic patients, medical professionals, and game experts. A mixed-method approach was used to validate the framework. Expert interviews (N=8) were conducted in order to validate the themes of the framework. Additionally, diabetic participants completed a questionnaire (N=42) in order to measure their attitudes toward the themes of the framework. The results provide a validation of the framework. This indicates that gamification might improve the self-management of chronic illnesses, such as diabetes. Namely, the eight themes in the Wheel of Sukr (fun, esteem, socializing, self-management, self-representation, motivation, growth, sustainability) were perceived positively by 71% (30/42) of the participants with P value <.001. In this research, both the interviews and the questionnaire yielded positive results that validate the framework (The Wheel of Sukr). Generally, this study indicates an overall acceptance of the notion of gamification in the self-management of diabetes.

  10. Methodology for turbulence code validation: Quantification of simulation-experiment agreement and application to the TORPEX experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, Paolo; Theiler, C.; Fasoli, A.

    A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less

  11. Self-homodyne free-space optical communication system based on orthogonally polarized binary phase shift keying.

    PubMed

    Cai, Guangyu; Sun, Jianfeng; Li, Guangyuan; Zhang, Guo; Xu, Mengmeng; Zhang, Bo; Yue, Chaolei; Liu, Liren

    2016-06-10

    A self-homodyne laser communication system based on orthogonally polarized binary phase shift keying is demonstrated. The working principles of this method and the structure of a transceiver are described using theoretical calculations. Moreover, the signal-to-noise ratio, sensitivity, and bit error rate are analyzed for the amplifier-noise-limited case. The reported experiment validates the feasibility of the proposed method and demonstrates its advantageous sensitivity as a self-homodyne communication system.

  12. Calculating the Bending Modulus for Multicomponent Lipid Membranes in Different Thermodynamic Phases

    PubMed Central

    2013-01-01

    We establish a computational approach to extract the bending modulus, KC, for lipid membranes from relatively small-scale molecular simulations. Fluctuations in the splay of individual pairs of lipids faithfully inform on KC in multicomponent membranes over a large range of rigidities in different thermodynamic phases. Predictions are validated by experiments even where the standard spectral analysis-based methods fail. The local nature of this method potentially allows its extension to calculations of KC in protein-laden membranes. PMID:24039553

  13. Design and realization of retina-like three-dimensional imaging based on a MOEMS mirror

    NASA Astrophysics Data System (ADS)

    Cao, Jie; Hao, Qun; Xia, Wenze; Peng, Yuxin; Cheng, Yang; Mu, Jiaxing; Wang, Peng

    2016-07-01

    To balance conflicts for high-resolution, large-field-of-view and real-time imaging, a retina-like imaging method based on time-of flight (TOF) is proposed. Mathematical models of 3D imaging based on MOEMS are developed. Based on this method, we perform simulations of retina-like scanning properties, including compression of redundant information and rotation and scaling invariance. To validate the theory, we develop a prototype and conduct relevant experiments. The preliminary results agree well with the simulations.

  14. Error analysis of finite element method for Poisson–Nernst–Planck equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yuzhou; Sun, Pengtao; Zheng, Bin

    A priori error estimates of finite element method for time-dependent Poisson-Nernst-Planck equations are studied in this work. We obtain the optimal error estimates in L∞(H1) and L2(H1) norms, and suboptimal error estimates in L∞(L2) norm, with linear element, and optimal error estimates in L∞(L2) norm with quadratic or higher-order element, for both semi- and fully discrete finite element approximations. Numerical experiments are also given to validate the theoretical results.

  15. Calibration of the LHAASO-KM2A electromagnetic particle detectors using charged particles within the extensive air showers

    NASA Astrophysics Data System (ADS)

    Lv, Hongkui; He, Huihai; Sheng, Xiangdong; Liu, Jia; Chen, Songzhan; Liu, Ye; Hou, Chao; Zhao, Jing; Zhang, Zhongquan; Wu, Sha; Wang, Yaping; Lhaaso Collaboration

    2018-07-01

    In the Large High Altitude Air Shower Observatory (LHAASO), one square kilometer array (KM2A), with 5242 electromagnetic particle detectors (EDs) and 1171 muon detectors (MDs), is designed to study ultra-high energy gamma-ray astronomy and cosmic ray physics. The remoteness and numerous detectors extremely demand a robust and automatic calibration procedure. In this paper, a self-calibration method which relies on the measurement of charged particles within the extensive air showers is proposed. The method is fully validated by Monte Carlo simulation and successfully applied in a KM2A prototype array experiment. Experimental results show that the self-calibration method can be used to determine the detector time offset constants at the sub-nanosecond level and the number density of particles collected by each ED with an accuracy of a few percents, which are adequate to meet the physical requirements of LHAASO experiment. This software calibration also offers an ideal method to realtime monitor the detector performances for next generation ground-based EAS experiments covering an area above square kilometers scale.

  16. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jernigan, Dann A.; Blanchat, Thomas K.

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less

  17. Developments in Sensitivity Methodologies and the Validation of Reactor Physics Calculations

    DOE PAGES

    Palmiotti, Giuseppe; Salvatores, Massimo

    2012-01-01

    The sensitivity methodologies have been a remarkable story when adopted in the reactor physics field. Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. A review of the methods used is provided, and several examples illustrate the success of the methodology in reactor physics. A new application as the improvement of nuclear basic parameters using integral experiments is also described.

  18. The PeptideAtlas Project.

    PubMed

    Deutsch, Eric W

    2010-01-01

    PeptideAtlas is a multi-species compendium of peptides observed with tandem mass spectrometry methods. Raw mass spectrometer output files are collected from the community and reprocessed through a uniform analysis and validation pipeline that continues to advance. The results are loaded into a database and the information derived from the raw data is returned to the community via several web-based data exploration tools. The PeptideAtlas resource is useful for experiment planning, improving genome annotation, and other data mining projects. PeptideAtlas has become especially useful for planning targeted proteomics experiments.

  19. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    NASA Astrophysics Data System (ADS)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  20. An augmented classical least squares method for quantitative Raman spectral analysis against component information loss.

    PubMed

    Zhou, Yan; Cao, Hui

    2013-01-01

    We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.

  1. Shared control of a medical robot with haptic guidance.

    PubMed

    Xiong, Linfei; Chng, Chin Boon; Chui, Chee Kong; Yu, Peiwu; Li, Yao

    2017-01-01

    Tele-operation of robotic surgery reduces the radiation exposure during the interventional radiological operations. However, endoscope vision without force feedback on the surgical tool increases the difficulty for precise manipulation and the risk of tissue damage. The shared control of vision and force provides a novel approach of enhanced control with haptic guidance, which could lead to subtle dexterity and better maneuvrability during MIS surgery. The paper provides an innovative shared control method for robotic minimally invasive surgery system, in which vision and haptic feedback are incorporated to provide guidance cues to the clinician during surgery. The incremental potential field (IPF) method is utilized to generate a guidance path based on the anatomy of tissue and surgical tool interaction. Haptic guidance is provided at the master end to assist the clinician during tele-operative surgical robotic task. The approach has been validated with path following and virtual tumor targeting experiments. The experiment results demonstrate that comparing with vision only guidance, the shared control with vision and haptics improved the accuracy and efficiency of surgical robotic manipulation, where the tool-position error distance and execution time are reduced. The validation experiment demonstrates that the shared control approach could help the surgical robot system provide stable assistance and precise performance to execute the designated surgical task. The methodology could also be implemented with other surgical robot with different surgical tools and applications.

  2. Electrolysis Performance Improvement and Validation Experiment

    NASA Technical Reports Server (NTRS)

    Schubert, Franz H.

    1992-01-01

    Viewgraphs on electrolysis performance improvement and validation experiment are presented. Topics covered include: water electrolysis: an ever increasing need/role for space missions; static feed electrolysis (SFE) technology: a concept developed for space applications; experiment objectives: why test in microgravity environment; and experiment description: approach, hardware description, test sequence and schedule.

  3. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  4. A CFD validation roadmap for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.

    1993-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  5. Validation of a continuous flow method for the determination of soluble iron in atmospheric dust and volcanic ash.

    PubMed

    Simonella, Lucio E; Gaiero, Diego M; Palomeque, Miriam E

    2014-10-01

    Iron is an essential micronutrient for phytoplankton growth and is supplied to the remote areas of the ocean mainly through atmospheric dust/ash. The amount of soluble Fe in dust/ash is a major source of uncertainty in modeling-Fe dissolution and deposition to the surface ocean. Currently in the literature, there exist almost as many different methods to estimate fractional solubility as researchers in the field, making it difficult to compare results between research groups. Also, an important constraint to evaluate Fe solubility in atmospheric dust is the limited mass of sample which is usually only available in micrograms to milligrams amounts. A continuous flow (CF) method that can be run with low mass of sediments (<10mg) was tested against a standard method which require about 1g of sediments (BCR of the European Union). For validation of the CF experiment, we run both methods using South American surface sediment and deposited volcanic ash. Both materials tested are easy eroded by wind and are representative of atmospheric dust/ash exported from this region. The uncertainty of the CF method was obtained from seven replicates of one surface sediment sample, and shows very good reproducibility. The replication was conducted on different days in a span of two years and ranged between 8 and 22% (i.e., the uncertainty for the standard method was 6-19%). Compared to other standardized methods, the CF method allows studies of dissolution kinetic of metals and consumes less reagents and time (<3h). The method validated here is suggested to be used as a standardized method for Fe solubility studies on dust/ash. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Reconstruction of gene regulatory modules from RNA silencing of IFN-α modulators: experimental set-up and inference method.

    PubMed

    Grassi, Angela; Di Camillo, Barbara; Ciccarese, Francesco; Agnusdei, Valentina; Zanovello, Paola; Amadori, Alberto; Finesso, Lorenzo; Indraccolo, Stefano; Toffolo, Gianna Maria

    2016-03-12

    Inference of gene regulation from expression data may help to unravel regulatory mechanisms involved in complex diseases or in the action of specific drugs. A challenging task for many researchers working in the field of systems biology is to build up an experiment with a limited budget and produce a dataset suitable to reconstruct putative regulatory modules worth of biological validation. Here, we focus on small-scale gene expression screens and we introduce a novel experimental set-up and a customized method of analysis to make inference on regulatory modules starting from genetic perturbation data, e.g. knockdown and overexpression data. To illustrate the utility of our strategy, it was applied to produce and analyze a dataset of quantitative real-time RT-PCR data, in which interferon-α (IFN-α) transcriptional response in endothelial cells is investigated by RNA silencing of two candidate IFN-α modulators, STAT1 and IFIH1. A putative regulatory module was reconstructed by our method, revealing an intriguing feed-forward loop, in which STAT1 regulates IFIH1 and they both negatively regulate IFNAR1. STAT1 regulation on IFNAR1 was object of experimental validation at the protein level. Detailed description of the experimental set-up and of the analysis procedure is reported, with the intent to be of inspiration for other scientists who want to realize similar experiments to reconstruct gene regulatory modules starting from perturbations of possible regulators. Application of our approach to the study of IFN-α transcriptional response modulators in endothelial cells has led to many interesting novel findings and new biological hypotheses worth of validation.

  7. Research on Flow Field Perception Based on Artificial Lateral Line Sensor System.

    PubMed

    Liu, Guijie; Wang, Mengmeng; Wang, Anyi; Wang, Shirui; Yang, Tingting; Malekian, Reza; Li, Zhixiong

    2018-03-11

    In nature, the lateral line of fish is a peculiar and important organ for sensing the surrounding hydrodynamic environment, preying, escaping from predators and schooling. In this paper, by imitating the mechanism of fish lateral canal neuromasts, we developed an artificial lateral line system composed of micro-pressure sensors. Through hydrodynamic simulations, an optimized sensor structure was obtained and the pressure distribution models of the lateral surface were established in uniform flow and turbulent flow. Carrying out the corresponding underwater experiment, the validity of the numerical simulation method is verified by the comparison between the experimental data and the simulation results. In addition, a variety of effective research methods are proposed and validated for the flow velocity estimation and attitude perception in turbulent flow, respectively and the shape recognition of obstacles is realized by the neural network algorithm.

  8. High salinity relay as a post-harvest processing method for reducing Vibrio vulnificus levels in oysters (Crassostrea virginica).

    PubMed

    Audemard, Corinne; Kator, Howard I; Reece, Kimberly S

    2018-08-20

    High salinity relay of Eastern oysters (Crassostrea virginica) was evaluated as a post-harvest processing (PHP) method for reducing Vibrio vulnificus. This approach relies on the exposure of oysters to natural high salinity waters and preserves a live product compared to previously approved PHPs. Although results of prior studies evaluating high salinity relay as a means to decrease V. vulnificus levels were promising, validation of this method as a PHP following approved guidelines is required. This study was designed to provide data for validation of this method following Food and Drug Administration (FDA) PHP validation guidelines. During each of 3 relay experiments, oysters cultured from 3 different Chesapeake Bay sites of contrasting salinities (10-21 psu) were relayed without acclimation to high salinity waters (31-33 psu) for up to 28 days. Densities of V. vulnificus and densities of total and pathogenic Vibrio parahaemolyticus (as tdh positive strains) were measured using an MPN-quantitative PCR approach. Overall, 9 lots of oysters were relayed with 6 exhibiting initial V. vulnificus >10,000/g. As recommended by the FDA PHP validation guidelines, these lots reached both the 3.52 log reduction and the <30 MPN/g densities requirements for V. vulnificus after 14 to 28 days of relay. Densities of total and pathogenic V. parahaemolyticus in relayed oysters were significantly lower than densities at the sites of origin suggesting an additional benefit associated with high salinity relay. While relay did not have a detrimental effect on oyster condition, oyster mortality levels ranged from 2 to 61% after 28 days of relay. Although the identification of the factors implicated in oyster mortality will require further examination, this study strongly supports the validation of high salinity relay as an effective PHP method to reduce levels of V. vulnificus in oysters to endpoint levels approved for human consumption. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Procedure-specific assessment tool for flexible pharyngo-laryngoscopy: gathering validity evidence and setting pass-fail standards.

    PubMed

    Melchiors, Jacob; Petersen, K; Todsen, T; Bohr, A; Konge, Lars; von Buchwald, Christian

    2018-06-01

    The attainment of specific identifiable competencies is the primary measure of progress in the modern medical education system. The system, therefore, requires a method for accurately assessing competence to be feasible. Evidence of validity needs to be gathered before an assessment tool can be implemented in the training and assessment of physicians. This evidence of validity must according to the contemporary theory on validity be gathered from specific sources in a structured and rigorous manner. The flexible pharyngo-laryngoscopy (FPL) is central to the otorhinolaryngologist. We aim to evaluate the flexible pharyngo-laryngoscopy assessment tool (FLEXPAT) created in a previous study and to establish a pass-fail level for proficiency. Eighteen physicians with different levels of experience (novices, intermediates, and experienced) were recruited to the study. Each performed an FPL on two patients. These procedures were video recorded, blinded, and assessed by two specialists. The score was expressed as the percentage of a possible max score. Cronbach's α was used to analyze internal consistency of the data, and a generalizability analysis was performed. The scores of the three different groups were explored, and a pass-fail level was determined using the contrasting groups' standard setting method. Internal consistency was strong with a Cronbach's α of 0.86. We found a generalizability coefficient of 0.72 sufficient for moderate stakes assessment. We found a significant difference between the novice and experienced groups (p < 0.001) and strong correlation between experience and score (Pearson's r = 0.75). The pass/fail level was established at 72% of the maximum score. Applying this pass-fail level in the test population resulted in half of the intermediary group receiving a failing score. We gathered validity evidence for the FLEXPAT according to the contemporary framework as described by Messick. Our results support a claim of validity and are comparable to other studies exploring clinical assessment tools. The high rate of physicians underperforming in the intermediary group demonstrates the need for continued educational intervention. Based on our work, we recommend the use of the FLEXPAT in clinical assessment of FPL and the application of a pass-fail level of 72% for proficiency.

  10. Four experimental demonstrations of active vibration control for flexible structures

    NASA Technical Reports Server (NTRS)

    Phillips, Doug; Collins, Emmanuel G., Jr.

    1990-01-01

    Laboratory experiments designed to test prototype active-vibration-control systems under development for future flexible space structures are described, summarizing previously reported results. The control-synthesis technique employed for all four experiments was the maximum-entropy optimal-projection (MEOP) method (Bernstein and Hyland, 1988). Consideration is given to: (1) a pendulum experiment on large-amplitude LF dynamics; (2) a plate experiment on broadband vibration suppression in a two-dimensional structure; (3) a multiple-hexagon experiment combining the factors studied in (1) and (2) to simulate the complexity of a large space structure; and (4) the NASA Marshall ACES experiment on a lightweight deployable 45-foot beam. Extensive diagrams, drawings, graphs, and photographs are included. The results are shown to validate the MEOP design approach, demonstrating that good performance is achievable using relatively simple low-order decentralized controllers.

  11. Validating the Changes to Self-identity After Total Laryngectomy.

    PubMed

    Bickford, Jane; Coveney, John; Baker, Janet; Hersh, Deborah

    2018-05-25

    A total laryngectomy often prolongs life but results in long-term disablement, disfigurement, and complex care needs. Current clinical practice addresses the surgical options, procedures, and immediate recovery. Less support is available longer-term despite significant changes to aspects of personhood and ongoing medical needs. The aim of this study was to explore the experience of living with and/or supporting individuals with a laryngectomy at least 1 year after surgery. Constructivist grounded theory methods and symbolic interactionism were used to guide collection and analysis of interview data from 28 participants (12 individuals with a laryngectomy, 9 primary supporters, and 7 health professionals). The phenomena of "validating the altered self after total laryngectomy" highlighted how individuals, postlaryngectomy, navigate and negotiate interactions due to the disruption of their self-expression, related competencies, and roles. Several reframing patterns representing validation of the self emerged from the narratives. They were as follows: destabilized, resigned, resolute, and transformed. The data describe the influence of the processes of developing competence and building resilience, combined with contextual factors, for example, timing and turning points; being supported; and personal factors on these reframing patterns. The findings further our understanding of the long-term subjective experience of identity change after laryngectomy and call attention to the persisting need for psychosocial support. This research provides important evidence for evaluating and strengthening the continuum of services (specialist to community) and supporting social participation, regardless of communication method, and for competency training for all involved to optimize person-centered practices.

  12. Final Report of Research Conducted For DE-AI02-08ER64546

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick Minnis

    2012-03-28

    Research was conducted for 3-4 years to use ARM data to validate satellite cloud retrievals and help the development of improved techniques for remotely sensing clouds and radiative fluxes from space to complement the ARM surface measurement program. This final report summarizes the results and publications during the last 2 years of the studies. Since our last report covering the 2009 period, we published four papers that were accepted during the previous reporting period and revised and published a fifth one. Our efforts to intercalibrate selected channels on several polar orbiting and geostationary satellite imagers, which are funded in partmore » by ASR, resulted in methods that were accepted as part of the international Global Space-based Intercalibration System (GSICS) calibration algorithms. We developed a new empirical method for correcting the spectral differences between comparable channels on various imagers that will be used to correct the calibrations of the satellite data used for ARM. We documented our cloud retrievals for the VAMOS Ocean-Cloud-Atmosphere-Land Study Regional Experiment (VOCALS-Rex; ARM participated with an AAF contribution) in context of the entire experiment. We used our VOCALS satellite data along with the aircraft measurements to better understand the relationships between aerosols and liquid water path in marine stratus clouds. We continued or efforts to validate and improve the satellite cloud retrievals for ARM and using ARM data to validate retrievals for other purposes.« less

  13. [Multi-mathematical modelings for compatibility optimization of Jiangzhi granules].

    PubMed

    Yang, Ming; Zhang, Li; Ge, Yingli; Lu, Yanliu; Ji, Guang

    2011-12-01

    To investigate into the method of "multi activity index evaluation and combination optimized of mult-component" for Chinese herbal formulas. According to the scheme of uniform experimental design, efficacy experiment, multi index evaluation, least absolute shrinkage, selection operator (LASSO) modeling, evolutionary optimization algorithm, validation experiment, we optimized the combination of Jiangzhi granules based on the activity indexes of blood serum ALT, ALT, AST, TG, TC, HDL, LDL and TG level of liver tissues, ratio of liver tissue to body. Analytic hierarchy process (AHP) combining with criteria importance through intercriteria correlation (CRITIC) for multi activity index evaluation was more reasonable and objective, it reflected the information of activity index's order and objective sample data. LASSO algorithm modeling could accurately reflect the relationship between different combination of Jiangzhi granule and the activity comprehensive indexes. The optimized combination of Jiangzhi granule showed better values of the activity comprehensive indexed than the original formula after the validation experiment. AHP combining with CRITIC can be used for multi activity index evaluation and LASSO algorithm, it is suitable for combination optimized of Chinese herbal formulas.

  14. Measuring Dispositional Flow: Validity and reliability of the Dispositional Flow State Scale 2, Italian version.

    PubMed

    Riva, Eleonora F M; Riva, Giuseppe; Talò, Cosimo; Boffi, Marco; Rainisio, Nicola; Pola, Linda; Diana, Barbara; Villani, Daniela; Argenton, Luca; Inghilleri, Paolo

    2017-01-01

    The aim of this study is to evaluate the psychometric properties of the Italian version of the Dispositional Flow Scale-2 (DFS-2), for use with Italian adults, young adults and adolescents. In accordance with the guidelines for test adaptation, the scale has been translated with the method of back translation. The understanding of the item has been checked according to the latest standards on the culturally sensitive translation. The scale thus produced was administered to 843 individuals (of which 60.69% female), between the ages of 15 and 74. The sample is balanced between workers and students. The main activities defined by the subjects allow the sample to be divided into three categories: students, workers, athletes (professionals and semi-professionals). The confirmatory factor analysis, conducted using the Maximum Likelihood Estimator (MLM), showed acceptable fit indexes. Reliability and validity have been verified, and structural invariance has been verified on 6 categories of Flow experience and for 3 subsamples with different with different fields of action. Correlational analysis shows significant high values between the nine dimensions. Our data confirmed the validity and reliability of the Italian DFS-2 in measuring Flow experiences. The scale is reliable for use with Italian adults, young adults and adolescents. The Italian version of the scale is suitable for the evaluation of the subjective tendency to experience Flow trait characteristic in different contest, as sport, study and work.

  15. Intuition in emergency nursing: a phenomenological study.

    PubMed

    Lyneham, Joy; Parkinson, Camillus; Denholm, Carey

    2008-04-01

    The evidence of experience of intuitive knowing in the clinical setting has to this point only been informal and anecdotal. Reported experiences thus need to be either validated or refuted so that its place in emergency nursing can be determined. The history, nature and component themes captured within the intuitive practice of emergency nursing are described. This study was informed by the philosophy and method of phenomenology. Participants were 14 experienced emergency nurses. Through their narrative accounts and recall of events their experience of knowing was captured. Through a Van Manen process and a Gadamerian analysis, six themes associated with the ways in which the participants experienced intuition in clinical practice, were identified. This paper reveals the six emerging themes as knowledge, experience, connection, feeling, syncretism and trust.

  16. Shape control of an adaptive wing for transonic drag reduction

    NASA Astrophysics Data System (ADS)

    Austin, Fred; Van Nostrand, William C.

    1995-05-01

    Theory and experiments to control the static shape of flexible structures by employing internal translational actuators are summarized and plants to extend the work to adaptive wings are presented. Significant reductions in the shock-induced drag are achievable during transonic- cruise by small adaptive modifications to the wing cross-sectional profile. Actuators are employed as truss elements of active ribs to deform the wing cross section. An adaptive-rib model was constructed, and experiments validated the shape-control theory. Plans for future development under an ARPA/AFWAL contract include payoff assessments of the method on an actual aircraft, the development of inchworm TERFENOL-D actuators, and the development of a method to optimize the wing cross-sectional shapes by direct-drag measurements.

  17. The construction and assessment of a statistical model for the prediction of protein assay data.

    PubMed

    Pittman, J; Sacks, J; Young, S Stanley

    2002-01-01

    The focus of this work is the development of a statistical model for a bioinformatics database whose distinctive structure makes model assessment an interesting and challenging problem. The key components of the statistical methodology, including a fast approximation to the singular value decomposition and the use of adaptive spline modeling and tree-based methods, are described, and preliminary results are presented. These results are shown to compare favorably to selected results achieved using comparitive methods. An attempt to determine the predictive ability of the model through the use of cross-validation experiments is discussed. In conclusion a synopsis of the results of these experiments and their implications for the analysis of bioinformatic databases in general is presented.

  18. A novel analysis method for paired-sample microbial ecology experiments

    DOE PAGES

    Olesen, Scott W.; Vora, Suhani; Techtmann, Stephen M.; ...

    2016-05-06

    Many microbial ecology experiments use sequencing data to measure a community s response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samplesmore » and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method s validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Furthermore, our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of bottle effects .« less

  19. A novel analysis method for paired-sample microbial ecology experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olesen, Scott W.; Vora, Suhani; Techtmann, Stephen M.

    Many microbial ecology experiments use sequencing data to measure a community s response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samplesmore » and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method s validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Furthermore, our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of bottle effects .« less

  20. Investigation of Springback Associated with Composite Material Component Fabrication (MSFC Center Director's Discretionary Fund Final Report, Project 94-09)

    NASA Technical Reports Server (NTRS)

    Benzie, M. A.

    1998-01-01

    The objective of this research project was to examine processing and design parameters in the fabrication of composite components to obtain a better understanding and attempt to minimize springback associated with composite materials. To accomplish this, both processing and design parameters were included in a Taguchi-designed experiment. Composite angled panels were fabricated, by hand layup techniques, and the fabricated panels were inspected for springback effects. This experiment yielded several significant results. The confirmation experiment validated the reproducibility of the factorial effects, error recognized, and experiment as reliable. The material used in the design of tooling needs to be a major consideration when fabricating composite components, as expected. The factors dealing with resin flow, however, raise several potentially serious material and design questions. These questions must be dealt with up front in order to minimize springback: viscosity of the resin, vacuum bagging of the part for cure, and the curing method selected. These factors directly affect design, material selection, and processing methods.

  1. Patient Experience and Satisfaction with Inpatient Service: Development of Short Form Survey Instrument Measuring the Core Aspect of Inpatient Experience

    PubMed Central

    Wong, Eliza L. Y.; Coulter, Angela; Hewitson, Paul; Cheung, Annie W. L.; Yam, Carrie H. K.; Lui, Siu fai; Tam, Wilson W. S.; Yeoh, Eng-kiong

    2015-01-01

    Patient experience reflects quality of care from the patients’ perspective; therefore, patients’ experiences are important data in the evaluation of the quality of health services. The development of an abbreviated, reliable and valid instrument for measuring inpatients’ experience would reflect the key aspect of inpatient care from patients’ perspective as well as facilitate quality improvement by cultivating patient engagement and allow the trends in patient satisfaction and experience to be measured regularly. The study developed a short-form inpatient instrument and tested its ability to capture a core set of inpatients’ experiences. The Hong Kong Inpatient Experience Questionnaire (HKIEQ) was established in 2010; it is an adaptation of the General Inpatient Questionnaire of the Care Quality Commission created by the Picker Institute in United Kingdom. This study used a consensus conference and a cross-sectional validation survey to create and validate a short-form of the Hong Kong Inpatient Experience Questionnaire (SF-HKIEQ). The short-form, the SF-HKIEQ, consisted of 18 items derived from the HKIEQ. The 18 items mainly covered relational aspects of care under four dimensions of the patient’s journey: hospital staff, patient care and treatment, information on leaving the hospital, and overall impression. The SF-HKIEQ had a high degree of face validity, construct validity and internal reliability. The validated SF-HKIEQ reflects the relevant core aspects of inpatients’ experience in a hospital setting. It provides a quick reference tool for quality improvement purposes and a platform that allows both healthcare staff and patients to monitor the quality of hospital care over time. PMID:25860775

  2. Validation of vision-based range estimation algorithms using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Smith, Phillip N.

    1993-01-01

    The objective of this research was to demonstrate the effectiveness of an optic flow method for passive range estimation using a Kalman-filter implementation with helicopter flight data. This paper is divided into the following areas: (1) ranging algorithm; (2) flight experiment; (3) analysis methodology; (4) results; and (5) concluding remarks. The discussion is presented in viewgraph format.

  3. New Tools for Learning: A Case of Organizational Problem Analysis Derived from Debriefing Records in a Medical Center

    ERIC Educational Resources Information Center

    Holzmann, Vered; Mischari, Shoshana; Goldberg, Shoshana; Ziv, Amitai

    2012-01-01

    Purpose: This article aims to present a unique systematic and validated method for creating a linkage between past experiences and management of future occurrences in an organization. Design/methodology/approach: The study is based on actual data accumulated in a series of projects performed in a major medical center. Qualitative and quantitative…

  4. Assessment of Trauma History and Trauma-Related Problems in Ethnic Minority Child Populations: An INFORMED Approach

    ERIC Educational Resources Information Center

    de Arellano, Michael A.; Danielson, Carla Kmett

    2008-01-01

    Youth who experience traumatic events are at risk for a range of negative outcomes, including posttraumatic stress disorder, other anxiety disorders, depression, substance use, and health risk behaviors. It is important to identify valid methods to assess individuals for exposure to traumatic events, as well as the types of problems or symptoms…

  5. Walk a Mile in My Shoes: Stakeholder Accounts of Testing Experience with a Computer-Administered Test

    ERIC Educational Resources Information Center

    Fox, Janna; Cheng, Liying

    2015-01-01

    In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…

  6. Lightweight evacuated multilayer insulation systems for the space shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Barclay, D. L.; Bell, J. E.; Zimmerman, D. K.

    1973-01-01

    The elements in the evacuated multilayer insulation system were investigated, and the major weight contributors for optimization selected. Outgassing tests were conducted on candidate vacuum jacket materials and experiments were conducted to determine the vacuum and structural integrity of selected vacuum jacket configurations. A nondestructive proof test method, applicable to externally pressurized shells, was validated on this program.

  7. The teratology testing of cosmetics.

    PubMed

    Spézia, François; Barrow, Paul C

    2013-01-01

    In Europe, the developmental toxicity testing (including teratogenicity) of new cosmetic ingredients is performed according to the Cosmetics Directive 76/768/EEC: only alternatives leading to full replacement of animal experiments should be used. This chapter presents the three scientifically validated animal alternative methods for the assessment of embryotoxicity: the embryonic stem cell test (EST), the micromass (MM) assay, and the whole embryo culture (WEC) assay.

  8. Green's function methods in heavy ion shielding

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.

    1993-01-01

    An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.

  9. Radiocardiography in clinical cardiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierson, R.N. Jr.; Alam, S.; Kemp, H.G.

    1977-01-01

    Quantitative radiocardiography provides a variety of noninvasive measurements of value in cardiology. A gamma camera and computer processing are required for most of these measurements. The advantages of ease, economy, and safety of these procedures are, in part, offset by the complexity of as yet unstandardized methods and incomplete validation of results. The expansion of these techniques will inevitably be rapid. Their careful performance requires, for the moment, a major and perhaps dedicated effort by at least one member of the professional team, if the pitfalls that lead to unrecognized error are to be avoided. We may anticipate more automatedmore » and reliable results with increased experience and validation.« less

  10. ALHAT System Validation

    NASA Technical Reports Server (NTRS)

    Brady, Tye; Bailey, Erik; Crain, Timothy; Paschall, Stephen

    2011-01-01

    NASA has embarked on a multiyear technology development effort to develop a safe and precise lunar landing capability. The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is investigating a range of landing hazard detection methods while developing a hazard avoidance capability to best field test the proper set of relevant autonomous GNC technologies. Ultimately, the advancement of these technologies through the ALHAT Project will provide an ALHAT System capable of enabling next generation lunar lander vehicles to globally land precisely and safely regardless of lighting condition. This paper provides an overview of the ALHAT System and describes recent validation experiments that have advanced the highly capable GNC architecture.

  11. Calculated criticality for sup 235 U/graphite systems using the VIM Monte Carlo code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, P.J.; Grasseschi, G.L.; Olsen, D.N.

    1992-01-01

    Calculations for highly enriched uranium and graphite systems gained renewed interest recently for the new production modular high-temperature gas-cooled reactor (MHTGR). Experiments to validate the physics calculations for these systems are being prepared for the Transient Reactor Test Facility (TREAT) reactor at Argonne National Laboratory (ANL-West) and in the Compact Nuclear Power Source facility at Los Alamos National Laboratory. The continuous-energy Monte Carlo code VIM, or equivalently the MCNP code, can utilize fully detailed models of the MHTGR and serve as benchmarks for the approximate multigroup methods necessary in full reactor calculations. Validation of these codes and their associated nuclearmore » data did not exist for highly enriched {sup 235}U/graphite systems. Experimental data, used in development of more approximate methods, dates back to the 1960s. The authors have selected two independent sets of experiments for calculation with the VIM code. The carbon-to-uranium (C/U) ratios encompass the range of 2,000, representative of the new production MHTGR, to the ratio of 10,000 in the fuel of TREAT. Calculations used the ENDF/B-V data.« less

  12. Spatial-temporal discriminant analysis for ERP-based brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2013-03-01

    Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.

  13. A Bayesian Active Learning Experimental Design for Inferring Signaling Networks.

    PubMed

    Ness, Robert O; Sachs, Karen; Mallick, Parag; Vitek, Olga

    2018-06-21

    Machine learning methods for learning network structure are applied to quantitative proteomics experiments and reverse-engineer intracellular signal transduction networks. They provide insight into the rewiring of signaling within the context of a disease or a phenotype. To learn the causal patterns of influence between proteins in the network, the methods require experiments that include targeted interventions that fix the activity of specific proteins. However, the interventions are costly and add experimental complexity. We describe an active learning strategy for selecting optimal interventions. Our approach takes as inputs pathway databases and historic data sets, expresses them in form of prior probability distributions on network structures, and selects interventions that maximize their expected contribution to structure learning. Evaluations on simulated and real data show that the strategy reduces the detection error of validated edges as compared with an unguided choice of interventions and avoids redundant interventions, thereby increasing the effectiveness of the experiment.

  14. Experiment Analysis and Modelling of Compaction Behaviour of Ag60Cu30Sn10 Mixed Metal Powders

    NASA Astrophysics Data System (ADS)

    Zhou, Mengcheng; Huang, Shangyu; Liu, Wei; Lei, Yu; Yan, Shiwei

    2018-03-01

    A novel process method combines powder compaction and sintering was employed to fabricate thin sheets of cadmium-free silver based filler metals, the compaction densification behaviour of Ag60Cu30Sn10 mixed metal powders was investigated experimentally. Based on the equivalent density method, the density-dependent Drucker-Prager Cap (DPC) model was introduced to model the powder compaction behaviour. Various experiment procedures were completed to determine the model parameters. The friction coefficients in lubricated and unlubricated die were experimentally determined. The determined material parameters were validated by experiments and numerical simulation of powder compaction process using a user subroutine (USDFLD) in ABAQUS/Standard. The good agreement between the simulated and experimental results indicates that the determined model parameters are able to describe the compaction behaviour of the multicomponent mixed metal powders, which can be further used for process optimization simulations.

  15. The Impact of Preceptor and Student Learning Styles on Experiential Performance Measures

    PubMed Central

    Cox, Craig D.; Seifert, Charles F.

    2012-01-01

    Objectives. To identify preceptors’ and students’ learning styles to determine how these impact students’ performance on pharmacy practice experience assessments. Methods. Students and preceptors were asked to complete a validated Pharmacist’s Inventory of Learning Styles (PILS) questionnaire to identify dominant and secondary learning styles. The significance of “matched” and “unmatched” learning styles between students and preceptors was evaluated based on performance on both subjective and objective practice experience assessments. Results. Sixty-one percent of 67 preceptors and 57% of 72 students who participated reported “assimilator” as their dominant learning style. No differences were found between student and preceptor performance on evaluations, regardless of learning style match. Conclusion. Determination of learning styles may encourage preceptors to use teaching methods to challenge students during pharmacy practice experiences; however, this does not appear to impact student or preceptor performance. PMID:23049100

  16. An alternative method for calibration of flow field flow fractionation channels for hydrodynamic radius determination: The nanoemulsion method (featuring multi angle light scattering).

    PubMed

    Bolinsson, Hans; Lu, Yi; Hall, Stephen; Nilsson, Lars; Håkansson, Andreas

    2018-01-19

    This study suggests a novel method for determination of the channel height in asymmetrical flow field-flow fractionation (AF4), which can be used for calibration of the channel for hydrodynamic radius determinations. The novel method uses an oil-in-water nanoemulsion together with multi angle light scattering (MALS) and elution theory to determine channel height from an AF4 experiment. The method is validated using two orthogonal methods; first, by using standard particle elution experiments and, secondly, by imaging an assembled and carrier liquid filled channel by x-ray computed tomography (XCT). It is concluded that the channel height can be determined with approximately the same accuracy as with the traditional channel height determination technique. However, the nanoemulsion method can be used under more challenging conditions than standard particles, as the nanoemulsion remains stable in a wider pH range than the previously used standard particles. Moreover, the novel method is also more cost effective. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Predicting gene regulatory networks of soybean nodulation from RNA-Seq transcriptome data.

    PubMed

    Zhu, Mingzhu; Dahmen, Jeremy L; Stacey, Gary; Cheng, Jianlin

    2013-09-22

    High-throughput RNA sequencing (RNA-Seq) is a revolutionary technique to study the transcriptome of a cell under various conditions at a systems level. Despite the wide application of RNA-Seq techniques to generate experimental data in the last few years, few computational methods are available to analyze this huge amount of transcription data. The computational methods for constructing gene regulatory networks from RNA-Seq expression data of hundreds or even thousands of genes are particularly lacking and urgently needed. We developed an automated bioinformatics method to predict gene regulatory networks from the quantitative expression values of differentially expressed genes based on RNA-Seq transcriptome data of a cell in different stages and conditions, integrating transcriptional, genomic and gene function data. We applied the method to the RNA-Seq transcriptome data generated for soybean root hair cells in three different development stages of nodulation after rhizobium infection. The method predicted a soybean nodulation-related gene regulatory network consisting of 10 regulatory modules common for all three stages, and 24, 49 and 70 modules separately for the first, second and third stage, each containing both a group of co-expressed genes and several transcription factors collaboratively controlling their expression under different conditions. 8 of 10 common regulatory modules were validated by at least two kinds of validations, such as independent DNA binding motif analysis, gene function enrichment test, and previous experimental data in the literature. We developed a computational method to reliably reconstruct gene regulatory networks from RNA-Seq transcriptome data. The method can generate valuable hypotheses for interpreting biological data and designing biological experiments such as ChIP-Seq, RNA interference, and yeast two hybrid experiments.

  18. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    NASA Astrophysics Data System (ADS)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  19. Sector-Based Detection for Hands-Free Speech Enhancement in Cars

    NASA Astrophysics Data System (ADS)

    Lathoud, Guillaume; Bourgeois, Julien; Freudenberger, Jürgen

    2006-12-01

    Adaptation control of beamforming interference cancellation techniques is investigated for in-car speech acquisition. Two efficient adaptation control methods are proposed that avoid target cancellation. The "implicit" method varies the step-size continuously, based on the filtered output signal. The "explicit" method decides in a binary manner whether to adapt or not, based on a novel estimate of target and interference energies. It estimates the average delay-sum power within a volume of space, for the same cost as the classical delay-sum. Experiments on real in-car data validate both methods, including a case with[InlineEquation not available: see fulltext.] km/h background road noise.

  20. A simple method to eliminate shielding currents for magnetization perpendicular to superconducting tapes wound into coils

    NASA Astrophysics Data System (ADS)

    Kajikawa, Kazuhiro; Funaki, Kazuo

    2011-12-01

    Application of an external AC magnetic field parallel to superconducting tapes helps in eliminating the magnetization caused by the shielding current induced in the flat faces of the tapes. This method helps in realizing a magnet system with high-temperature superconducting tapes for magnetic resonance imaging (MRI) and nuclear magnetic resonance (NMR) applications. The effectiveness of the proposed method is validated by numerical calculations carried out using the finite-element method and experiments performed using a commercially available superconducting tape. The field uniformity for a single-layer solenoid coil after the application of an AC field is also estimated by a theoretical consideration.

  1. High resolution particle tracking method by suppressing the wavefront aberrations

    NASA Astrophysics Data System (ADS)

    Chang, Xinyu; Yang, Yuan; Kou, Li; Jin, Lei; Lu, Junsheng; Hu, Xiaodong

    2018-01-01

    Digital in-line holographic microscopy is one of the most efficient methods for particle tracking as it can precisely measure the axial position of particles. However, imaging systems are often limited by detector noise, image distortions and human operator misjudgment making the particles hard to locate. A general method is used to solve this problem. The normalized holograms of particles were reconstructed to the pupil plane and then fit to a linear superposition of the Zernike polynomial functions to suppress the aberrations. Relative experiments were implemented to validate the method and the results show that nanometer scale resolution was achieved even when the holograms were poorly recorded.

  2. The use of capillary electrophoresis as part of a specificity testing strategy for mitoguazone dihydrochloride HPLC methods.

    PubMed

    Thomson, C E; Gray, M R; Baxter, M P

    1997-05-01

    Capillary electrophoresis (CE) has been used as part of a validation experiment designed to prove the specificity of high performance liquid chromatography (HPLC) methods used for analysis of mitoguazone dihydrochloride drug substance. Data regarding accuracy, precision and sensitivity of the CE methods are presented as well as a comparison of results obtained from CE, HPLC and thin-layer chromatography (TLC) analysis of samples stressed under a variety of conditions. It was concluded that, not only were the HPLC methods being investigated specific, but that CE could potentially be used to replace HPLC for the routine assay of mitoguazone dihydrochloride.

  3. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations

    PubMed Central

    Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.

    2017-01-01

    A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices. PMID:28594889

  4. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices.

  5. Estimation of Unsteady Aerodynamic Models from Dynamic Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick; Klein, Vladislav

    2011-01-01

    Demanding aerodynamic modelling requirements for military and civilian aircraft have motivated researchers to improve computational and experimental techniques and to pursue closer collaboration in these areas. Model identification and validation techniques are key components for this research. This paper presents mathematical model structures and identification techniques that have been used successfully to model more general aerodynamic behaviours in single-degree-of-freedom dynamic testing. Model parameters, characterizing aerodynamic properties, are estimated using linear and nonlinear regression methods in both time and frequency domains. Steps in identification including model structure determination, parameter estimation, and model validation, are addressed in this paper with examples using data from one-degree-of-freedom dynamic wind tunnel and water tunnel experiments. These techniques offer a methodology for expanding the utility of computational methods in application to flight dynamics, stability, and control problems. Since flight test is not always an option for early model validation, time history comparisons are commonly made between computational and experimental results and model adequacy is inferred by corroborating results. An extension is offered to this conventional approach where more general model parameter estimates and their standard errors are compared.

  6. Nuclear magnetic resonance signal dynamics of liquids in the presence of distant dipolar fields, revisited

    PubMed Central

    Barros, Wilson; Gochberg, Daniel F.; Gore, John C.

    2009-01-01

    The description of the nuclear magnetic resonance magnetization dynamics in the presence of long-range dipolar interactions, which is based upon approximate solutions of Bloch–Torrey equations including the effect of a distant dipolar field, has been revisited. New experiments show that approximate analytic solutions have a broader regime of validity as well as dependencies on pulse-sequence parameters that seem to have been overlooked. In order to explain these experimental results, we developed a new method consisting of calculating the magnetization via an iterative formalism where both diffusion and distant dipolar field contributions are treated as integral operators incorporated into the Bloch–Torrey equations. The solution can be organized as a perturbative series, whereby access to higher order terms allows one to set better boundaries on validity regimes for analytic first-order approximations. Finally, the method legitimizes the use of simple analytic first-order approximations under less demanding experimental conditions, it predicts new pulse-sequence parameter dependencies for the range of validity, and clarifies weak points in previous calculations. PMID:19425789

  7. [Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].

    PubMed

    Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping

    2014-06-01

    The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.

  8. An image-based automatic recognition method for the flowering stage of maize

    NASA Astrophysics Data System (ADS)

    Yu, Zhenghong; Zhou, Huabing; Li, Cuina

    2018-03-01

    In this paper, we proposed an image-based approach for automatic recognizing the flowering stage of maize. A modified HOG/SVM detection framework is first adopted to detect the ears of maize. Then, we use low-rank matrix recovery technology to precisely extract the ears at pixel level. At last, a new feature called color gradient histogram, as an indicator, is proposed to determine the flowering stage. Comparing experiment has been carried out to testify the validity of our method and the results indicate that our method can meet the demand for practical observation.

  9. The Development of a Web-Based Urban Soundscape Evaluation System

    NASA Astrophysics Data System (ADS)

    Sudarsono, A. S.; Sarwono, J.

    2018-05-01

    Acoustic quality is one of the important aspects of urban design. It is usually evaluated based on how loud the urban environment is. However, this approach does not consider people’s perception of the urban acoustic environment. Therefore, a different method has been developed based on the perception of the acoustic environment using the concept of soundscape. Soundscape is defined as the acoustic environment perceived by people who are part of the environment. This approach considers the relationship between the sound source, the environment, and the people. The analysis of soundscape considers many aspects such as cultural aspects, people’s expectations, people’s experience of space, and social aspects. Soundscape affects many aspects of human life such as culture, health, and the quality of life. Urban soundscape management and planning must be integrated with the other aspect of urban design, both in the design and the improvement stages. The soundscape concept seeks to make the acoustic environment as pleasant as possible in a space with or without uncomfortable sound sources. Soundscape planning includes the design of physical features to achieve a positive perceptual outcome. It is vital to gather data regarding the relationship between humans and the components of a soundscape, e.g., sound sources, features of the physical environment, the functions of a space, and the expectation of the sound source. The data can be measured and gathered using several soundscape evaluation methods. Soundscape evaluation is usually conducted using in-situ surveys and laboratory experiments using a multi-speaker system. Although these methods have been validated and are widely used in soundscape analysis, there are some limitations in the application. The in-situ survey needs to be done at one time with many people at the same time because it is hard to replicate the acoustic environment. Conversely, the laboratory experiment does not have a problem with the repetition of the experiment. This method requires a room with a multi-speaker reproduction system. This project used a different method to analyse soundscape developed using headphones via the internet. The internet system for data gathering has been established; a website has enabled to reproduce high-quality audio and it has a system to design online questionnaires. Furthermore, the development of a virtual reality system allows the reproduction of virtual audio-visual stimulus on a website. Although the website has an established system to gather the required data, the problem is the validation of the reproduction system for soundscape analysis, which needs to be done with consideration of several factors: the suitable recording system, the effect of headphone variation, the calibration of the system, and the perception result from internet-based acoustic environment reproduction. This study aims to develop and validate a web-based urban soundscape evaluation method. By using this method, the experiment can be repeated easily and data can be gathered from many respondents. Furthermore, the simplicity of the system allows for the application by the stakeholders in urban design. The data gathered from this system is important for the design of an urban area with consideration of the acoustic aspects.

  10. The Development and Validation of a Life Experience Inventory for the Identification of Creative Electrical Engineers.

    ERIC Educational Resources Information Center

    Michael, William B.; Colson, Kenneth R.

    1979-01-01

    The construction and validation of the Life Experience Inventory (LEI) for the identification of creative electrical engineers are described. Using the number of patents held or pending as a criterion measure, the LEI was found to have high concurrent validity. (JKS)

  11. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  12. Cross Cultural Adaptation, Validity, and Reliability of the Farsi Breastfeeding Attrition Prediction Tools in Iranian Pregnant Women

    PubMed Central

    Mortazavi, Forough; Mousavi, Seyed Abbas; Chaman, Reza; Khosravi, Ahmad; Janke, Jill R.

    2015-01-01

    Background: The rate of exclusive breastfeeding in Iran is decreasing. The breastfeeding attrition prediction tools (BAPT) have been validated and used in predicting premature weaning. Objectives: We aimed to translate the BAPT into Farsi, assess its content validity, and examine its reliability and validity to identify exclusive breastfeeding discontinuation in Iran. Materials and Methods: The BAPT was translated into Farsi and the content validity of the Farsi version of the BAPT was assessed. It was administered to 356 pregnant women in the third trimester of pregnancy, who were residents of a city in northeast of Iran. The structural integrity of the four-factor model was assessed in confirmatory factor analysis (CFA) and exploratory factor analysis (EFA). Reliability was assessed using Cronbach’s alpha coefficient and item-subscale correlations. Validity was assessed using the known-group comparison (128 with vs. 228 without breastfeeding experience) and predictive validity (80 successes vs. 265 failures in exclusive breastfeeding). Results: The internal consistency of the whole instrument (49 items) was 0.775. CFA provided an acceptable fit to the a priori four-factor model (Chi-square/df = 1.8, Root Mean Square Error of Approximation (RMSEA) = 0.049, Standardized Root Mean Square Residual (SRMR) = 0.064, Comparative Fit Index (CFI) = 0.911). The difference in means of breastfeeding control (BFC) between the participants with and without breastfeeding experience was significant (P < 0.001). In addition, the total score of BAPT and the score of Breast Feeding Control (BFC) subscale were higher in women who were on exclusive breastfeeding than women who were not, at four months postpartum (P < 0.05). Conclusions: This study validated the Farsi version of BAPT. It is useful for researchers who want to use it in Iran to identify women at higher risks of Exclusive Breast Feeding (EBF) discontinuation. PMID:26019910

  13. Coherent entropy induced and acoustic noise separation in compact nozzles

    NASA Astrophysics Data System (ADS)

    Tao, Wenjie; Schuller, Thierry; Huet, Maxime; Richecoeur, Franck

    2017-04-01

    A method to separate entropy induced noise from an acoustic pressure wave in an harmonically perturbed flow through a nozzle is presented. It is tested on an original experimental setup generating simultaneously acoustic and temperature fluctuations in an air flow that is accelerated by a convergent nozzle. The setup mimics the direct and indirect noise contributions to the acoustic pressure field in a confined combustion chamber by producing synchronized acoustic and temperature fluctuations, without dealing with the complexity of the combustion process. It allows generating temperature fluctuations with amplitude up to 10 K in the frequency range from 10 to 100 Hz. The noise separation technique uses experiments with and without temperature fluctuations to determine the relative level of acoustic and entropy fluctuations in the system and to identify the nozzle response to these forcing waves. It requires multi-point measurements of acoustic pressure and temperature. The separation method is first validated with direct numerical simulations of the nonlinear Euler equations. These simulations are used to investigate the conditions for which the separation technique is valid and yield similar trends as the experiments for the investigated flow operating conditions. The separation method then gives successfully the acoustic reflection coefficient but does not recover the same entropy reflection coefficient as predicted by the compact nozzle theory due to the sensitivity of the method to signal noises in the explored experimental conditions. This methodology provides a framework for experimental investigation of direct and indirect combustion noises originating from synchronized perturbations.

  14. Development and validation of a HPLC method for the assay of dapivirine in cell-based and tissue permeability experiments.

    PubMed

    das Neves, José; Sarmento, Bruno; Amiji, Mansoor; Bahia, Maria Fernanda

    2012-12-12

    Dapivirine, a non-nucleoside reverse transcriptase inhibitor, is being currently used for the development of potential anti-HIV microbicide formulations and delivery systems. A new high-performance liquid chromatography (HPLC) method with UV detection was developed for the assay of this drug in different biological matrices, namely cell lysates, receptor media from permeability experiments and homogenates of mucosal tissues. The method used a reversed-phase C18 column with a mobile phase composed of trifluoroacetic acid solution (0.1%, v/v) and acetonitrile in a gradient mode. Injection volume was 50μL and the flow rate 1mL/min. The total run time was 12min and UV detection was performed at 290nm for dapivirine and the internal standard (IS) diphenylamine. A Box-Behnken experimental design was used to study different experimental variables of the method, namely the ratio of the mobile phase components and the gradient time, and their influence in responses such as the retention factor, tailing factor, and theoretical plates for dapivirine and the IS, as well as the peak resolution between both compounds. The optimized method was further validated and its usefulness assessed for in vitro and ex vivo experiments using dapivirine or dapivirine-loaded nanoparticles. The method showed to be selective, linear, accurate and precise in the range of 0.02-1.5μg/mL. Other chromatographic parameters, namely carry-over, lower limit of quantification (0.02μg/mL), limit of detection (0.006μg/mL), recovery (equal or higher than 90.7%), and sample stability at different storage conditions, were also determined and found adequate for the intended purposes. The method was successfully used for cell uptake assays and permeability studies across cell monolayers and pig genital mucosal tissues. Overall, the proposed method provides a simple, versatile and reliable way for studying the behavior of dapivirine in different biological matrices and assessing its potential as an anti-HIV microbicide drug. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. The Use of Simulation to Teach Nursing Students and Clinicians Palliative Care and End-of-Life Communication: A Systematic Review.

    PubMed

    Smith, Madison B; Macieira, Tamara G R; Bumbach, Michael D; Garbutt, Susan J; Citty, Sandra W; Stephen, Anita; Ansell, Margaret; Glover, Toni L; Keenan, Gail

    2018-01-01

    To present the findings of a systematic review on the use of simulation-based learning experiences (SBLEs) to teach communication skills to nursing students and clinicians who provide palliative and end-of-life care to patients and their families. Palliative care communication skills are fundamental to providing holistic patient care. Since nurses have the greatest amount of direct exposure to patients, building such communication competencies is essential. However, exposure to patients and families receiving palliative and end-of-life care is often limited, resulting in few opportunities to learn these skills in the clinical setting. Simulation-based learning experiences can be used to supplement didactic teaching and clinical experiences to build the requisite communication skills. Searches of CINAHL, MEDLINE, PsychINFO, ERIC, and Web of Science electronic databases and Grey Literature returned 442 unique records. Thirty articles met the established criteria, including the SBLE must contain a nursing role. Simulation-based learning experience are being used to teach palliative and end-of-life communication skills to nursing students and clinicians. Lack of standardization, poor evaluation methods, and limited exposure to the entire interprofessional team makes it difficult to identify and disseminate validated best practices. While the need for further research is acknowledged, we recommend this evidence be augmented by training programs that utilize SBLEs through (1) applying standards, (2) clearly specifying goals and objectives, (3) integrating externally validated scenarios, and (4) employing rigorous evaluation methods and measures that link the SBLE to the training objectives and desired clinician practice behaviors and patient outcomes.

  16. Development and validation of a simplified titration method for monitoring volatile fatty acids in anaerobic digestion.

    PubMed

    Sun, Hao; Guo, Jianbin; Wu, Shubiao; Liu, Fang; Dong, Renjie

    2017-09-01

    The volatile fatty acids (VFAs) concentration has been considered as one of the most sensitive process performance indicators in anaerobic digestion (AD) process. However, the accurate determination of VFAs concentration in AD processes normally requires advanced equipment and complex pretreatment procedures. A simplified method with fewer sample pretreatment procedures and improved accuracy is greatly needed, particularly for on-site application. This report outlines improvements to the Nordmann method, one of the most popular titrations used for VFA monitoring. The influence of ion and solid interfering subsystems in titrated samples on results accuracy was discussed. The total solid content in titrated samples was the main factor affecting accuracy in VFA monitoring. Moreover, a high linear correlation was established between the total solids contents and VFA measurement differences between the traditional Nordmann equation and gas chromatography (GC). Accordingly, a simplified titration method was developed and validated using a semi-continuous experiment of chicken manure anaerobic digestion with various organic loading rates. The good fitting of the results obtained by this method in comparison with GC results strongly supported the potential application of this method to VFA monitoring. Copyright © 2017. Published by Elsevier Ltd.

  17. Laplace Inversion of Low-Resolution NMR Relaxometry Data Using Sparse Representation Methods

    PubMed Central

    Berman, Paula; Levi, Ofer; Parmet, Yisrael; Saunders, Michael; Wiesman, Zeev

    2013-01-01

    Low-resolution nuclear magnetic resonance (LR-NMR) relaxometry is a powerful tool that can be harnessed for characterizing constituents in complex materials. Conversion of the relaxation signal into a continuous distribution of relaxation components is an ill-posed inverse Laplace transform problem. The most common numerical method implemented today for dealing with this kind of problem is based on L2-norm regularization. However, sparse representation methods via L1 regularization and convex optimization are a relatively new approach for effective analysis and processing of digital images and signals. In this article, a numerical optimization method for analyzing LR-NMR data by including non-negativity constraints and L1 regularization and by applying a convex optimization solver PDCO, a primal-dual interior method for convex objectives, that allows general linear constraints to be treated as linear operators is presented. The integrated approach includes validation of analyses by simulations, testing repeatability of experiments, and validation of the model and its statistical assumptions. The proposed method provides better resolved and more accurate solutions when compared with those suggested by existing tools. © 2013 Wiley Periodicals, Inc. Concepts Magn Reson Part A 42A: 72–88, 2013. PMID:23847452

  18. Development and Validation of a Sensitive Method for Trace Nickel Determination by Slotted Quartz Tube Flame Atomic Absorption Spectrometry After Dispersive Liquid-Liquid Microextraction.

    PubMed

    Yolcu, Şükran Melda; Fırat, Merve; Chormey, Dotse Selali; Büyükpınar, Çağdaş; Turak, Fatma; Bakırdere, Sezgin

    2018-05-01

    In this study, dispersive liquid-liquid microextraction was systematically optimized for the preconcentration of nickel after forming a complex with diphenylcarbazone. The measurement output of the flame atomic absorption spectrometer was further enhanced by fitting a custom-cut slotted quartz tube to the flame burner head. The extraction method increased the amount of nickel reaching the flame and the slotted quartz tube increased the residence time of nickel atoms in the flame to record higher absorbance. Two methods combined to give about 90 fold enhancement in sensitivity over the conventional flame atomic absorption spectrometry. The optimized method was applicable over a wide linear concentration range, and it gave a detection limit of 2.1 µg L -1 . Low relative standard deviations at the lowest concentration in the linear calibration plot indicated high precision for both extraction process and instrumental measurements. A coal fly ash standard reference material (SRM 1633c) was used to determine the accuracy of the method, and experimented results were compatible with the certified value. Spiked recovery tests were also used to validate the applicability of the method.

  19. Laplace Inversion of Low-Resolution NMR Relaxometry Data Using Sparse Representation Methods.

    PubMed

    Berman, Paula; Levi, Ofer; Parmet, Yisrael; Saunders, Michael; Wiesman, Zeev

    2013-05-01

    Low-resolution nuclear magnetic resonance (LR-NMR) relaxometry is a powerful tool that can be harnessed for characterizing constituents in complex materials. Conversion of the relaxation signal into a continuous distribution of relaxation components is an ill-posed inverse Laplace transform problem. The most common numerical method implemented today for dealing with this kind of problem is based on L 2 -norm regularization. However, sparse representation methods via L 1 regularization and convex optimization are a relatively new approach for effective analysis and processing of digital images and signals. In this article, a numerical optimization method for analyzing LR-NMR data by including non-negativity constraints and L 1 regularization and by applying a convex optimization solver PDCO, a primal-dual interior method for convex objectives, that allows general linear constraints to be treated as linear operators is presented. The integrated approach includes validation of analyses by simulations, testing repeatability of experiments, and validation of the model and its statistical assumptions. The proposed method provides better resolved and more accurate solutions when compared with those suggested by existing tools. © 2013 Wiley Periodicals, Inc. Concepts Magn Reson Part A 42A: 72-88, 2013.

  20. Student-Directed Video Validation of Psychomotor Skills Performance: A Strategy to Facilitate Deliberate Practice, Peer Review, and Team Skill Sets.

    PubMed

    DeBourgh, Gregory A; Prion, Susan K

    2017-03-22

    Background Essential nursing skills for safe practice are not limited to technical skills, but include abilities for determining salience among clinical data within dynamic practice environments, demonstrating clinical judgment and reasoning, problem-solving abilities, and teamwork competence. Effective instructional methods are needed to prepare new nurses for entry-to-practice in contemporary healthcare settings. Method This mixed-methods descriptive study explored self-reported perceptions of a process to self-record videos for psychomotor skill performance evaluation in a convenience sample of 102 pre-licensure students. Results Students reported gains in confidence and skill acquisition using team skills to record individual videos of skill performance, and described the importance of teamwork, peer support, and deliberate practice. Conclusion Although time consuming, the production of student-directed video validations of psychomotor skill performance is an authentic task with meaningful accountabilities that is well-received by students as an effective, satisfying learner experience to increase confidence and competence in performing psychomotor skills.

Top