Sample records for previously validated method

  1. Comparative assessment of three standardized robotic surgery training methods.

    PubMed

    Hung, Andrew J; Jayaratna, Isuru S; Teruya, Kara; Desai, Mihir M; Gill, Inderbir S; Goh, Alvin C

    2013-10-01

    To evaluate three standardized robotic surgery training methods, inanimate, virtual reality and in vivo, for their construct validity. To explore the concept of cross-method validity, where the relative performance of each method is compared. Robotic surgical skills were prospectively assessed in 49 participating surgeons who were classified as follows: 'novice/trainee': urology residents, previous experience <30 cases (n = 38) and 'experts': faculty surgeons, previous experience ≥30 cases (n = 11). Three standardized, validated training methods were used: (i) structured inanimate tasks; (ii) virtual reality exercises on the da Vinci Skills Simulator (Intuitive Surgical, Sunnyvale, CA, USA); and (iii) a standardized robotic surgical task in a live porcine model with performance graded by the Global Evaluative Assessment of Robotic Skills (GEARS) tool. A Kruskal-Wallis test was used to evaluate performance differences between novices and experts (construct validity). Spearman's correlation coefficient (ρ) was used to measure the association of performance across inanimate, simulation and in vivo methods (cross-method validity). Novice and expert surgeons had previously performed a median (range) of 0 (0-20) and 300 (30-2000) robotic cases, respectively (P < 0.001). Construct validity: experts consistently outperformed residents with all three methods (P < 0.001). Cross-method validity: overall performance of inanimate tasks significantly correlated with virtual reality robotic performance (ρ = -0.7, P < 0.001) and in vivo robotic performance based on GEARS (ρ = -0.8, P < 0.0001). Virtual reality performance and in vivo tissue performance were also found to be strongly correlated (ρ = 0.6, P < 0.001). We propose the novel concept of cross-method validity, which may provide a method of evaluating the relative value of various forms of skills education and assessment. We externally confirmed the construct validity of each featured training tool. © 2013 BJU International.

  2. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    ERIC Educational Resources Information Center

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  3. Validation of "Teaching and Learning Guiding Principles Instrument" for Malaysian Higher Learning Institutions

    ERIC Educational Resources Information Center

    Rahman, Nurulhuda Abd; Masuwai, Azwani; Tajudin, Nor'ain Mohd; Tek, Ong Eng; Adnan, Mazlini

    2016-01-01

    Purpose: This study was aimed at establishing, through the validation of the "Teaching and Learning Guiding Principles Instrument" (TLGPI), the validity and reliability of the underlying factor structure of the Teaching and Learning Guiding Principles (TLGP) generated by a previous study. Method: A survey method was used to collect data…

  4. Modeling, implementation, and validation of arterial travel time reliability.

    DOT National Transportation Integrated Search

    2013-11-01

    Previous research funded by Florida Department of Transportation (FDOT) developed a method for estimating : travel time reliability for arterials. This method was not initially implemented or validated using field data. This : project evaluated and r...

  5. Least Squares Distance Method of Cognitive Validation and Analysis for Binary Items Using Their Item Response Theory Parameters

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    2007-01-01

    The validation of cognitive attributes required for correct answers on binary test items or tasks has been addressed in previous research through the integration of cognitive psychology and psychometric models using parametric or nonparametric item response theory, latent class modeling, and Bayesian modeling. All previous models, each with their…

  6. Rapid analysis of aminoglycoside antibiotics in bovine tissues using disposable pipette extraction and ultrahigh performance liquid chromatography - tandem mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pip...

  7. Who Needs Replication?

    ERIC Educational Resources Information Center

    Porte, Graeme

    2013-01-01

    In this paper, the editor of a recent Cambridge University Press book on research methods discusses replicating previous key studies to throw more light on their reliability and generalizability. Replication research is presented as an accepted method of validating previous research by providing comparability between the original and replicated…

  8. CFD Analysis of the SBXC Glider Airframe

    DTIC Science & Technology

    2016-06-01

    mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the previous research data...greater than 15 m/s. 14. SUBJECT TERMS finite element method, computational fluid dynamics, Y Plus, mesh element quality, aerodynamic data, fluid...based mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the

  9. Global approach for the validation of an in-line Raman spectroscopic method to determine the API content in real-time during a hot-melt extrusion process.

    PubMed

    Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E

    2017-08-15

    Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Reliability and validity of quantifying absolute muscle hardness using ultrasound elastography.

    PubMed

    Chino, Kentaro; Akagi, Ryota; Dohi, Michiko; Fukashiro, Senshi; Takahashi, Hideyuki

    2012-01-01

    Muscle hardness is a mechanical property that represents transverse muscle stiffness. A quantitative method that uses ultrasound elastography for quantifying absolute human muscle hardness has been previously devised; however, its reliability and validity have not been completely verified. This study aimed to verify the reliability and validity of this quantitative method. The Young's moduli of seven tissue-mimicking materials (in vitro; Young's modulus range, 20-80 kPa; increments of 10 kPa) and the human medial gastrocnemius muscle (in vivo) were quantified using ultrasound elastography. On the basis of the strain/Young's modulus ratio of two reference materials, one hard and one soft (Young's moduli of 7 and 30 kPa, respectively), the Young's moduli of the tissue-mimicking materials and medial gastrocnemius muscle were calculated. The intra- and inter-investigator reliability of the method was confirmed on the basis of acceptably low coefficient of variations (≤6.9%) and substantially high intraclass correlation coefficients (≥0.77) obtained from all measurements. The correlation coefficient between the Young's moduli of the tissue-mimicking materials obtained using a mechanical method and ultrasound elastography was 0.996, which was equivalent to values previously obtained using magnetic resonance elastography. The Young's moduli of the medial gastrocnemius muscle obtained using ultrasound elastography were within the range of values previously obtained using magnetic resonance elastography. The reliability and validity of the quantitative method for measuring absolute muscle hardness using ultrasound elastography were thus verified.

  11. Reliability and Validity of Quantifying Absolute Muscle Hardness Using Ultrasound Elastography

    PubMed Central

    Chino, Kentaro; Akagi, Ryota; Dohi, Michiko; Fukashiro, Senshi; Takahashi, Hideyuki

    2012-01-01

    Muscle hardness is a mechanical property that represents transverse muscle stiffness. A quantitative method that uses ultrasound elastography for quantifying absolute human muscle hardness has been previously devised; however, its reliability and validity have not been completely verified. This study aimed to verify the reliability and validity of this quantitative method. The Young’s moduli of seven tissue-mimicking materials (in vitro; Young’s modulus range, 20–80 kPa; increments of 10 kPa) and the human medial gastrocnemius muscle (in vivo) were quantified using ultrasound elastography. On the basis of the strain/Young’s modulus ratio of two reference materials, one hard and one soft (Young’s moduli of 7 and 30 kPa, respectively), the Young’s moduli of the tissue-mimicking materials and medial gastrocnemius muscle were calculated. The intra- and inter-investigator reliability of the method was confirmed on the basis of acceptably low coefficient of variations (≤6.9%) and substantially high intraclass correlation coefficients (≥0.77) obtained from all measurements. The correlation coefficient between the Young’s moduli of the tissue-mimicking materials obtained using a mechanical method and ultrasound elastography was 0.996, which was equivalent to values previously obtained using magnetic resonance elastography. The Young’s moduli of the medial gastrocnemius muscle obtained using ultrasound elastography were within the range of values previously obtained using magnetic resonance elastography. The reliability and validity of the quantitative method for measuring absolute muscle hardness using ultrasound elastography were thus verified. PMID:23029231

  12. Quantum-state anomaly detection for arbitrary errors using a machine-learning technique

    NASA Astrophysics Data System (ADS)

    Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki

    2016-10-01

    The accurate detection of small deviations in given density matrice is important for quantum information processing, which is a difficult task because of the intrinsic fluctuation in density matrices reconstructed using a limited number of experiments. We previously proposed a method for decoherence error detection using a machine-learning technique [S. Hara, T. Ono, R. Okamoto, T. Washio, and S. Takeuchi, Phys. Rev. A 89, 022104 (2014), 10.1103/PhysRevA.89.022104]. However, the previous method is not valid when the errors are just changes in phase. Here, we propose a method that is valid for arbitrary errors in density matrices. The performance of the proposed method is verified using both numerical simulation data and real experimental data.

  13. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    PubMed

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  14. Assessment of a condition-specific quality-of-life measure for patients with developmentally absent teeth: validity and reliability testing.

    PubMed

    Akram, A J; Ireland, A J; Postlethwaite, K C; Sandy, J R; Jerreat, A S

    2013-11-01

    This article describes the process of validity and reliability testing of a condition-specific quality-of-life measure for patients with hypodontia presenting for orthodontic treatment. The development of the instrument is described in a previous article. Royal Devon and Exeter NHS Foundation Trust & Musgrove Park Hospital, Taunton. The child perception questionnaire was used as a standard against which to test criterion validity. The Bland and Altman method was used to check agreement between the two questionnaires. Construct validity was tested using principal component analysis on the four sections of the questionnaire. Test-retest reliability was tested using intraclass correlation coefficient and Bland and Altman method. Cronbach's alpha was used to test internal consistency reliability. Overall the questionnaire showed good reliability, criterion and construct validity. This together with previous evidence of good face and content validity suggests that the instrument may prove useful in clinical practice and further research. This study has demonstrated that the newly developed condition-specific quality-of-life questionnaire is both valid and reliable for use in young patients with hypodontia. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  15. An Early Years Toolbox for Assessing Early Executive Function, Language, Self-Regulation, and Social Development: Validity, Reliability, and Preliminary Norms

    ERIC Educational Resources Information Center

    Howard, Steven J.; Melhuish, Edward

    2017-01-01

    Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years…

  16. Gold-standard evaluation of a folksonomy-based ontology learning model

    NASA Astrophysics Data System (ADS)

    Djuana, E.

    2018-03-01

    Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.

  17. Assessment of the Validity of the Research Diagnostic Criteria for Temporomandibular Disorders: Overview and Methodology

    PubMed Central

    Schiffman, Eric L.; Truelove, Edmond L.; Ohrbach, Richard; Anderson, Gary C.; John, Mike T.; List, Thomas; Look, John O.

    2011-01-01

    AIMS The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. An overview is presented, including Axis I and II methodology and descriptive statistics for the study participant sample. This paper details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. Validity testing for the Axis II biobehavioral instruments was based on previously validated reference standards. METHODS The Axis I reference standards were based on the consensus of 2 criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion exam reliability was also assessed within study sites. RESULTS Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas ≥ 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion exam agreement with reference standards was excellent (k ≥ 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). CONCLUSION The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods. PMID:20213028

  18. Assessing Discriminative Performance at External Validation of Clinical Prediction Models

    PubMed Central

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W.

    2016-01-01

    Introduction External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. Methods We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. Results The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. Conclusion The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients. PMID:26881753

  19. A Diagnostic Marker to Discriminate Childhood Apraxia of Speech from Speech Delay: IV. the Pause Marker Index

    ERIC Educational Resources Information Center

    Shriberg, Lawrence D.; Strand, Edythe A.; Fourakis, Marios; Jakielski, Kathy J.; Hall, Sheryl D.; Karlsson, Heather B.; Mabie, Heather L.; McSweeny, Jane L.; Tilkens, Christie M.; Wilson, David L.

    2017-01-01

    Purpose: Three previous articles provided rationale, methods, and several forms of validity support for a diagnostic marker of childhood apraxia of speech (CAS), termed the pause marker (PM). Goals of the present article were to assess the validity and stability of the PM Index (PMI) to scale CAS severity. Method: PM scores and speech, prosody,…

  20. Validity and reliability of bioelectrical impedance analysis and skinfold thickness in predicting body fat in military personnel.

    PubMed

    Aandstad, Anders; Holtberget, Kristian; Hageberg, Rune; Holme, Ingar; Anderssen, Sigmund A

    2014-02-01

    Previous studies show that body composition is related to injury risk and physical performance in soldiers. Thus, valid methods for measuring body composition in military personnel are needed. The frequently used body mass index method is not a valid measure of body composition in soldiers, but reliability and validity of alternative field methods are less investigated in military personnel. Thus, we carried out test and retest of skinfold (SKF), single frequency bioelectrical impedance analysis (SF-BIA), and multifrequency bioelectrical impedance analysis measurements in 65 male and female soldiers. Several validated equations were used to predict percent body fat from these methods. Dual-energy X-ray absorptiometry was also measured, and acted as the criterion method. Results showed that SF-BIA was the most reliable method in both genders. In women, SF-BIA was also the most valid method, whereas SKF or a combination of SKF and SF-BIA produced the highest validity in men. Reliability and validity varied substantially among the equations examined. The best methods and equations produced test-retest 95% limits of agreement below ±1% points, whereas the corresponding validity figures were ±3.5% points. Each investigator and practitioner must consider whether such measurement errors are acceptable for its specific use. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  1. Discriminant Validity Assessment: Use of Fornell & Larcker criterion versus HTMT Criterion

    NASA Astrophysics Data System (ADS)

    Hamid, M. R. Ab; Sami, W.; Mohmad Sidek, M. H.

    2017-09-01

    Assessment of discriminant validity is a must in any research that involves latent variables for the prevention of multicollinearity issues. Fornell and Larcker criterion is the most widely used method for this purpose. However, a new method has emerged for establishing the discriminant validity assessment through heterotrait-monotrait (HTMT) ratio of correlations method. Therefore, this article presents the results of discriminant validity assessment using these methods. Data from previous study was used that involved 429 respondents for empirical validation of value-based excellence model in higher education institutions (HEI) in Malaysia. From the analysis, the convergent, divergent and discriminant validity were established and admissible using Fornell and Larcker criterion. However, the discriminant validity is an issue when employing the HTMT criterion. This shows that the latent variables under study faced the issue of multicollinearity and should be looked into for further details. This also implied that the HTMT criterion is a stringent measure that could detect the possible indiscriminant among the latent variables. In conclusion, the instrument which consisted of six latent variables was still lacking in terms of discriminant validity and should be explored further.

  2. A novel validation and calibration method for motion capture systems based on micro-triangulation.

    PubMed

    Nagymáté, Gergely; Tuchband, Tamás; Kiss, Rita M

    2018-06-06

    Motion capture systems are widely used to measure human kinematics. Nevertheless, users must consider system errors when evaluating their results. Most validation techniques for these systems are based on relative distance and displacement measurements. In contrast, our study aimed to analyse the absolute volume accuracy of optical motion capture systems by means of engineering surveying reference measurement of the marker coordinates (uncertainty: 0.75 mm). The method is exemplified on an 18 camera OptiTrack Flex13 motion capture system. The absolute accuracy was defined by the root mean square error (RMSE) between the coordinates measured by the camera system and by engineering surveying (micro-triangulation). The original RMSE of 1.82 mm due to scaling error was managed to be reduced to 0.77 mm while the correlation of errors to their distance from the origin reduced from 0.855 to 0.209. A simply feasible but less accurate absolute accuracy compensation method using tape measure on large distances was also tested, which resulted in similar scaling compensation compared to the surveying method or direct wand size compensation by a high precision 3D scanner. The presented validation methods can be less precise in some respects as compared to previous techniques, but they address an error type, which has not been and cannot be studied with the previous validation methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Validation of an asthma questionnaire for use in healthcare workers

    PubMed Central

    Delclos, G L; Arif, A A; Aday, L; Carson, A; Lai, D; Lusk, C; Stock, T; Symanski, E; Whitehead, L W; Benavides, F G; Antó, J M

    2006-01-01

    Background Previous studies have described increased occurrence of asthma among healthcare workers, but to our knowledge there are no validated survey questionnaires with which to study this occupational group. Aims To develop, validate, and refine a new survey instrument on asthma for use in epidemiological studies of healthcare workers. Methods An initial draft questionnaire, designed by a multidisciplinary team, used previously validated questions where possible; the occupational exposure section was developed by updating health services specific chemical lists through hospital walk‐through surveys and review of material safety data sheets. A cross‐sectional validation study was conducted in 118 non‐smoking subjects, who also underwent bronchial challenge testing, an interview with an industrial hygienist, and measurement of specific IgE antibodies to common aeroallergens. Results The final version consisted of 43 main questions in four sections. Time to completion of the questionnaire ranged from 13 to 25 minutes. Test–retest reliability of asthma and allergy items ranged from 75% to 94%, and internal consistency for these items was excellent (Cronbach's α ⩾ 0.86). Against methacholine challenge, an eight item combination of asthma related symptoms had a sensitivity of 71% and specificity of 70%; against a physician diagnosis of asthma, this same combination showed a sensitivity of 79% and specificity of 98%. Agreement between self‐reported exposures and industrial hygienist review was similar to previous studies and only moderate, indicating the need to incorporate more reliable methods of exposure assessment. Against the aerollergen panel, the best combinations of sensitivity and specificity were obtained for a history of allergies to dust, dust mite, and animals. Conclusions Initial evaluation of this new questionnaire indicates good validity and reliability, and further field testing and cross‐validation in a larger healthcare worker population is in progress. The need for development of more reliable occupational exposure assessment methods that go beyond self‐report is underscored. PMID:16497858

  5. Validity of Evidence-Derived Criteria for Reactive Attachment Disorder: Indiscriminately Social/Disinhibited and Emotionally Withdrawn/Inhibited Types

    ERIC Educational Resources Information Center

    Gleason, Mary Margaret; Fox, Nathan A.; Drury, Stacy; Smyke, Anna; Egger, Helen L.; Nelson, Charles A., III; Gregas, Matthew C.; Zeanah, Charles H.

    2011-01-01

    Objective: This study examined the validity of criteria for indiscriminately social/disinhibited and emotionally withdrawn/inhibited reactive attachment disorder (RAD). Method: As part of a longitudinal intervention trial of previously institutionalized children, caregiver interviews and direct observational measurements provided continuous and…

  6. A gas-kinetic BGK scheme for the compressible Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Xu, Kun

    2000-01-01

    This paper presents an improved gas-kinetic scheme based on the Bhatnagar-Gross-Krook (BGK) model for the compressible Navier-Stokes equations. The current method extends the previous gas-kinetic Navier-Stokes solver developed by Xu and Prendergast by implementing a general nonequilibrium state to represent the gas distribution function at the beginning of each time step. As a result, the requirement in the previous scheme, such as the particle collision time being less than the time step for the validity of the BGK Navier-Stokes solution, is removed. Therefore, the applicable regime of the current method is much enlarged and the Navier-Stokes solution can be obtained accurately regardless of the ratio between the collision time and the time step. The gas-kinetic Navier-Stokes solver developed by Chou and Baganoff is the limiting case of the current method, and it is valid only under such a limiting condition. Also, in this paper, the appropriate implementation of boundary condition for the kinetic scheme, different kinetic limiting cases, and the Prandtl number fix are presented. The connection among artificial dissipative central schemes, Godunov-type schemes, and the gas-kinetic BGK method is discussed. Many numerical tests are included to validate the current method.

  7. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  8. Using cluster ensemble and validation to identify subtypes of pervasive developmental disorders.

    PubMed

    Shen, Jess J; Lee, Phil-Hyoun; Holden, Jeanette J A; Shatkay, Hagit

    2007-10-11

    Pervasive Developmental Disorders (PDD) are neurodevelopmental disorders characterized by impairments in social interaction, communication and behavior. Given the diversity and varying severity of PDD, diagnostic tools attempt to identify homogeneous subtypes within PDD. Identifying subtypes can lead to targeted etiology studies and to effective type-specific intervention. Cluster analysis can suggest coherent subsets in data; however, different methods and assumptions lead to different results. Several previous studies applied clustering to PDD data, varying in number and characteristics of the produced subtypes. Most studies used a relatively small dataset (fewer than 150 subjects), and all applied only a single clustering method. Here we study a relatively large dataset (358 PDD patients), using an ensemble of three clustering methods. The results are evaluated using several validation methods, and consolidated through an integration step. Four clusters are identified, analyzed and compared to subtypes previously defined by the widely used diagnostic tool DSM-IV.

  9. Using Cluster Ensemble and Validation to Identify Subtypes of Pervasive Developmental Disorders

    PubMed Central

    Shen, Jess J.; Lee, Phil Hyoun; Holden, Jeanette J.A.; Shatkay, Hagit

    2007-01-01

    Pervasive Developmental Disorders (PDD) are neurodevelopmental disorders characterized by impairments in social interaction, communication and behavior.1 Given the diversity and varying severity of PDD, diagnostic tools attempt to identify homogeneous subtypes within PDD. Identifying subtypes can lead to targeted etiology studies and to effective type-specific intervention. Cluster analysis can suggest coherent subsets in data; however, different methods and assumptions lead to different results. Several previous studies applied clustering to PDD data, varying in number and characteristics of the produced subtypes19. Most studies used a relatively small dataset (fewer than 150 subjects), and all applied only a single clustering method. Here we study a relatively large dataset (358 PDD patients), using an ensemble of three clustering methods. The results are evaluated using several validation methods, and consolidated through an integration step. Four clusters are identified, analyzed and compared to subtypes previously defined by the widely used diagnostic tool DSM-IV.2 PMID:18693920

  10. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    PubMed

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  11. Large scale study of multiple-molecule queries

    PubMed Central

    2009-01-01

    Background In ligand-based screening, as well as in other chemoinformatics applications, one seeks to effectively search large repositories of molecules in order to retrieve molecules that are similar typically to a single molecule lead. However, in some case, multiple molecules from the same family are available to seed the query and search for other members of the same family. Multiple-molecule query methods have been less studied than single-molecule query methods. Furthermore, the previous studies have relied on proprietary data and sometimes have not used proper cross-validation methods to assess the results. In contrast, here we develop and compare multiple-molecule query methods using several large publicly available data sets and background. We also create a framework based on a strict cross-validation protocol to allow unbiased benchmarking for direct comparison in future studies across several performance metrics. Results Fourteen different multiple-molecule query methods were defined and benchmarked using: (1) 41 publicly available data sets of related molecules with similar biological activity; and (2) publicly available background data sets consisting of up to 175,000 molecules randomly extracted from the ChemDB database and other sources. Eight of the fourteen methods were parameter free, and six of them fit one or two free parameters to the data using a careful cross-validation protocol. All the methods were assessed and compared for their ability to retrieve members of the same family against the background data set by using several performance metrics including the Area Under the Accumulation Curve (AUAC), Area Under the Curve (AUC), F1-measure, and BEDROC metrics. Consistent with the previous literature, the best parameter-free methods are the MAX-SIM and MIN-RANK methods, which score a molecule to a family by the maximum similarity, or minimum ranking, obtained across the family. One new parameterized method introduced in this study and two previously defined methods, the Exponential Tanimoto Discriminant (ETD), the Tanimoto Power Discriminant (TPD), and the Binary Kernel Discriminant (BKD), outperform most other methods but are more complex, requiring one or two parameters to be fit to the data. Conclusion Fourteen methods for multiple-molecule querying of chemical databases, including novel methods, (ETD) and (TPD), are validated using publicly available data sets, standard cross-validation protocols, and established metrics. The best results are obtained with ETD, TPD, BKD, MAX-SIM, and MIN-RANK. These results can be replicated and compared with the results of future studies using data freely downloadable from http://cdb.ics.uci.edu/. PMID:20298525

  12. Seeking a Valid Gold Standard for an Innovative, Dialect-Neutral Language Test

    ERIC Educational Resources Information Center

    Pearson, Barbara Zurer; Jackson, Janice E.; Wu, Haotian

    2014-01-01

    Purpose: In this study, the authors explored alternative gold standards to validate an innovative, dialect-neutral language assessment. Method: Participants were 78 African American children, ages 5;0 (years;months) to 6;11. Twenty participants had previously been identified as having language impairment. The Diagnostic Evaluation of Language…

  13. Development and Validation of Rapid Assessment Indices for Condition of Coastal Wetlands in Southern New England USA

    EPA Science Inventory

    The goals of this study were to develop and validate a Rapid Assessment Method (RAM) for assessing the condition of coastal wetlands in New England, USA. Eighty-one coastal wetland sites were assessed; nested within these were ten reference sites which were previously assessed us...

  14. Time series modeling of human operator dynamics in manual control tasks

    NASA Technical Reports Server (NTRS)

    Biezad, D. J.; Schmidt, D. K.

    1984-01-01

    A time-series technique is presented for identifying the dynamic characteristics of the human operator in manual control tasks from relatively short records of experimental data. Control of system excitation signals used in the identification is not required. The approach is a multi-channel identification technique for modeling multi-input/multi-output situations. The method presented includes statistical tests for validity, is designed for digital computation, and yields estimates for the frequency responses of the human operator. A comprehensive relative power analysis may also be performed for validated models. This method is applied to several sets of experimental data; the results are discussed and shown to compare favorably with previous research findings. New results are also presented for a multi-input task that has not been previously modeled to demonstrate the strengths of the method.

  15. Time Series Modeling of Human Operator Dynamics in Manual Control Tasks

    NASA Technical Reports Server (NTRS)

    Biezad, D. J.; Schmidt, D. K.

    1984-01-01

    A time-series technique is presented for identifying the dynamic characteristics of the human operator in manual control tasks from relatively short records of experimental data. Control of system excitation signals used in the identification is not required. The approach is a multi-channel identification technique for modeling multi-input/multi-output situations. The method presented includes statistical tests for validity, is designed for digital computation, and yields estimates for the frequency response of the human operator. A comprehensive relative power analysis may also be performed for validated models. This method is applied to several sets of experimental data; the results are discussed and shown to compare favorably with previous research findings. New results are also presented for a multi-input task that was previously modeled to demonstrate the strengths of the method.

  16. Validation of an optimized method for the determination of iodine in human breast milk by inductively coupled plasma mass spectrometry (ICPMS) after tetramethylammonium hydroxide extraction.

    PubMed

    Huynh, Dao; Zhou, Shao Jia; Gibson, Robert; Palmer, Lyndon; Muhlhausler, Beverly

    2015-01-01

    In this study a novel method to determine iodine concentrations in human breast milk was developed and validated. The iodine was analyzed by inductively coupled plasma mass spectrometry (ICPMS) following tetramethylammonium hydroxide (TMAH) extraction at 90°C in disposable polypropylene tubes. While similar approaches have been used previously, this method adopted a shorter extraction time (1h vs. 3h) and used antimony (Sb) as the internal standard, which exhibited greater stability in breast milk and milk powder matrices compared to tellurium (Te). Method validation included: defining iodine linearity up to 200μgL(-1); confirming recovery of iodine from NIST 1549 milk powder. A recovery of 94-98% was also achieved for the NIST 1549 milk powder and human breast milk samples spiked with sodium iodide and thyroxine (T4) solutions. The method quantitation limit (MQL) for human breast milk was 1.6μgL(-1). The intra-assay and inter-assay coefficient of variation for the breast milk samples and NIST powder were <1% and <3.5%, respectively. NIST 1549 milk powder, human breast milk samples and calibration standards spiked with the internal standard were all stable for at least 2.5 months after extraction. The results of the validation process confirmed that this newly developed method provides greater accuracy and precision in the assessment of iodine concentrations in human breast milk than previous methods and therefore offers a more reliable approach for assessing iodine concentrations in human breast milk. Copyright © 2014 Elsevier GmbH. All rights reserved.

  17. Determination of Chondroitin Sulfate Content in Raw Materials and Dietary Supplements by High-Performance Liquid Chromatography with UV Detection After Enzymatic Hydrolysis: Single-Laboratory Validation First Action 2015.11.

    PubMed

    Brunelle, Sharon L

    2016-01-01

    A previously validated method for determination of chondroitin sulfate in raw materials and dietary supplements was submitted to the AOAC Expert Review Panel (ERP) for Stakeholder Panel on Dietary Supplements Set 1 Ingredients (Anthocyanins, Chondroitin, and PDE5 Inhibitors) for consideration of First Action Official Methods(SM) status. The ERP evaluated the single-laboratory validation results against AOAC Standard Method Performance Requirements 2014.009. With recoveries of 100.8-101.6% in raw materials and 105.4-105.8% in finished products and precision of 0.25-1.8% RSDr within-day and 1.6-4.72% RSDr overall, the ERP adopted the method for First Action Official Methods status and provided recommendations for achieving Final Action status.

  18. Factor Analysis Methods and Validity Evidence: A Systematic Review of Instrument Development across the Continuum of Medical Education

    ERIC Educational Resources Information Center

    Wetzel, Angela Payne

    2011-01-01

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…

  19. The convergent and concurrent validity of trait-based prototype assessment of personality disorder categories in homeless persons.

    PubMed

    Samuel, Douglas B; Connolly, Adrian J; Ball, Samuel A

    2012-09-01

    The DSM-5 proposal indicates that personality disorders (PDs) be defined as collections of maladaptive traits but does not provide a specific diagnostic method. However, researchers have previously suggested that PD constructs can be assessed by comparing individuals' trait profiles with those prototypic of PDs and evidence from the five-factor model (FFM) suggests that these prototype matching scores converge moderately with traditional PD instruments. The current study investigates the convergence of FFM PD prototypes with interview-assigned PD diagnoses in a sample of 99 homeless individuals. This sample had very high rates of PDs, which extends previous research on samples with more modest prevalence rates. Results indicated that diagnostic agreement between these methods was generally low but consistent with the agreement previously observed between explicit PD measures. Furthermore, trait-based and diagnostic interview scores evinced similar relationships with clinically important indicators such as abuse history and past suicide attempts. These findings demonstrate the validity of prototype methods and suggest their consideration for assessing trait-defined PD types within DSM-5.

  20. Validating a Prognostic Scoring System for Postmastectomy Locoregional Recurrence in Breast Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Skye Hung-Chun, E-mail: skye@kfsyscc.org; Clinical Research Office, Koo Foundation Sun Yat-Sen Cancer Center, Taipei, Taiwan; Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina

    2013-03-15

    Purpose: This study is designed to validate a previously developed locoregional recurrence risk (LRR) scoring system and further define which groups of patients with breast cancer would benefit from postmastectomy radiation therapy (PMRT). Methods and Materials: An LRR risk scoring system was developed previously at our institution using breast cancer patients initially treated with modified radical mastectomy between 1990 and 2001. The LRR score comprised 4 factors: patient age, lymphovascular invasion, estrogen receptor negativity, and number of involved lymph nodes. We sought to validate the original study by examining a new dataset of 1545 patients treated between 2002 and 2007. Results:more » The 1545 patients were scored according to the previously developed criteria: 920 (59.6%) were low risk (score 0-1), 493 (31.9%) intermediate risk (score 2-3), and 132 (8.5%) were high risk (score ≥4). The 5-year locoregional control rates with and without PMRT in low-risk, intermediate-risk, and high-risk groups were 98% versus 97% (P=.41), 97% versus 91% (P=.0005), and 89% versus 50% (P=.0002) respectively. Conclusions: This analysis of an additional 1545 patients treated between 2002 and 2007 validates our previously reported LRR scoring system and suggests appropriate patients for whom PMRT will be beneficial. Independent validation of this scoring system by other institutions is recommended.« less

  1. Rapid analysis of aminoglycoside antibiotics in bovine tissues using disposable pipette extraction and ultrahigh performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Lehotay, Steven J; Mastovska, Katerina; Lightfield, Alan R; Nuñez, Alberto; Dutko, Terry; Ng, Chilton; Bluhm, Louis

    2013-10-25

    A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pipette extraction, and analysis by a 3 min ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. The drug analytes include neomycin, streptomycin, dihydrosptreptomycin, and spectinomycin, which have residue tolerances in bovine in the US, and kanamicin, gentamicin, apramycin, amikacin, and hygromycin, which do not have US tolerances established in bovine tissues. Tobramycin was used as an internal standard. An additional drug, paromomycin also was validated in the method, but it was dropped during implementation due to conversion of neomycin into paromomycin. Proposed fragmentation patterns for the monitored ions of each analyte were elucidated with the aid of high resolution MS using a quadrupole-time-of-flight instrument. Recoveries from spiking experiments at regulatory levels of concern showed that all analytes averaged 70-120% recoveries in all tissues, except hygromycin averaged 61% recovery. Lowest calibrated levels were as low as 0.005 μg/g in matrix extracts, which approximately corresponded to the limit of detection for screening purposes. Drug identifications at levels <0.05 μg/g were made in spiked and/or real samples for all analytes and tissues tested. Analyses of 60 samples from 20 slaughtered cattle previously screened positive for aminoglycosides showed that this method worked well in practice. The UHPLC-MS/MS method has several advantages compared to the previous microbial inhibition screening assay, especially for distinguishing individual drugs from a mixture and improving identification of gentamicin in tissue samples. Published by Elsevier B.V.

  2. Validation of a new ELISA method for in vitro potency testing of hepatitis A vaccines.

    PubMed

    Morgeaux, S; Variot, P; Daas, A; Costanzo, A

    2013-01-01

    The goal of the project was to standardise a new in vitro method in replacement of the existing standard method for the determination of hepatitis A virus antigen content in hepatitis A vaccines (HAV) marketed in Europe. This became necessary due to issues with the method used previously, requiring the use of commercial test kits. The selected candidate method, not based on commercial kits, had already been used for many years by an Official Medicines Control Laboratory (OMCL) for routine testing and batch release of HAV. After a pre-qualification phase (Phase 1) that showed the suitability of the commercially available critical ELISA reagents for the determination of antigen content in marketed HAV present on the European market, an international collaborative study (Phase 2) was carried out in order to fully validate the method. Eleven laboratories took part in the collaborative study. They performed assays with the candidate standard method and, in parallel, for comparison purposes, with their own in-house validated methods where these were available. The study demonstrated that the new assay provides a more reliable and reproducible method when compared to the existing standard method. A good correlation of the candidate standard method with the in vivo immunogenicity assay in mice was shown previously for both potent and sub-potent (stressed) vaccines. Thus, the new standard method validated during the collaborative study may be implemented readily by manufacturers and OMCLs for routine batch release but also for in-process control or consistency testing. The new method was approved in October 2012 by Group of Experts 15 of the European Pharmacopoeia (Ph. Eur.) as the standard method for in vitro potency testing of HAV. The relevant texts will be revised accordingly. Critical reagents such as coating reagent and detection antibodies have been adopted by the Ph. Eur. Commission and are available from the EDQM as Ph. Eur. Biological Reference Reagents (BRRs).

  3. Memory Indexing: A Novel Method for Tracing Memory Processes in Complex Cognitive Tasks

    ERIC Educational Resources Information Center

    Renkewitz, Frank; Jahn, Georg

    2012-01-01

    We validate an eye-tracking method applicable for studying memory processes in complex cognitive tasks. The method is tested with a task on probabilistic inferences from memory. It provides valuable data on the time course of processing, thus clarifying previous results on heuristic probabilistic inference. Participants learned cue values of…

  4. Validation of life-charts documented with the personal life-chart app - a self-monitoring tool for bipolar disorder.

    PubMed

    Schärer, Lars O; Krienke, Ute J; Graf, Sandra-Mareike; Meltzer, Katharina; Langosch, Jens M

    2015-03-14

    Long-term monitoring in bipolar affective disorders constitutes an important therapeutic and preventive method. The present study examines the validity of the Personal Life-Chart App (PLC App), in both German and in English. This App is based on the National Institute of Mental Health's Life-Chart Method, the de facto standard for long-term monitoring in the treatment of bipolar disorders. Methods have largely been replicated from 2 previous Life-Chart studies. The participants documented Life-Charts with the PLC App on a daily basis. Clinicians assessed manic and depressive symptoms in clinical interviews using the Inventory of Depressive Symptomatology, clinician-rated (IDS-C) and the Young Mania Rating Scale (YMRS) on a monthly basis on average. Spearman correlations of the total scores of IDS-C and YMRS were calculated with both the Life-Chart functional impairment rating and mood rating documented with the PLC App. 44 subjects used the PLC App in German and 10 subjects used the PLC App in English. 118 clinical interviews from the German sub-sample and 97 from the English sub-sample were analysed separately. The results in both sub-samples are similar to previous Life-Chart validation studies. Again statistically significant high correlations were found between the Life-Chart function rating assigned through the PLC App and well-established observer-rated methods. Again correlations were weaker for the Life-Chart mood rating than for the Life-Chart function impairment. No relevant correlation was found between the Life-chart mood rating and YMRS in the German sub-sample. This study gives further evidence for the validity of the Life-Chart method as a valid tool for the recognition of both manic and depressive episodes. Documenting Life-Charts with the PLC App (English and German) does not seem to impair the validity of patient ratings.

  5. Determination of free sulphydryl groups in wheat gluten under the influence of different time and temperature of incubation: method validation.

    PubMed

    Rakita, Slađana; Pojić, Milica; Tomić, Jelena; Torbica, Aleksandra

    2014-05-01

    The aim of the present study was to determine the characteristics of an analytical method for determination of free sulphydryl (SH) groups of wheat gluten performed with previous gluten incubation for variable times (45, 90 and 135min) at variable temperatures (30 and 37°C), in order to determine its fitness-for-purpose. It was observed that the increase in temperature and gluten incubation time caused the increase in the amount of free SH groups, with more dynamic changes at 37°C. The method characteristics identified as relevant were: linearity, limit of detection, limit of quantification, precision (repeatability and reproducibility) and measurement uncertainty, which were checked within the validation protocol, while the method performance was monitored by X- and R-control charts. Identified method characteristics demonstrated its acceptable fitness-for-purpose, when assay included previous gluten incubation at 30°C. Although the method repeatability at 37°C was acceptable, the corresponding reproducibility did not meet the performance criterion on the basis of HORRAT value (HORRAT<2). Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Validation of catchment models for predicting land-use and climate change impacts. 1. Method

    NASA Astrophysics Data System (ADS)

    Ewen, J.; Parkin, G.

    1996-02-01

    Computer simulation models are increasingly being proposed as tools capable of giving water resource managers accurate predictions of the impact of changes in land-use and climate. Previous validation testing of catchment models is reviewed, and it is concluded that the methods used do not clearly test a model's fitness for such a purpose. A new generally applicable method is proposed. This involves the direct testing of fitness for purpose, uses established scientific techniques, and may be implemented within a quality assured programme of work. The new method is applied in Part 2 of this study (Parkin et al., J. Hydrol., 175:595-613, 1996).

  7. An accelerated photo-magnetic imaging reconstruction algorithm based on an analytical forward solution and a fast Jacobian assembly method

    NASA Astrophysics Data System (ADS)

    Nouizi, F.; Erkol, H.; Luk, A.; Marks, M.; Unlu, M. B.; Gulsen, G.

    2016-10-01

    We previously introduced photo-magnetic imaging (PMI), an imaging technique that illuminates the medium under investigation with near-infrared light and measures the induced temperature increase using magnetic resonance thermometry (MRT). Using a multiphysics solver combining photon migration and heat diffusion, PMI models the spatiotemporal distribution of temperature variation and recovers high resolution optical absorption images using these temperature maps. In this paper, we present a new fast non-iterative reconstruction algorithm for PMI. This new algorithm uses analytic methods during the resolution of the forward problem and the assembly of the sensitivity matrix. We validate our new analytic-based algorithm with the first generation finite element method (FEM) based reconstruction algorithm previously developed by our team. The validation is performed using, first synthetic data and afterwards, real MRT measured temperature maps. Our new method accelerates the reconstruction process 30-fold when compared to a single iteration of the FEM-based algorithm.

  8. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions

    DOE PAGES

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    2018-03-03

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less

  9. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions.

    PubMed

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    2018-05-30

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175-183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave ). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave . The improved model contains six of the 10 terms in the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. Compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Improvement of a mixture experiment model relating the component proportions to the size of nanonized itraconazole particles in extemporary suspensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pattarino, Franco; Piepel, Greg; Rinaldi, Maurizio

    A paper by Foglio Bonda et al. published previously in this journal (2016, Vol. 83, pp. 175–183) discussed the use of mixture experiment design and modeling methods to study how the proportions of three components in an extemporaneous oral suspension affected the mean diameter of drug particles (Z ave). The three components were itraconazole (ITZ), Tween 20 (TW20), and Methocel® E5 (E5). This commentary addresses some errors and other issues in the previous paper, and also discusses an improved model relating proportions of ITZ, TW20, and E5 to Z ave. The improved model contains six of the 10 terms inmore » the full-cubic mixture model, which were selected using a different cross-validation procedure than used in the previous paper. In conclusion, compared to the four-term model presented in the previous paper, the improved model fit the data better, had excellent cross-validation performance, and the predicted Z ave of a validation point was within model uncertainty of the measured value.« less

  11. An optimized method to calculate error correction capability of tool influence function in frequency domain

    NASA Astrophysics Data System (ADS)

    Wang, Jia; Hou, Xi; Wan, Yongjian; Shi, Chunyan

    2017-10-01

    An optimized method to calculate error correction capability of tool influence function (TIF) in certain polishing conditions will be proposed based on smoothing spectral function. The basic mathematical model for this method will be established in theory. A set of polishing experimental data with rigid conformal tool is used to validate the optimized method. The calculated results can quantitatively indicate error correction capability of TIF for different spatial frequency errors in certain polishing conditions. The comparative analysis with previous method shows that the optimized method is simpler in form and can get the same accuracy results with less calculating time in contrast to previous method.

  12. Towards a conceptual framework demonstrating the effectiveness of audiovisual patient descriptions (patient video cases): a review of the current literature

    PubMed Central

    2012-01-01

    Background Technological advances have enabled the widespread use of video cases via web-streaming and online download as an educational medium. The use of real subjects to demonstrate acute pathology should aid the education of health care professionals. However, the methodology by which this effect may be tested is not clear. Methods We undertook a literature review of major databases, found relevant articles relevant to using patient video cases as educational interventions, extracted the methodologies used and assessed these methods for internal and construct validity. Results A review of 2532 abstracts revealed 23 studies meeting the inclusion criteria and a final review of 18 of relevance. Medical students were the most commonly studied group (10 articles) with a spread of learner satisfaction, knowledge and behaviour tested. Only two of the studies fulfilled defined criteria on achieving internal and construct validity. The heterogeneity of articles meant it was not possible to perform any meta-analysis. Conclusions Previous studies have not well classified which facet of training or educational outcome the study is aiming to explore and had poor internal and construct validity. Future research should aim to validate a particular outcome measure, preferably by reproducing previous work rather than adopting new methods. In particular cognitive processing enhancement, demonstrated in a number of the medical student studies, should be tested at a postgraduate level. PMID:23256787

  13. The convergent and discriminant validity of burnout measures in sport: a multi-trait/multi-method analysis.

    PubMed

    Cresswell, Scott L; Eklund, Robert C

    2006-02-01

    Athlete burnout research has been hampered by the lack of an adequate measurement tool. The Athlete Burnout Questionnaire (ABQ) and the Maslach Burnout Inventory General Survey (MBI-GS) are two recently developed self-report instruments designed to assess burnout. The convergent and discriminant validity of the ABQ and MBI-GS were assessed through multi-trait/multi-method analysis with a sporting population. Overall, the ABQ and the MBI-GS displayed acceptable convergent validity with matching subscales highly correlated, and satisfactory internal discriminant validity with lower correlations between non-matching subscales. Both scales also indicated an adequate discrimination between the concepts of burnout and depression. These findings add support to previous findings in non-sporting populations that depression and burnout are separate constructs. Based on the psychometric results, construct validity analysis and practical considerations, the results support the use of the ABQ to assess athlete burnout.

  14. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  15. Fractal Clustering and Knowledge-driven Validation Assessment for Gene Expression Profiling.

    PubMed

    Wang, Lu-Yong; Balasubramanian, Ammaiappan; Chakraborty, Amit; Comaniciu, Dorin

    2005-01-01

    DNA microarray experiments generate a substantial amount of information about the global gene expression. Gene expression profiles can be represented as points in multi-dimensional space. It is essential to identify relevant groups of genes in biomedical research. Clustering is helpful in pattern recognition in gene expression profiles. A number of clustering techniques have been introduced. However, these traditional methods mainly utilize shape-based assumption or some distance metric to cluster the points in multi-dimension linear Euclidean space. Their results shows poor consistence with the functional annotation of genes in previous validation study. From a novel different perspective, we propose fractal clustering method to cluster genes using intrinsic (fractal) dimension from modern geometry. This method clusters points in such a way that points in the same clusters are more self-affine among themselves than to the points in other clusters. We assess this method using annotation-based validation assessment for gene clusters. It shows that this method is superior in identifying functional related gene groups than other traditional methods.

  16. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    PubMed Central

    CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study. PMID:24037076

  17. The influence of validity criteria on Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) test-retest reliability among high school athletes.

    PubMed

    Brett, Benjamin L; Solomon, Gary S

    2017-04-01

    Research findings to date on the stability of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) Composite scores have been inconsistent, requiring further investigation. The use of test validity criteria across these studies also has been inconsistent. Using multiple measures of stability, we examined test-retest reliability of repeated ImPACT baseline assessments in high school athletes across various validity criteria reported in previous studies. A total of 1146 high school athletes completed baseline cognitive testing using the online ImPACT test battery at two time periods of approximately two-year intervals. No participant sustained a concussion between assessments. Five forms of validity criteria used in previous test-retest studies were applied to the data, and differences in reliability were compared. Intraclass correlation coefficients (ICCs) ranged in composite scores from .47 (95% confidence interval, CI [.38, .54]) to .83 (95% CI [.81, .85]) and showed little change across a two-year interval for all five sets of validity criteria. Regression based methods (RBMs) examining the test-retest stability demonstrated a lack of significant change in composite scores across the two-year interval for all forms of validity criteria, with no cases falling outside the expected range of 90% confidence intervals. The application of more stringent validity criteria does not alter test-retest reliability, nor does it account for some of the variation observed across previously performed studies. As such, use of the ImPACT manual validity criteria should be utilized in the determination of test validity and in the individualized approach to concussion management. Potential future efforts to improve test-retest reliability are discussed.

  18. Developing Enhanced Blood–Brain Barrier Permeability Models: Integrating External Bio-Assay Data in QSAR Modeling

    PubMed Central

    Wang, Wenyi; Kim, Marlene T.; Sedykh, Alexander

    2015-01-01

    Purpose Experimental Blood–Brain Barrier (BBB) permeability models for drug molecules are expensive and time-consuming. As alternative methods, several traditional Quantitative Structure-Activity Relationship (QSAR) models have been developed previously. In this study, we aimed to improve the predictivity of traditional QSAR BBB permeability models by employing relevant public bio-assay data in the modeling process. Methods We compiled a BBB permeability database consisting of 439 unique compounds from various resources. The database was split into a modeling set of 341 compounds and a validation set of 98 compounds. Consensus QSAR modeling workflow was employed on the modeling set to develop various QSAR models. A five-fold cross-validation approach was used to validate the developed models, and the resulting models were used to predict the external validation set compounds. Furthermore, we used previously published membrane transporter models to generate relevant transporter profiles for target compounds. The transporter profiles were used as additional biological descriptors to develop hybrid QSAR BBB models. Results The consensus QSAR models have R2=0.638 for fivefold cross-validation and R2=0.504 for external validation. The consensus model developed by pooling chemical and transporter descriptors showed better predictivity (R2=0.646 for five-fold cross-validation and R2=0.526 for external validation). Moreover, several external bio-assays that correlate with BBB permeability were identified using our automatic profiling tool. Conclusions The BBB permeability models developed in this study can be useful for early evaluation of new compounds (e.g., new drug candidates). The combination of chemical and biological descriptors shows a promising direction to improve the current traditional QSAR models. PMID:25862462

  19. Variable Case Detection and Many Unreported Cases of Surgical-Site Infection Following Colon Surgery and Abdominal Hysterectomy in a Statewide Validation.

    PubMed

    Calderwood, Michael S; Huang, Susan S; Keller, Vicki; Bruce, Christina B; Kazerouni, N Neely; Janssen, Lynn

    2017-09-01

    OBJECTIVE To assess hospital surgical-site infection (SSI) identification and reporting following colon surgery and abdominal hysterectomy via a statewide external validation METHODS Infection preventionists (IPs) from the California Department of Public Health (CDPH) performed on-site SSI validation for surgical procedures performed in hospitals that voluntarily participated. Validation involved chart review of SSI cases previously reported by hospitals plus review of patient records flagged for review by claims codes suggestive of SSI. We assessed the sensitivity of traditional surveillance and the added benefit of claims-based surveillance. We also evaluated the positive predictive value of claims-based surveillance (ie, workload efficiency). RESULTS Upon validation review, CDPH IPs identified 239 SSIs following colon surgery at 42 hospitals and 76 SSIs following abdominal hysterectomy at 34 hospitals. For colon surgery, traditional surveillance had a sensitivity of 50% (47% for deep incisional or organ/space [DI/OS] SSI), compared to 84% (88% for DI/OS SSI) for claims-based surveillance. For abdominal hysterectomy, traditional surveillance had a sensitivity of 68% (67% for DI/OS SSI) compared to 74% (78% for DI/OS SSI) for claims-based surveillance. Claims-based surveillance was also efficient, with 1 SSI identified for every 2 patients flagged for review who had undergone abdominal hysterectomy and for every 2.6 patients flagged for review who had undergone colon surgery. Overall, CDPH identified previously unreported SSIs in 74% of validation hospitals performing colon surgery and 35% of validation hospitals performing abdominal hysterectomy. CONCLUSIONS Claims-based surveillance is a standardized approach that hospitals can use to augment traditional surveillance methods and health departments can use for external validation. Infect Control Hosp Epidemiol 2017;38:1091-1097.

  20. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  1. A brief measure of attitudes toward mixed methods research in psychology.

    PubMed

    Roberts, Lynne D; Povee, Kate

    2014-01-01

    The adoption of mixed methods research in psychology has trailed behind other social science disciplines. Teaching psychology students, academics, and practitioners about mixed methodologies may increase the use of mixed methods within the discipline. However, tailoring and evaluating education and training in mixed methodologies requires an understanding of, and way of measuring, attitudes toward mixed methods research in psychology. To date, no such measure exists. In this article we present the development and initial validation of a new measure: Attitudes toward Mixed Methods Research in Psychology. A pool of 42 items developed from previous qualitative research on attitudes toward mixed methods research along with validation measures was administered via an online survey to a convenience sample of 274 psychology students, academics and psychologists. Principal axis factoring with varimax rotation on a subset of the sample produced a four-factor, 12-item solution. Confirmatory factor analysis on a separate subset of the sample indicated that a higher order four factor model provided the best fit to the data. The four factors; 'Limited Exposure,' '(in)Compatibility,' 'Validity,' and 'Tokenistic Qualitative Component'; each have acceptable internal reliability. Known groups validity analyses based on preferred research orientation and self-rated mixed methods research skills, and convergent and divergent validity analyses based on measures of attitudes toward psychology as a science and scientist and practitioner orientation, provide initial validation of the measure. This brief, internally reliable measure can be used in assessing attitudes toward mixed methods research in psychology, measuring change in attitudes as part of the evaluation of mixed methods education, and in larger research programs.

  2. Interlaboratory study of a liquid chromatography method for erythromycin: determination of uncertainty.

    PubMed

    Dehouck, P; Vander Heyden, Y; Smeyers-Verbeke, J; Massart, D L; Marini, R D; Chiap, P; Hubert, Ph; Crommen, J; Van de Wauw, W; De Beer, J; Cox, R; Mathieu, G; Reepmeyer, J C; Voigt, B; Estevenon, O; Nicolas, A; Van Schepdael, A; Adams, E; Hoogmartens, J

    2003-08-22

    Erythromycin is a mixture of macrolide antibiotics produced by Saccharopolyspora erythreas during fermentation. A new method for the analysis of erythromycin by liquid chromatography has previously been developed. It makes use of an Astec C18 polymeric column. After validation in one laboratory, the method was now validated in an interlaboratory study. Validation studies are commonly used to test the fitness of the analytical method prior to its use for routine quality testing. The data derived in the interlaboratory study can be used to make an uncertainty statement as well. The relationship between validation and uncertainty statement is not clear for many analysts and there is a need to show how the existing data, derived during validation, can be used in practice. Eight laboratories participated in this interlaboratory study. The set-up allowed the determination of the repeatability variance, s(2)r and the between-laboratory variance, s(2)L. Combination of s(2)r and s(2)L results in the reproducibility variance s(2)R. It has been shown how these data can be used in future by a single laboratory that wants to make an uncertainty statement concerning the same analysis.

  3. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    PubMed

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W

    2016-01-01

    External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  4. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population.

    PubMed

    Carvalho, Suzana Papile Maciel; Brito, Liz Magalhães; Paiva, Luiz Airton Saavedra de; Bicudo, Lucilene Arilho Ribeiro; Crosato, Edgard Michel; Oliveira, Rogério Nogueira de

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study.

  5. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  6. A Comparison of Assessment Methods and Raters in Product Creativity

    ERIC Educational Resources Information Center

    Lu, Chia-Chen; Luh, Ding-Bang

    2012-01-01

    Although previous studies have attempted to use different experiences of raters to rate product creativity by adopting the Consensus Assessment Method (CAT) approach, the validity of replacing CAT with another measurement tool has not been adequately tested. This study aimed to compare raters with different levels of experience (expert ves.…

  7. Assessing the Effectiveness of "Wise Guys": A Mixed-Methods Approach

    ERIC Educational Resources Information Center

    Herrman, Judith W.; Gordon, Mellissa; Rahmer, Brian; Moore, Christopher C.; Habermann, Barbara; Haigh, Katherine M.

    2017-01-01

    Previous research raised questions on the validity of survey studies with the teen population. As one response, our team implemented a mixed-methods study to evaluate an evidence-based, interactive curriculum, "Wise Guys," which is designed to promote healthy relationships and sexual behavior in young men ages 4-17. The current study…

  8. Landscape scale estimation of soil carbon stock using 3D modelling.

    PubMed

    Veronesi, F; Corstanje, R; Mayr, T

    2014-07-15

    Soil C is the largest pool of carbon in the terrestrial biosphere, and yet the processes of C accumulation, transformation and loss are poorly accounted for. This, in part, is due to the fact that soil C is not uniformly distributed through the soil depth profile and most current landscape level predictions of C do not adequately account the vertical distribution of soil C. In this study, we apply a method based on simple soil specific depth functions to map the soil C stock in three-dimensions at landscape scale. We used soil C and bulk density data from the Soil Survey for England and Wales to map an area in the West Midlands region of approximately 13,948 km(2). We applied a method which describes the variation through the soil profile and interpolates this across the landscape using well established soil drivers such as relief, land cover and geology. The results indicate that this mapping method can effectively reproduce the observed variation in the soil profiles samples. The mapping results were validated using cross validation and an independent validation. The cross-validation resulted in an R(2) of 36% for soil C and 44% for BULKD. These results are generally in line with previous validated studies. In addition, an independent validation was undertaken, comparing the predictions against the National Soil Inventory (NSI) dataset. The majority of the residuals of this validation are between ± 5% of soil C. This indicates high level of accuracy in replicating topsoil values. In addition, the results were compared to a previous study estimating the carbon stock of the UK. We discuss the implications of our results within the context of soil C loss factors such as erosion and the impact on regional C process models. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Semiquantitative determination of mesophilic, aerobic microorganisms in cocoa products using the Soleris NF-TVC method.

    PubMed

    Montei, Carolyn; McDougal, Susan; Mozola, Mark; Rice, Jennifer

    2014-01-01

    The Soleris Non-fermenting Total Viable Count method was previously validated for a wide variety of food products, including cocoa powder. A matrix extension study was conducted to validate the method for use with cocoa butter and cocoa liquor. Test samples included naturally contaminated cocoa liquor and cocoa butter inoculated with natural microbial flora derived from cocoa liquor. A probability of detection statistical model was used to compare Soleris results at multiple test thresholds (dilutions) with aerobic plate counts determined using the AOAC Official Method 966.23 dilution plating method. Results of the two methods were not statistically different at any dilution level in any of the three trials conducted. The Soleris method offers the advantage of results within 24 h, compared to the 48 h required by standard dilution plating methods.

  10. Determination of Phosphorus and Potassium in Commercial Inorganic Fertilizers by Inductively Coupled Plasma-Optical Emission Spectrometry: Single-Laboratory Validation, First Action 2015.18.

    PubMed

    Thiex, Nancy J

    2016-07-01

    A previously validated method for the determination of both citrate-EDTA-soluble P and K and acid-soluble P and K in commercial inorganic fertilizers by inductively coupled plasma-optical emission spectrometry was submitted to the expert review panel (ERP) for fertilizers for consideration of First Action Official Method(SM) status. The ERP evaluated the single-laboratory validation results and recommended the method for First Action Official Method status and provided recommendations for achieving Final Action. Validation materials ranging from 4.4 to 52.4% P2O5 (1.7-22.7% P) and 3-62% K2O (2.5-51.1% K) were used for the validation. Recoveries from validation materials for citrate-soluble P and K ranged from 99.3 to 124.9% P and from 98.4 to 100.7% K. Recoveries from validation materials for acid-soluble "total" P and K ranged from 95.53 to 99.40% P and from 98.36 to 107.28% K. Values of r for citrate-soluble P and K, expressed as RSD, ranged from 0.28 to 1.30% for P and from 0.41 to 1.52% for K. Values of r for total P and K, expressed as RSD, ranged from 0.71 to 1.13% for P and from 0.39 to 1.18% for K. Based on the validation data, the ERP recommended the method (with alternatives for the citrate-soluble and the acid-soluble extractions) for First Action Official Method status and provided recommendations for achieving Final Action status.

  11. Cluster analysis of European Y-chromosomal STR haplotypes using the discrete Laplace method.

    PubMed

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2014-07-01

    The European Y-chromosomal short tandem repeat (STR) haplotype distribution has previously been analysed in various ways. Here, we introduce a new way of analysing population substructure using a new method based on clustering within the discrete Laplace exponential family that models the probability distribution of the Y-STR haplotypes. Creating a consistent statistical model of the haplotypes enables us to perform a wide range of analyses. Previously, haplotype frequency estimation using the discrete Laplace method has been validated. In this paper we investigate how the discrete Laplace method can be used for cluster analysis to further validate the discrete Laplace method. A very important practical fact is that the calculations can be performed on a normal computer. We identified two sub-clusters of the Eastern and Western European Y-STR haplotypes similar to results of previous studies. We also compared pairwise distances (between geographically separated samples) with those obtained using the AMOVA method and found good agreement. Further analyses that are impossible with AMOVA were made using the discrete Laplace method: analysis of the homogeneity in two different ways and calculating marginal STR distributions. We found that the Y-STR haplotypes from e.g. Finland were relatively homogeneous as opposed to the relatively heterogeneous Y-STR haplotypes from e.g. Lublin, Eastern Poland and Berlin, Germany. We demonstrated that the observed distributions of alleles at each locus were similar to the expected ones. We also compared pairwise distances between geographically separated samples from Africa with those obtained using the AMOVA method and found good agreement. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Evidence-based dentistry: analysis of dental anxiety scales for children.

    PubMed

    Al-Namankany, A; de Souza, M; Ashley, P

    2012-03-09

    To review paediatric dental anxiety measures (DAMs) and assess the statistical methods used for validation and their clinical implications. A search of four computerised databases between 1960 and January 2011 associated with DAMs, using pre-specified search terms, to assess the method of validation including the reliability as intra-observer agreement 'repeatability or stability' and inter-observer agreement 'reproducibility' and all types of validity. Fourteen paediatric DAMs were predominantly validated in schools and not in the clinical setting while five of the DAMs were not validated at all. The DAMs that were validated were done so against other paediatric DAMs which may not have been validated previously. Reliability was not assessed in four of the DAMs. However, all of the validated studies assessed reliability which was usually 'good' or 'acceptable'. None of the current DAMs used a formal sample size technique. Diversity was seen between the studies ranging from a few simple pictograms to lists of questions reported by either the individual or an observer. To date there is no scale that can be considered as a gold standard, and there is a need to further develop an anxiety scale with a cognitive component for children and adolescents.

  13. Optimal electromagnetic energy transmission and real-time dissipation in extended media.

    PubMed

    Glasgow, S; Ware, M

    2014-02-24

    Pulse reshaping effects that give rise to fast and slow light phenomena are inextricably linked to the dynamics of energy exchange between the pulse and the propagation medium. Energy that is dissipated from the pulse can no longer participate in this exchange process, but previous methods of calculating real-time dissipation are not valid for extended propagation media. We present a method for calculating real-time dissipation that is valid for electromagnetic pulse propagation in extended media. This method allows one to divide the energy stored in an extended medium into the portion that can be later transmitted out of the medium, and that portion which must be lost to either dissipation or reflection.

  14. A brief measure of attitudes toward mixed methods research in psychology

    PubMed Central

    Roberts, Lynne D.; Povee, Kate

    2014-01-01

    The adoption of mixed methods research in psychology has trailed behind other social science disciplines. Teaching psychology students, academics, and practitioners about mixed methodologies may increase the use of mixed methods within the discipline. However, tailoring and evaluating education and training in mixed methodologies requires an understanding of, and way of measuring, attitudes toward mixed methods research in psychology. To date, no such measure exists. In this article we present the development and initial validation of a new measure: Attitudes toward Mixed Methods Research in Psychology. A pool of 42 items developed from previous qualitative research on attitudes toward mixed methods research along with validation measures was administered via an online survey to a convenience sample of 274 psychology students, academics and psychologists. Principal axis factoring with varimax rotation on a subset of the sample produced a four-factor, 12-item solution. Confirmatory factor analysis on a separate subset of the sample indicated that a higher order four factor model provided the best fit to the data. The four factors; ‘Limited Exposure,’ ‘(in)Compatibility,’ ‘Validity,’ and ‘Tokenistic Qualitative Component’; each have acceptable internal reliability. Known groups validity analyses based on preferred research orientation and self-rated mixed methods research skills, and convergent and divergent validity analyses based on measures of attitudes toward psychology as a science and scientist and practitioner orientation, provide initial validation of the measure. This brief, internally reliable measure can be used in assessing attitudes toward mixed methods research in psychology, measuring change in attitudes as part of the evaluation of mixed methods education, and in larger research programs. PMID:25429281

  15. Measuring Empathy in Pharmacy Students

    PubMed Central

    Van Winkle, Lon J.; Hojat, Mohammadreza

    2011-01-01

    Objective. To validate the Jefferson Scale of Empathy-Health Profession Students version (JSE-HPS) in pharmacy students. Methods. The JSE-HPS (20 items), adapted from the original Jefferson Scale of Empathy for use among students in the healthcare professions, was completed by 187 first-year pharmacy students at Midwestern University Chicago College of Pharmacy. Results. Two factors, “perspective-taking” and “compassionate care,” emerged from factor analysis in this study, accounting for 31% and 8% of the variance, respectively. These factors are similar to the prominent ones reported in previous research involving physicians and medical students, supporting the construct validity of this instrument for pharmacy students. In the current study, mean JSE-HPS score was comparable to those reported for medical students, and consistent with previous findings with medical students and physicians. Women scored significantly higher than men. Conclusions. Findings support the construct validity and reliability of the JSE-HPS for measuring empathy in pharmacy students. PMID:21931447

  16. Application of a Method of Estimating DIF for Polytomous Test Items.

    ERIC Educational Resources Information Center

    Camilli, Gregory; Congdon, Peter

    1999-01-01

    Demonstrates a method for studying differential item functioning (DIF) that can be used with dichotomous or polytomous items and that is valid for data that follow a partial credit Item Response Theory model. A simulation study shows that positively biased Type I error rates are in accord with results from previous studies. (SLD)

  17. Secondary Schools Principals and Their Job Satisfaction: A Test of Process Theories

    ERIC Educational Resources Information Center

    Maforah, Tsholofelo Paulinah

    2015-01-01

    The study aims to test the validity of process theories on the job satisfaction of previously disadvantaged Secondary School principals in the North West province. A mixed-method approach consisting of both quantitative and qualitative methods was used for the study. A questionnaire was administered during the quantitative phase with a sample that…

  18. Assessing Adolescents' Understanding of and Reactions to Stress in Different Cultures: Results of a Mixed-Methods Approach

    ERIC Educational Resources Information Center

    Nastasi, Bonnie K.; Hitchcock, John H.; Burkholder, Gary; Varjas, Kristen; Sarkar, Sreeroopa; Jayasena, Asoka

    2007-01-01

    This article expands on an emerging mixed-method approach for validating culturally-specific constructs (see Hitchcock et al., 2005). Previous work established an approach for dealing with cultural impacts when assessing psychological constructs and the current article extends these efforts into studying stress reactions among adolescents in Sri…

  19. Early Prediction of Intensive Care Unit-Acquired Weakness: A Multicenter External Validation Study.

    PubMed

    Witteveen, Esther; Wieske, Luuk; Sommers, Juultje; Spijkstra, Jan-Jaap; de Waard, Monique C; Endeman, Henrik; Rijkenberg, Saskia; de Ruijter, Wouter; Sleeswijk, Mengalvio; Verhamme, Camiel; Schultz, Marcus J; van Schaik, Ivo N; Horn, Janneke

    2018-01-01

    An early diagnosis of intensive care unit-acquired weakness (ICU-AW) is often not possible due to impaired consciousness. To avoid a diagnostic delay, we previously developed a prediction model, based on single-center data from 212 patients (development cohort), to predict ICU-AW at 2 days after ICU admission. The objective of this study was to investigate the external validity of the original prediction model in a new, multicenter cohort and, if necessary, to update the model. Newly admitted ICU patients who were mechanically ventilated at 48 hours after ICU admission were included. Predictors were prospectively recorded, and the outcome ICU-AW was defined by an average Medical Research Council score <4. In the validation cohort, consisting of 349 patients, we analyzed performance of the original prediction model by assessment of calibration and discrimination. Additionally, we updated the model in this validation cohort. Finally, we evaluated a new prediction model based on all patients of the development and validation cohort. Of 349 analyzed patients in the validation cohort, 190 (54%) developed ICU-AW. Both model calibration and discrimination of the original model were poor in the validation cohort. The area under the receiver operating characteristics curve (AUC-ROC) was 0.60 (95% confidence interval [CI]: 0.54-0.66). Model updating methods improved calibration but not discrimination. The new prediction model, based on all patients of the development and validation cohort (total of 536 patients) had a fair discrimination, AUC-ROC: 0.70 (95% CI: 0.66-0.75). The previously developed prediction model for ICU-AW showed poor performance in a new independent multicenter validation cohort. Model updating methods improved calibration but not discrimination. The newly derived prediction model showed fair discrimination. This indicates that early prediction of ICU-AW is still challenging and needs further attention.

  20. A New MI-Based Visualization Aided Validation Index for Mining Big Longitudinal Web Trial Data

    PubMed Central

    Zhang, Zhaoyang; Fang, Hua; Wang, Honggang

    2016-01-01

    Web-delivered clinical trials generate big complex data. To help untangle the heterogeneity of treatment effects, unsupervised learning methods have been widely applied. However, identifying valid patterns is a priority but challenging issue for these methods. This paper, built upon our previous research on multiple imputation (MI)-based fuzzy clustering and validation, proposes a new MI-based Visualization-aided validation index (MIVOOS) to determine the optimal number of clusters for big incomplete longitudinal Web-trial data with inflated zeros. Different from a recently developed fuzzy clustering validation index, MIVOOS uses a more suitable overlap and separation measures for Web-trial data but does not depend on the choice of fuzzifiers as the widely used Xie and Beni (XB) index. Through optimizing the view angles of 3-D projections using Sammon mapping, the optimal 2-D projection-guided MIVOOS is obtained to better visualize and verify the patterns in conjunction with trajectory patterns. Compared with XB and VOS, our newly proposed MIVOOS shows its robustness in validating big Web-trial data under different missing data mechanisms using real and simulated Web-trial data. PMID:27482473

  1. Mathematic models for a ray tracing method and its applications in wireless optical communications.

    PubMed

    Zhang, Minglun; Zhang, Yangan; Yuan, Xueguang; Zhang, Jinnan

    2010-08-16

    This paper presents a new ray tracing method, which contains a whole set of mathematic models, and its validity is verified by simulations. In addition, both theoretical analysis and simulation results show that the computational complexity of the method is much lower than that of previous ones. Therefore, the method can be used to rapidly calculate the impulse response of wireless optical channels for complicated systems.

  2. Development and validation of a matrix solid-phase dispersion method to determine acrylamide in coffee and coffee substitutes.

    PubMed

    Soares, Cristina M Dias; Alves, Rita C; Casal, Susana; Oliveira, M Beatriz P P; Fernandes, José Oliveira

    2010-04-01

    The present study describes the development and validation of a new method based on a matrix solid-phase dispersion (MSPD) sample preparation procedure followed by GC-MS for determination of acrylamide levels in coffee (ground coffee and brewed coffee) and coffee substitute samples. Samples were dispersed in C(18) sorbent and the mixture was further packed into a preconditioned custom-made ISOLUTE bilayered SPE column (C(18)/Multimode; 1 g + 1 g). Acrylamide was subsequently eluted with water, and then derivatized with bromine and quantified by GC-MS in SIM mode. The MSPD/GC-MS method presented a LOD of 5 microg/kg and a LOQ of 10 microg/kg. Intra and interday precisions ranged from 2% to 4% and 4% to 10%, respectively. To evaluate the performance of the method, 11 samples of ground and brewed coffee and coffee substitutes were simultaneously analyzed by the developed method and also by a previously validated method based in a liquid-extraction (LE) procedure, and the results were compared showing a high correlation between them.

  3. [Selection of risk and diagnosis in diabetic polyneuropathy. Validation of method of new systems].

    PubMed

    Jurado, Jerónimo; Caula, Jacinto; Pou i Torelló, Josep Maria

    2006-06-30

    In a previous study we developed a specific algorithm, the polyneuropathy selection method (PSM) with 4 parameters (age, HDL-C, HbA1c, and retinopathy), to select patients at risk of diabetic polyneuropathy (DPN). We also developed a simplified method for DPN diagnosis: outpatient polyneuropathy diagnosis (OPD), with 4 variables (symptoms and 3 objective tests). To confirm the validity of conventional tests for DPN diagnosis; to validate the discriminatory power of the PSM and the diagnostic value of OPD by evaluating their relationship to electrodiagnosis studies and objective clinical neurological assessment; and to evaluate the correlation of DPN and pro-inflammatory status. Cross-sectional, crossed association for PSM validation. Paired samples for OPD validation. Primary care in 3 counties. Random sample of 75 subjects from the type-2 diabetes census for PSM evaluation. Thirty DPN patients and 30 non-DPN patients (from 2 DM2 sub-groups in our earlier study) for OPD evaluation. The gold standard for DPN diagnosis will be studied by means of a clinical neurological study (symptoms, physical examination, and sensitivity tests) and electrodiagnosis studies (sensitivity and motor EMG). Risks of neuropathy, macroangiopathy and pro-inflammatory status (PCR, TNF soluble fraction and total TGF-beta1) will be studied in every subject. Electrodiagnosis studies should confirm the validity of conventional tests for DPN diagnosis. PSM and OPD will be valid methods for selecting patients at risk and diagnosing DPN. There will be a significant relationship between DPN and pro-inflammatory tests.

  4. Diagnostic accuracy of different caries risk assessment methods. A systematic review.

    PubMed

    Senneby, Anna; Mejàre, Ingegerd; Sahlin, Nils-Eric; Svensäter, Gunnel; Rohlin, Madeleine

    2015-12-01

    To evaluate the accuracy of different methods used to identify individuals with increased risk of developing dental coronal caries. Studies on following methods were included: previous caries experience, tests using microbiota, buffering capacity, salivary flow rate, oral hygiene, dietary habits and sociodemographic variables. QUADAS-2 was used to assess risk of bias. Sensitivity, specificity, predictive values, and likelihood ratios (LR) were calculated. Quality of evidence based on ≥3 studies of a method was rated according to GRADE. PubMed, Cochrane Library, Web of Science and reference lists of included publications were searched up to January 2015. From 5776 identified articles, 18 were included. Assessment of study quality identified methodological limitations concerning study design, test technology and reporting. No study presented low risk of bias in all domains. Three or more studies were found only for previous caries experience and salivary mutans streptococci and quality of evidence for these methods was low. Evidence regarding other methods was lacking. For previous caries experience, sensitivity ranged between 0.21 and 0.94 and specificity between 0.20 and 1. Tests using salivary mutans streptococci resulted in low sensitivity and high specificity. For children with primary teeth at baseline, pooled LR for a positive test was 3 for previous caries experience and 4 for salivary mutans streptococci, given a threshold ≥10(5) CFU/ml. Evidence on the validity of analysed methods used for caries risk assessment is limited. As methodological quality was low, there is a need to improve study design. Low validity for the analysed methods may lead to patients with increased risk not being identified, whereas some are falsely identified as being at risk. As caries risk assessment guides individualized decisions on interventions and intervals for patient recall, improved performance based on best evidence is greatly needed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  6. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  7. Validation of a Russian Language Oswestry Disability Index Questionnaire.

    PubMed

    Yu, Elizabeth M; Nosova, Emily V; Falkenstein, Yuri; Prasad, Priya; Leasure, Jeremi M; Kondrashov, Dimitriy G

    2016-11-01

    Study Design  Retrospective reliability and validity study. Objective  To validate a recently translated Russian language version of the Oswestry Disability Index (R-ODI) using standardized methods detailed from previous validations in other languages. Methods  We included all subjects who were seen in our spine surgery clinic, over the age of 18, and fluent in the Russian language. R-ODI was translated by six bilingual people and combined into a consensus version. R-ODI and visual analog scale (VAS) questionnaires for leg and back pain were distributed to subjects during both their initial and follow-up visits. Test validity, stability, and internal consistency were measured using standardized psychometric methods. Results Ninety-seven subjects participated in the study. No change in the meaning of the questions on R-ODI was noted with translation from English to Russian. There was a significant positive correlation between R-ODI and VAS scores for both the leg and back during both the initial and follow-up visits ( p  < 0.01 for all). The instrument was shown to have high internal consistency (Cronbach α = 0.82) and moderate test-retest stability (interclass correlation coefficient = 0.70). Conclusions  The R-ODI is both valid and reliable for use among the Russian-speaking population in the United States.

  8. LC-UV assay method and UPLC/Q-TOF-MS characterisation of saponins from Ilex paraguariensis A. St. Hil. (mate) unripe fruits.

    PubMed

    Peixoto, Maria Paula Garofo; Kaiser, Samuel; Verza, Simone Gasparin; de Resende, Pedro Ernesto; Treter, Janine; Pavei, Cabral; Borré, Gustavo Luís; Ortega, George González

    2012-01-01

    Ilex paraguariensis A. St. Hil. (mate) is known in several South American countries because of the use of its leaves in stimulant herbal beverages. High saponin contents were reported in mate leaves and unripe fruits that possess a dissimilar composition. Two LC-UV methods previously reported for mate saponins assay focused on mate leaves and the quantification of the less polar saponin fraction in mate fruits. To develop and validate a LC-UV method to assay the total content of saponins in unripe mate fruits and characterise the chemical structure of triterpenic saponins by UPLC/Q-TOF-MS. From unripe fruits of mate a crude ethanolic extract was prepared (EX40) and the mate saponin fraction (MSF) purified by solid phase extraction. The LC-UV method was validated using ilexoside II as external standard. UPLC/Q-TOF-MS was adjusted from the LC-UV method to obtain the fragmentation patterns of the main saponins present in unripe fruits. Both LC-UV and UPLC/Q-TOF-MS methods indicate a wide range of Ilex saponins polarity. The ilexoside II and total saponin content of EX40 were 8.20% (w/w) and 47.60% (w/w), respectively. The total saponin content in unripe fruits was 7.28% (w/w). The saponins present in MSF characterised by UPLC/Q-TOF-MS are derived mainly from ursolic/oleanolic, acetyl ursolic or pomolic acid. The validated LC-UV method was shown to be linear, precise, accurate and to cover several saponins previously isolated from Ilex species and could be applied for the quality control of unripe fruit saponins. Copyright © 2011 John Wiley & Sons, Ltd.

  9. Validity of self-reported lunch recalls in Swedish school children aged 6-8 years.

    PubMed

    Hunsberger, Monica; Pena, Pablo; Lissner, Lauren; Grafström, Lisen; Vanaelst, Barbara; Börnhorst, Claudia; Pala, Valeria; Eiben, Gabriele

    2013-09-18

    Previous studies have suggested that young children are inaccurate reporters of dietary intake. The purpose of this study was to validate a single recall of the previous day's school lunch reported by 6-8 year old Swedish children and to assess teacher-recorded intake of the same meal in a standardized food journal. An additional research question was whether parents could report their child's intake of the previous day's lunch. Subjects constituted a convenience sample from the large, multi-country study Identification and prevention of Dietary- and lifestyle-induced health EFfects In Children and infantS (IDEFICS). Validations of both children's recalls and teachers' records were made by comparing results with the duplicate plate reference method. Twenty-five children (12 boys/13 girls) aged 6-8 years participated in the validation study at one school in western Sweden. Children were accurate self-reporters of their dietary intake at lunch, with no significant difference between reported and weighed intake (Mean difference (SD): 7(50) kcals, p=0.49). Teachers significantly over-reported intake (Mean difference (SD): 65(79) kcals, p=0.01). For both methods, child-reported and teacher-recorded, correlations with weighed intake were strong (Pearson's correlations r=0.92, p<0.001 and r=0.83, p<0.001 respectively). Bland-Altman plots showed strong agreement between child-reported and weighed intakes but confirmed systematic differences between teacher-records and weighed intakes. Foods were recalled by children with a food-match rate of 90%. In all cases parents themselves were unable to report on quantities consumed and only four of 25 children had parents with knowledge regarding food items consumed. Children 6-8 years of age accurately recalled their school lunch intake for one occasion while teachers recorded with less accuracy. Our findings suggest that children as young as six years of age may be better able to report on their dietary intake than previously suggested, at least for one main meal at school. Teacher-recorded intake provides a satisfactory estimate but with greater systematic deviation from the weighed intake. Parents were not able to report on their children's school lunches consumed on the previous day.

  10. Paediatric Automatic Phonological Analysis Tools (APAT).

    PubMed

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  11. Implementation and Validation of an Impedance Eduction Technique

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Gerhold, Carl H.

    2011-01-01

    Implementation of a pressure gradient method of impedance eduction in two NASA Langley flow ducts is described. The Grazing Flow Impedance Tube only supports plane-wave sources, while the Curved Duct Test Rig supports sources that contain higher-order modes. Multiple exercises are used to validate this new impedance eduction method. First, synthesized data for a hard wall insert and a conventional liner mounted in the Grazing Flow Impedance Tube are used as input to the two impedance eduction methods, the pressure gradient method and a previously validated wall pressure method. Comparisons between the two results are excellent. Next, data measured in the Grazing Flow Impedance Tube are used as input to both methods. Results from the two methods compare quite favorably for sufficiently low Mach numbers but this comparison degrades at Mach 0.5, especially when the hard wall insert is used. Finally, data measured with a hard wall insert mounted in the Curved Duct Test Rig are used as input to the pressure gradient method. Significant deviation from the known solution is observed, which is believed to be largely due to 3-D effects in this flow duct. Potential solutions to this issue are currently being explored.

  12. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for detection of genotoxic carcinogens: II. Summary of definitive validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Beevers, Carol; De Boeck, Marlies; Burlinson, Brian; Hobbs, Cheryl A; Kitamoto, Sachiko; Kraynak, Andrew R; McNamee, James; Nakagawa, Yuzuki; Pant, Kamala; Plappert-Helbig, Ulla; Priestley, Catherine; Takasawa, Hironao; Wada, Kunio; Wirnitzer, Uta; Asano, Norihide; Escobar, Patricia A; Lovell, David; Morita, Takeshi; Nakajima, Madoka; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this exercise was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The study protocol was optimized in the pre-validation studies, and then the definitive (4th phase) validation study was conducted in two steps. In the 1st step, assay reproducibility was confirmed among laboratories using four coded reference chemicals and the positive control ethyl methanesulfonate. In the 2nd step, the predictive capability was investigated using 40 coded chemicals with known genotoxic and carcinogenic activity (i.e., genotoxic carcinogens, genotoxic non-carcinogens, non-genotoxic carcinogens, and non-genotoxic non-carcinogens). Based on the results obtained, the in vivo comet assay is concluded to be highly capable of identifying genotoxic chemicals and therefore can serve as a reliable predictor of rodent carcinogenicity. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Classification of burn wounds using support vector machines

    NASA Astrophysics Data System (ADS)

    Acha, Begona; Serrano, Carmen; Palencia, Sergio; Murillo, Juan Jose

    2004-05-01

    The purpose of this work is to improve a previous method developed by the authors for the classification of burn wounds into their depths. The inputs of the system are color and texture information, as these are the characteristics observed by physicians in order to give a diagnosis. Our previous work consisted in segmenting the burn wound from the rest of the image and classifying the burn into its depth. In this paper we focus on the classification problem only. We already proposed to use a Fuzzy-ARTMAP neural network (NN). However, we may take advantage of new powerful classification tools such as Support Vector Machines (SVM). We apply the five-folded cross validation scheme to divide the database into training and validating sets. Then, we apply a feature selection method for each classifier, which will give us the set of features that yields the smallest classification error for each classifier. Features used to classify are first-order statistical parameters extracted from the L*, u* and v* color components of the image. The feature selection algorithms used are the Sequential Forward Selection (SFS) and the Sequential Backward Selection (SBS) methods. As data of the problem faced here are not linearly separable, the SVM was trained using some different kernels. The validating process shows that the SVM method, when using a Gaussian kernel of variance 1, outperforms classification results obtained with the rest of the classifiers, yielding an error classification rate of 0.7% whereas the Fuzzy-ARTMAP NN attained 1.6 %.

  14. Validation of a method to detect cocaine and its metabolites in nails by gas chromatography-mass spectrometry.

    PubMed

    Valente-Campos, Simone; Yonamine, Mauricio; de Moraes Moreau, Regina Lucia; Silva, Ovandir Alves

    2006-06-02

    The objective of the present work was to compare previously published methods and provide validation data to detect simultaneously cocaine (COC), benzoylecgonine (BE) and norcocaine (NCOC) in nail. Finger and toenail samples (5mg) were cut in very small pieces and submitted to an initial procedure for external decontamination. Methanol (3 ml) was used to release analytes from the matrix. A cleanup step was performed simultaneously by solid-phase extraction (SPE) and the residue was derivatized with pentafluoropropionic anhydride/pentafluoropropanol (PFPA/PFP). Gas chromatography-mass spectrometry (GC-MS) was used to detect the analytes in selected ion monitoring mode (SIM). Confidence parameters of validation of the method were: recovery, intra- and inter-assay precision, as well as limit of detection (LOD) of the analytes. The limits of detection were: 3.5 ng/mg for NCOC and 3.0 ng/mg for COC and BE. Good intra-assay precision was observed for all detected substances (coefficient of variation (CV)<11%). The inter-assay precision for norcocaine and benzoylecgonine were <4%. For intra- and inter-assay precision deuterated internal standards were used. Toenail and fingernail samples from eight declared cocaine users were submitted to the validated method.

  15. Fruit and Vegetable Plate Waste among Students in a Suburban School District Participating in the National School Lunch Program

    ERIC Educational Resources Information Center

    Handforth, Kellyn M.; Gilboy, Mary Beth; Harris, Jeffrey; Melia, Nicole

    2016-01-01

    Purpose/Objectives: The purpose of this project was to assess fruit and vegetable plate waste, examine patterns of selection and consumption of specific fruit and vegetable subgroups, and analyze for differences across gender, grade level, and school. Methods: A previously-validated digital photography method was used to collect plate waste data…

  16. Evolving forecasting classifications and applications in health forecasting

    PubMed Central

    Soyiri, Ireneous N; Reidpath, Daniel D

    2012-01-01

    Health forecasting forewarns the health community about future health situations and disease episodes so that health systems can better allocate resources and manage demand. The tools used for developing and measuring the accuracy and validity of health forecasts commonly are not defined although they are usually adapted forms of statistical procedures. This review identifies previous typologies used in classifying the forecasting methods commonly used in forecasting health conditions or situations. It then discusses the strengths and weaknesses of these methods and presents the choices available for measuring the accuracy of health-forecasting models, including a note on the discrepancies in the modes of validation. PMID:22615533

  17. The Elastic Behaviour of Sintered Metallic Fibre Networks: A Finite Element Study by Beam Theory

    PubMed Central

    Bosbach, Wolfram A.

    2015-01-01

    Background The finite element method has complimented research in the field of network mechanics in the past years in numerous studies about various materials. Numerical predictions and the planning efficiency of experimental procedures are two of the motivational aspects for these numerical studies. The widespread availability of high performance computing facilities has been the enabler for the simulation of sufficiently large systems. Objectives and Motivation In the present study, finite element models were built for sintered, metallic fibre networks and validated by previously published experimental stiffness measurements. The validated models were the basis for predictions about so far unknown properties. Materials and Methods The finite element models were built by transferring previously published skeletons of fibre networks into finite element models. Beam theory was applied as simplification method. Results and Conclusions The obtained material stiffness isn’t a constant but rather a function of variables such as sample size and boundary conditions. Beam theory offers an efficient finite element method for the simulated fibre networks. The experimental results can be approximated by the simulated systems. Two worthwhile aspects for future work will be the influence of size and shape and the mechanical interaction with matrix materials. PMID:26569603

  18. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  19. An Argument Against Augmenting the Lagrangean for Nonholonomic Systems

    NASA Technical Reports Server (NTRS)

    Roithmayr, Carlos M.; Hodges, Dewey H.

    2009-01-01

    Although it is known that correct dynamical equations of motion for a nonholonomic system cannot be obtained from a Lagrangean that has been augmented with a sum of the nonholonomic constraint equations weighted with multipliers, previous publications suggest otherwise. An example has been proposed in support of augmentation and purportedly demonstrates that an accepted method fails to produce correct equations of motion whereas augmentation leads to correct equations; this paper shows that in fact the opposite is true. The correct equations, previously discounted on the basis of a flawed application of the Newton-Euler method, are verified by using Kane's method and a new approach to determining the directions of constraint forces. A correct application of the Newton-Euler method reproduces valid equations.

  20. Figure-ground segmentation based on class-independent shape priors

    NASA Astrophysics Data System (ADS)

    Li, Yang; Liu, Yang; Liu, Guojun; Guo, Maozu

    2018-01-01

    We propose a method to generate figure-ground segmentation by incorporating shape priors into the graph-cuts algorithm. Given an image, we first obtain a linear representation of an image and then apply directional chamfer matching to generate class-independent, nonparametric shape priors, which provide shape clues for the graph-cuts algorithm. We then enforce shape priors in a graph-cuts energy function to produce object segmentation. In contrast to previous segmentation methods, the proposed method shares shape knowledge for different semantic classes and does not require class-specific model training. Therefore, the approach obtains high-quality segmentation for objects. We experimentally validate that the proposed method outperforms previous approaches using the challenging PASCAL VOC 2010/2012 and Berkeley (BSD300) segmentation datasets.

  1. Validation of an analytical method for nitrous oxide (N2O) laughing gas by headspace gas chromatography coupled to mass spectrometry (HS-GC-MS): forensic application to a lethal intoxication.

    PubMed

    Giuliani, N; Beyer, J; Augsburger, M; Varlet, V

    2015-03-01

    Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Validity and Interrater Reliability of the Visual Quarter-Waste Method for Assessing Food Waste in Middle School and High School Cafeteria Settings.

    PubMed

    Getts, Katherine M; Quinn, Emilee L; Johnson, Donna B; Otten, Jennifer J

    2017-11-01

    Measuring food waste (ie, plate waste) in school cafeterias is an important tool to evaluate the effectiveness of school nutrition policies and interventions aimed at increasing consumption of healthier meals. Visual assessment methods are frequently applied in plate waste studies because they are more convenient than weighing. The visual quarter-waste method has become a common tool in studies of school meal waste and consumption, but previous studies of its validity and reliability have used correlation coefficients, which measure association but not necessarily agreement. The aims of this study were to determine, using a statistic measuring interrater agreement, whether the visual quarter-waste method is valid and reliable for assessing food waste in a school cafeteria setting when compared with the gold standard of weighed plate waste. To evaluate validity, researchers used the visual quarter-waste method and weighed food waste from 748 trays at four middle schools and five high schools in one school district in Washington State during May 2014. To assess interrater reliability, researcher pairs independently assessed 59 of the same trays using the visual quarter-waste method. Both validity and reliability were assessed using a weighted κ coefficient. For validity, as compared with the measured weight, 45% of foods assessed using the visual quarter-waste method were in almost perfect agreement, 42% of foods were in substantial agreement, 10% were in moderate agreement, and 3% were in slight agreement. For interrater reliability between pairs of visual assessors, 46% of foods were in perfect agreement, 31% were in almost perfect agreement, 15% were in substantial agreement, and 8% were in moderate agreement. These results suggest that the visual quarter-waste method is a valid and reliable tool for measuring plate waste in school cafeteria settings. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  3. Determination of Nitrogen, Phosphorus, and Potassium Release Rates of Slow- and Controlled-Release Fertilizers: Single-Laboratory Validation, First Action 2015.15.

    PubMed

    Thiex, Nancy

    2016-01-01

    A previously validated method for the determination of nitrogen release patterns of slow- and controlled-release fertilizers (SRFs and CRFs, respectively) was submitted to the Expert Review Panel (ERP) for Fertilizers for consideration of First Action Official Method(SM) status. The ERP evaluated the single-laboratory validation results and recommended the method for First Action Official Method status and provided recommendations for achieving Final Action. The 180 day soil incubation-column leaching technique was demonstrated to be a robust and reliable method for characterizing N release patterns from SRFs and CRFs. The method was reproducible, and the results were only slightly affected by variations in environmental factors such as microbial activity, soil moisture, temperature, and texture. The release of P and K were also studied, but at fewer replications than for N. Optimization experiments on the accelerated 74 h extraction method indicated that temperature was the only factor found to substantially influence nutrient-release rates from the materials studied, and an optimized extraction profile was established as follows: 2 h at 25°C, 2 h at 50°C, 20 h at 55°C, and 50 h at 60°C.

  4. Jet production in the CoLoRFulNNLO method: Event shapes in electron-positron collisions

    NASA Astrophysics Data System (ADS)

    Del Duca, Vittorio; Duhr, Claude; Kardos, Adam; Somogyi, Gábor; Szőr, Zoltán; Trócsányi, Zoltán; Tulipánt, Zoltán

    2016-10-01

    We present the CoLoRFulNNLO method to compute higher order radiative corrections to jet cross sections in perturbative QCD. We apply our method to the computation of event shape observables in electron-positron collisions at NNLO accuracy and validate our code by comparing our predictions to previous results in the literature. We also calculate for the first time jet cone energy fraction at NNLO.

  5. A Conflict Management Scale for Pharmacy

    PubMed Central

    Gregory, Paul A.; Martin, Craig

    2009-01-01

    Objectives To develop and establish the validity and reliability of a conflict management scale specific to pharmacy practice and education. Methods A multistage inventory-item development process was undertaken involving 93 pharmacists and using a previously described explanatory model for conflict in pharmacy practice. A 19-item inventory was developed, field tested, and validated. Results The conflict management scale (CMS) demonstrated an acceptable degree of reliability and validity for use in educational or practice settings to promote self-reflection and self-awareness regarding individuals' conflict management styles. Conclusions The CMS provides a unique, pharmacy-specific method for individuals to determine and reflect upon their own conflict management styles. As part of an educational program to facilitate self-reflection and heighten self-awareness, the CMS may be a useful tool to promote discussions related to an important part of pharmacy practice. PMID:19960081

  6. Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment

    NASA Technical Reports Server (NTRS)

    Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.

    2008-01-01

    Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.

  7. An Early Years Toolbox for Assessing Early Executive Function, Language, Self-Regulation, and Social Development: Validity, Reliability, and Preliminary Norms

    PubMed Central

    Howard, Steven J.; Melhuish, Edward

    2016-01-01

    Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years Toolbox (EYT) offers substantial advantages for early assessment of language, EF, self-regulation, and social development. In the current study, results of our large-scale administration of this toolbox to 1,764 preschool and early primary school students indicated very good reliability, convergent validity with existing measures, and developmental sensitivity. Results were also suggestive of better capture of children’s emerging abilities relative to comparison measures. Preliminary norms are presented, showing a clear developmental trajectory across half-year age groups. The accessibility of the EYT, as well as its advantages over existing measures, offers considerably enhanced opportunities for objective measurement of young children’s abilities to enable research and educational applications. PMID:28503022

  8. Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  9. The Expanding Role of Applications in the Development and Validation of CFD at NASA

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    2010-01-01

    This paper focuses on the recent escalation in application of CFD to manned and unmanned flight projects at NASA and the need to often apply these methods to problems for which little or no previous validation data directly applies. The paper discusses the evolution of NASA.s CFD development from a strict Develop, Validate, Apply strategy to sometimes allowing for a Develop, Apply, Validate approach. The risks of this approach and some of its unforeseen benefits are discussed and tied to specific operational examples. There are distinct advantages for the CFD developer that is able to operate in this paradigm, and recommendations are provided for those inclined and willing to work in this environment.

  10. An extended Lagrangian method

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    1992-01-01

    A unique formulation of describing fluid motion is presented. The method, referred to as 'extended Lagrangian method', is interesting from both theoretical and numerical points of view. The formulation offers accuracy in numerical solution by avoiding numerical diffusion resulting from mixing of fluxes in the Eulerian description. Meanwhile, it also avoids the inaccuracy incurred due to geometry and variable interpolations used by the previous Lagrangian methods. Unlike the Lagrangian method previously imposed which is valid only for supersonic flows, the present method is general and capable of treating subsonic flows as well as supersonic flows. The method proposed in this paper is robust and stable. It automatically adapts to flow features without resorting to clustering, thereby maintaining rather uniform grid spacing throughout and large time step. Moreover, the method is shown to resolve multi-dimensional discontinuities with a high level of accuracy, similar to that found in one-dimensional problems.

  11. Validating MEDIQUAL Constructs

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  12. Item Development and Validity Testing for a Self- and Proxy Report: The Safe Driving Behavior Measure

    PubMed Central

    Classen, Sherrilene; Winter, Sandra M.; Velozo, Craig A.; Bédard, Michel; Lanford, Desiree N.; Brumback, Babette; Lutz, Barbara J.

    2010-01-01

    OBJECTIVE We report on item development and validity testing of a self-report older adult safe driving behaviors measure (SDBM). METHOD On the basis of theoretical frameworks (Precede–Proceed Model of Health Promotion, Haddon’s matrix, and Michon’s model), existing driving measures, and previous research and guided by measurement theory, we developed items capturing safe driving behavior. Item development was further informed by focus groups. We established face validity using peer reviewers and content validity using expert raters. RESULTS Peer review indicated acceptable face validity. Initial expert rater review yielded a scale content validity index (CVI) rating of 0.78, with 44 of 60 items rated ≥0.75. Sixteen unacceptable items (≤0.5) required major revision or deletion. The next CVI scale average was 0.84, indicating acceptable content validity. CONCLUSION The SDBM has relevance as a self-report to rate older drivers. Future pilot testing of the SDBM comparing results with on-road testing will define criterion validity. PMID:20437917

  13. Objective assessment based on motion-related metrics and technical performance in laparoscopic suturing.

    PubMed

    Sánchez-Margallo, Juan A; Sánchez-Margallo, Francisco M; Oropesa, Ignacio; Enciso, Silvia; Gómez, Enrique J

    2017-02-01

    The aim of this study is to present the construct and concurrent validity of a motion-tracking method of laparoscopic instruments based on an optical pose tracker and determine its feasibility as an objective assessment tool of psychomotor skills during laparoscopic suturing. A group of novice ([Formula: see text] laparoscopic procedures), intermediate (11-100 laparoscopic procedures) and experienced ([Formula: see text] laparoscopic procedures) surgeons performed three intracorporeal sutures on an ex vivo porcine stomach. Motion analysis metrics were recorded using the proposed tracking method, which employs an optical pose tracker to determine the laparoscopic instruments' position. Construct validation was measured for all 10 metrics across the three groups and between pairs of groups. Concurrent validation was measured against a previously validated suturing checklist. Checklists were completed by two independent surgeons over blinded video recordings of the task. Eighteen novices, 15 intermediates and 11 experienced surgeons took part in this study. Execution time and path length travelled by the laparoscopic dissector presented construct validity. Experienced surgeons required significantly less time ([Formula: see text]), travelled less distance using both laparoscopic instruments ([Formula: see text]) and made more efficient use of the work space ([Formula: see text]) compared with novice and intermediate surgeons. Concurrent validation showed strong correlation between both the execution time and path length and the checklist score ([Formula: see text] and [Formula: see text], [Formula: see text]). The suturing performance was successfully assessed by the motion analysis method. Construct and concurrent validity of the motion-based assessment method has been demonstrated for the execution time and path length metrics. This study demonstrates the efficacy of the presented method for objective evaluation of psychomotor skills in laparoscopic suturing. However, this method does not take into account the quality of the suture. Thus, future works will focus on developing new methods combining motion analysis and qualitative outcome evaluation to provide a complete performance assessment to trainees.

  14. DNS of Flows over Periodic Hills using a Discontinuous-Galerkin Spectral-Element Method

    NASA Technical Reports Server (NTRS)

    Diosady, Laslo T.; Murman, Scott M.

    2014-01-01

    Direct numerical simulation (DNS) of turbulent compressible flows is performed using a higher-order space-time discontinuous-Galerkin finite-element method. The numerical scheme is validated by performing DNS of the evolution of the Taylor-Green vortex and turbulent flow in a channel. The higher-order method is shown to provide increased accuracy relative to low-order methods at a given number of degrees of freedom. The turbulent flow over a periodic array of hills in a channel is simulated at Reynolds number 10,595 using an 8th-order scheme in space and a 4th-order scheme in time. These results are validated against previous large eddy simulation (LES) results. A preliminary analysis provides insight into how these detailed simulations can be used to improve Reynoldsaveraged Navier-Stokes (RANS) modeling

  15. Validation of a reversed phase high performance thin layer chromatographic-densitometric method for secoisolariciresinol diglucoside determination in flaxseed.

    PubMed

    Coran, Silvia A; Bartolucci, Gianluca; Bambagiotti-Alberti, Massimo

    2008-10-17

    The validation of a HPTLC-densitometric method for the determination of secoisolariciresinol diglucoside (SDG) in flaxseed was performed improving the reproducibility of a previously reported HPTLC densitometric procedure by the use of fully wettable reversed phase plates (silica gel 60 RP18W F(254S), 10cmx10cm) with MeOH:HCOOH 0.1% (40:60, v/v) mobile phase. The analysis required only the alkaline hydrolysis in aqueous medium of undefatted samples and densitometry at 282nm of HPTLC runs. The method was validated following the protocol proposed by the Société Francaise des Sciences et Techniques Pharmaceutiques (SFSTP) giving rise to a dependable and high throughput procedure well suited to routine application. SDG was quantified in the range of 321-1071ng with RSD of repeatability and intermediate precision not exceeding 3.61% and accuracy inside the acceptance limits. Flaxseed of five cultivars of different origin was elected as test-bed.

  16. Folding free energy surfaces of three small proteins under crowding: validation of the postprocessing method by direct simulation

    NASA Astrophysics Data System (ADS)

    Qin, Sanbo; Mittal, Jeetain; Zhou, Huan-Xiang

    2013-08-01

    We have developed a ‘postprocessing’ method for modeling biochemical processes such as protein folding under crowded conditions (Qin and Zhou 2009 Biophys. J. 97 12-19). In contrast to the direct simulation approach, in which the protein undergoing folding is simulated along with crowders, the postprocessing method requires only the folding simulation without crowders. The influence of the crowders is then obtained by taking conformations from the crowder-free simulation and calculating the free energies of transferring to the crowders. This postprocessing yields the folding free energy surface of the protein under crowding. Here the postprocessing results for the folding of three small proteins under ‘repulsive’ crowding are validated by those obtained previously by the direct simulation approach (Mittal and Best 2010 Biophys. J. 98 315-20). This validation confirms the accuracy of the postprocessing approach and highlights its distinct advantages in modeling biochemical processes under cell-like crowded conditions, such as enabling an atomistic representation of the test proteins.

  17. Comprehensive GMO detection using real-time PCR array: single-laboratory validation.

    PubMed

    Mano, Junichi; Harada, Mioko; Takabatake, Reona; Furui, Satoshi; Kitta, Kazumi; Nakamura, Kosuke; Akiyama, Hiroshi; Teshima, Reiko; Noritake, Hiromichi; Hatano, Shuko; Futo, Satoshi; Minegishi, Yasutaka; Iizuka, Tayoshi

    2012-01-01

    We have developed a real-time PCR array method to comprehensively detect genetically modified (GM) organisms. In the method, genomic DNA extracted from an agricultural product is analyzed using various qualitative real-time PCR assays on a 96-well PCR plate, targeting for individual GM events, recombinant DNA (r-DNA) segments, taxon-specific DNAs, and donor organisms of the respective r-DNAs. In this article, we report the single-laboratory validation of both DNA extraction methods and component PCR assays constituting the real-time PCR array. We selected some DNA extraction methods for specified plant matrixes, i.e., maize flour, soybean flour, and ground canola seeds, then evaluated the DNA quantity, DNA fragmentation, and PCR inhibition of the resultant DNA extracts. For the component PCR assays, we evaluated the specificity and LOD. All DNA extraction methods and component PCR assays satisfied the criteria set on the basis of previous reports.

  18. De novo synthesis of trideuteromethyl esters of amino acids for use in GC-MS and GC-tandem MS exemplified for ADMA in human plasma and urine: standardization, validation, comparison and proof of evidence for their aptitude as internal standards.

    PubMed

    Tsikas, Dimitrios

    2009-08-01

    Asymmetric dimethylarginine (ADMA, N(G),N(G)-dimethyl-L-arginine) is an endogenous inhibitor of nitric oxide (NO) synthesis, a potential risk factor for cardiovascular diseases and a powerful biochemical parameter in clinical studies. In our previous work we have reported on a GC-tandem MS method for the accurate and precise quantification of ADMA in biological fluids using de novo synthesized [(2)H(3)]-methyl ester ADMA (d(3)Me-ADMA) as internal standard (IS). This method provides basal ADMA concentrations in biological fluids that agree with those obtained by other groups using other validated methods for ADMA. Unanimously, de novo synthesized stable-isotope labeled analogues are considered not ideal IS, because they must be prepared in a matrix different from the biological sample. Recently, [2,3,3,4,4,5,5-(2)H(7)]-ADMA (d(7)-ADMA) has become commercially available and we took this opportunity to test the reliability of the de novo synthesized d(3)Me-ADMA as an IS for ADMA in GC-tandem MS. In this article, we report on the re-validation of the previously reported GC-tandem MS method for ADMA in human plasma and urine using d(7)-ADMA as IS, and on comparative quantitative analyses of ADMA by GC-tandem MS using d(7)-ADMA and d(3)Me-ADMA. After thorough standardization of d(7)-ADMA and methods validation, we obtained by GC-tandem MS very similar ADMA concentrations in plasma and urine using d(7)-ADMA and d(3)Me-ADMA. The present study gives a proof of evidence for the aptitude of (2)H(3)-ADMA as IS in GC-tandem MS and suggests that de novo synthesis of stable-isotope labeled alkyl esters of amino acids and amino acid derivates may be a generally applicable method in mass spectrometry-based methods for amino acids. This approach is especially useful for amino acids for which no stable-isotope labeled analogues are commercially available.

  19. A Complex Systems Approach to Causal Discovery in Psychiatry.

    PubMed

    Saxe, Glenn N; Statnikov, Alexander; Fenyo, David; Ren, Jiwen; Li, Zhiguo; Prasad, Meera; Wall, Dennis; Bergman, Nora; Briggs, Ernestine C; Aliferis, Constantin

    2016-01-01

    Conventional research methodologies and data analytic approaches in psychiatric research are unable to reliably infer causal relations without experimental designs, or to make inferences about the functional properties of the complex systems in which psychiatric disorders are embedded. This article describes a series of studies to validate a novel hybrid computational approach--the Complex Systems-Causal Network (CS-CN) method-designed to integrate causal discovery within a complex systems framework for psychiatric research. The CS-CN method was first applied to an existing dataset on psychopathology in 163 children hospitalized with injuries (validation study). Next, it was applied to a much larger dataset of traumatized children (replication study). Finally, the CS-CN method was applied in a controlled experiment using a 'gold standard' dataset for causal discovery and compared with other methods for accurately detecting causal variables (resimulation controlled experiment). The CS-CN method successfully detected a causal network of 111 variables and 167 bivariate relations in the initial validation study. This causal network had well-defined adaptive properties and a set of variables was found that disproportionally contributed to these properties. Modeling the removal of these variables resulted in significant loss of adaptive properties. The CS-CN method was successfully applied in the replication study and performed better than traditional statistical methods, and similarly to state-of-the-art causal discovery algorithms in the causal detection experiment. The CS-CN method was validated, replicated, and yielded both novel and previously validated findings related to risk factors and potential treatments of psychiatric disorders. The novel approach yields both fine-grain (micro) and high-level (macro) insights and thus represents a promising approach for complex systems-oriented research in psychiatry.

  20. Knowledge discovery by accuracy maximization

    PubMed Central

    Cacciatore, Stefano; Luchinat, Claudio; Tenori, Leonardo

    2014-01-01

    Here we describe KODAMA (knowledge discovery by accuracy maximization), an unsupervised and semisupervised learning algorithm that performs feature extraction from noisy and high-dimensional data. Unlike other data mining methods, the peculiarity of KODAMA is that it is driven by an integrated procedure of cross-validation of the results. The discovery of a local manifold’s topology is led by a classifier through a Monte Carlo procedure of maximization of cross-validated predictive accuracy. Briefly, our approach differs from previous methods in that it has an integrated procedure of validation of the results. In this way, the method ensures the highest robustness of the obtained solution. This robustness is demonstrated on experimental datasets of gene expression and metabolomics, where KODAMA compares favorably with other existing feature extraction methods. KODAMA is then applied to an astronomical dataset, revealing unexpected features. Interesting and not easily predictable features are also found in the analysis of the State of the Union speeches by American presidents: KODAMA reveals an abrupt linguistic transition sharply separating all post-Reagan from all pre-Reagan speeches. The transition occurs during Reagan’s presidency and not from its beginning. PMID:24706821

  1. Glossary of reference terms for alternative test methods and their validation.

    PubMed

    Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas

    2014-01-01

    This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.

  2. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  3. 40 CFR 152.93 - Citation of a previously submitted valid study.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Data Submitters' Rights § 152.93 Citation of a previously submitted valid study. An applicant may demonstrate compliance for a data requirement by citing a valid study previously submitted to the Agency. The... the original data submitter, the applicant may cite the study only in accordance with paragraphs (b...

  4. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    PubMed

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  5. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin

    PubMed Central

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2016-01-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a “complete mystical experience” that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. PMID:26442957

  6. A method to determine the mammographic regions that show early changes due to the development of breast cancer

    NASA Astrophysics Data System (ADS)

    Karemore, Gopal; Nielsen, Mads; Karssemeijer, Nico; Brandt, Sami S.

    2014-11-01

    It is well understood nowadays that changes in the mammographic parenchymal pattern are an indicator of a risk of breast cancer and we have developed a statistical method that estimates the mammogram regions where the parenchymal changes, due to breast cancer, occur. This region of interest is computed from a score map by utilising the anatomical breast coordinate system developed in our previous work. The method also makes an automatic scale selection to avoid overfitting while the region estimates are computed by a nested cross-validation scheme. In this way, it is possible to recover those mammogram regions that show a significant difference in classification scores between the cancer and the control group. Our experiments suggested that the most significant mammogram region is the region behind the nipple and that can be justified by previous findings from other research groups. This result was conducted on the basis of the cross-validation experiments on independent training, validation and testing sets from the case-control study of 490 women, of which 245 women were diagnosed with breast cancer within a period of 2-4 years after the baseline mammograms. We additionally generalised the estimated region to another, mini-MIAS study and showed that the transferred region estimate gives at least a similar classification result when compared to the case where the whole breast region is used. In all, by following our method, one most likely improves both preclinical and follow-up breast cancer screening, but a larger study population will be required to test this hypothesis.

  7. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques.

    PubMed

    Illias, Hazlee Azil; Chai, Xin Rui; Abu Bakar, Ab Halim; Mokhlis, Hazlie

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.

  8. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques

    PubMed Central

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works. PMID:26103634

  9. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize this gap in best practices and subsequently to promote instrument development research that is more consistent through the peer-review process.

  10. Reliability and Validity of a New Method for Isometric Back Extensor Strength Evaluation Using A Hand-Held Dynamometer.

    PubMed

    Park, Hee-Won; Baek, Sora; Kim, Hong Young; Park, Jung-Gyoo; Kang, Eun Kyoung

    2017-10-01

    To investigate the reliability and validity of a new method for isometric back extensor strength measurement using a portable dynamometer. A chair equipped with a small portable dynamometer was designed (Power Track II Commander Muscle Tester). A total of 15 men (mean age, 34.8±7.5 years) and 15 women (mean age, 33.1±5.5 years) with no current back problems or previous history of back surgery were recruited. Subjects were asked to push the back of the chair while seated, and their isometric back extensor strength was measured by the portable dynamometer. Test-retest reliability was assessed with intraclass correlation coefficient (ICC). For the validity assessment, isometric back extensor strength of all subjects was measured by a widely used physical performance evaluation instrument, BTE PrimusRS system. The limit of agreement (LoA) from the Bland-Altman plot was evaluated between two methods. The test-retest reliability was excellent (ICC=0.82; 95% confidence interval, 0.65-0.91). The Bland-Altman plots demonstrated acceptable agreement between the two methods: the lower 95% LoA was -63.1 N and the upper 95% LoA was 61.1 N. This study shows that isometric back extensor strength measurement using a portable dynamometer has good reliability and validity.

  11. Determination of Major Phenolic Compounds in Echinacea spp. Raw Materials and Finished Products by High-Performance Liquid Chromatography with Ultraviolet Detection: Single-Laboratory Validation Matrix Extension

    PubMed Central

    Brown, Paula N.; Chan, Michael; Paley, Lori; Betz, Joseph M.

    2013-01-01

    A method previously validated to determine caftaric acid, chlorogenic acid, cynarin, echinacoside, and cichoric acid in echinacea raw materials has been successfully applied to dry extract and liquid tincture products in response to North American consumer needs. Single-laboratory validation was used to assess the repeatability, accuracy, selectivity, LOD, LOQ, analyte stability (ruggedness), and linearity of the method, with emphasis on finished products. Repeatability precision for each phenolic compound was between 1.04 and 5.65% RSD, with HorRat values between 0.30 and 1.39 for raw and dry extract finished products. HorRat values for tinctures were between 0.09 and 1.10. Accuracy of the method was determined through spike recovery studies. Recovery of each compound from raw material negative control (ginseng) was between 90 and 114%, while recovery from the finished product negative control (maltodextrin and magnesium stearate) was between 97 and 103%. A study was conducted to determine if cichoric acid, a major phenolic component of Echinacea purpurea (L.) Moench and E. angustifolia DC, degrades during sample preparation (extraction) and HPLC analysis. No significant degradation was observed over an extended testing period using the validated method. PMID:22165004

  12. Determination of major phenolic compounds in Echinacea spp. raw materials and finished products by high-performance liquid chromatography with ultraviolet detection: single-laboratory validation matrix extension.

    PubMed

    Brown, Paula N; Chan, Michael; Paley, Lori; Betz, Joseph M

    2011-01-01

    A method previously validated to determine caftaric acid, chlorogenic acid, cynarin, echinacoside, and cichoric acid in echinacea raw materials has been successfully applied to dry extract and liquid tincture products in response to North American consumer needs. Single-laboratory validation was used to assess the repeatability, accuracy, selectivity, LOD, LOQ, analyte stability (ruggedness), and linearity of the method, with emphasis on finished products. Repeatability precision for each phenolic compound was between 1.04 and 5.65% RSD, with HorRat values between 0.30 and 1.39 for raw and dry extract finished products. HorRat values for tinctures were between 0.09 and 1.10. Accuracy of the method was determined through spike recovery studies. Recovery of each compound from raw material negative control (ginseng) was between 90 and 114%, while recovery from the finished product negative control (maltodextrin and magnesium stearate) was between 97 and 103%. A study was conducted to determine if cichoric acid, a major phenolic component of Echinacea purpurea (L.) Moench and E. angustifolia DC, degrades during sample preparation (extraction) and HPLC analysis. No significant degradation was observed over an extended testing period using the validated method.

  13. The use of children's drawings in the evaluation and treatment of child sexual, emotional, and physical abuse.

    PubMed

    Peterson, L W; Hardin, M; Nitsch, M J

    1995-05-01

    Primary care physicians can be instrumental in the initial identification of potential sexual, emotional, and physical abuse of children. We reviewed the use of children's artwork as a method of communicating individual and family functioning. A quantitative method of analyzing children's artwork provides more reliability and validity than some methods used previously. A new scoring system was developed that uses individual human figure drawings and kinetic family drawings. This scoring system was based on research with 842 children (341 positively identified as sexually molested, 252 positively not sexually molested but having emotional or behavioral problems, and 249 "normal" public school children). This system is more comprehensive than previous systems of assessment of potential abuse.

  14. Validation of Regression-Based Myogenic Correction Techniques for Scalp and Source-Localized EEG

    PubMed Central

    McMenamin, Brenton W.; Shackman, Alexander J.; Maxwell, Jeffrey S.; Greischar, Lawrence L.; Davidson, Richard J.

    2008-01-01

    EEG and EEG source-estimation are susceptible to electromyographic artifacts (EMG) generated by the cranial muscles. EMG can mask genuine effects or masquerade as a legitimate effect - even in low frequencies, such as alpha (8–13Hz). Although regression-based correction has been used previously, only cursory attempts at validation exist and the utility for source-localized data is unknown. To address this, EEG was recorded from 17 participants while neurogenic and myogenic activity were factorially varied. We assessed the sensitivity and specificity of four regression-based techniques: between-subjects, between-subjects using difference-scores, within-subjects condition-wise, and within-subject epoch-wise on the scalp and in data modeled using the LORETA algorithm. Although within-subject epoch-wise showed superior performance on the scalp, no technique succeeded in the source-space. Aside from validating the novel epoch-wise methods on the scalp, we highlight methods requiring further development. PMID:19298626

  15. Experimental validation of calculated atomic charges in ionic liquids

    NASA Astrophysics Data System (ADS)

    Fogarty, Richard M.; Matthews, Richard P.; Ashworth, Claire R.; Brandt-Talbot, Agnieszka; Palgrave, Robert G.; Bourne, Richard A.; Vander Hoogerstraete, Tom; Hunt, Patricia A.; Lovelock, Kevin R. J.

    2018-05-01

    A combination of X-ray photoelectron spectroscopy and near edge X-ray absorption fine structure spectroscopy has been used to provide an experimental measure of nitrogen atomic charges in nine ionic liquids (ILs). These experimental results are used to validate charges calculated with three computational methods: charges from electrostatic potentials using a grid-based method (ChelpG), natural bond orbital population analysis, and the atoms in molecules approach. By combining these results with those from a previous study on sulfur, we find that ChelpG charges provide the best description of the charge distribution in ILs. However, we find that ChelpG charges can lead to significant conformational dependence and therefore advise that small differences in ChelpG charges (<0.3 e) should be interpreted with care. We use these validated charges to provide physical insight into nitrogen atomic charges for the ILs probed.

  16. Validation of the self-assessment teamwork tool (SATT) in a cohort of nursing and medical students.

    PubMed

    Roper, Lucinda; Shulruf, Boaz; Jorm, Christine; Currie, Jane; Gordon, Christopher J

    2018-02-09

    Poor teamwork has been implicated in medical error and teamwork training has been shown to improve patient care. Simulation is an effective educational method for teamwork training. Post-simulation reflection aims to promote learning and we have previously developed a self-assessment teamwork tool (SATT) for health students to measure teamwork performance. This study aimed to evaluate the psychometric properties of a revised self-assessment teamwork tool. The tool was tested in 257 medical and nursing students after their participation in one of several mass casualty simulations. Using exploratory and confirmatory factor analysis, the revised self-assessment teamwork tool was shown to have strong construct validity, high reliability, and the construct demonstrated invariance across groups (Medicine & Nursing). The modified SATT was shown to be a reliable and valid student self-assessment tool. The SATT is a quick and practical method of guiding students' reflection on important teamwork skills.

  17. Reply to Comment on "Fringe projection profilometry with nonparallel illumination: a least-squares approach"

    NASA Astrophysics Data System (ADS)

    Chen, Lujie; Quan, Chenggen

    2006-07-01

    We have confirmed that a mathematical expression in our previous Letter [Chen and Quan, Opt. Lett.30, 2101 (2005)] should be modified. The modification, however, does not affect the validity of the method reported, the results obtained and the subsequent conclusions made.

  18. Gilligan's Moral Orientation Hypothesis: Strategies of Justification and Practical Deliberation.

    ERIC Educational Resources Information Center

    Keefer, Matthew Wilks

    Previous studies failed to determine whether Gilligan's (1982) justice and care perspectives represent two distinct orientations of moral reasoning. Using methods developed in research on reasoning and discourse processes, a study used a discursive framework to validate an alternate methodology for the investigation of moral orientation reasoning.…

  19. Academic and Recreational Reading Motivation of Teacher Candidates

    ERIC Educational Resources Information Center

    Lancellot, Michael

    2017-01-01

    The purpose of this mixed methods study was to determine relationships among teacher candidates' academic and recreational reading motivation. This study utilized a previously designed, reliable, and valid instrument called the Adult Reading Motivation Scale with permission from Schutte and Malouff (2007). The instrument included a pool of 50…

  20. Scale of Academic Emotion in Science Education: Development and Validation

    ERIC Educational Resources Information Center

    Chiang, Wen-Wei; Liu, Chia-Ju

    2014-01-01

    Contemporary research into science education has generally been conducted from the perspective of "conceptual change" in learning. This study sought to extend previous work by recognizing that human rationality can be influenced by the emotions generated by the learning environment and specific actions related to learning. Methods used…

  1. TLE uncertainty estimation using robust weighted differencing

    NASA Astrophysics Data System (ADS)

    Geul, Jacco; Mooij, Erwin; Noomen, Ron

    2017-05-01

    Accurate knowledge of satellite orbit errors is essential for many types of analyses. Unfortunately, for two-line elements (TLEs) this is not available. This paper presents a weighted differencing method using robust least-squares regression for estimating many important error characteristics. The method is applied to both classic and enhanced TLEs, compared to previous implementations, and validated using Global Positioning System (GPS) solutions for the GOCE satellite in Low-Earth Orbit (LEO), prior to its re-entry. The method is found to be more accurate than previous TLE differencing efforts in estimating initial uncertainty, as well as error growth. The method also proves more reliable and requires no data filtering (such as outlier removal). Sensitivity analysis shows a strong relationship between argument of latitude and covariance (standard deviations and correlations), which the method is able to approximate. Overall, the method proves accurate, computationally fast, and robust, and is applicable to any object in the satellite catalogue (SATCAT).

  2. Validation methodology in publications describing epidemiological registration methods of dental caries: a systematic review.

    PubMed

    Sjögren, P; Ordell, S; Halling, A

    2003-12-01

    The aim was to describe and systematically review the methodology and reporting of validation in publications describing epidemiological registration methods for dental caries. BASIC RESEARCH METHODOLOGY: Literature searches were conducted in six scientific databases. All publications fulfilling the predetermined inclusion criteria were assessed for methodology and reporting of validation using a checklist including items described previously as well as new items. The frequency of endorsement of the assessed items was analysed. Moreover, the type and strength of evidence, was evaluated. Reporting of predetermined items relating to methodology of validation and the frequency of endorsement of the assessed items were of primary interest. Initially 588 publications were located. 74 eligible publications were obtained, 23 of which fulfilled the inclusion criteria and remained throughout the analyses. A majority of the studies reported the methodology of validation. The reported methodology of validation was generally inadequate, according to the recommendations of evidence-based medicine. The frequencies of reporting the assessed items (frequencies of endorsement) ranged from four to 84 per cent. A majority of the publications contributed to a low strength of evidence. There seems to be a need to improve the methodology and the reporting of validation in publications describing professionally registered caries epidemiology. Four of the items assessed in this study are potentially discriminative for quality assessments of reported validation.

  3. Comparing the validity of different sources of information on emergency department visits: a latent class analysis.

    PubMed

    Dendukuri, Nandini; McCusker, Jane; Bellavance, François; Cardin, Sylvie; Verdon, Josée; Karp, Igor; Belzile, Eric

    2005-03-01

    Emergency department (ED) use in Quebec may be measured from varied sources, eg, patient's self-reports, hospital medical charts, and provincial health insurance claims databases. Determining the relative validity of each source is complicated because none is a gold standard. We sought to compare the validity of different measures of ED use without arbitrarily assuming one is perfect. Data were obtained from a nursing liaison intervention study for frail seniors visiting EDs at 4 university-affiliated hospitals in Montreal. The number of ED visits during 2 consecutive follow-up periods of 1 and 4 months after baseline was obtained from patient interviews, from medical charts of participating hospitals, and from the provincial health insurance claims database. Latent class analysis was used to estimate the validity of each source. The impact of the following covariates on validity was evaluated: hospital visited, patient's demographic/clinical characteristics, risk of functional decline, nursing liaison intervention, duration of recall, previous ED use, and previous hospitalization. The patient's self-report was found to be the least accurate (sensitivity: 70%, specificity: 88%). Claims databases had the greatest validity, especially after defining claims made on consecutive days as part of the same ED visit (sensitivity: 98%, specificity: 98%). The validity of the medical chart was intermediate. Lower sensitivity (or under-reporting) on the self-report appeared to be associated with higher age, low comorbidity and shorter length of recall. The claims database is the most valid method of measuring ED use among seniors in Quebec compared with hospital medical charts and patient-reported use.

  4. Repetitive deliberate fires: Development and validation of a methodology to detect series.

    PubMed

    Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi

    2017-08-01

    The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  5. Reliability and Validity of a New Method for Isometric Back Extensor Strength Evaluation Using A Hand-Held Dynamometer

    PubMed Central

    2017-01-01

    Objective To investigate the reliability and validity of a new method for isometric back extensor strength measurement using a portable dynamometer. Methods A chair equipped with a small portable dynamometer was designed (Power Track II Commander Muscle Tester). A total of 15 men (mean age, 34.8±7.5 years) and 15 women (mean age, 33.1±5.5 years) with no current back problems or previous history of back surgery were recruited. Subjects were asked to push the back of the chair while seated, and their isometric back extensor strength was measured by the portable dynamometer. Test-retest reliability was assessed with intraclass correlation coefficient (ICC). For the validity assessment, isometric back extensor strength of all subjects was measured by a widely used physical performance evaluation instrument, BTE PrimusRS system. The limit of agreement (LoA) from the Bland-Altman plot was evaluated between two methods. Results The test-retest reliability was excellent (ICC=0.82; 95% confidence interval, 0.65–0.91). The Bland-Altman plots demonstrated acceptable agreement between the two methods: the lower 95% LoA was −63.1 N and the upper 95% LoA was 61.1 N. Conclusion This study shows that isometric back extensor strength measurement using a portable dynamometer has good reliability and validity. PMID:29201818

  6. Advanced forensic validation for human spermatozoa identification using SPERM HY-LITER™ Express with quantitative image analysis.

    PubMed

    Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko

    2017-07-01

    Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.

  7. Predicting implementation from organizational readiness for change: a study protocol

    PubMed Central

    2011-01-01

    Background There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment. Objectives Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias. Methods and Design We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities), and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities). Convergent and discriminant validities will test associations between organizational readiness and different aspects of job satisfaction: satisfaction with leadership, which should be highly correlated with readiness, versus satisfaction with salary, which should be less correlated with readiness. Content validity will be assessed using an expert panel and modified Delphi technique. Discussion We propose a comprehensive protocol for validating a survey instrument for assessing organizational readiness to change that specifically addresses key threats of bias related to halo effect, method bias and questions of construct validity that often go unexplored in research using measures of organizational constructs. PMID:21777479

  8. A New Z Score Curve of the Coronary Arterial Internal Diameter Using the Lambda-Mu-Sigma Method in a Pediatric Population.

    PubMed

    Kobayashi, Tohru; Fuse, Shigeto; Sakamoto, Naoko; Mikami, Masashi; Ogawa, Shunichi; Hamaoka, Kenji; Arakaki, Yoshio; Nakamura, Tsuneyuki; Nagasawa, Hiroyuki; Kato, Taichi; Jibiki, Toshiaki; Iwashima, Satoru; Yamakawa, Masaru; Ohkubo, Takashi; Shimoyama, Shinya; Aso, Kentaro; Sato, Seiichi; Saji, Tsutomu

    2016-08-01

    Several coronary artery Z score models have been developed. However, a Z score model derived by the lambda-mu-sigma (LMS) method has not been established. Echocardiographic measurements of the proximal right coronary artery, left main coronary artery, proximal left anterior descending coronary artery, and proximal left circumflex artery were prospectively collected in 3,851 healthy children ≤18 years of age and divided into developmental and validation data sets. In the developmental data set, smooth curves were fitted for each coronary artery using linear, logarithmic, square-root, and LMS methods for both sexes. The relative goodness of fit of these models was compared using the Bayesian information criterion. The best-fitting model was tested for reproducibility using the validation data set. The goodness of fit of the selected model was visually compared with that of the previously reported regression models using a Q-Q plot. Because the internal diameter of each coronary artery was not similar between sexes, sex-specific Z score models were developed. The LMS model with body surface area as the independent variable showed the best goodness of fit; therefore, the internal diameter of each coronary artery was transformed into a sex-specific Z score on the basis of body surface area using the LMS method. In the validation data set, a Q-Q plot of each model indicated that the distribution of Z scores in the LMS models was closer to the normal distribution compared with previously reported regression models. Finally, the final models for each coronary artery in both sexes were developed using the developmental and validation data sets. A Microsoft Excel-based Z score calculator was also created, which is freely available online (http://raise.umin.jp/zsp/calculator/). Novel LMS models with which to estimate the sex-specific Z score of each internal coronary artery diameter were generated and validated using a large pediatric population. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  9. Validity of self-reported lunch recalls in Swedish school children aged 6–8 years

    PubMed Central

    2013-01-01

    Background Previous studies have suggested that young children are inaccurate reporters of dietary intake. The purpose of this study was to validate a single recall of the previous day’s school lunch reported by 6–8 year old Swedish children and to assess teacher-recorded intake of the same meal in a standardized food journal. An additional research question was whether parents could report their child’s intake of the previous day’s lunch. Subjects constituted a convenience sample from the large, multi-country study Identification and prevention of Dietary- and lifestyle-induced health EFfects In Children and infantS (IDEFICS). Validations of both children’s recalls and teachers’ records were made by comparing results with the duplicate plate reference method. Findings Twenty-five children (12 boys/13 girls) aged 6–8 years participated in the validation study at one school in western Sweden. Children were accurate self-reporters of their dietary intake at lunch, with no significant difference between reported and weighed intake (Mean difference (SD): 7(50) kcals, p=0.49). Teachers significantly over-reported intake (Mean difference (SD): 65(79) kcals, p=0.01). For both methods, child-reported and teacher-recorded, correlations with weighed intake were strong (Pearson’s correlations r=0.92, p<0.001 and r=0.83, p<0.001 respectively). Bland-Altman plots showed strong agreement between child-reported and weighed intakes but confirmed systematic differences between teacher-records and weighed intakes. Foods were recalled by children with a food-match rate of 90%. In all cases parents themselves were unable to report on quantities consumed and only four of 25 children had parents with knowledge regarding food items consumed. Conclusions Children 6–8 years of age accurately recalled their school lunch intake for one occasion while teachers recorded with less accuracy. Our findings suggest that children as young as six years of age may be better able to report on their dietary intake than previously suggested, at least for one main meal at school. Teacher-recorded intake provides a satisfactory estimate but with greater systematic deviation from the weighed intake. Parents were not able to report on their children’s school lunches consumed on the previous day. PMID:24047239

  10. Recommendations for adaptation and validation of commercial kits for biomarker quantification in drug development.

    PubMed

    Khan, Masood U; Bowsher, Ronald R; Cameron, Mark; Devanarayan, Viswanath; Keller, Steve; King, Lindsay; Lee, Jean; Morimoto, Alyssa; Rhyne, Paul; Stephen, Laurie; Wu, Yuling; Wyant, Timothy; Lachno, D Richard

    2015-01-01

    Increasingly, commercial immunoassay kits are used to support drug discovery and development. Longitudinally consistent kit performance is crucial, but the degree to which kits and reagents are characterized by manufacturers is not standardized, nor are the approaches by users to adapt them and evaluate their performance through validation prior to use. These factors can negatively impact data quality. This paper offers a systematic approach to assessment, method adaptation and validation of commercial immunoassay kits for quantification of biomarkers in drug development, expanding upon previous publications and guidance. These recommendations aim to standardize and harmonize user practices, contributing to reliable biomarker data from commercial immunoassays, thus, enabling properly informed decisions during drug development.

  11. Stereological Analysis of Liver Biopsy Histology Sections as a Reference Standard for Validating Non-Invasive Liver Fat Fraction Measurements by MRI

    PubMed Central

    St. Pierre, Tim G.; House, Michael J.; Bangma, Sander J.; Pang, Wenjie; Bathgate, Andrew; Gan, Eng K.; Ayonrinde, Oyekoya T.; Bhathal, Prithi S.; Clouston, Andrew; Olynyk, John K.; Adams, Leon A.

    2016-01-01

    Background and Aims Validation of non-invasive methods of liver fat quantification requires a reference standard. However, using standard histopathology assessment of liver biopsies is problematical because of poor repeatability. We aimed to assess a stereological method of measuring volumetric liver fat fraction (VLFF) in liver biopsies and to use the method to validate a magnetic resonance imaging method for measurement of VLFF. Methods VLFFs were measured in 59 subjects (1) by three independent analysts using a stereological point counting technique combined with the Delesse principle on liver biopsy histological sections and (2) by three independent analysts using the HepaFat-Scan® technique on magnetic resonance images of the liver. Bland Altman statistics and intraclass correlation (IC) were used to assess the repeatability of each method and the bias between the methods of liver fat fraction measurement. Results Inter-analyst repeatability coefficients for the stereology and HepaFat-Scan® methods were 8.2 (95% CI 7.7–8.8)% and 2.4 (95% CI 2.2–2.5)% VLFF respectively. IC coefficients were 0.86 (95% CI 0.69–0.93) and 0.990 (95% CI 0.985–0.994) respectively. Small biases (≤3.4%) were observable between two pairs of analysts using stereology while no significant biases were observable between any of the three pairs of analysts using HepaFat-Scan®. A bias of 1.4±0.5% VLFF was observed between the HepaFat-Scan® method and the stereological method. Conclusions Repeatability of the stereological method is superior to the previously reported performance of assessment of hepatic steatosis by histopathologists and is a suitable reference standard for validating non-invasive methods of measurement of VLFF. PMID:27501242

  12. Torso-Tank Validation of High-Resolution Electrogastrography (EGG): Forward Modelling, Methodology and Results.

    PubMed

    Calder, Stefan; O'Grady, Greg; Cheng, Leo K; Du, Peng

    2018-04-27

    Electrogastrography (EGG) is a non-invasive method for measuring gastric electrical activity. Recent simulation studies have attempted to extend the current clinical utility of the EGG, in particular by providing a theoretical framework for distinguishing specific gastric slow wave dysrhythmias. In this paper we implement an experimental setup called a 'torso-tank' with the aim of expanding and experimentally validating these previous simulations. The torso-tank was developed using an adult male torso phantom with 190 electrodes embedded throughout the torso. The gastric slow waves were reproduced using an artificial current source capable of producing 3D electrical fields. Multiple gastric dysrhythmias were reproduced based on high-resolution mapping data from cases of human gastric dysfunction (gastric re-entry, conduction blocks and ectopic pacemakers) in addition to normal test data. Each case was recorded and compared to the previously-presented simulated results. Qualitative and quantitative analyses were performed to define the accuracy showing [Formula: see text] 1.8% difference, [Formula: see text] 0.99 correlation, and [Formula: see text] 0.04 normalised RMS error between experimental and simulated findings. These results reaffirm previous findings and these methods in unison therefore present a promising morphological-based methodology for advancing the understanding and clinical applications of EGG.

  13. Fast sweeping methods for hyperbolic systems of conservation laws at steady state II

    NASA Astrophysics Data System (ADS)

    Engquist, Björn; Froese, Brittany D.; Tsai, Yen-Hsi Richard

    2015-04-01

    The idea of using fast sweeping methods for solving stationary systems of conservation laws has previously been proposed for efficiently computing solutions with sharp shocks. We further develop these methods to allow for a more challenging class of problems including problems with sonic points, shocks originating in the interior of the domain, rarefaction waves, and two-dimensional systems. We show that fast sweeping methods can produce higher-order accuracy. Computational results validate the claims of accuracy, sharp shock curves, and optimal computational efficiency.

  14. Applied Chaos Level Test for Validation of Signal Conditions Underlying Optimal Performance of Voice Classification Methods.

    PubMed

    Liu, Boquan; Polce, Evan; Sprott, Julien C; Jiang, Jack J

    2018-05-17

    The purpose of this study is to introduce a chaos level test to evaluate linear and nonlinear voice type classification method performances under varying signal chaos conditions without subjective impression. Voice signals were constructed with differing degrees of noise to model signal chaos. Within each noise power, 100 Monte Carlo experiments were applied to analyze the output of jitter, shimmer, correlation dimension, and spectrum convergence ratio. The computational output of the 4 classifiers was then plotted against signal chaos level to investigate the performance of these acoustic analysis methods under varying degrees of signal chaos. A diffusive behavior detection-based chaos level test was used to investigate the performances of different voice classification methods. Voice signals were constructed by varying the signal-to-noise ratio to establish differing signal chaos conditions. Chaos level increased sigmoidally with increasing noise power. Jitter and shimmer performed optimally when the chaos level was less than or equal to 0.01, whereas correlation dimension was capable of analyzing signals with chaos levels of less than or equal to 0.0179. Spectrum convergence ratio demonstrated proficiency in analyzing voice signals with all chaos levels investigated in this study. The results of this study corroborate the performance relationships observed in previous studies and, therefore, demonstrate the validity of the validation test method. The presented chaos level validation test could be broadly utilized to evaluate acoustic analysis methods and establish the most appropriate methodology for objective voice analysis in clinical practice.

  15. Development and validation of a yoga module for Parkinson disease.

    PubMed

    Kakde, Noopur; Metri, Kashinath G; Varambally, Shivarama; Nagaratna, Raghuram; Nagendra, H R

    2017-03-25

    Background Parkinson's disease (PD), a progressive neurodegenerative disease, affects motor and nonmotor functions, leading to severe debility and poor quality of life. Studies have reported the beneficial role of yoga in alleviating the symptoms of PD; however, a validated yoga module for PD is unavailable. This study developed and validated an integrated yoga module(IYM) for PD. Methods The IYM was prepared after a thorough review of classical yoga texts and previous findings. Twenty experienced yoga experts, who fulfilled the inclusion criteria, were selected validating the content of the IYM. A total of 28 practices were included in the IYM, and each practice was discussed and rated as (i) not essential, (ii) useful but not essential, and (iii) essential; the content validity ratio (CVR) was calculated using Lawshe's formula. Results Data analysis revealed that of the 28 IYM practices, 21 exhibited significant content validity (cut-off value: 0.42, as calculated by applying Lawshe's formula for the CVR). Conclusions The IYM is valid for PD, with good content validity. However, future studies must determine the feasibility and efficacy of the developed module.

  16. Ethinylestradiol and levonorgestrel preparations on the Belgian market: a comparative study.

    PubMed

    Vanheusden, V; De Braekeleer, K; Corthout, J

    2012-03-01

    Preparations formulated as coated or film-coated tablets, containing levonorgestrel and the combination ethinylestradiol/levonorgestrel, were evaluated in a comparative study. This study comprised in vitro dissolution, assay and content uniformity. The analytical methods were previously validated according to international guidelines. All examined products complied with the postulated requirements.

  17. Internal validation of the prognostic index for spine metastasis (PRISM) for stratifying survival in patients treated with spinal stereotactic radiosurgery.

    PubMed

    Jensen, Garrett; Tang, Chad; Hess, Kenneth R; Bishop, Andrew J; Pan, Hubert Y; Li, Jing; Yang, James N; Tannir, Nizar M; Amini, Behrang; Tatsui, Claudio; Rhines, Laurence; Brown, Paul D; Ghia, Amol J

    2017-01-01

    We sought to validate the Prognostic Index for Spinal Metastases (PRISM), a scoring system that stratifies patients into subgroups by overall survival.Methods and materials: The PRISM was previously created from multivariate Cox regression with patients enrolled in prospective single institution trials of stereotactic spine radiosurgery (SSRS) for spinal metastasis. We assess model calibration and discrimination within a validation cohort of patients treated off-trial with SSRS for metastatic disease at the same institution. The training and validation cohorts consisted of 205 and 249 patients respectively. Similar survival trends were shown in the 4 PRISM. Survival was significantly different between PRISM subgroups (P<0.0001). C-index for the validation cohort was 0.68 after stratification into subgroups. We internally validated the PRISM with patients treated off-protocol, demonstrating that it can distinguish subgroups by survival, which will be useful for individualizing treatment of spinal metastases and stratifying patients for clinical trials.

  18. Validating a benchmarking tool for audit of early outcomes after operations for head and neck cancer.

    PubMed

    Tighe, D; Sassoon, I; McGurk, M

    2017-04-01

    INTRODUCTION In 2013 all UK surgical specialties, with the exception of head and neck surgery, published outcome data adjusted for case mix for indicator operations. This paper reports a pilot study to validate a previously published risk adjustment score on patients from separate UK cancer centres. METHODS A case note audit was performed of 1,075 patients undergoing 1,218 operations for head and neck squamous cell carcinoma under general anaesthesia in 4 surgical centres. A logistic regression equation predicting for all complications, previously validated internally at sites A-C, was tested on a fourth external validation sample (site D, 172 operations) using receiver operating characteristic curves, Hosmer-Lemeshow goodness of fit analysis and Brier scores. RESULTS Thirty-day complication rates varied widely (34-51%) between the centres. The predictive score allowed imperfect risk adjustment (area under the curve: 0.70), with Hosmer-Lemeshow analysis suggesting good calibration. The Brier score changed from 0.19 for sites A-C to 0.23 when site D was also included, suggesting poor accuracy overall. CONCLUSIONS Marked differences in operative risk and patient case mix captured by the risk adjustment score do not explain all the differences in observed outcomes. Further investigation with different methods is recommended to improve modelling of risk. Morbidity is common, and usually has a major impact on patient recovery, ward occupancy, hospital finances and patient perception of quality of care. We hope comparative audit will highlight good performance and challenge underperformance where it exists.

  19. A Comparison of Reliability and Construct Validity between the Original and Revised Versions of the Rosenberg Self-Esteem Scale

    PubMed Central

    Nahathai, Wongpakaran

    2012-01-01

    Objective The Rosenberg Self-Esteem Scale (RSES) is a widely used instrument that has been tested for reliability and validity in many settings; however, some negative-worded items appear to have caused it to reveal low reliability in a number of studies. In this study, we revised one negative item that had previously (from the previous studies) produced the worst outcome in terms of the structure of the scale, then re-analyzed the new version for its reliability and construct validity, comparing it to the original version with respect to fit indices. Methods In total, 851 students from Chiang Mai University (mean age: 19.51±1.7, 57% of whom were female), participated in this study. Of these, 664 students completed the Thai version of the original RSES - containing five positively worded and five negatively worded items, while 187 students used the revised version containing six positively worded and four negatively worded items. Confirmatory factor analysis was applied, using a uni-dimensional model with method effects and a correlated uniqueness approach. Results The revised version showed the same level of reliability (good) as the original, but yielded a better model fit. The revised RSES demonstrated excellent fit statistics, with χ2=29.19 (df=19, n=187, p=0.063), GFI=0.970, TFI=0.969, NFI=0.964, CFI=0.987, SRMR=0.040 and RMSEA=0.054. Conclusion The revised version of the Thai RSES demonstrated an equivalent level of reliability but a better construct validity when compared to the original. PMID:22396685

  20. Validating a benchmarking tool for audit of early outcomes after operations for head and neck cancer

    PubMed Central

    Sassoon, I; McGurk, M

    2017-01-01

    INTRODUCTION In 2013 all UK surgical specialties, with the exception of head and neck surgery, published outcome data adjusted for case mix for indicator operations. This paper reports a pilot study to validate a previously published risk adjustment score on patients from separate UK cancer centres. METHODS A case note audit was performed of 1,075 patients undergoing 1,218 operations for head and neck squamous cell carcinoma under general anaesthesia in 4 surgical centres. A logistic regression equation predicting for all complications, previously validated internally at sites A–C, was tested on a fourth external validation sample (site D, 172 operations) using receiver operating characteristic curves, Hosmer–Lemeshow goodness of fit analysis and Brier scores. RESULTS Thirty-day complication rates varied widely (34–51%) between the centres. The predictive score allowed imperfect risk adjustment (area under the curve: 0.70), with Hosmer–Lemeshow analysis suggesting good calibration. The Brier score changed from 0.19 for sites A–C to 0.23 when site D was also included, suggesting poor accuracy overall. CONCLUSIONS Marked differences in operative risk and patient case mix captured by the risk adjustment score do not explain all the differences in observed outcomes. Further investigation with different methods is recommended to improve modelling of risk. Morbidity is common, and usually has a major impact on patient recovery, ward occupancy, hospital finances and patient perception of quality of care. We hope comparative audit will highlight good performance and challenge underperformance where it exists. PMID:27917662

  1. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  2. Automated Tumor Volumetry Using Computer-Aided Image Segmentation

    PubMed Central

    Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A.; Ali, Zarina S.; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M.; Davatzikos, Christos

    2015-01-01

    Rationale and Objectives Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. Materials and Methods A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Results Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0–5 rating scale where 5 indicated perfect segmentation. Conclusions The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. PMID:25770633

  3. Validate or falsify: Lessons learned from a microscopy method claimed to be useful for detecting Borrelia and Babesia organisms in human blood.

    PubMed

    Aase, Audun; Hajdusek, Ondrej; Øines, Øivind; Quarsten, Hanne; Wilhelmsson, Peter; Herstad, Tove K; Kjelland, Vivian; Sima, Radek; Jalovecka, Marie; Lindgren, Per-Eric; Aaberge, Ingeborg S

    2016-01-01

    A modified microscopy protocol (the LM-method) was used to demonstrate what was interpreted as Borrelia spirochetes and later also Babesia sp., in peripheral blood from patients. The method gained much publicity, but was not validated prior to publication, which became the purpose of this study using appropriate scientific methodology, including a control group. Blood from 21 patients previously interpreted as positive for Borrelia and/or Babesia infection by the LM-method and 41 healthy controls without known history of tick bite were collected, blinded and analysed for these pathogens by microscopy in two laboratories by the LM-method and conventional method, respectively, by PCR methods in five laboratories and by serology in one laboratory. Microscopy by the LM-method identified structures claimed to be Borrelia- and/or Babesia in 66% of the blood samples of the patient group and in 85% in the healthy control group. Microscopy by the conventional method for Babesia only did not identify Babesia in any samples. PCR analysis detected Borrelia DNA in one sample of the patient group and in eight samples of the control group; whereas Babesia DNA was not detected in any of the blood samples using molecular methods. The structures interpreted as Borrelia and Babesia by the LM-method could not be verified by PCR. The method was, thus, falsified. This study underlines the importance of doing proper test validation before new or modified assays are introduced.

  4. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  5. The HIrisPlex-S system for eye, hair and skin colour prediction from DNA: Introduction and forensic developmental validation.

    PubMed

    Chaitanya, Lakshmi; Breslin, Krystal; Zuñiga, Sofia; Wirken, Laura; Pośpiech, Ewelina; Kukla-Bartoszek, Magdalena; Sijen, Titia; Knijff, Peter de; Liu, Fan; Branicki, Wojciech; Kayser, Manfred; Walsh, Susan

    2018-07-01

    Forensic DNA Phenotyping (FDP), i.e. the prediction of human externally visible traits from DNA, has become a fast growing subfield within forensic genetics due to the intelligence information it can provide from DNA traces. FDP outcomes can help focus police investigations in search of unknown perpetrators, who are generally unidentifiable with standard DNA profiling. Therefore, we previously developed and forensically validated the IrisPlex DNA test system for eye colour prediction and the HIrisPlex system for combined eye and hair colour prediction from DNA traces. Here we introduce and forensically validate the HIrisPlex-S DNA test system (S for skin) for the simultaneous prediction of eye, hair, and skin colour from trace DNA. This FDP system consists of two SNaPshot-based multiplex assays targeting a total of 41 SNPs via a novel multiplex assay for 17 skin colour predictive SNPs and the previous HIrisPlex assay for 24 eye and hair colour predictive SNPs, 19 of which also contribute to skin colour prediction. The HIrisPlex-S system further comprises three statistical prediction models, the previously developed IrisPlex model for eye colour prediction based on 6 SNPs, the previous HIrisPlex model for hair colour prediction based on 22 SNPs, and the recently introduced HIrisPlex-S model for skin colour prediction based on 36 SNPs. In the forensic developmental validation testing, the novel 17-plex assay performed in full agreement with the Scientific Working Group on DNA Analysis Methods (SWGDAM) guidelines, as previously shown for the 24-plex assay. Sensitivity testing of the 17-plex assay revealed complete SNP profiles from as little as 63 pg of input DNA, equalling the previously demonstrated sensitivity threshold of the 24-plex HIrisPlex assay. Testing of simulated forensic casework samples such as blood, semen, saliva stains, of inhibited DNA samples, of low quantity touch (trace) DNA samples, and of artificially degraded DNA samples as well as concordance testing, demonstrated the robustness, efficiency, and forensic suitability of the new 17-plex assay, as previously shown for the 24-plex assay. Finally, we provide an update to the publically available HIrisPlex website https://hirisplex.erasmusmc.nl/, now allowing the estimation of individual probabilities for 3 eye, 4 hair, and 5 skin colour categories from HIrisPlex-S input genotypes. The HIrisPlex-S DNA test represents the first forensically validated tool for skin colour prediction, and reflects the first forensically validated tool for simultaneous eye, hair and skin colour prediction from DNA. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. An IMU-to-Body Alignment Method Applied to Human Gait Analysis.

    PubMed

    Vargas-Valencia, Laura Susana; Elias, Arlindo; Rocon, Eduardo; Bastos-Filho, Teodiano; Frizera, Anselmo

    2016-12-10

    This paper presents a novel calibration procedure as a simple, yet powerful, method to place and align inertial sensors with body segments. The calibration can be easily replicated without the need of any additional tools. The proposed method is validated in three different applications: a computer mathematical simulation; a simplified joint composed of two semi-spheres interconnected by a universal goniometer; and a real gait test with five able-bodied subjects. Simulation results demonstrate that, after the calibration method is applied, the joint angles are correctly measured independently of previous sensor placement on the joint, thus validating the proposed procedure. In the cases of a simplified joint and a real gait test with human volunteers, the method also performs correctly, although secondary plane errors appear when compared with the simulation results. We believe that such errors are caused by limitations of the current inertial measurement unit (IMU) technology and fusion algorithms. In conclusion, the presented calibration procedure is an interesting option to solve the alignment problem when using IMUs for gait analysis.

  7. In-vivo detectability index: development and validation of an automated methodology

    NASA Astrophysics Data System (ADS)

    Smith, Taylor Brunton; Solomon, Justin; Samei, Ehsan

    2017-03-01

    The purpose of this study was to develop and validate a method to estimate patient-specific detectability indices directly from patients' CT images (i.e., "in vivo"). The method works by automatically extracting noise (NPS) and resolution (MTF) properties from each patient's CT series based on previously validated techniques. Patient images are thresholded into skin-air interfaces to form edge-spread functions, which are further binned, differentiated, and Fourier transformed to form the MTF. The NPS is likewise estimated from uniform areas of the image. These are combined with assumed task functions (reference function: 10 mm disk lesion with contrast of -15 HU) to compute detectability indices for a non-prewhitening matched filter model observer predicting observer performance. The results were compared to those from a previous human detection study on 105 subtle, hypo-attenuating liver lesions, using a two-alternative-forcedchoice (2AFC) method, over 6 dose levels using 16 readers. The in vivo detectability indices estimated for all patient images were compared to binary 2AFC outcomes with a generalized linear mixed-effects statistical model (Probit link function, linear terms only, no interactions, random term for readers). The model showed that the in vivo detectability indices were strongly predictive of 2AFC outcomes (P < 0.05). A linear comparison between the human detection accuracy and model-predicted detection accuracy (for like conditions) resulted in Pearson and Spearman correlations coefficients of 0.86 and 0.87, respectively. These data provide evidence that the in vivo detectability index could potentially be used to automatically estimate and track image quality in a clinical operation.

  8. On the validity of the Middlesex Hospital Questionnaire: a comparison of diagnostic self-ratings in psychiatric out-patients, general practice patients, and 'normals' based on the Hebrew version.

    PubMed

    Dasberg, H; Shalif, I

    1978-09-01

    The short clinical diagnostic self-rating scale for psycho-neurotic patients (The Middlesex Hospital Questionnaire) was translated into everyday Hebrew and tested on 216 subjects for: (1) concurrent validity with clinical diagnoses; (2) discriminatory validity on a psychoneurotic gradient of psychiatric out-patients, general practice patients, and normal controls; (3) validity of subscales and discrete items using matrices of Spearman rank correlation coefficients; (4) construct validity using Guttman's smallest space analysis based on coefficients of similarity. The Hebrew MHQ was found to retain its validity and to be easily applicable in waiting-room situations. It is a useful method for generating and substantiating hypotheses on psychosomatic and psychosocial interrelationships. The MHQ seems to enable the expression of the 'neurotic load' of a general practice subpopulation as a centile on a scale, thereby corroborating previous epidemiological findings on the high prevalence of neurotic illness in general practice. There is reason to believe that the MHQ is a valid instrument for the analysis of symptom profiles of subjects involved in future drug trials.

  9. Shock compression response of cold-rolled Ni/Al multilayer composites

    DOE PAGES

    Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.

    2017-01-06

    Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. Finally, these simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.

  10. Automated tumor volumetry using computer-aided image segmentation.

    PubMed

    Gaonkar, Bilwaj; Macyszyn, Luke; Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A; Ali, Zarina S; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M; Davatzikos, Christos

    2015-05-01

    Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0-5 rating scale where 5 indicated perfect segmentation. The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  11. Visual analog rating of mood by people with aphasia.

    PubMed

    Haley, Katarina L; Womack, Jennifer L; Harmon, Tyson G; Williams, Sharon W

    2015-08-01

    Considerable attention has been given to the identification of depression in stroke survivors with aphasia, but there is more limited information about other mood states. Visual analog scales are often used to collect subjective information from people with aphasia. However, the validity of these methods for communicating about mood has not been established in people with moderately to severely impaired language. The dual purposes of this study were to characterize the relative endorsement of negative and positive mood states in people with chronic aphasia after stroke and to examine congruent validity for visual analog rating methods for people with a range of aphasia severity. Twenty-three left-hemisphere stroke survivors with aphasia were asked to indicate their present mood by using two published visual analog rating methods. The congruence between the methods was estimated through correlation analysis, and scores for different moods were compared. Endorsement was significantly stronger for "happy" than for mood states with negative valence. At the same time, several participants displayed pronounced negative mood compared to previously published norms for neurologically healthy adults. Results from the two rating methods were moderately and positively correlated. Positive mood is prominent in people with aphasia who are in the chronic stage of recovery after stroke, but negative moods can also be salient and individual presentations are diverse. Visual analog rating methods are valid methods for discussing mood with people with aphasia; however, design optimization should be explored.

  12. In vivo measurement of aerodynamic weight support in freely flying birds

    NASA Astrophysics Data System (ADS)

    Lentink, David; Haselsteiner, Andreas; Ingersoll, Rivers

    2014-11-01

    Birds dynamically change the shape of their wing during the stroke to support their body weight aerodynamically. The wing is partially folded during the upstroke, which suggests that the upstroke of birds might not actively contribute to aerodynamic force production. This hypothesis is supported by the significant mass difference between the large pectoralis muscle that powers the down-stroke and the much smaller supracoracoideus that drives the upstroke. Previous works used indirect or incomplete techniques to measure the total force generated by bird wings ranging from muscle force, airflow, wing surface pressure, to detailed kinematics measurements coupled with bird mass-distribution models to derive net force through second derivatives. We have validated a new method that measures aerodynamic force in vivo time-resolved directly in freely flying birds which can resolve this question. The validation of the method, using independent force measurements on a quadcopter with pulsating thrust, show the aerodynamic force and impulse are measured within 2% accuracy and time-resolved. We demonstrate results for quad-copters and birds of similar weight and size. The method is scalable and can be applied to both engineered and natural flyers across taxa. The first author invented the method, the second and third authors validated the method and present results for quadcopters and birds.

  13. Validation of a new method for finding the rotational axes of the knee using both marker-based roentgen stereophotogrammetric analysis and 3D video-based motion analysis for kinematic measurements.

    PubMed

    Roland, Michelle; Hull, M L; Howell, S M

    2011-05-01

    In a previous paper, we reported the virtual axis finder, which is a new method for finding the rotational axes of the knee. The virtual axis finder was validated through simulations that were subject to limitations. Hence, the objective of the present study was to perform a mechanical validation with two measurement modalities: 3D video-based motion analysis and marker-based roentgen stereophotogrammetric analysis (RSA). A two rotational axis mechanism was developed, which simulated internal-external (or longitudinal) and flexion-extension (FE) rotations. The actual axes of rotation were known with respect to motion analysis and RSA markers within ± 0.0006 deg and ± 0.036 mm and ± 0.0001 deg and ± 0.016 mm, respectively. The orientation and position root mean squared errors for identifying the longitudinal rotation (LR) and FE axes with video-based motion analysis (0.26 deg, 0.28 m, 0.36 deg, and 0.25 mm, respectively) were smaller than with RSA (1.04 deg, 0.84 mm, 0.82 deg, and 0.32 mm, respectively). The random error or precision in the orientation and position was significantly better (p=0.01 and p=0.02, respectively) in identifying the LR axis with video-based motion analysis (0.23 deg and 0.24 mm) than with RSA (0.95 deg and 0.76 mm). There was no significant difference in the bias errors between measurement modalities. In comparing the mechanical validations to virtual validations, the virtual validations produced comparable errors to those of the mechanical validation. The only significant difference between the errors of the mechanical and virtual validations was the precision in the position of the LR axis while simulating video-based motion analysis (0.24 mm and 0.78 mm, p=0.019). These results indicate that video-based motion analysis with the equipment used in this study is the superior measurement modality for use with the virtual axis finder but both measurement modalities produce satisfactory results. The lack of significant differences between validation techniques suggests that the virtual sensitivity analysis previously performed was appropriately modeled. Thus, the virtual axis finder can be applied with a thorough understanding of its errors in a variety of test conditions.

  14. Solving Fluid Structure Interaction Problems with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Barad, Michael F.; Brehm, Christoph; Kiris, Cetin C.

    2016-01-01

    An immersed boundary method for the compressible Navier-Stokes equations can be used for moving boundary problems as well as fully coupled fluid-structure interaction is presented. The underlying Cartesian immersed boundary method of the Launch Ascent and Vehicle Aerodynamics (LAVA) framework, based on the locally stabilized immersed boundary method previously presented by the authors, is extended to account for unsteady boundary motion and coupled to linear and geometrically nonlinear structural finite element solvers. The approach is validated for moving boundary problems with prescribed body motion and fully coupled fluid structure interaction problems. Keywords: Immersed Boundary Method, Higher-Order Finite Difference Method, Fluid Structure Interaction.

  15. Inferring Gene Regulatory Networks by Singular Value Decomposition and Gravitation Field Algorithm

    PubMed Central

    Zheng, Ming; Wu, Jia-nan; Huang, Yan-xin; Liu, Gui-xia; Zhou, You; Zhou, Chun-guang

    2012-01-01

    Reconstruction of gene regulatory networks (GRNs) is of utmost interest and has become a challenge computational problem in system biology. However, every existing inference algorithm from gene expression profiles has its own advantages and disadvantages. In particular, the effectiveness and efficiency of every previous algorithm is not high enough. In this work, we proposed a novel inference algorithm from gene expression data based on differential equation model. In this algorithm, two methods were included for inferring GRNs. Before reconstructing GRNs, singular value decomposition method was used to decompose gene expression data, determine the algorithm solution space, and get all candidate solutions of GRNs. In these generated family of candidate solutions, gravitation field algorithm was modified to infer GRNs, used to optimize the criteria of differential equation model, and search the best network structure result. The proposed algorithm is validated on both the simulated scale-free network and real benchmark gene regulatory network in networks database. Both the Bayesian method and the traditional differential equation model were also used to infer GRNs, and the results were used to compare with the proposed algorithm in our work. And genetic algorithm and simulated annealing were also used to evaluate gravitation field algorithm. The cross-validation results confirmed the effectiveness of our algorithm, which outperforms significantly other previous algorithms. PMID:23226565

  16. Non-contact AFM measurement of the Hamaker constants of solids: Calibrating cantilever geometries.

    PubMed

    Fronczak, Sean G; Browne, Christopher A; Krenek, Elizabeth C; Beaudoin, Stephen P; Corti, David S

    2018-05-01

    Surface effects arising from roughness and deformation can negatively affect the results of AFM contact experiments. Using the non-contact portion of an AFM deflection curve is therefore desirable for estimating the Hamaker constant, A, of a solid material. A previously validated non-contact quasi-dynamic method for estimating A is revisited, in which the cantilever tip is now always represented by an "effective sphere". In addition to simplifying this previous method, accurate estimates of A can still be obtained even though precise knowledge of the nanoscale geometric features of the cantilever tip are no longer required. The tip's "effective" radius of curvature, R eff , is determined from a "calibration" step, in which the tip's deflection at first contact with the surface is measured for a substrate with a known Hamaker constant. After R eff is known for a given tip, estimates of A for other surfaces of interest are then determined. An experimental study was conducted to validate the new method and the obtained results are in good agreement with predictions from the Lifshitz approximation, when available. Since R eff accounts for all geometric uncertainties of the tip through a single fitted parameter, no visual fitting of the tip shape was required. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. A network-based method for the identification of putative genes related to infertility.

    PubMed

    Wang, ShaoPeng; Huang, GuoHua; Hu, Qinghua; Zou, Quan

    2016-11-01

    Infertility has become one of the major health problems worldwide, with its incidence having risen markedly in recent decades. There is an urgent need to investigate the pathological mechanisms behind infertility and to design effective treatments. However, this is made difficult by the fact that various biological factors have been identified to be related to infertility, including genetic factors. A network-based method was established to identify new genes potentially related to infertility. A network constructed using human protein-protein interactions based on previously validated infertility-related genes enabled the identification of some novel candidate genes. These genes were then filtered by a permutation test and their functional and structural associations with infertility-related genes. Our method identified 23 novel genes, which have strong functional and structural associations with previously validated infertility-related genes. Substantial evidence indicates that the identified genes are strongly related to dysfunction of the four main biological processes of fertility: reproductive development and physiology, gametogenesis, meiosis and recombination, and hormone regulation. The newly discovered genes may provide new directions for investigating infertility. This article is part of a Special Issue entitled "System Genetics" Guest Editor: Dr. Yudong Cai and Dr. Tao Huang. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Three-dimensional Computational Fluid Dynamics Investigation of a Spinning Helicopter Slung Load

    NASA Technical Reports Server (NTRS)

    Theorn, J. N.; Duque, E. P. N.; Cicolani, L.; Halsey, R.

    2005-01-01

    After performing steady-state Computational Fluid Dynamics (CFD) calculations using OVERFLOW to validate the CFD method against static wind-tunnel data of a box-shaped cargo container, the same setup was used to investigate unsteady flow with a moving body. Results were compared to flight test data previously collected in which the container is spinning.

  19. The Convergent and Concurrent Validity of Trait-Based Prototype Assessment of Personality Disorder Categories in Homeless Persons

    ERIC Educational Resources Information Center

    Samuel, Douglas B.; Connolly, Adrian J.; Ball, Samuel A.

    2012-01-01

    The "DSM-5" proposal indicates that personality disorders (PDs) be defined as collections of maladaptive traits but does not provide a specific diagnostic method. However, researchers have previously suggested that PD constructs can be assessed by comparing individuals' trait profiles with those prototypic of PDs and evidence from the…

  20. Validation of a Cognitive Diagnostic Model across Multiple Forms of a Reading Comprehension Assessment

    ERIC Educational Resources Information Center

    Clark, Amy K.

    2013-01-01

    The present study sought to fit a cognitive diagnostic model (CDM) across multiple forms of a passage-based reading comprehension assessment using the attribute hierarchy method. Previous research on CDMs for reading comprehension assessments served as a basis for the attributes in the hierarchy. The two attribute hierarchies were fit to data from…

  1. A phase one AR/C system design

    NASA Technical Reports Server (NTRS)

    Kachmar, Peter M.; Polutchko, Robert J.; Matusky, Martin; Chu, William; Jackson, William; Montez, Moises

    1991-01-01

    The Phase One AR&C System Design integrates an evolutionary design based on the legacy of previous mission successes, flight tested components from manned Rendezvous and Proximity Operations (RPO) space programs, and additional AR&C components validated using proven methods. The Phase One system has a modular, open architecture with the standardized interfaces proposed for Space Station Freedom system architecture.

  2. A Randomized Controlled Trial Validating the Impact of the LASER Model of Science Education on Student Achievement and Teacher Instruction

    ERIC Educational Resources Information Center

    Kaldon, Carolyn R.; Zoblotsky, Todd A.

    2014-01-01

    Previous research has linked inquiry-based science instruction (i.e., science instruction that engages students in doing science rather than just learning about science) with greater gains in student learning than text-book based methods (Vanosdall, Klentschy, Hedges & Weisbaum, 2007; Banilower, 2007; Ferguson 2009; Bredderman, 1983;…

  3. Rates of Physical Activity among Appalachian Adolescents in Ohio

    ERIC Educational Resources Information Center

    Hortz, Brian; Stevens, Emily; Holden, Becky; Petosa, R. Lingyak

    2009-01-01

    Purpose: The purpose of this study was to describe the physical activity behavior of high school students living in the Appalachian region of Ohio. Methods: A cross-sectional sample of 1,024 subjects from 11 schools in Appalachian Ohio was drawn. Previously validated instruments were used to measure physical activity behavior over 7 days.…

  4. A three-dimensional parabolic equation model of sound propagation using higher-order operator splitting and Padé approximants.

    PubMed

    Lin, Ying-Tsong; Collis, Jon M; Duda, Timothy F

    2012-11-01

    An alternating direction implicit (ADI) three-dimensional fluid parabolic equation solution method with enhanced accuracy is presented. The method uses a square-root Helmholtz operator splitting algorithm that retains cross-multiplied operator terms that have been previously neglected. With these higher-order cross terms, the valid angular range of the parabolic equation solution is improved. The method is tested for accuracy against an image solution in an idealized wedge problem. Computational efficiency improvements resulting from the ADI discretization are also discussed.

  5. Faster methods for estimating arc centre position during VAR and results from Ti-6Al-4V and INCONEL 718 alloys

    NASA Astrophysics Data System (ADS)

    Nair, B. G.; Winter, N.; Daniel, B.; Ward, R. M.

    2016-07-01

    Direct measurement of the flow of electric current during VAR is extremely difficult due to the aggressive environment as the arc process itself controls the distribution of current. In previous studies the technique of “magnetic source tomography” was presented; this was shown to be effective but it used a computationally intensive iterative method to analyse the distribution of arc centre position. In this paper we present faster computational methods requiring less numerical optimisation to determine the centre position of a single distributed arc both numerically and experimentally. Numerical validation of the algorithms were done on models and experimental validation on measurements based on titanium and nickel alloys (Ti6Al4V and INCONEL 718). The results are used to comment on the effects of process parameters on arc behaviour during VAR.

  6. Standardised evaluation of the performance of a simple membrane filtration-elution method to concentrate bacteriophages from drinking water.

    PubMed

    Méndez, Javier; Audicana, Ana; Isern, Ana; Llaneza, Julián; Moreno, Belén; Tarancón, María Luisa; Jofre, Juan; Lucena, Francisco

    2004-04-01

    The bacteriophage elution procedure described further after adsorption to acetate-nitrate cellulose membrane filters allows better recovery of phages concentrated from 1l of water than elution procedures used previously. The improvement is due to the combined effect of the eluent (3% (w/v) beef extract, 3% (v/v) Tween 80, 0.5M NaCl, pH 9.0) and the application of ultrasound instead of agitation or swirling. Average recovery of somatic coliphages, 82 +/- 7%, was the greatest, and that of phages infecting Bacteroides fragilis, 56 +/- 8%, the lowest, with intermediate values for F-specific and F-specific RNA bacteriophages. Thus, the method allowed recovery of over 56% for all the phages suggested as surrogate indicators. The method was then validated according to an International Standardisation Organisation validation standard procedure and implemented in routine laboratories, which obtained reproducible results.

  7. Reproducible diagnostic metabolites in plasma from typhoid fever patients in Asia and Africa.

    PubMed

    Näsström, Elin; Parry, Christopher M; Vu Thieu, Nga Tran; Maude, Rapeephan R; de Jong, Hanna K; Fukushima, Masako; Rzhepishevska, Olena; Marks, Florian; Panzner, Ursula; Im, Justin; Jeon, Hyonjin; Park, Seeun; Chaudhury, Zabeen; Ghose, Aniruddha; Samad, Rasheda; Van, Tan Trinh; Johansson, Anders; Dondorp, Arjen M; Thwaites, Guy E; Faiz, Abul; Antti, Henrik; Baker, Stephen

    2017-05-09

    Salmonella Typhi is the causative agent of typhoid. Typhoid is diagnosed by blood culture, a method that lacks sensitivity, portability and speed. We have previously shown that specific metabolomic profiles can be detected in the blood of typhoid patients from Nepal (Näsström et al., 2014). Here, we performed mass spectrometry on plasma from Bangladeshi and Senegalese patients with culture confirmed typhoid fever, clinically suspected typhoid, and other febrile diseases including malaria. After applying supervised pattern recognition modelling, we could significantly distinguish metabolite profiles in plasma from the culture confirmed typhoid patients. After comparing the direction of change and degree of multivariate significance, we identified 24 metabolites that were consistently up- or down regulated in a further Bangladeshi/Senegalese validation cohort, and the Nepali cohort from our previous work. We have identified and validated a metabolite panel that can distinguish typhoid from other febrile diseases, providing a new approach for typhoid diagnostics.

  8. A newly validated high-performance liquid chromatography method with diode array ultraviolet detection for analysis of the antimalarial drug primaquine in the blood plasma.

    PubMed

    Carmo, Ana Paula Barbosa do; Borborema, Manoella; Ribeiro, Stephan; De-Oliveira, Ana Cecilia Xavier; Paumgartten, Francisco Jose Roma; Moreira, Davyson de Lima

    2017-01-01

    Primaquine (PQ) diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV ) analysis of PQ in the blood plasma was developed and validated. After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm) as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80) (45:55) as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD) and quantification (LOQ) limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral) PQ diphosphate. By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.

  9. Why Does a Method That Fails Continue To Be Used: The Answer

    PubMed Central

    Templeton, Alan R.

    2009-01-01

    It has been claimed that hundreds of researchers use nested clade phylogeographic analysis (NCPA) based on what the method promises rather than requiring objective validation of the method. The supposed failure of NCPA is based upon the argument that validating it by using positive controls ignored type I error, and that computer simulations have shown a high type I error. The first argument is factually incorrect: the previously published validation analysis fully accounted for both type I and type II errors. The simulations that indicate a 75% type I error rate have serious flaws and only evaluate outdated versions of NCPA. These outdated type I error rates fall precipitously when the 2003 version of single locus NCPA is used or when the 2002 multi-locus version of NCPA is used. It is shown that the treewise type I errors in single-locus NCPA can be corrected to the desired nominal level by a simple statistical procedure, and that multilocus NCPA reconstructs a simulated scenario used to discredit NCPA with 100% accuracy. Hence, NCPA is a not a failed method at all, but rather has been validated both by actual data and by simulated data in a manner that satisfies the published criteria given by its critics. The critics have come to different conclusions because they have focused on the pre-2002 versions of NCPA and have failed to take into account the extensive developments in NCPA since 2002. Hence, researchers can choose to use NCPA based upon objective critical validation that shows that NCPA delivers what it promises. PMID:19335340

  10. Evaluation and validation of a multi-residue method based on biochip technology for the simultaneous screening of six families of antibiotics in muscle and aquaculture products.

    PubMed

    Gaudin, Valérie; Hedou, Celine; Soumet, Christophe; Verdon, Eric

    2016-01-01

    The Evidence Investigator™ system (Randox, UK) is a biochip and semi-automated system. The microarray kit II (AM II) is capable of detecting several compounds belonging to different families of antibiotics: quinolones, ceftiofur, thiamphenicol, streptomycin, tylosin and tetracyclines. The performance of this innovative system was evaluated for the detection of antibiotic residues in new matrices, in muscle of different animal species and in aquaculture products. The method was validated according to the European Decision No. EC/2002/657 and the European guideline for the validation of screening methods, which represents a complete initial validation. The false-positive rate was equal to 0% in muscle and in aquaculture products. The detection capabilities CCβ for 12 validated antibiotics (enrofloxacin, difloxacin, ceftiofur, desfuroyl ceftiofur cysteine disulfide, thiamphenicol, florfenicol, tylosin, tilmicosin, streptomycin, dihydrostreptomycin, tetracycline, doxycycline) were all lower than the respective maximum residue limits (MRLs) in muscle from different animal origins (bovine, ovine, porcine, poultry). No cross-reactions were observed with other antibiotics, neither with the six detected families nor with other families of antibiotics. The AM II kit could be applied to aquaculture products but with higher detection capabilities from those in muscle. The detection capabilities CCβ in aquaculture products were respectively at 0.25, 0.10 and 0.5 of the respective MRL in aquaculture products for enrofloxacin, tylosin and oxytetracycline. The performance of the AM II kit has been compared with other screening methods and with the performance characteristics previously determined in honey.

  11. Gene network inherent in genomic big data improves the accuracy of prognostic prediction for cancer patients.

    PubMed

    Kim, Yun Hak; Jeong, Dae Cheon; Pak, Kyoungjune; Goh, Tae Sik; Lee, Chi-Seung; Han, Myoung-Eun; Kim, Ji-Young; Liangwen, Liu; Kim, Chi Dae; Jang, Jeon Yeob; Cha, Wonjae; Oh, Sae-Ock

    2017-09-29

    Accurate prediction of prognosis is critical for therapeutic decisions regarding cancer patients. Many previously developed prognostic scoring systems have limitations in reflecting recent progress in the field of cancer biology such as microarray, next-generation sequencing, and signaling pathways. To develop a new prognostic scoring system for cancer patients, we used mRNA expression and clinical data in various independent breast cancer cohorts (n=1214) from the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC) and Gene Expression Omnibus (GEO). A new prognostic score that reflects gene network inherent in genomic big data was calculated using Network-Regularized high-dimensional Cox-regression (Net-score). We compared its discriminatory power with those of two previously used statistical methods: stepwise variable selection via univariate Cox regression (Uni-score) and Cox regression via Elastic net (Enet-score). The Net scoring system showed better discriminatory power in prediction of disease-specific survival (DSS) than other statistical methods (p=0 in METABRIC training cohort, p=0.000331, 4.58e-06 in two METABRIC validation cohorts) when accuracy was examined by log-rank test. Notably, comparison of C-index and AUC values in receiver operating characteristic analysis at 5 years showed fewer differences between training and validation cohorts with the Net scoring system than other statistical methods, suggesting minimal overfitting. The Net-based scoring system also successfully predicted prognosis in various independent GEO cohorts with high discriminatory power. In conclusion, the Net-based scoring system showed better discriminative power than previous statistical methods in prognostic prediction for breast cancer patients. This new system will mark a new era in prognosis prediction for cancer patients.

  12. Gene network inherent in genomic big data improves the accuracy of prognostic prediction for cancer patients

    PubMed Central

    Kim, Yun Hak; Jeong, Dae Cheon; Pak, Kyoungjune; Goh, Tae Sik; Lee, Chi-Seung; Han, Myoung-Eun; Kim, Ji-Young; Liangwen, Liu; Kim, Chi Dae; Jang, Jeon Yeob; Cha, Wonjae; Oh, Sae-Ock

    2017-01-01

    Accurate prediction of prognosis is critical for therapeutic decisions regarding cancer patients. Many previously developed prognostic scoring systems have limitations in reflecting recent progress in the field of cancer biology such as microarray, next-generation sequencing, and signaling pathways. To develop a new prognostic scoring system for cancer patients, we used mRNA expression and clinical data in various independent breast cancer cohorts (n=1214) from the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC) and Gene Expression Omnibus (GEO). A new prognostic score that reflects gene network inherent in genomic big data was calculated using Network-Regularized high-dimensional Cox-regression (Net-score). We compared its discriminatory power with those of two previously used statistical methods: stepwise variable selection via univariate Cox regression (Uni-score) and Cox regression via Elastic net (Enet-score). The Net scoring system showed better discriminatory power in prediction of disease-specific survival (DSS) than other statistical methods (p=0 in METABRIC training cohort, p=0.000331, 4.58e-06 in two METABRIC validation cohorts) when accuracy was examined by log-rank test. Notably, comparison of C-index and AUC values in receiver operating characteristic analysis at 5 years showed fewer differences between training and validation cohorts with the Net scoring system than other statistical methods, suggesting minimal overfitting. The Net-based scoring system also successfully predicted prognosis in various independent GEO cohorts with high discriminatory power. In conclusion, the Net-based scoring system showed better discriminative power than previous statistical methods in prognostic prediction for breast cancer patients. This new system will mark a new era in prognosis prediction for cancer patients. PMID:29100405

  13. Development of a time-dependent incompressible Navier-Stokes solver based on a fractional-step method

    NASA Technical Reports Server (NTRS)

    Rosenfeld, Moshe

    1990-01-01

    The development, validation and application of a fractional step solution method of the time-dependent incompressible Navier-Stokes equations in generalized coordinate systems are discussed. A solution method that combines a finite-volume discretization with a novel choice of the dependent variables and a fractional step splitting to obtain accurate solutions in arbitrary geometries was previously developed for fixed-grids. In the present research effort, this solution method is extended to include more general situations, including cases with moving grids. The numerical techniques are enhanced to gain efficiency and generality.

  14. The LEAP™ Gesture Interface Device and Take-Home Laparoscopic Simulators: A Study of Construct and Concurrent Validity.

    PubMed

    Partridge, Roland W; Brown, Fraser S; Brennan, Paul M; Hennessey, Iain A M; Hughes, Mark A

    2016-02-01

    To assess the potential of the LEAP™ infrared motion tracking device to map laparoscopic instrument movement in a simulated environment. Simulator training is optimized when augmented by objective performance feedback. We explore the potential LEAP has to provide this in a way compatible with affordable take-home simulators. LEAP and the previously validated InsTrac visual tracking tool mapped expert and novice performances of a standardized simulated laparoscopic task. Ability to distinguish between the 2 groups (construct validity) and correlation between techniques (concurrent validity) were the primary outcome measures. Forty-three expert and 38 novice performances demonstrated significant differences in LEAP-derived metrics for instrument path distance (P < .001), speed (P = .002), acceleration (P < .001), motion smoothness (P < .001), and distance between the instruments (P = .019). Only instrument path distance demonstrated a correlation between LEAP and InsTrac tracking methods (novices: r = .663, P < .001; experts: r = .536, P < .001). Consistency of LEAP tracking was poor (average % time hands not tracked: 31.9%). The LEAP motion device is able to track the movement of hands using instruments in a laparoscopic box simulator. Construct validity is demonstrated by its ability to distinguish novice from expert performances. Only time and instrument path distance demonstrated concurrent validity with an existing tracking method however. A number of limitations to the tracking method used by LEAP have been identified. These need to be addressed before it can be considered an alternative to visual tracking for the delivery of objective performance metrics in take-home laparoscopic simulators. © The Author(s) 2015.

  15. On the validity of the autobiographical emotional memory task for emotion induction.

    PubMed

    Mills, Caitlin; D'Mello, Sidney

    2014-01-01

    The Autobiographical Emotional Memory Task (AEMT), which involves recalling and writing about intense emotional experiences, is a widely used method to experimentally induce emotions. The validity of this method depends upon the extent to which it can induce specific desired emotions (intended emotions), while not inducing any other (incidental) emotions at different levels across one (or more) conditions. A review of recent studies that used this method indicated that most studies exclusively monitor post-writing ratings of the intended emotions, without assessing the possibility that the method may have differentially induced other incidental emotions as well. We investigated the extent of this issue by collecting both pre- and post-writing ratings of incidental emotions in addition to the intended emotions. Using methods largely adapted from previous studies, participants were assigned to write about a profound experience of anger or fear (Experiment 1) or happiness or sadness (Experiment 2). In line with previous research, results indicated that intended emotions (anger and fear) were successfully induced in the respective conditions in Experiment 1. However, disgust and sadness were also induced while writing about an angry experience compared to a fearful experience. Similarly, although happiness and sadness were induced in the appropriate conditions, Experiment 2 indicated that writing about a sad experience also induced disgust, fear, and anger, compared to writing about a happy experience. Possible resolutions to avoid the limitations of the AEMT to induce specific discrete emotions are discussed.

  16. On the Validity of the Autobiographical Emotional Memory Task for Emotion Induction

    PubMed Central

    Mills, Caitlin; D'Mello, Sidney

    2014-01-01

    The Autobiographical Emotional Memory Task (AEMT), which involves recalling and writing about intense emotional experiences, is a widely used method to experimentally induce emotions. The validity of this method depends upon the extent to which it can induce specific desired emotions (intended emotions), while not inducing any other (incidental) emotions at different levels across one (or more) conditions. A review of recent studies that used this method indicated that most studies exclusively monitor post-writing ratings of the intended emotions, without assessing the possibility that the method may have differentially induced other incidental emotions as well. We investigated the extent of this issue by collecting both pre- and post-writing ratings of incidental emotions in addition to the intended emotions. Using methods largely adapted from previous studies, participants were assigned to write about a profound experience of anger or fear (Experiment 1) or happiness or sadness (Experiment 2). In line with previous research, results indicated that intended emotions (anger and fear) were successfully induced in the respective conditions in Experiment 1. However, disgust and sadness were also induced while writing about an angry experience compared to a fearful experience. Similarly, although happiness and sadness were induced in the appropriate conditions, Experiment 2 indicated that writing about a sad experience also induced disgust, fear, and anger, compared to writing about a happy experience. Possible resolutions to avoid the limitations of the AEMT to induce specific discrete emotions are discussed. PMID:24776697

  17. Improving the sensitivity and specificity of a bioanalytical assay for the measurement of certolizumab pegol.

    PubMed

    Smeraglia, John; Silva, John-Paul; Jones, Kieran

    2017-08-01

    In order to evaluate placental transfer of certolizumab pegol (CZP), a more sensitive and selective bioanalytical assay was required to accurately measure low CZP concentrations in infant and umbilical cord blood. Results & methodology: A new electrochemiluminescence immunoassay was developed to measure CZP levels in human plasma. Validation experiments demonstrated improved selectivity (no matrix interference observed) and a detection range of 0.032-5.0 μg/ml. Accuracy and precision met acceptance criteria (mean total error ≤20.8%). Dilution linearity and sample stability were acceptable and sufficient to support the method. The electrochemiluminescence immunoassay was validated for measuring low CZP concentrations in human plasma. The method demonstrated a more than tenfold increase in sensitivity compared with previous assays, and improved selectivity for intact CZP.

  18. The Use of Virtual Reality in the Study of People's Responses to Violent Incidents.

    PubMed

    Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel

    2009-01-01

    This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call 'plausibility' - including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.

  19. The Use of Virtual Reality in the Study of People's Responses to Violent Incidents

    PubMed Central

    Rovira, Aitor; Swapp, David; Spanlang, Bernhard; Slater, Mel

    2009-01-01

    This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on field data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unification of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call ‘plausibility’ – including the fidelity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram's 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents. PMID:20076762

  20. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE PAGES

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; ...

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  1. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  2. Subarachnoid hemorrhage admissions retrospectively identified using a prediction model

    PubMed Central

    McIntyre, Lauralyn; Fergusson, Dean; Turgeon, Alexis; dos Santos, Marlise P.; Lum, Cheemun; Chassé, Michaël; Sinclair, John; Forster, Alan; van Walraven, Carl

    2016-01-01

    Objective: To create an accurate prediction model using variables collected in widely available health administrative data records to identify hospitalizations for primary subarachnoid hemorrhage (SAH). Methods: A previously established complete cohort of consecutive primary SAH patients was combined with a random sample of control hospitalizations. Chi-square recursive partitioning was used to derive and internally validate a model to predict the probability that a patient had primary SAH (due to aneurysm or arteriovenous malformation) using health administrative data. Results: A total of 10,322 hospitalizations with 631 having primary SAH (6.1%) were included in the study (5,122 derivation, 5,200 validation). In the validation patients, our recursive partitioning algorithm had a sensitivity of 96.5% (95% confidence interval [CI] 93.9–98.0), a specificity of 99.8% (95% CI 99.6–99.9), and a positive likelihood ratio of 483 (95% CI 254–879). In this population, patients meeting criteria for the algorithm had a probability of 45% of truly having primary SAH. Conclusions: Routinely collected health administrative data can be used to accurately identify hospitalized patients with a high probability of having a primary SAH. This algorithm may allow, upon validation, an easy and accurate method to create validated cohorts of primary SAH from either ruptured aneurysm or arteriovenous malformation. PMID:27629096

  3. Validation of automated white matter hyperintensity segmentation.

    PubMed

    Smart, Sean D; Firbank, Michael J; O'Brien, John T

    2011-01-01

    Introduction. White matter hyperintensities (WMHs) are a common finding on MRI scans of older people and are associated with vascular disease. We compared 3 methods for automatically segmenting WMHs from MRI scans. Method. An operator manually segmented WMHs on MRI images from a 3T scanner. The scans were also segmented in a fully automated fashion by three different programmes. The voxel overlap between manual and automated segmentation was compared. Results. Between observer overlap ratio was 63%. Using our previously described in-house software, we had overlap of 62.2%. We investigated the use of a modified version of SPM segmentation; however, this was not successful, with only 14% overlap. Discussion. Using our previously reported software, we demonstrated good segmentation of WMHs in a fully automated fashion.

  4. Accuracy of the visual estimation method as a predictor of food intake in Alzheimer's patients provided with different types of food.

    PubMed

    Amano, Nobuko; Nakamura, Tomiyo

    2018-02-01

    The visual estimation method is commonly used in hospitals and other care facilities to evaluate food intake through estimation of plate waste. In Japan, no previous studies have investigated the validity and reliability of this method under the routine conditions of a hospital setting. The present study aimed to evaluate the validity and reliability of the visual estimation method, in long-term inpatients with different levels of eating disability caused by Alzheimer's disease. The patients were provided different therapeutic diets presented in various food types. This study was performed between February and April 2013, and 82 patients with Alzheimer's disease were included. Plate waste was evaluated for the 3 main daily meals, for a total of 21 days, 7 consecutive days during each of the 3 months, originating a total of 4851 meals, from which 3984 were included. Plate waste was measured by the nurses through the visual estimation method, and by the hospital's registered dietitians through the actual measurement method. The actual measurement method was first validated to serve as a reference, and the level of agreement between both methods was then determined. The month, time of day, type of food provided, and patients' physical characteristics were considered for analysis. For the 3984 meals included in the analysis, the level of agreement between the measurement methods was 78.4%. Disagreement of measurements consisted of 3.8% of underestimation and 17.8% of overestimation. Cronbach's α (0.60, P < 0.001) indicated that the reliability of the visual estimation method was within the acceptable range. The visual estimation method was found to be a valid and reliable method for estimating food intake in patients with different levels of eating impairment. The successful implementation and use of the method depends upon adequate training and motivation of the nurses and care staff involved. Copyright © 2017 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.

  5. Development and validation of criterion-referenced clinically relevant fitness standards for maintaining physical independence in later years.

    PubMed

    Rikli, Roberta E; Jones, C Jessie

    2013-04-01

    To develop and validate criterion-referenced fitness standards for older adults that predict the level of capacity needed for maintaining physical independence into later life. The proposed standards were developed for use with a previously validated test battery for older adults-the Senior Fitness Test (Rikli, R. E., & Jones, C. J. (2001). Development and validation of a functional fitness test for community--residing older adults. Journal of Aging and Physical Activity, 6, 127-159; Rikli, R. E., & Jones, C. J. (1999a). Senior fitness test manual. Champaign, IL: Human Kinetics.). A criterion measure to assess physical independence was identified. Next, scores from a subset of 2,140 "moderate-functioning" older adults from a larger cross-sectional database, together with findings from longitudinal research on physical capacity and aging, were used as the basis for proposing fitness standards (performance cut points) associated with having the ability to function independently. Validity and reliability analyses were conducted to test the standards for their accuracy and consistency as predictors of physical independence. Performance standards are presented for men and women ages 60-94 indicating the level of fitness associated with remaining physically independent until late in life. Reliability and validity indicators for the standards ranged between .79 and .97. The proposed standards provide easy-to-use, previously unavailable methods for evaluating physical capacity in older adults relative to that associated with physical independence. Most importantly, the standards can be used in planning interventions that target specific areas of weakness, thus reducing risk for premature loss of mobility and independence.

  6. The Research Diagnostic Criteria for Temporomandibular Disorders. I: overview and methodology for assessment of validity.

    PubMed

    Schiffman, Eric L; Truelove, Edmond L; Ohrbach, Richard; Anderson, Gary C; John, Mike T; List, Thomas; Look, John O

    2010-01-01

    The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. The aim of this article is to provide an overview of the project's methodology, descriptive statistics, and data for the study participant sample. This article also details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. The Axis I reference standards were based on the consensus of two criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion examination reliability was also assessed within study sites. Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas > or = 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion examiner agreement with reference standards was excellent (k > or = 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods.

  7. A novel multi-target regression framework for time-series prediction of drug efficacy.

    PubMed

    Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin

    2017-01-18

    Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task.

  8. A novel multi-target regression framework for time-series prediction of drug efficacy

    PubMed Central

    Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin

    2017-01-01

    Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task. PMID:28098186

  9. A Simple and Efficient Computational Approach to Chafed Cable Time-Domain Reflectometry Signature Prediction

    NASA Technical Reports Server (NTRS)

    Kowalski, Marc Edward

    2009-01-01

    A method for the prediction of time-domain signatures of chafed coaxial cables is presented. The method is quasi-static in nature, and is thus efficient enough to be included in inference and inversion routines. Unlike previous models proposed, no restriction on the geometry or size of the chafe is required in the present approach. The model is validated and its speed is illustrated via comparison to simulations from a commercial, three-dimensional electromagnetic simulator.

  10. T2* Mapping Provides Information That Is Statistically Comparable to an Arthroscopic Evaluation of Acetabular Cartilage.

    PubMed

    Morgan, Patrick; Nissi, Mikko J; Hughes, John; Mortazavi, Shabnam; Ellerman, Jutta

    2017-07-01

    Objectives The purpose of this study was to validate T2* mapping as an objective, noninvasive method for the prediction of acetabular cartilage damage. Methods This is the second step in the validation of T2*. In a previous study, we established a quantitative predictive model for identifying and grading acetabular cartilage damage. In this study, the model was applied to a second cohort of 27 consecutive hips to validate the model. A clinical 3.0-T imaging protocol with T2* mapping was used. Acetabular regions of interest (ROI) were identified on magnetic resonance and graded using the previously established model. Each ROI was then graded in a blinded fashion by arthroscopy. Accurate surgical location of ROIs was facilitated with a 2-dimensional map projection of the acetabulum. A total of 459 ROIs were studied. Results When T2* mapping and arthroscopic assessment were compared, 82% of ROIs were within 1 Beck group (of a total 6 possible) and 32% of ROIs were classified identically. Disease prediction based on receiver operating characteristic curve analysis demonstrated a sensitivity of 0.713 and a specificity of 0.804. Model stability evaluation required no significant changes to the predictive model produced in the initial study. Conclusions These results validate that T2* mapping provides statistically comparable information regarding acetabular cartilage when compared to arthroscopy. In contrast to arthroscopy, T2* mapping is quantitative, noninvasive, and can be used in follow-up. Unlike research quantitative magnetic resonance protocols, T2* takes little time and does not require a contrast agent. This may facilitate its use in the clinical sphere.

  11. The role of affect-driven impulsivity in gambling cognitions: A convenience-sample study with a Spanish version of the Gambling-Related Cognitions Scale.

    PubMed

    Del Prete, Francesco; Steward, Trevor; Navas, Juan F; Fernández-Aranda, Fernando; Jiménez-Murcia, Susana; Oei, Tian P S; Perales, José C

    2017-03-01

    Background and aims Abnormal cognitions are among the most salient domain-specific features of gambling disorder. The aims of this study were: (a) to examine and validate a Spanish version of the Gambling-Related Cognitions Scale (GRCS; Raylu & Oei, 2004) and (b) to examine associations between cognitive distortion levels, impulsivity, and gambling behavior. Methods This study first recruited a convenience sample of 500 adults who had gambled during the previous year. Participants were assessed using the Spanish version of GRCS (GRCS-S) questionnaire, the UPPS-P impulsivity questionnaire, measures of gambling behavior, and potentially relevant confounders. Robust confirmatory factor analysis methods on half the sample were used to select the best models from a hypothesis-driven set. The best solutions were validated on the other half, and the resulting factors were later correlated with impulsivity dimensions (in the whole n = 500 factor analysis sample) and clinically relevant gambling indices (in a separate convenience sample of 137 disordered and non-disordered gamblers; validity sample). Results This study supports the original five-factor model, suggests an alternative four-factor solution, and confirms the psychometric soundness of the GRCS-S. Importantly, cognitive distortions consistently correlated with affect- or motivation-driven aspects of impulsivity (urgency and sensation seeking), but not with cognitive impulsivity (lack of premeditation and lack of perseverance). Discussion and conclusions Our findings suggest that the GRCS-S is a valid and reliable instrument to identify gambling cognitions in Spanish samples. Our results expand upon previous research signaling specific associations between gambling-related distortions and affect-driven impulsivity in line with models of motivated reasoning.

  12. The role of affect-driven impulsivity in gambling cognitions: A convenience-sample study with a Spanish version of the Gambling-Related Cognitions Scale

    PubMed Central

    Del Prete, Francesco; Steward, Trevor; Navas, Juan F.; Fernández-Aranda, Fernando; Jiménez-Murcia, Susana; Oei, Tian P. S.; Perales, José C.

    2017-01-01

    Background and aims Abnormal cognitions are among the most salient domain-specific features of gambling disorder. The aims of this study were: (a) to examine and validate a Spanish version of the Gambling-Related Cognitions Scale (GRCS; Raylu & Oei, 2004) and (b) to examine associations between cognitive distortion levels, impulsivity, and gambling behavior. Methods This study first recruited a convenience sample of 500 adults who had gambled during the previous year. Participants were assessed using the Spanish version of GRCS (GRCS-S) questionnaire, the UPPS-P impulsivity questionnaire, measures of gambling behavior, and potentially relevant confounders. Robust confirmatory factor analysis methods on half the sample were used to select the best models from a hypothesis-driven set. The best solutions were validated on the other half, and the resulting factors were later correlated with impulsivity dimensions (in the whole n = 500 factor analysis sample) and clinically relevant gambling indices (in a separate convenience sample of 137 disordered and non-disordered gamblers; validity sample). Results This study supports the original five-factor model, suggests an alternative four-factor solution, and confirms the psychometric soundness of the GRCS-S. Importantly, cognitive distortions consistently correlated with affect- or motivation-driven aspects of impulsivity (urgency and sensation seeking), but not with cognitive impulsivity (lack of premeditation and lack of perseverance). Discussion and conclusions Our findings suggest that the GRCS-S is a valid and reliable instrument to identify gambling cognitions in Spanish samples. Our results expand upon previous research signaling specific associations between gambling-related distortions and affect-driven impulsivity in line with models of motivated reasoning. PMID:28118729

  13. Improving qPCR telomere length assays: Controlling for well position effects increases statistical power.

    PubMed

    Eisenberg, Dan T A; Kuzawa, Christopher W; Hayes, M Geoffrey

    2015-01-01

    Telomere length (TL) is commonly measured using quantitative PCR (qPCR). Although, easier than the southern blot of terminal restriction fragments (TRF) TL measurement method, one drawback of qPCR is that it introduces greater measurement error and thus reduces the statistical power of analyses. To address a potential source of measurement error, we consider the effect of well position on qPCR TL measurements. qPCR TL data from 3,638 people run on a Bio-Rad iCycler iQ are reanalyzed here. To evaluate measurement validity, correspondence with TRF, age, and between mother and offspring are examined. First, we present evidence for systematic variation in qPCR TL measurements in relation to thermocycler well position. Controlling for these well-position effects consistently improves measurement validity and yields estimated improvements in statistical power equivalent to increasing sample sizes by 16%. We additionally evaluated the linearity of the relationships between telomere and single copy gene control amplicons and between qPCR and TRF measures. We find that, unlike some previous reports, our data exhibit linear relationships. We introduce the standard error in percent, a superior method for quantifying measurement error as compared to the commonly used coefficient of variation. Using this measure, we find that excluding samples with high measurement error does not improve measurement validity in our study. Future studies using block-based thermocyclers should consider well position effects. Since additional information can be gleaned from well position corrections, rerunning analyses of previous results with well position correction could serve as an independent test of the validity of these results. © 2015 Wiley Periodicals, Inc.

  14. Construct Validity of Fresh Frozen Human Cadaver as a Training Model in Minimal Access Surgery

    PubMed Central

    Macafee, David; Pranesh, Nagarajan; Horgan, Alan F.

    2012-01-01

    Background: The construct validity of fresh human cadaver as a training tool has not been established previously. The aims of this study were to investigate the construct validity of fresh frozen human cadaver as a method of training in minimal access surgery and determine if novices can be rapidly trained using this model to a safe level of performance. Methods: Junior surgical trainees, novices (<3 laparoscopic procedure performed) in laparoscopic surgery, performed 10 repetitions of a set of structured laparoscopic tasks on fresh frozen cadavers. Expert laparoscopists (>100 laparoscopic procedures) performed 3 repetitions of identical tasks. Performances were scored using a validated, objective Global Operative Assessment of Laparoscopic Skills scale. Scores for 3 consecutive repetitions were compared between experts and novices to determine construct validity. Furthermore, to determine if the novices reached a safe level, a trimmed mean of the experts score was used to define a benchmark. Mann-Whitney U test was used for construct validity analysis and 1-sample t test to compare performances of the novice group with the benchmark safe score. Results: Ten novices and 2 experts were recruited. Four out of 5 tasks (nondominant to dominant hand transfer; simulated appendicectomy; intracorporeal and extracorporeal knot tying) showed construct validity. Novices’ scores became comparable to benchmark scores between the eighth and tenth repetition. Conclusion: Minimal access surgical training using fresh frozen human cadavers appears to have construct validity. The laparoscopic skills of novices can be accelerated through to a safe level within 8 to 10 repetitions. PMID:23318058

  15. Validation on milk and sprouts of EN ISO 16654:2001 - Microbiology of food and animal feeding stuffs - Horizontal method for the detection of Escherichia coli O157.

    PubMed

    Tozzoli, Rosangela; Maugliani, Antonella; Michelacci, Valeria; Minelli, Fabio; Caprioli, Alfredo; Morabito, Stefano

    2018-05-08

    In 2006, the European Committee for standardisation (CEN)/Technical Committee 275 - Food analysis - Horizontal methods/Working Group 6 - Microbiology of the food chain (TC275/WG6), launched the project of validating the method ISO 16654:2001 for the detection of Escherichia coli O157 in foodstuff by the evaluation of its performance, in terms of sensitivity and specificity, through collaborative studies. Previously, a validation study had been conducted to assess the performance of the Method No 164 developed by the Nordic Committee for Food Analysis (NMKL), which aims at detecting E. coli O157 in food as well, and is based on a procedure equivalent to that of the ISO 16654:2001 standard. Therefore, CEN established that the validation data obtained for the NMKL Method 164 could be exploited for the ISO 16654:2001 validation project, integrated with new data obtained through two additional interlaboratory studies on milk and sprouts, run in the framework of the CEN mandate No. M381. The ISO 16654:2001 validation project was led by the European Union Reference Laboratory for Escherichia coli including VTEC (EURL-VTEC), which organized the collaborative validation study on milk in 2012 with 15 participating laboratories and that on sprouts in 2014, with 14 participating laboratories. In both studies, a total of 24 samples were tested by each laboratory. Test materials were spiked with different concentration of E. coli O157 and the 24 samples corresponded to eight replicates of three levels of contamination: zero, low and high spiking level. The results submitted by the participating laboratories were analyzed to evaluate the sensitivity and specificity of the ISO 16654:2001 method when applied to milk and sprouts. The performance characteristics calculated on the data of the collaborative validation studies run under the CEN mandate No. M381 returned sensitivity and specificity of 100% and 94.4%, respectively for the milk study. As for sprouts matrix, the sensitivity resulted in 75.9% in the low level of contamination samples and 96.4% in samples spiked with high level of E. coli O157 and specificity was calculated as 99.1%. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Functional classification of protein structures by local structure matching in graph representation.

    PubMed

    Mills, Caitlyn L; Garg, Rohan; Lee, Joslynn S; Tian, Liang; Suciu, Alexandru; Cooperman, Gene; Beuning, Penny J; Ondrechen, Mary Jo

    2018-03-31

    As a result of high-throughput protein structure initiatives, over 14,400 protein structures have been solved by structural genomics (SG) centers and participating research groups. While the totality of SG data represents a tremendous contribution to genomics and structural biology, reliable functional information for these proteins is generally lacking. Better functional predictions for SG proteins will add substantial value to the structural information already obtained. Our method described herein, Graph Representation of Active Sites for Prediction of Function (GRASP-Func), predicts quickly and accurately the biochemical function of proteins by representing residues at the predicted local active site as graphs rather than in Cartesian coordinates. We compare the GRASP-Func method to our previously reported method, structurally aligned local sites of activity (SALSA), using the ribulose phosphate binding barrel (RPBB), 6-hairpin glycosidase (6-HG), and Concanavalin A-like Lectins/Glucanase (CAL/G) superfamilies as test cases. In each of the superfamilies, SALSA and the much faster method GRASP-Func yield similar correct classification of previously characterized proteins, providing a validated benchmark for the new method. In addition, we analyzed SG proteins using our SALSA and GRASP-Func methods to predict function. Forty-one SG proteins in the RPBB superfamily, nine SG proteins in the 6-HG superfamily, and one SG protein in the CAL/G superfamily were successfully classified into one of the functional families in their respective superfamily by both methods. This improved, faster, validated computational method can yield more reliable predictions of function that can be used for a wide variety of applications by the community. © 2018 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  17. A conflict management scale for pharmacy.

    PubMed

    Austin, Zubin; Gregory, Paul A; Martin, Craig

    2009-11-12

    To develop and establish the validity and reliability of a conflict management scale specific to pharmacy practice and education. A multistage inventory-item development process was undertaken involving 93 pharmacists and using a previously described explanatory model for conflict in pharmacy practice. A 19-item inventory was developed, field tested, and validated. The conflict management scale (CMS) demonstrated an acceptable degree of reliability and validity for use in educational or practice settings to promote self-reflection and self-awareness regarding individuals' conflict management styles. The CMS provides a unique, pharmacy-specific method for individuals to determine and reflect upon their own conflict management styles. As part of an educational program to facilitate self-reflection and heighten self-awareness, the CMS may be a useful tool to promote discussions related to an important part of pharmacy practice.

  18. Variability of Hormonal Stress Markers Collected from a Managed Dolphin Population

    DTIC Science & Technology

    2011-09-30

    physiological indicators of stress in wild marine mammals and the interrelationships between different stress markers can be used to estimate the impact of...samples will be processed for adrenocorticosteroids (ACTH, cortisol, aldosterone ), catecholamines (epinephrine, norepinephrine), and thyroid hormones...T3 and T4) via radioimmunoassay (RIA). Radioimmunoassay methods have previously been validated for cortisol and aldosterone in this species (Houser

  19. Reconsidering the Utility of Case Study Designs for Researching School Reform in a Neo-Scientific Era: Insights from a Multiyear, Mixed-Methods Study

    ERIC Educational Resources Information Center

    Donmoyer, Robert; Galloway, Fred

    2010-01-01

    In recent years, policy makers and researchers once again have embraced the traditional idea that quasi-experimental research designs (or reasonable facsimiles) can provide the sort of valid and generalizable knowledge about "what works" that educational researchers had promised--but never really produced--during the previous century. Although…

  20. Psychiatric OSCE Performance of Students with and without a Previous Core Psychiatry Clerkship

    ERIC Educational Resources Information Center

    Goisman, Robert M.; Levin, Robert M.; Krupat, Edward; Pelletier, Stephen R.; Alpert, Jonathan E.

    2010-01-01

    Objective: The OSCE has been demonstrated to be a reliable and valid method by which to assess students' clinical skills. An OSCE station was used to determine whether or not students who had completed a core psychiatry clerkship demonstrated skills that were superior to those who had not taken the clerkship and which areas discriminated between…

  1. Validation of the content of the prevention protocol for early sepsis caused by Streptococcus agalactiaein newborns

    PubMed Central

    da Silva, Fabiana Alves; Vidal, Cláudia Fernanda de Lacerda; de Araújo, Ednaldo Cavalcante

    2015-01-01

    Abstract Objective: to validate the content of the prevention protocol for early sepsis caused by Streptococcus agalactiaein newborns. Method: a transversal, descriptive and methodological study, with a quantitative approach. The sample was composed of 15 judges, 8 obstetricians and 7 pediatricians. The validation occurred through the assessment of the content of the protocol by the judges that received the instrument for data collection - checklist - which contained 7 items that represent the requisites to be met by the protocol. The validation of the content was achieved by applying the Content Validity Index. Result: in the judging process, all the items that represented requirements considered by the protocol obtained concordance within the established level (Content Validity Index > 0.75). Of 7 items, 6 have obtained full concordance (Content Validity Index 1.0) and the feasibility item obtained a Content Validity Index of 0.93. The global assessment of the instruments obtained a Content Validity Index of 0.99. Conclusion: the validation of content that was done was an efficient tool for the adjustment of the protocol, according to the judgment of experienced professionals, which demonstrates the importance of conducting a previous validation of the instruments. It is expected that this study will serve as an incentive for the adoption of universal tracking by other institutions through validated protocols. PMID:26444165

  2. Measurement properties of existing clinical assessment methods evaluating scapular positioning and function. A systematic review.

    PubMed

    Larsen, Camilla Marie; Juul-Kristensen, Birgit; Lund, Hans; Søgaard, Karen

    2014-10-01

    The aims were to compile a schematic overview of clinical scapular assessment methods and critically appraise the methodological quality of the involved studies. A systematic, computer-assisted literature search using Medline, CINAHL, SportDiscus and EMBASE was performed from inception to October 2013. Reference lists in articles were also screened for publications. From 50 articles, 54 method names were identified and categorized into three groups: (1) Static positioning assessment (n = 19); (2) Semi-dynamic (n = 13); and (3) Dynamic functional assessment (n = 22). Fifteen studies were excluded for evaluation due to no/few clinimetric results, leaving 35 studies for evaluation. Graded according to the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN checklist), the methodological quality in the reliability and validity domains was "fair" (57%) to "poor" (43%), with only one study rated as "good". The reliability domain was most often investigated. Few of the assessment methods in the included studies that had "fair" or "good" measurement property ratings demonstrated acceptable results for both reliability and validity. We found a substantially larger number of clinical scapular assessment methods than previously reported. Using the COSMIN checklist the methodological quality of the included measurement properties in the reliability and validity domains were in general "fair" to "poor". None were examined for all three domains: (1) reliability; (2) validity; and (3) responsiveness. Observational evaluation systems and assessment of scapular upward rotation seem suitably evidence-based for clinical use. Future studies should test and improve the clinimetric properties, and especially diagnostic accuracy and responsiveness, to increase utility for clinical practice.

  3. Optimization of detection conditions and single-laboratory validation of a multiresidue method for the determination of 135 pesticides and 25 organic pollutants in grapes and wine by gas chromatography time-of-flight mass spectrometry.

    PubMed

    Dasgupta, Soma; Banerjee, Kaushik; Dhumal, Kondiba N; Adsule, Pandurang G

    2011-01-01

    This paper describes single-laboratory validation of a multiresidue method for the determination of 135 pesticides, 12 dioxin-like polychlorinated biphenyls, 12 polyaromatic hydrocarbons, and bisphenol A in grapes and wine by GC/time-of-flight MS in a total run time of 48 min. The method is based on extraction with ethyl acetate in a sample-to-solvent ratio of 1:1, followed by selective dispersive SPE cleanup for grapes and wine. The GC/MS conditions were optimized for the chromatographic separation and to achieve highest S/N for all 160 target analytes, including the temperature-sensitive compounds, like captan and captafol, that are prone to degradation during analysis. An average recovery of 80-120% with RSD < 10% could be attained for all analytes except 17, for which the average recoveries were 70-80%. LOQ ranged within 10-50 ng/g, with < 25% expanded uncertainties, for 155 compounds in grapes and 151 in wine. In the incurred grape and wine samples, the residues of buprofezin, chlorpyriphos, metalaxyl, and myclobutanil were detected, with an RSD of < 5% (n = 6); the results were statistically similar to previously reported validated methods.

  4. Validation of an association rule mining-based method to infer associations between medications and problems.

    PubMed

    Wright, A; McCoy, A; Henkin, S; Flaherty, M; Sittig, D

    2013-01-01

    In a prior study, we developed methods for automatically identifying associations between medications and problems using association rule mining on a large clinical data warehouse and validated these methods at a single site which used a self-developed electronic health record. To demonstrate the generalizability of these methods by validating them at an external site. We received data on medications and problems for 263,597 patients from the University of Texas Health Science Center at Houston Faculty Practice, an ambulatory practice that uses the Allscripts Enterprise commercial electronic health record product. We then conducted association rule mining to identify associated pairs of medications and problems and characterized these associations with five measures of interestingness: support, confidence, chi-square, interest and conviction and compared the top-ranked pairs to a gold standard. 25,088 medication-problem pairs were identified that exceeded our confidence and support thresholds. An analysis of the top 500 pairs according to each measure of interestingness showed a high degree of accuracy for highly-ranked pairs. The same technique was successfully employed at the University of Texas and accuracy was comparable to our previous results. Top associations included many medications that are highly specific for a particular problem as well as a large number of common, accurate medication-problem pairs that reflect practice patterns.

  5. Fatty acid ethyl esters (FAEEs) as markers for alcohol in meconium: method validation and implementation of a screening program for prenatal drug exposure.

    PubMed

    Hastedt, Martin; Krumbiegel, Franziska; Gapert, René; Tsokos, Michael; Hartwig, Sven

    2013-09-01

    Alcohol consumption during pregnancy is a widespread problem and can cause severe fetal damage. As the diagnosis of fetal alcohol syndrome is difficult, the implementation of a reliable marker for alcohol consumption during pregnancy into meconium drug screening programs would be invaluable. A previously published gas chromatography mass spectrometry method for the detection of fatty acid ethyl esters (FAEEs) as alcohol markers in meconium was optimized and newly validated for a sample size of 50 mg. This method was applied to 122 cases from a drug-using population. The meconium samples were also tested for common drugs of abuse. In 73 % of the cases, one or more drugs were found. Twenty percent of the samples tested positive for FAEEs at levels indicating significant alcohol exposure. Consequently, alcohol was found to be the third most frequently abused substance within the study group. This re-validated method provides an increase in testing sensitivity, is reliable and easily applicable as part of a drug screening program. It can be used as a non-invasive tool to detect high alcohol consumption in the last trimester of pregnancy. The introduction of FAEEs testing in meconium screening was found to be of particular use in a drug-using population.

  6. Methodologies on estimating the energy requirements for maintenance and determining the net energy contents of feed ingredients in swine: a review of recent work.

    PubMed

    Li, Zhongchao; Liu, Hu; Li, Yakui; Lv, Zhiqian; Liu, Ling; Lai, Changhua; Wang, Junjun; Wang, Fenglai; Li, Defa; Zhang, Shuai

    2018-01-01

    In the past two decades, a considerable amount of research has focused on the determination of the digestible (DE) and metabolizable energy (ME) contents of feed ingredients fed to swine. Compared with the DE and ME systems, the net energy (NE) system is assumed to be the most accurate estimate of the energy actually available to the animal. However, published data pertaining to the measured NE content of ingredients fed to growing pigs are limited. Therefore, the Feed Data Group at the Ministry of Agricultural Feed Industry Centre (MAFIC) located at China Agricultural University has evaluated the NE content of many ingredients using indirect calorimetry. The present review summarizes the NE research works conducted at MAFIC and compares these results with those from other research groups on methodological aspect. These research projects mainly focus on estimating the energy requirements for maintenance and its impact on the determination, prediction, and validation of the NE content of several ingredients fed to swine. The estimation of maintenance energy is affected by methodology, growth stage, and previous feeding level. The fasting heat production method and the curvilinear regression method were used in MAFIC to estimate the NE requirement for maintenance. The NE contents of different feedstuffs were determined using indirect calorimetry through standard experimental procedure in MAFIC. Previously generated NE equations can also be used to predict NE in situations where calorimeters are not available. Although popular, the caloric efficiency is not a generally accepted method to validate the energy content of individual feedstuffs. In the future, more accurate and dynamic NE prediction equations aiming at specific ingredients should be established, and more practical validation approaches need to be developed.

  7. Estimation of Sensory Analysis Cupping Test Arabica Coffee Using NIR Spectroscopy

    NASA Astrophysics Data System (ADS)

    Safrizal; Sutrisno; Lilik, P. E. N.; Ahmad, U.; Samsudin

    2018-05-01

    Flavors have become the most important coffee quality parameters now day, many coffee consuming countries require certain taste scores for the coffee to be ordered, the currently used cupping method of appraisal is the method designed by The Specialty Coffee Association Of America (SCAA), from several previous studies was found that Near-Infrared Spectroscopy (NIRS) can be used to detect chemical composition of certain materials including those associated with flavor so it is possible also to be applied to coffee powder. The aim of this research is to get correlation between NIRS spectrum with cupping scoring by tester, then look at the possibility of testing coffee taste sensors using NIRS spectrum. The coffee samples were taken from various places, altitudes and postharvest handling methods, then the samples were prepared following the SCAA protocol, for sensory analysis was done in two ways, with the expert tester and with the NIRS test. The calibration between both found that Without pretreatment using PLS get RMSE cross validation 6.14, using Multiplicative Scatter Correction spectra obtained RMSE cross validation 5.43, the best RMSE cross-validation was 1.73 achieved by de-trending correction, NIRS can be used to predict the score of cupping.

  8. An Immersed Boundary Method for Solving the Compressible Navier-Stokes Equations with Fluid Structure Interaction

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    An immersed boundary method for the compressible Navier-Stokes equation and the additional infrastructure that is needed to solve moving boundary problems and fully coupled fluid-structure interaction is described. All the methods described in this paper were implemented in NASA's LAVA solver framework. The underlying immersed boundary method is based on the locally stabilized immersed boundary method that was previously introduced by the authors. In the present paper this method is extended to account for all aspects that are involved for fluid structure interaction simulations, such as fast geometry queries and stencil computations, the treatment of freshly cleared cells, and the coupling of the computational fluid dynamics solver with a linear structural finite element method. The current approach is validated for moving boundary problems with prescribed body motion and fully coupled fluid structure interaction problems in 2D and 3D. As part of the validation procedure, results from the second AIAA aeroelastic prediction workshop are also presented. The current paper is regarded as a proof of concept study, while more advanced methods for fluid structure interaction are currently being investigated, such as geometric and material nonlinearities, and advanced coupling approaches.

  9. Verification on the use of the Inoue method for precisely determining glomerular filtration rate in Philippine pediatrics

    NASA Astrophysics Data System (ADS)

    Magcase, M. J. D. J.; Duyan, A. Q.; Carpio, J.; Carbonell, C. A.; Trono, J. D.

    2015-06-01

    The objective of this study is to validate the Inoue method so that it would be the preferential choice in determining glomerular filtration rate (GFR) in Philippine pediatrics. The study consisted of 36 patients ranging from ages 2 months to 19 years old. The subjects used were those who were previously subjected to in-vitro method. The scintigrams of the invitro method was obtained and processed for split percentage uptake and for parameters needed to obtain Inoue GFR. The result of this paper correlates the Inoue GFR and In-vitro method (r = 0.926). Thus, Inoue method is a viable, simple, and practical technique in determining GFR in pediatric patients.

  10. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    PubMed

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  11. Archeointensity estimates of a tenth-century kiln: first application of the Tsunakawa-Shaw paleointensity method to archeological relics

    NASA Astrophysics Data System (ADS)

    Kitahara, Yu; Yamamoto, Yuhji; Ohno, Masao; Kuwahara, Yoshihiro; Kameda, Shuichi; Hatakeyama, Tadahiro

    2018-05-01

    Paleomagnetic information reconstructed from archeological materials can be utilized to estimate the archeological age of excavated relics, in addition to revealing the geomagnetic secular variation and core dynamics. The direction and intensity of the Earth's magnetic field (archeodirection and archeointensity) can be ascertained using different methods, many of which have been proposed over the past decade. Among the new experimental techniques for archeointensity estimates is the Tsunakawa-Shaw method. This study demonstrates the validity of the Tsunakawa-Shaw method to reconstruct archeointensity from samples of baked clay from archeological relics. The validity of the approach was tested by comparison with the IZZI-Thellier method. The intensity values obtained coincided at the standard deviation (1 σ) level. A total of 8 specimens for the Tsunakawa-Shaw method and 16 specimens for the IZZI-Thellier method, from 8 baked clay blocks, collected from the surface of the kiln were used in these experiments. Among them, 8 specimens (for the Tsunakawa-Shaw method) and 3 specimens (for the IZZI-Thellier method) passed a set of strict selection criteria used in the final evaluation of validity. Additionally, we performed rock magnetic experiments, mineral analysis, and paleodirection measurement to evaluate the suitability of the baked clay samples for paleointensity experiments and hence confirmed that the sample properties were ideal for performing paleointensity experiments. It is notable that the newly estimated archaomagnetic intensity values are lower than those in previous studies that used other paleointensity methods for the tenth century in Japan. [Figure not available: see fulltext.

  12. Design, calibration and validation of a novel 3D printed instrumented spatial linkage that measures changes in the rotational axes of the tibiofemoral joint.

    PubMed

    Bonny, Daniel P; Hull, M L; Howell, S M

    2014-01-01

    An accurate axis-finding technique is required to measure any changes from normal caused by total knee arthroplasty in the flexion-extension (F-E) and longitudinal rotation (LR) axes of the tibiofemoral joint. In a previous paper, we computationally determined how best to design and use an instrumented spatial linkage (ISL) to locate the F-E and LR axes such that rotational and translational errors were minimized. However, the ISL was not built and consequently was not calibrated; thus the errors in locating these axes were not quantified on an actual ISL. Moreover, previous methods to calibrate an ISL used calibration devices with accuracies that were either undocumented or insufficient for the device to serve as a gold-standard. Accordingly, the objectives were to (1) construct an ISL using the previously established guidelines,(2) calibrate the ISL using an improved method, and (3) quantify the error in measuring changes in the F-E and LR axes. A 3D printed ISL was constructed and calibrated using a coordinate measuring machine, which served as a gold standard. Validation was performed using a fixture that represented the tibiofemoral joint with an adjustable F-E axis and the errors in measuring changes to the positions and orientations of the F-E and LR axes were quantified. The resulting root mean squared errors (RMSEs) of the calibration residuals using the new calibration method were 0.24, 0.33, and 0.15 mm for the anterior-posterior, medial-lateral, and proximal-distal positions, respectively, and 0.11, 0.10, and 0.09 deg for varus-valgus, flexion-extension, and internal-external orientations, respectively. All RMSEs were below 0.29% of the respective full-scale range. When measuring changes to the F-E or LR axes, each orientation error was below 0.5 deg; when measuring changes in the F-E axis, each position error was below 1.0 mm. The largest position RMSE was when measuring a medial-lateral change in the LR axis (1.2 mm). Despite the large size of the ISL, these calibration residuals were better than those for previously published ISLs, particularly when measuring orientations, indicating that using a more accurate gold standard was beneficial in limiting the calibration residuals. The validation method demonstrated that this ISL is capable of accurately measuring clinically important changes (i.e. 1 mm and 1 deg) in the F-E and LR axes.

  13. Designing, optimization and validation of tetra-primer ARMS PCR protocol for genotyping mutations in caprine Fec genes

    PubMed Central

    Ahlawat, Sonika; Sharma, Rekha; Maitra, A.; Roy, Manoranjan; Tantia, M.S.

    2014-01-01

    New, quick, and inexpensive methods for genotyping novel caprine Fec gene polymorphisms through tetra-primer ARMS PCR were developed in the present investigation. Single nucleotide polymorphism (SNP) genotyping needs to be attempted to establish association between the identified mutations and traits of economic importance. In the current study, we have successfully genotyped three new SNPs identified in caprine fecundity genes viz. T(-242)C (BMPR1B), G1189A (GDF9) and G735A (BMP15). Tetra-primer ARMS PCR protocol was optimized and validated for these SNPs with short turn-around time and costs. The optimized techniques were tested on 158 random samples of Black Bengal goat breed. Samples with known genotypes for the described genes, previously tested in duplicate using the sequencing methods, were employed for validation of the assay. Upon validation, complete concordance was observed between the tetra-primer ARMS PCR assays and the sequencing results. These results highlight the ability of tetra-primer ARMS PCR in genotyping of mutations in Fec genes. Any associated SNP could be used to accelerate the improvement of goat reproductive traits by identifying high prolific animals at an early stage of life. Our results provide direct evidence that tetra-primer ARMS-PCR is a rapid, reliable, and cost-effective method for SNP genotyping of mutations in caprine Fec genes. PMID:25606428

  14. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities.

    PubMed

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin

    2013-03-05

    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Concurrent fNIRS-fMRI measurement to validate a method for separating deep and shallow fNIRS signals by using multidistance optodes

    PubMed Central

    Funane, Tsukasa; Sato, Hiroki; Yahata, Noriaki; Takizawa, Ryu; Nishimura, Yukika; Kinoshita, Akihide; Katura, Takusige; Atsumori, Hirokazu; Fukuda, Masato; Kasai, Kiyoto; Koizumi, Hideaki; Kiguchi, Masashi

    2015-01-01

    Abstract. It has been reported that a functional near-infrared spectroscopy (fNIRS) signal can be contaminated by extracerebral contributions. Many algorithms using multidistance separations to address this issue have been proposed, but their spatial separation performance has rarely been validated with simultaneous measurements of fNIRS and functional magnetic resonance imaging (fMRI). We previously proposed a method for discriminating between deep and shallow contributions in fNIRS signals, referred to as the multidistance independent component analysis (MD-ICA) method. In this study, to validate the MD-ICA method from the spatial aspect, multidistance fNIRS, fMRI, and laser-Doppler-flowmetry signals were simultaneously obtained for 12 healthy adult males during three tasks. The fNIRS signal was separated into deep and shallow signals by using the MD-ICA method, and the correlation between the waveforms of the separated fNIRS signals and the gray matter blood oxygenation level–dependent signals was analyzed. A three-way analysis of variance (signal depth×Hb kind×task) indicated that the main effect of fNIRS signal depth on the correlation is significant [F(1,1286)=5.34, p<0.05]. This result indicates that the MD-ICA method successfully separates fNIRS signals into spatially deep and shallow signals, and the accuracy and reliability of the fNIRS signal will be improved with the method. PMID:26157983

  16. Validity and power of association testing in family-based sampling designs: evidence for and against the common wisdom.

    PubMed

    Knight, Stacey; Camp, Nicola J

    2011-04-01

    Current common wisdom posits that association analyses using family-based designs have inflated type 1 error rates (if relationships are ignored) and independent controls are more powerful than familial controls. We explore these suppositions. We show theoretically that family-based designs can have deflated type-error rates. Through simulation, we examine the validity and power of family designs for several scenarios: cases from randomly or selectively ascertained pedigrees; and familial or independent controls. Family structures considered are as follows: sibships, nuclear families, moderate-sized and extended pedigrees. Three methods were considered with the χ(2) test for trend: variance correction (VC), weighted (weights assigned to account for genetic similarity), and naïve (ignoring relatedness) as well as the Modified Quasi-likelihood Score (MQLS) test. Selectively ascertained pedigrees had similar levels of disease enrichment; random ascertainment had no such restriction. Data for 1,000 cases and 1,000 controls were created under the null and alternate models. The VC and MQLS methods were always valid. The naïve method was anti-conservative if independent controls were used and valid or conservative in designs with familial controls. The weighted association method was generally valid for independent controls, and was conservative for familial controls. With regard to power, independent controls were more powerful for small-to-moderate selectively ascertained pedigrees, but familial and independent controls were equivalent in the extended pedigrees and familial controls were consistently more powerful for all randomly ascertained pedigrees. These results suggest a more complex situation than previously assumed, which has important implications for study design and analysis. © 2011 Wiley-Liss, Inc.

  17. Shock compression response of cold-rolled Ni/Al multilayer composites

    NASA Astrophysics Data System (ADS)

    Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.

    2017-01-01

    Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations [Specht et al., J. Appl. Phys. 111, 073527 (2012)]. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. These simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.

  18. An IMU-to-Body Alignment Method Applied to Human Gait Analysis

    PubMed Central

    Vargas-Valencia, Laura Susana; Elias, Arlindo; Rocon, Eduardo; Bastos-Filho, Teodiano; Frizera, Anselmo

    2016-01-01

    This paper presents a novel calibration procedure as a simple, yet powerful, method to place and align inertial sensors with body segments. The calibration can be easily replicated without the need of any additional tools. The proposed method is validated in three different applications: a computer mathematical simulation; a simplified joint composed of two semi-spheres interconnected by a universal goniometer; and a real gait test with five able-bodied subjects. Simulation results demonstrate that, after the calibration method is applied, the joint angles are correctly measured independently of previous sensor placement on the joint, thus validating the proposed procedure. In the cases of a simplified joint and a real gait test with human volunteers, the method also performs correctly, although secondary plane errors appear when compared with the simulation results. We believe that such errors are caused by limitations of the current inertial measurement unit (IMU) technology and fusion algorithms. In conclusion, the presented calibration procedure is an interesting option to solve the alignment problem when using IMUs for gait analysis. PMID:27973406

  19. 3D-QSAR studies of some reversible Acetyl cholinesterase inhibitors based on CoMFA and ligand protein interaction fingerprints using PC-LS-SVM and PLS-LS-SVM.

    PubMed

    Ghafouri, Hamidreza; Ranjbar, Mohsen; Sakhteman, Amirhossein

    2017-08-01

    A great challenge in medicinal chemistry is to develop different methods for structural design based on the pattern of the previously synthesized compounds. In this study two different QSAR methods were established and compared for a series of piperidine acetylcholinesterase inhibitors. In one novel approach, PC-LS-SVM and PLS-LS-SVM was used for modeling 3D interaction descriptors, and in the other method the same nonlinear techniques were used to build QSAR equations based on field descriptors. Different validation methods were used to evaluate the models and the results revealed the more applicability and predictive ability of the model generated by field descriptors (Q 2 LOO-CV =1, R 2 ext =0.97). External validation criteria revealed that both methods can be used in generating reasonable QSAR models. It was concluded that due to ability of interaction descriptors in prediction of binding mode, using this approach can be implemented in future 3D-QSAR softwares. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Quantitative measurement of intact alpha-synuclein proteoforms from post-mortem control and Parkinson's disease brain tissue by intact protein mass spectrometry.

    PubMed

    Kellie, John F; Higgs, Richard E; Ryder, John W; Major, Anthony; Beach, Thomas G; Adler, Charles H; Merchant, Kalpana; Knierman, Michael D

    2014-07-23

    A robust top down proteomics method is presented for profiling alpha-synuclein species from autopsied human frontal cortex brain tissue from Parkinson's cases and controls. The method was used to test the hypothesis that pathology associated brain tissue will have a different profile of post-translationally modified alpha-synuclein than the control samples. Validation of the sample processing steps, mass spectrometry based measurements, and data processing steps were performed. The intact protein quantitation method features extraction and integration of m/z data from each charge state of a detected alpha-synuclein species and fitting of the data to a simple linear model which accounts for concentration and charge state variability. The quantitation method was validated with serial dilutions of intact protein standards. Using the method on the human brain samples, several previously unreported modifications in alpha-synuclein were identified. Low levels of phosphorylated alpha synuclein were detected in brain tissue fractions enriched for Lewy body pathology and were marginally significant between PD cases and controls (p = 0.03).

  1. A validated solid-liquid extraction method for the HPLC determination of polyphenols in apple tissues Comparison with pressurised liquid extraction.

    PubMed

    Alonso-Salces, Rosa M; Barranco, Alejandro; Corta, Edurne; Berrueta, Luis A; Gallo, Blanca; Vicente, Francisca

    2005-02-15

    A solid-liquid extraction procedure followed by reversed-phase high-performance liquid chromatography (RP-HPLC) coupled with a photodiode array detector (DAD) for the determination of polyphenols in freeze-dried apple peel and pulp is reported. The extraction step consists in sonicating 0.5g of freeze-dried apple tissue with 30mL of methanol-water-acetic acid (30:69:1, v/v/v) containing 2g of ascorbic acid/L, for 10min in an ultrasonic bath. The whole method was validated, concluding that it is a robust method that presents high extraction efficiencies (peel: >91%, pulp: >95%) and appropriate precisions (within day: R.S.D. (n = 5) <5%, and between days: R.S.D. (n = 5) <7%) at the different concentration levels of polyphenols that can be found in apple samples. The method was compared with one previously published, consisting in a pressurized liquid extraction (PLE) followed by RP-HPLC-DAD determination. The advantages and disadvantages of both methods are discussed.

  2. In situ and online monitoring of hydrodynamic flow profiles in microfluidic channels based upon microelectrochemistry: concept, theory, and validation.

    PubMed

    Amatore, Christian; Oleinick, Alexander; Klymenko, Oleksiy V; Svir, Irina

    2005-08-12

    Herein, we propose a method for reconstructing any plausible macroscopic hydrodynamic flow profile occurring locally within a rectangular microfluidic channel. The method is based on experimental currents measured at single or double microband electrodes embedded in one channel wall. A perfectly adequate quasiconformal mapping of spatial coordinates introduced in our previous work [Electrochem. Commun. 2004, 6, 1123] and an exponentially expanding time grid, initially proposed [J. Electroanal. Chem. 2003, 557, 75] in conjunction with the solution of the corresponding variational problem approached by the Ritz method are used for the numerical reconstruction of flow profiles. Herein, the concept of the method is presented and developed theoretically and its validity is tested on the basis of the use of pseudoexperimental currents emulated by simulation of the diffusion-convection problem in a channel flow cell, to which a random Gaussian current noise is added. The flow profiles reconstructed by our method compare successfully with those introduced a priori into the simulations, even when these include significant distortions compared with either classical Poiseuille or electro-osmotic flows.

  3. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Validation of holistic nursing competencies: role-delineation study, 2012.

    PubMed

    Erickson, Helen Lorraine; Erickson, Margaret Elizabeth; Campbell, Joan A; Brekke, Mary E; Sandor, M Kay

    2013-12-01

    The American Holistic Nurses Credentialing Corporation (AHNCC), certifying body for nurses practicing within the precepts of holistic nursing, uses a systematic process to guide program development. A previous publication described their early work that distinguished basic and advanced holistic nursing and development of related examinations. A more recent publication described the work of AHNCC from 2004 to 2012, including a role-delineation study (RDS) that was undertaken to identify and validate competencies currently used by holistic nurses. A final report describes the RDS design, methods, and raw data information. This article discusses AHNCC's goals for undertaking the 2012 Holistic Nursing RDS and the implications for the certification programs.

  5. Validation of thermal effects of LED package by using Elmer finite element simulation method

    NASA Astrophysics Data System (ADS)

    Leng, Lai Siang; Retnasamy, Vithyacharan; Mohamad Shahimin, Mukhzeer; Sauli, Zaliman; Taniselass, Steven; Bin Ab Aziz, Muhamad Hafiz; Vairavan, Rajendaran; Kirtsaeng, Supap

    2017-02-01

    The overall performance of the Light-emitting diode, LED package is critically affected by the heat attribution. In this study, open source software - Elmer FEM has been utilized to study the thermal analysis of the LED package. In order to perform a complete simulation study, both Salome software and ParaView software were introduced as Pre and Postprocessor. The thermal effect of the LED package was evaluated by this software. The result has been validated with commercially licensed software based on previous work. The percentage difference from both simulation results is less than 5% which is tolerable and comparable.

  6. Endogenous Reference Genes and Their Quantitative Real-Time PCR Assays for Genetically Modified Bread Wheat (Triticum aestivum L.) Detection.

    PubMed

    Yang, Litao; Quan, Sheng; Zhang, Dabing

    2017-01-01

    Endogenous reference genes (ERG) and their derivate analytical methods are standard requirements for analysis of genetically modified organisms (GMOs). Development and validation of suitable ERGs is the primary step for establishing assays that monitoring the genetically modified (GM) contents in food/feed samples. Herein, we give a review of the ERGs currently used for GM wheat analysis, such as ACC1, PKABA1, ALMT1, and Waxy-D1, as well as their performances in GM wheat analysis. Also, we discussed one model for developing and validating one ideal RG for one plant species based on our previous research work.

  7. Validity and reliability of a scale to measure genital body image.

    PubMed

    Zielinski, Ruth E; Kane-Low, Lisa; Miller, Janis M; Sampselle, Carolyn

    2012-01-01

    Women's body image dissatisfaction extends to body parts usually hidden from view--their genitals. Ability to measure genital body image is limited by lack of valid and reliable questionnaires. We subjected a previously developed questionnaire, the Genital Self Image Scale (GSIS) to psychometric testing using a variety of methods. Five experts determined the content validity of the scale. Then using four participant groups, factor analysis was performed to determine construct validity and to identify factors. Further construct validity was established using the contrasting groups approach. Internal consistency and test-retest reliability was determined. Twenty one of 29 items were considered content valid. Two items were added based on expert suggestions. Factor analysis was undertaken resulting in four factors, identified as Genital Confidence, Appeal, Function, and Comfort. The revised scale (GSIS-20) included 20 items explaining 59.4% of the variance. Women indicating an interest in genital cosmetic surgery exhibited significantly lower scores on the GSIS-20 than those who did not. The final 20 item scale exhibited internal reliability across all sample groups as well as test-retest reliability. The GSIS-20 provides a measure of genital body image demonstrating reliability and validity across several populations of women.

  8. Validation of Laser-Induced Fluorescent Photogrammetric Targets on Membrane Structures

    NASA Technical Reports Server (NTRS)

    Jones, Thomas W.; Dorrington, Adrian A.; Shortis, Mark R.; Hendricks, Aron R.

    2004-01-01

    The need for static and dynamic characterization of a new generation of inflatable space structures requires the advancement of classical metrology techniques. A new photogrammetric-based method for non-contact ranging and surface profiling has been developed at NASA Langley Research Center (LaRC) to support modal analyses and structural validation of this class of space structures. This full field measurement method, known as Laser-Induced Fluorescence (LIF) photogrammetry, has previously yielded promising experimental results. However, data indicating the achievable measurement precision had not been published. This paper provides experimental results that indicate the LIF-photogrammetry measurement precision for three different target types used on a reflective membrane structure. The target types were: (1) non-contact targets generated using LIF, (2) surface attached retro-reflective targets, and (3) surface attached diffuse targets. Results from both static and dynamic investigations are included.

  9. Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.

    1990-01-01

    Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results ofmore » such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech.« less

  10. Lattice hydrodynamic model based traffic control: A transportation cyber-physical system approach

    NASA Astrophysics Data System (ADS)

    Liu, Hui; Sun, Dihua; Liu, Weining

    2016-11-01

    Lattice hydrodynamic model is a typical continuum traffic flow model, which describes the jamming transition of traffic flow properly. Previous studies in lattice hydrodynamic model have shown that the use of control method has the potential to improve traffic conditions. In this paper, a new control method is applied in lattice hydrodynamic model from a transportation cyber-physical system approach, in which only one lattice site needs to be controlled in this control scheme. The simulation verifies the feasibility and validity of this method, which can ensure the efficient and smooth operation of the traffic flow.

  11. Effectively identifying regulatory hotspots while capturing expression heterogeneity in gene expression studies

    PubMed Central

    2014-01-01

    Expression quantitative trait loci (eQTL) mapping is a tool that can systematically identify genetic variation affecting gene expression. eQTL mapping studies have shown that certain genomic locations, referred to as regulatory hotspots, may affect the expression levels of many genes. Recently, studies have shown that various confounding factors may induce spurious regulatory hotspots. Here, we introduce a novel statistical method that effectively eliminates spurious hotspots while retaining genuine hotspots. Applied to simulated and real datasets, we validate that our method achieves greater sensitivity while retaining low false discovery rates compared to previous methods. PMID:24708878

  12. A systematic review of the quality of homeopathic clinical trials

    PubMed Central

    Jonas, Wayne B; Anderson, Rachel L; Crawford, Cindy C; Lyons, John S

    2001-01-01

    Background While a number of reviews of homeopathic clinical trials have been done, all have used methods dependent on allopathic diagnostic classifications foreign to homeopathic practice. In addition, no review has used established and validated quality criteria allowing direct comparison of the allopathic and homeopathic literature. Methods In a systematic review, we compared the quality of clinical-trial research in homeopathy to a sample of research on conventional therapies using a validated and system-neutral approach. All clinical trials on homeopathic treatments with parallel treatment groups published between 1945–1995 in English were selected. All were evaluated with an established set of 33 validity criteria previously validated on a broad range of health interventions across differing medical systems. Criteria covered statistical conclusion, internal, construct and external validity. Reliability of criteria application is greater than 0.95. Results 59 studies met the inclusion criteria. Of these, 79% were from peer-reviewed journals, 29% used a placebo control, 51% used random assignment, and 86% failed to consider potentially confounding variables. The main validity problems were in measurement where 96% did not report the proportion of subjects screened, and 64% did not report attrition rate. 17% of subjects dropped out in studies where this was reported. There was practically no replication of or overlap in the conditions studied and most studies were relatively small and done at a single-site. Compared to research on conventional therapies the overall quality of studies in homeopathy was worse and only slightly improved in more recent years. Conclusions Clinical homeopathic research is clearly in its infancy with most studies using poor sampling and measurement techniques, few subjects, single sites and no replication. Many of these problems are correctable even within a "holistic" paradigm given sufficient research expertise, support and methods. PMID:11801202

  13. Validation of the Saskatoon Falls Prevention Consortium's Falls Screening and Referral Algorithm

    PubMed Central

    Lawson, Sara Nicole; Zaluski, Neal; Petrie, Amanda; Arnold, Cathy; Basran, Jenny

    2013-01-01

    ABSTRACT Purpose: To investigate the concurrent validity of the Saskatoon Falls Prevention Consortium's Falls Screening and Referral Algorithm (FSRA). Method: A total of 29 older adults (mean age 77.7 [SD 4.0] y) residing in an independent-living senior's complex who met inclusion criteria completed a demographic questionnaire and the components of the FSRA and Berg Balance Scale (BBS). The FSRA consists of the Elderly Fall Screening Test (EFST) and the Multi-factor Falls Questionnaire (MFQ); it is designed to categorize individuals into low, moderate, or high fall-risk categories to determine appropriate management pathways. A predictive model for probability of fall risk, based on previous research, was used to determine concurrent validity of the FSRI. Results: The FSRA placed 79% of participants into the low-risk category, whereas the predictive model found the probability of fall risk to range from 0.04 to 0.74, with a mean of 0.35 (SD 0.25). No statistically significant correlation was found between the FSRA and the predictive model for probability of fall risk (Spearman's ρ=0.35, p=0.06). Conclusion: The FSRA lacks concurrent validity relative to to a previously established model of fall risk and appears to over-categorize individuals into the low-risk group. Further research on the FSRA as an adequate tool to screen community-dwelling older adults for fall risk is recommended. PMID:24381379

  14. Validation of Automated White Matter Hyperintensity Segmentation

    PubMed Central

    Smart, Sean D.; Firbank, Michael J.; O'Brien, John T.

    2011-01-01

    Introduction. White matter hyperintensities (WMHs) are a common finding on MRI scans of older people and are associated with vascular disease. We compared 3 methods for automatically segmenting WMHs from MRI scans. Method. An operator manually segmented WMHs on MRI images from a 3T scanner. The scans were also segmented in a fully automated fashion by three different programmes. The voxel overlap between manual and automated segmentation was compared. Results. Between observer overlap ratio was 63%. Using our previously described in-house software, we had overlap of 62.2%. We investigated the use of a modified version of SPM segmentation; however, this was not successful, with only 14% overlap. Discussion. Using our previously reported software, we demonstrated good segmentation of WMHs in a fully automated fashion. PMID:21904678

  15. A voxel-based approach to gray matter asymmetries.

    PubMed

    Luders, E; Gaser, C; Jancke, L; Schlaug, G

    2004-06-01

    Voxel-based morphometry (VBM) was used to analyze gray matter (GM) asymmetries in a large sample (n = 60) of male and female professional musicians with and without absolute pitch (AP). We chose to examine these particular groups because previous studies using traditional region-of-interest (ROI) analyses have shown differences in hemispheric asymmetry related to AP and gender. Voxel-based methods may have advantages over traditional ROI-based methods since the analysis can be performed across the whole brain with minimal user bias. After determining that the VBM method was sufficiently sensitive for the detection of differences in GM asymmetries between groups, we found that male AP musicians were more leftward lateralized in the anterior region of the planum temporale (PT) than male non-AP musicians. This confirmed the results of previous studies using ROI-based methods that showed an association between PT asymmetry and the AP phenotype. We further observed that male non-AP musicians revealed an increased leftward GM asymmetry in the postcentral gyrus compared to female non-AP musicians, again corroborating results of a previously published study using ROI-based methods. By analyzing hemispheric GM differences across our entire sample, we were able to partially confirm findings of previous studies using traditional morphometric techniques, as well as more recent, voxel-based analyses. In addition, we found some unusually pronounced GM asymmetries in our musician sample not previously detected in subjects unselected for musical training. Since we were able to validate gender- and AP-related brain asymmetries previously described using traditional ROI-based morphometric techniques, the results of our analyses support the use of VBM for examinations of GM asymmetries.

  16. Setup of a Protocol of Molecular Diagnosis of β-Thalassemia Mutations in Tunisia using Denaturing High-Performance Liquid Chromatography (DHPLC).

    PubMed

    Sahli, Chaima Abdelhafidh; Ben Salem, Ikbel; Jouini, Latifa; Laouini, Naouel; Dabboubi, Rym; Hadj Fredj, Sondes; Siala, Hajer; Othmeni, Rym; Dakhlaoui, Boutheina; Fattoum, Slaheddine; Bibi, Amina; Messaoud, Taieb

    2016-09-01

    β-Thalassemia is one of the most prevalent worldwide autosomal recessive disorders. It presents a great molecular heterogeneity resulting from more than 200 causative mutations in the β-globin gene. In Tunisia, β-thalassemia represents the most prevalent monogenic hemoglobin disorder with 2.21% of carriers. Efficient and reliable mutation-screening methods are essential in order to establish appropriate prevention programs for at risk couples. The aim of the present study is to develop an efficient method based on the denaturing high-performance liquid chromatography (DHPLC) in which the whole β-globin gene (HBB) is screened for mutations covering about 90% of the spectrum. We have performed the validation of a DHPLC assay for direct genotyping of 11 known β-thalassemia mutations in the Tunisian population. DHPLC assay was established based on the analysis of 62 archival β-thalassemia samples previously genotyped then validated with full concordance on 50 tests with blind randomized samples previously genotyped with DNA sequencing and with 96% of consistency on 40 samples as a prospective study. Compared to other genotyping techniques, the DHPLC method can meet the requirements of direct genotyping of known β-thalassemia mutations in Tunisia and to be applied as a powerful tool for the genetic screening of prenatal and postnatal individuals. © 2016 Wiley Periodicals, Inc.

  17. Evaluation and comparison of predictive individual-level general surrogates.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Halloran, M Elizabeth

    2018-07-01

    An intermediate response measure that accurately predicts efficacy in a new setting at the individual level could be used both for prediction and personalized medical decisions. In this article, we define a predictive individual-level general surrogate (PIGS), which is an individual-level intermediate response that can be used to accurately predict individual efficacy in a new setting. While methods for evaluating trial-level general surrogates, which are predictors of trial-level efficacy, have been developed previously, few, if any, methods have been developed to evaluate individual-level general surrogates, and no methods have formalized the use of cross-validation to quantify the expected prediction error. Our proposed method uses existing methods of individual-level surrogate evaluation within a given clinical trial setting in combination with cross-validation over a set of clinical trials to evaluate surrogate quality and to estimate the absolute prediction error that is expected in a new trial setting when using a PIGS. Simulations show that our method performs well across a variety of scenarios. We use our method to evaluate and to compare candidate individual-level general surrogates over a set of multi-national trials of a pentavalent rotavirus vaccine.

  18. Uncertainty Footprint: Visualization of Nonuniform Behavior of Iterative Algorithms Applied to 4D Cell Tracking

    PubMed Central

    Wan, Y.; Hansen, C.

    2018-01-01

    Research on microscopy data from developing biological samples usually requires tracking individual cells over time. When cells are three-dimensionally and densely packed in a time-dependent scan of volumes, tracking results can become unreliable and uncertain. Not only are cell segmentation results often inaccurate to start with, but it also lacks a simple method to evaluate the tracking outcome. Previous cell tracking methods have been validated against benchmark data from real scans or artificial data, whose ground truth results are established by manual work or simulation. However, the wide variety of real-world data makes an exhaustive validation impossible. Established cell tracking tools often fail on new data, whose issues are also difficult to diagnose with only manual examinations. Therefore, data-independent tracking evaluation methods are desired for an explosion of microscopy data with increasing scale and resolution. In this paper, we propose the uncertainty footprint, an uncertainty quantification and visualization technique that examines nonuniformity at local convergence for an iterative evaluation process on a spatial domain supported by partially overlapping bases. We demonstrate that the patterns revealed by the uncertainty footprint indicate data processing quality in two algorithms from a typical cell tracking workflow – cell identification and association. A detailed analysis of the patterns further allows us to diagnose issues and design methods for improvements. A 4D cell tracking workflow equipped with the uncertainty footprint is capable of self diagnosis and correction for a higher accuracy than previous methods whose evaluation is limited by manual examinations. PMID:29456279

  19. Intercomparison of 3D pore-scale flow and solute transport simulation methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.

    2016-09-01

    Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include methods that 1) explicitly model the three-dimensional geometry of pore spaces and 2) those that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of class 1, based on direct numerical simulation using computational fluid dynamics (CFD) codes, against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of class 1 based on the immersed-boundary method (IMB),more » lattice Boltzmann method (LBM), smoothed particle hydrodynamics (SPH), as well as a model of class 2 (a pore-network model or PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and nonreactive solute transport, and intercompare the model results with previously reported experimental observations. Experimental observations are limited to measured pore-scale velocities, so solute transport comparisons are made only among the various models. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations).« less

  20. [Epidemiological methods used in studies in the prevalence of Tourette syndrome].

    PubMed

    Stefanoff, Paweł; Mazurek, Jacek

    2003-01-01

    Tourette syndrome (TS) prevalence was studied since the early 80-ies. Its clinical course is characterised by co-occurrence of motor and vocal tics. Results of previous epidemiological studies were surprisingly divergent: the prevalence varied from 0.5 to 115 cases per 10,000 population. The disease previously recognised as extremely rare and severe is now considered as quite common, with often moderate course. Selected methods used in studies of TS prevalence and analysis of their possible impact on study results are presented. The studies were divided into 3 groups: studies of the hospitalised population, large-scale screenings and studies involving school population, basing on characteristic and size of population, methods of selection of subjects, diagnostic and screening methods used. Studies of the hospitalised population involved patients with most severe symptoms, in different age groups, different methods of final diagnosis confirmation were used. TS prevalence varied from 0.5 up to 15 cases per 10,000 population. Procedures used in large-scale screening studies made possible the elimination of potential selection bias. Large populations were studied using transparent and repetitive confirmation of diagnoses. Their validity was additionally checked in parallel validity studies. TS prevalence was in the range 4.3 to 10 cases per 10,000 population. The highest TS prevalence was obtained in studies involving schoolchildren. Data were gathered from multiple sources: from parents, teachers and children, as well as from classroom observation. Diagnoses were made by experienced clinicians. TS prevalence obtained in school population studies was between 36.2 up to 115 per 10,000 population.

  1. Development and psychometric testing of a new instrument to measure factors influencing women's breast cancer prevention behaviors (ASSISTS).

    PubMed

    Khazaee-Pool, Maryam; Majlessi, Fereshteh; Montazeri, Ali; Pashaei, Tahereh; Gholami, Ali; Ponnet, Koen

    2016-07-22

    Breast cancer preventive behaviors have an extreme effect on women's health. Despite the benefits of preventive behaviors regarding breast cancer, they have not been implemented as routine care for healthy women. To assess this health issue, a reliable and valid scale is needed. The aim of the present study is to develop and examine the psychometric properties of a new scale, called the ASSISTS, in order to identify factors that affect women's breast cancer prevention behaviors. A multi-phase instrument development method was performed to develop the questionnaire from February 2012 to September 2014. The item pool was generated based on secondary analyses of previous qualitative data. Then, content and face validity were applied to provide a pre-final version of the scale. The scale validation was conducted with a sample of women recruited from health centers affiliated with Tehran University of Medical Sciences. The construct validity (both exploratory and confirmatory), convergent validity, discriminate validity, internal consistency reliability and test-retest analysis of the questionnaire were tested. Fifty-eight items were initially extracted from the secondary analysis of previous qualitative data. After content validity, this was reduced to 49 items. The exploratory factor analysis revealed seven factors (Attitude, supportive systems, self-efficacy, information seeking, stress management, stimulant and self-care) containing 33 items that jointly accounted for 60.62 % of the observed variance. The confirmatory factor analysis showed a model with appropriate fitness for the data. The Cronbach's alpha coefficient for the subscales ranged from 0.68 to 0.85, and the Intraclass Correlation Coefficient (ICC) ranged from 0.71 to 0.98; which is well above the acceptable thresholds. The findings showed that the designed questionnaire was a valid and reliable instrument for assessing factors affecting women's breast cancer prevention behaviors that can be used both in practice and in future studies.

  2. Digital multishaker modal testing

    NASA Technical Reports Server (NTRS)

    Blair, M.; Craig, R. R., Jr.

    1983-01-01

    A review of several modal testing techniques is made, along with brief discussions of their advantages and limitations. A new technique is presented which overcomes many of the previous limitations. Several simulated experiments are included to verify the validity and accuracy of the new method. Conclusions are drawn from the simulation studies and recommendations for further work are presented. The complete computer code configured for the simulation study is presented.

  3. Variability of Hormonal Stress Markers Collected from a Managed Dolphin Population

    DTIC Science & Technology

    2013-09-30

    physiological indicators of stress in wild marine mammals and the interrelationships between different stress markers can be used to estimate the impact...Radioimmunoassay methods have previously been validated for cortisol and aldosterone in this species (Houser et al., 2011). Parallel processing of...for these hormones.. Metabolites of cortisol, aldosterone and thyroid hormone will be extracted from fecal samples and measured via RIA using

  4. A new method to model electroconvulsive therapy in rats with increased construct validity and enhanced translational value.

    PubMed

    Theilmann, Wiebke; Löscher, Wolfgang; Socala, Katarzyna; Frieling, Helge; Bleich, Stefan; Brandt, Claudia

    2014-06-01

    Electroconvulsive therapy is the most effective therapy for major depressive disorder (MDD). The remission rate is above 50% in previously pharmacoresistant patients but the mechanisms of action are not fully understood. Electroconvulsive stimulation (ECS) in rodents mimics antidepressant electroconvulsive therapy (ECT) in humans and is widely used to investigate the underlying mechanisms of ECT. For the translational value of findings in animal models it is essential to establish models with the highest construct, face and predictive validity possible. The commonly used model for ECT in rodents does not meet the demand for high construct validity. For ECT, cortical surface electrodes are used to induce therapeutic seizures whereas ECS in rodents is exclusively performed by auricular or corneal electrodes. However, the stimulation site has a major impact on the type and spread of the induced seizure activity and its antidepressant effect. We propose a method in which ECS is performed by screw electrodes placed above the motor cortex of rats to closely simulate the clinical situation and thereby increase the construct validity of the model. Cortical ECS in rats induced reliably seizures comparable to human ECT. Cortical ECS was more effective than auricular ECS to reduce immobility in the forced swim test. Importantly, auricular stimulation had a negative influence on the general health condition of the rats with signs of fear during the stimulation sessions. These results suggest that auricular ECS in rats is not a suitable ECT model. Cortical ECS in rats promises to be a valid method to mimic ECT. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. North Atlantic observations sharpen meridional overturning projections

    NASA Astrophysics Data System (ADS)

    Olson, R.; An, S.-I.; Fan, Y.; Evans, J. P.; Caesar, L.

    2018-06-01

    Atlantic Meridional Overturning Circulation (AMOC) projections are uncertain due to both model errors, as well as internal climate variability. An AMOC slowdown projected by many climate models is likely to have considerable effects on many aspects of global and North Atlantic climate. Previous studies to make probabilistic AMOC projections have broken new ground. However, they do not drift-correct or cross-validate the projections, and do not fully account for internal variability. Furthermore, they consider a limited subset of models, and ignore the skill of models at representing the temporal North Atlantic dynamics. We improve on previous work by applying Bayesian Model Averaging to weight 13 Coupled Model Intercomparison Project phase 5 models by their skill at modeling the AMOC strength, and its temporal dynamics, as approximated by the northern North-Atlantic temperature-based AMOC Index. We make drift-corrected projections accounting for structural model errors, and for the internal variability. Cross-validation experiments give approximately correct empirical coverage probabilities, which validates our method. Our results present more evidence that AMOC likely already started slowing down. While weighting considerably moderates and sharpens our projections, our results are at low end of previously published estimates. We project mean AMOC changes between periods 1960-1999 and 2060-2099 of -4.0 Sv and -6.8 Sv for RCP4.5 and RCP8.5 emissions scenarios respectively. The corresponding average 90% credible intervals for our weighted experiments are [-7.2, -1.2] and [-10.5, -3.7] Sv respectively for the two scenarios.

  6. Method for in situ carbon deposition measurement for solid oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Kuhn, J.; Kesler, O.

    2014-01-01

    Previous methods to measure carbon deposition in solid oxide fuel cell (SOFC) anodes do not permit simultaneous electrochemical measurements. Electrochemical measurements supplemented with carbon deposition quantities create the opportunity to further understand how carbon affects SOFC performance and electrochemical impedance spectra (EIS). In this work, a method for measuring carbon in situ, named here as the quantification of gasified carbon (QGC), was developed. TGA experiments showed that carbon with a 100 h residence time in the SOFC was >99.8% gasified. Comparison of carbon mass measurements between the TGA and QGC show good agreement. In situ measurements of carbon deposition in SOFCs at varying molar steam/carbon ratios were performed to further validate the QGC method, and suppression of carbon deposition with increasing steam concentration was observed, in agreement with previous studies. The technique can be used to investigate in situ carbon deposition and gasification behavior simultaneously with electrochemical measurements for a variety of fuels and operating conditions, such as determining conditions under which incipient carbon deposition is reversible.

  7. Development of a multiple-parameter nonlinear perturbation procedure for transonic turbomachinery flows: Preliminary application to design/optimization problems

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Elliott, J. P.; Spreiter, J. R.

    1983-01-01

    An investigation was conducted to continue the development of perturbation procedures and associated computational codes for rapidly determining approximations to nonlinear flow solutions, with the purpose of establishing a method for minimizing computational requirements associated with parametric design studies of transonic flows in turbomachines. The results reported here concern the extension of the previously developed successful method for single parameter perturbations to simultaneous multiple-parameter perturbations, and the preliminary application of the multiple-parameter procedure in combination with an optimization method to blade design/optimization problem. In order to provide as severe a test as possible of the method, attention is focused in particular on transonic flows which are highly supercritical. Flows past both isolated blades and compressor cascades, involving simultaneous changes in both flow and geometric parameters, are considered. Comparisons with the corresponding exact nonlinear solutions display remarkable accuracy and range of validity, in direct correspondence with previous results for single-parameter perturbations.

  8. Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS

    PubMed Central

    Chitranshi, Priyanka; da Costa, Gonçalo Gamboa

    2016-01-01

    We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219

  9. Validation of a Crowdsourcing Methodology for Developing a Knowledge Base of Related Problem-Medication Pairs

    PubMed Central

    Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.

    2015-01-01

    Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079

  10. A New Method for Reconstructing Sea-Level and Deep-Sea-Temperature Variability over the Past 5.3 Million Years

    NASA Astrophysics Data System (ADS)

    Rohling, E. J.

    2014-12-01

    Ice volume (and hence sea level) and deep-sea temperature are key measures of global climate change. Sea level has been documented using several independent methods over the past 0.5 million years (Myr). Older periods, however, lack such independent validation; all existing records are related to deep-sea oxygen isotope (d18O) data that are influenced by processes unrelated to sea level. For deep-sea temperature, only one continuous high-resolution (Mg/Ca-based) record exists, with related sea-level estimates, spanning the past 1.5 Myr. We have recently presented a novel sea-level reconstruction, with associated estimates of deep-sea temperature, which independently validates the previous 0-1.5 Myr reconstruction and extends it back to 5.3 Myr ago. A serious of caveats applies to this new method, especially in older times of its application, as is always the case with new methods. Independent validation exercises are needed to elucidate where consistency exists, and where solutions drift away from each other. A key observation from our new method is that a large temporal offset existed during the onset of Plio-Pleistocene ice ages, between a marked cooling step at 2.73 Myr ago and the first major glaciation at 2.15 Myr ago. This observation relies on relative changes within the dataset, which are more robust than absolute values. I will discuss our method and its main caveats and avenues for improvement.

  11. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  12. A collaborative environment for developing and validating predictive tools for protein biophysical characteristics

    NASA Astrophysics Data System (ADS)

    Johnston, Michael A.; Farrell, Damien; Nielsen, Jens Erik

    2012-04-01

    The exchange of information between experimentalists and theoreticians is crucial to improving the predictive ability of theoretical methods and hence our understanding of the related biology. However many barriers exist which prevent the flow of information between the two disciplines. Enabling effective collaboration requires that experimentalists can easily apply computational tools to their data, share their data with theoreticians, and that both the experimental data and computational results are accessible to the wider community. We present a prototype collaborative environment for developing and validating predictive tools for protein biophysical characteristics. The environment is built on two central components; a new python-based integration module which allows theoreticians to provide and manage remote access to their programs; and PEATDB, a program for storing and sharing experimental data from protein biophysical characterisation studies. We demonstrate our approach by integrating PEATSA, a web-based service for predicting changes in protein biophysical characteristics, into PEATDB. Furthermore, we illustrate how the resulting environment aids method development using the Potapov dataset of experimentally measured ΔΔGfold values, previously employed to validate and train protein stability prediction algorithms.

  13. Nuclear magnetic resonance signal dynamics of liquids in the presence of distant dipolar fields, revisited

    PubMed Central

    Barros, Wilson; Gochberg, Daniel F.; Gore, John C.

    2009-01-01

    The description of the nuclear magnetic resonance magnetization dynamics in the presence of long-range dipolar interactions, which is based upon approximate solutions of Bloch–Torrey equations including the effect of a distant dipolar field, has been revisited. New experiments show that approximate analytic solutions have a broader regime of validity as well as dependencies on pulse-sequence parameters that seem to have been overlooked. In order to explain these experimental results, we developed a new method consisting of calculating the magnetization via an iterative formalism where both diffusion and distant dipolar field contributions are treated as integral operators incorporated into the Bloch–Torrey equations. The solution can be organized as a perturbative series, whereby access to higher order terms allows one to set better boundaries on validity regimes for analytic first-order approximations. Finally, the method legitimizes the use of simple analytic first-order approximations under less demanding experimental conditions, it predicts new pulse-sequence parameter dependencies for the range of validity, and clarifies weak points in previous calculations. PMID:19425789

  14. MMASS: an optimized array-based method for assessing CpG island methylation.

    PubMed

    Ibrahim, Ashraf E K; Thorne, Natalie P; Baird, Katie; Barbosa-Morais, Nuno L; Tavaré, Simon; Collins, V Peter; Wyllie, Andrew H; Arends, Mark J; Brenton, James D

    2006-01-01

    We describe an optimized microarray method for identifying genome-wide CpG island methylation called microarray-based methylation assessment of single samples (MMASS) which directly compares methylated to unmethylated sequences within a single sample. To improve previous methods we used bioinformatic analysis to predict an optimized combination of methylation-sensitive enzymes that had the highest utility for CpG-island probes and different methods to produce unmethylated representations of test DNA for more sensitive detection of differential methylation by hybridization. Subtraction or methylation-dependent digestion with McrBC was used with optimized (MMASS-v2) or previously described (MMASS-v1, MMASS-sub) methylation-sensitive enzyme combinations and compared with a published McrBC method. Comparison was performed using DNA from the cell line HCT116. We show that the distribution of methylation microarray data is inherently skewed and requires exogenous spiked controls for normalization and that analysis of digestion of methylated and unmethylated control sequences together with linear fit models of replicate data showed superior statistical power for the MMASS-v2 method. Comparison with previous methylation data for HCT116 and validation of CpG islands from PXMP4, SFRP2, DCC, RARB and TSEN2 confirmed the accuracy of MMASS-v2 results. The MMASS-v2 method offers improved sensitivity and statistical power for high-throughput microarray identification of differential methylation.

  15. Discovery and validation of cell cycle arrest biomarkers in human acute kidney injury

    PubMed Central

    2013-01-01

    Introduction Acute kidney injury (AKI) can evolve quickly and clinical measures of function often fail to detect AKI at a time when interventions are likely to provide benefit. Identifying early markers of kidney damage has been difficult due to the complex nature of human AKI, in which multiple etiologies exist. The objective of this study was to identify and validate novel biomarkers of AKI. Methods We performed two multicenter observational studies in critically ill patients at risk for AKI - discovery and validation. The top two markers from discovery were validated in a second study (Sapphire) and compared to a number of previously described biomarkers. In the discovery phase, we enrolled 522 adults in three distinct cohorts including patients with sepsis, shock, major surgery, and trauma and examined over 300 markers. In the Sapphire validation study, we enrolled 744 adult subjects with critical illness and without evidence of AKI at enrollment; the final analysis cohort was a heterogeneous sample of 728 critically ill patients. The primary endpoint was moderate to severe AKI (KDIGO stage 2 to 3) within 12 hours of sample collection. Results Moderate to severe AKI occurred in 14% of Sapphire subjects. The two top biomarkers from discovery were validated. Urine insulin-like growth factor-binding protein 7 (IGFBP7) and tissue inhibitor of metalloproteinases-2 (TIMP-2), both inducers of G1 cell cycle arrest, a key mechanism implicated in AKI, together demonstrated an AUC of 0.80 (0.76 and 0.79 alone). Urine [TIMP-2]·[IGFBP7] was significantly superior to all previously described markers of AKI (P <0.002), none of which achieved an AUC >0.72. Furthermore, [TIMP-2]·[IGFBP7] significantly improved risk stratification when added to a nine-variable clinical model when analyzed using Cox proportional hazards model, generalized estimating equation, integrated discrimination improvement or net reclassification improvement. Finally, in sensitivity analyses [TIMP-2]·[IGFBP7] remained significant and superior to all other markers regardless of changes in reference creatinine method. Conclusions Two novel markers for AKI have been identified and validated in independent multicenter cohorts. Both markers are superior to existing markers, provide additional information over clinical variables and add mechanistic insight into AKI. Trial registration ClinicalTrials.gov number NCT01209169. PMID:23388612

  16. Differences in quantitative assessment of myocardial scar and gray zone by LGE-CMR imaging using established gray zone protocols.

    PubMed

    Mesubi, Olurotimi; Ego-Osuala, Kelechi; Jeudy, Jean; Purtilo, James; Synowski, Stephen; Abutaleb, Ameer; Niekoop, Michelle; Abdulghani, Mohammed; Asoglu, Ramazan; See, Vincent; Saliaris, Anastasios; Shorofsky, Stephen; Dickfeld, Timm

    2015-02-01

    Late gadolinium enhancement cardiac magnetic resonance (LGE-CMR) imaging is the gold standard for myocardial scar evaluation. Heterogeneous areas of scar ('gray zone'), may serve as arrhythmogenic substrate. Various gray zone protocols have been correlated to clinical outcomes and ventricular tachycardia channels. This study assessed the quantitative differences in gray zone and scar core sizes as defined by previously validated signal intensity (SI) threshold algorithms. High quality LGE-CMR images performed in 41 cardiomyopathy patients [ischemic (33) or non-ischemic (8)] were analyzed using previously validated SI threshold methods [Full Width at Half Maximum (FWHM), n-standard deviation (NSD) and modified-FWHM]. Myocardial scar was defined as scar core and gray zone using SI thresholds based on these methods. Scar core, gray zone and total scar sizes were then computed and compared among these models. The median gray zone mass was 2-3 times larger with FWHM (15 g, IQR: 8-26 g) compared to NSD or modified-FWHM (5 g, IQR: 3-9 g; and 8 g. IQR: 6-12 g respectively, p < 0.001). Conversely, infarct core mass was 2.3 times larger with NSD (30 g, IQR: 17-53 g) versus FWHM and modified-FWHM (13 g, IQR: 7-23 g, p < 0.001). The gray zone extent (percentage of total scar that was gray zone) also varied significantly among the three methods, 51 % (IQR: 42-61 %), 17 % (IQR: 11-21 %) versus 38 % (IQR: 33-43 %) for FWHM, NSD and modified-FWHM respectively (p < 0.001). Considerable variability exists among the current methods for MRI defined gray zone and scar core. Infarct core and total myocardial scar mass also differ using these methods. Further evaluation of the most accurate quantification method is needed.

  17. Reproducible diagnostic metabolites in plasma from typhoid fever patients in Asia and Africa

    PubMed Central

    Näsström, Elin; Parry, Christopher M; Vu Thieu, Nga Tran; Maude, Rapeephan R; de Jong, Hanna K; Fukushima, Masako; Rzhepishevska, Olena; Marks, Florian; Panzner, Ursula; Im, Justin; Jeon, Hyonjin; Park, Seeun; Chaudhury, Zabeen; Ghose, Aniruddha; Samad, Rasheda; Van, Tan Trinh; Johansson, Anders; Dondorp, Arjen M; Thwaites, Guy E; Faiz, Abul; Antti, Henrik; Baker, Stephen

    2017-01-01

    Salmonella Typhi is the causative agent of typhoid. Typhoid is diagnosed by blood culture, a method that lacks sensitivity, portability and speed. We have previously shown that specific metabolomic profiles can be detected in the blood of typhoid patients from Nepal (Näsström et al., 2014). Here, we performed mass spectrometry on plasma from Bangladeshi and Senegalese patients with culture confirmed typhoid fever, clinically suspected typhoid, and other febrile diseases including malaria. After applying supervised pattern recognition modelling, we could significantly distinguish metabolite profiles in plasma from the culture confirmed typhoid patients. After comparing the direction of change and degree of multivariate significance, we identified 24 metabolites that were consistently up- or down regulated in a further Bangladeshi/Senegalese validation cohort, and the Nepali cohort from our previous work. We have identified and validated a metabolite panel that can distinguish typhoid from other febrile diseases, providing a new approach for typhoid diagnostics. DOI: http://dx.doi.org/10.7554/eLife.15651.001 PMID:28483042

  18. Validation of light water reactor calculation methods and JEF-1-based data libraries by TRX and BAPL critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Pelloni, S.; Grimm, P.

    1991-04-01

    This paper analyzes the capability of various code systems and JEF-1-based nuclear data libraries to compute light water reactor lattices by comparing calculations with results from thermal reactor benchmark experiments TRX and BAPL and with previously published values. With the JEF-1 evaluation, eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and all methods give reasonable results for the measured reaction rate ratios within, or not too far from, the experimental uncertainty.

  19. Distributed synchronization of networked drive-response systems: A nonlinear fixed-time protocol.

    PubMed

    Zhao, Wen; Liu, Gang; Ma, Xi; He, Bing; Dong, Yunfeng

    2017-11-01

    The distributed synchronization of networked drive-response systems is investigated in this paper. A novel nonlinear protocol is proposed to ensure that the tracking errors converge to zeros in a fixed-time. By comparison with previous synchronization methods, the present method considers more practical conditions and the synchronization time is not dependent of arbitrary initial conditions but can be offline pre-assign according to the task assignment. Finally, the feasibility and validity of the presented protocol have been illustrated by a numerical simulation. Copyright © 2017. Published by Elsevier Ltd.

  20. Successive ratio subtraction as a novel manipulation of ratio spectra for quantitative determination of a mixture of furosemide, spironolactone and canrenone

    NASA Astrophysics Data System (ADS)

    Emam, Aml A.; Abdelaleem, Eglal A.; Naguib, Ibrahim A.; Abdallah, Fatma F.; Ali, Nouruddin W.

    2018-03-01

    Furosemide and spironolactone are commonly prescribed antihypertensive drugs. Canrenone is the main degradation product and main metabolite of spironolactone. Ratio subtraction and extended ratio subtraction spectrophotometric methods were previously applied for quantitation of only binary mixtures. An extension of the above mentioned methods; successive ratio subtraction, is introduced in the presented work for quantitative determination of ternary mixtures exemplified by furosemide, spironolactone and canrenone. Manipulating the ratio spectra of the ternary mixture allowed their determination at 273.6 nm, 285 nm and 240 nm and in the concentration ranges of (2-16 μg mL- 1), (4-32 μg mL- 1) and (1-18 μg mL- 1) for furosemide, spironolactone and canrenone, respectively. Method specificity was ensured by the application to laboratory prepared mixtures. The introduced method was ensured to be accurate and precise. Validation of the developed method was done with respect to ICH guidelines and its validity was further ensured by the application to the pharmaceutical formulation. Statistical comparison between the obtained results and those obtained from the reported HPLC method was achieved concerning student's t-test and F ratio test where no significant difference was observed.

  1. Development and validation of a turbulent flow chromatography and tandem mass spectrometry method for the quantitation of methotrexate and its metabolites 7-hydroxy methotrexate and DAMPA in serum

    PubMed Central

    Schofield, Ryan C.; Ramanathan, Lakshmi V.; Murata, Kazunori; Grace, Marie; Fleisher, Martin; Pessin, Melissa S.; Carlow, Dean C.

    2016-01-01

    A rapid and simple turbulent flow liquid chromatography (TFC–LC) method implementing positive heated electrospray ionization (HESI) for the accurate and precise determination of methotrexate (MTX), 7-hydroxy methotrexate (7-OH MTX), and 4-amino-4-deoxy-N10-methylpteroic acid (DAMPA) concentrations in serum was developed. MTX was isolated from serum samples (100 μL) after protein precipitation with methanol containing formic acid and internal standard (MTX-D3) followed by centrifugation. The supernatant was injected into the turbulent flow liquid chromatography which is followed by electrospray positive ionization tandem mass spectrometry (TFC–LC–MS/MS) and quantified using a six-point calibration curve. For MTX and DAMPA the assays were linear from 10 to 1000 nmol/L and for 7-OH MTX from 20 to 2000 nmol/L. Dilutions of 10, 100 and 1000-fold were validated giving a clinically reportable range of 10 nmol/L to 5 × 105 nmol/L. Within-day and between-day precisions at concentrations spanning the analytical measurement ranges were less than 10% for all three analytes. MTX, DAMPA and 7-OH MTX were sufficiently stable under all relevant analytical conditions. No significant matrix effect was observed during the method validation. The TFC–LC-MS/MS MTX method was also compared with three other clinically validated MTX assays: a dihydrofolate reductase (DHFR) inhibition assay, an immunoassay based on fluorescence polarization and a previously developed LC–MS/MS assay. PMID:26322588

  2. Determination of C-glucosidic ellagitannins in Lythri salicariaeherba by ultra-high performance liquid chromatography coupled with charged aerosol detector: method development and validation.

    PubMed

    Granica, Sebastian; Piwowarski, Jakub P; Kiss, Anna K

    2014-01-01

    Lythri salicariaeherba is a pharmacopoeial plant material used by patients in the form of infusions in the treatment of acute diarrhoea. According to its pharmacopoeial monograph it is standardised for total tannin content, which should be not less than 5.0% using pyrogallol as a standard. Previous studies have shown that aqueous extracts from Lythri herba contain mainly ellagitannins among which vescalagin, castalagin and salicarinins A and B are dominating constituents. To develop and validate an efficient UHPLC coupled with a charged aerosol detector (CAD) method for quantification of four major ellagitannins in Lythri salicariaeherba and in one commercial preparation. Extraction conditions of ellagitannins from plant material were optimised. The relative response factors for vescalagin, castalagin and salicarinins A and B using gallic acid as an external standard were determined for the CAD detector. Then, a UHPLC method for quantification of ellagitannins was developed and validated. Four major ellagitannins were quantified in four samples of Lythri herba and in one commercial preparation. The sum of ellagitannins for each sample was determined, which varied from 30.66 to 48.80 mg/g of raw material and 16.57 mg per capsule for the preparation investigated. The first validated UHPLC/CAD UHPLC-CAD method for quantification of four major ellagitannins was developed. The universality of the CAD response was evaluated and it is shown that although all compounds analysed have similar structures their CAD response differs significantly. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Accurate prediction of secreted substrates and identification of a conserved putative secretion signal for type III secretion systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samudrala, Ram; Heffron, Fred; McDermott, Jason E.

    2009-04-24

    The type III secretion system is an essential component for virulence in many Gram-negative bacteria. Though components of the secretion system apparatus are conserved, its substrates, effector proteins, are not. We have used a machine learning approach to identify new secreted effectors. The method integrates evolutionary measures, such as the pattern of homologs in a range of other organisms, and sequence-based features, such as G+C content, amino acid composition and the N-terminal 30 residues of the protein sequence. The method was trained on known effectors from Salmonella typhimurium and validated on a corresponding set of effectors from Pseudomonas syringae, aftermore » eliminating effectors with detectable sequence similarity. The method was able to identify all of the known effectors in P. syringae with a specificity of 84% and sensitivity of 82%. The reciprocal validation, training on P. syringae and validating on S. typhimurium, gave similar results with a specificity of 86% when the sensitivity level was 87%. These results show that type III effectors in disparate organisms share common features. We found that maximal performance is attained by including an N-terminal sequence of only 30 residues, which agrees with previous studies indicating that this region contains the secretion signal. We then used the method to define the most important residues in this putative secretion signal. Finally, we present novel predictions of secreted effectors in S. typhimurium, some of which have been experimentally validated, and apply the method to predict secreted effectors in the genetically intractable human pathogen Chlamydia trachomatis. This approach is a novel and effective way to identify secreted effectors in a broad range of pathogenic bacteria for further experimental characterization and provides insight into the nature of the type III secretion signal.« less

  4. High salinity relay as a post-harvest processing method for reducing Vibrio vulnificus levels in oysters (Crassostrea virginica).

    PubMed

    Audemard, Corinne; Kator, Howard I; Reece, Kimberly S

    2018-08-20

    High salinity relay of Eastern oysters (Crassostrea virginica) was evaluated as a post-harvest processing (PHP) method for reducing Vibrio vulnificus. This approach relies on the exposure of oysters to natural high salinity waters and preserves a live product compared to previously approved PHPs. Although results of prior studies evaluating high salinity relay as a means to decrease V. vulnificus levels were promising, validation of this method as a PHP following approved guidelines is required. This study was designed to provide data for validation of this method following Food and Drug Administration (FDA) PHP validation guidelines. During each of 3 relay experiments, oysters cultured from 3 different Chesapeake Bay sites of contrasting salinities (10-21 psu) were relayed without acclimation to high salinity waters (31-33 psu) for up to 28 days. Densities of V. vulnificus and densities of total and pathogenic Vibrio parahaemolyticus (as tdh positive strains) were measured using an MPN-quantitative PCR approach. Overall, 9 lots of oysters were relayed with 6 exhibiting initial V. vulnificus >10,000/g. As recommended by the FDA PHP validation guidelines, these lots reached both the 3.52 log reduction and the <30 MPN/g densities requirements for V. vulnificus after 14 to 28 days of relay. Densities of total and pathogenic V. parahaemolyticus in relayed oysters were significantly lower than densities at the sites of origin suggesting an additional benefit associated with high salinity relay. While relay did not have a detrimental effect on oyster condition, oyster mortality levels ranged from 2 to 61% after 28 days of relay. Although the identification of the factors implicated in oyster mortality will require further examination, this study strongly supports the validation of high salinity relay as an effective PHP method to reduce levels of V. vulnificus in oysters to endpoint levels approved for human consumption. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Extension of the validation of AOAC Official Method 2005.06 for dc-GTX2,3: interlaboratory study.

    PubMed

    Ben-Gigirey, Begoña; Rodríguez-Velasco, María L; Gago-Martínez, Ana

    2012-01-01

    AOAC Official Method(SM) 2005.06 for the determination of saxitoxin (STX)-group toxins in shellfish by LC with fluorescence detection with precolumn oxidation was previously validated and adopted First Action following a collaborative study. However, the method was not validated for all key STX-group toxins, and procedures to quantify some of them were not provided. With more STX-group toxin standards commercially available and modifications to procedures, it was possible to overcome some of these difficulties. The European Union Reference Laboratory for Marine Biotoxins conducted an interlaboratory exercise to extend AOAC Official Method 2005.06 validation for dc-GTX2,3 and to compile precision data for several STX-group toxins. This paper reports the study design and the results obtained. The performance characteristics for dc-GTX2,3 (intralaboratory and interlaboratory precision, recovery, and theoretical quantification limit) were evaluated. The mean recoveries obtained for dc-GTX2,3 were, in general, low (53.1-58.6%). The RSD for reproducibility (RSD(r)%) for dc-GTX2,3 in all samples ranged from 28.2 to 45.7%, and HorRat values ranged from 1.5 to 2.8. The article also describes a hydrolysis protocol to convert GTX6 to NEO, which has been proven to be useful for the quantification of GTX6 while the GTX6 standard is not available. The performance of the participant laboratories in the application of this method was compared with that obtained from the original collaborative study of the method. Intralaboratory and interlaboratory precision data for several STX-group toxins, including dc-NEO and GTX6, are reported here. This study can be useful for those laboratories determining STX-group toxins to fully implement AOAC Official Method 2005.06 for official paralytic shellfish poisoning control. However the overall quantitative performance obtained with the method was poor for certain toxins.

  6. [Development and validation of a questionnaire about the main variables affecting the individual investor's behavior in the Stock Exchange].

    PubMed

    Pascual-Ezama, David; San Martín Castellanos, Rafael; Gil-Gómez de Liaño, Beatriz; Scandroglio, Bárbara

    2010-11-01

    Development and validation of a questionnaire about the main variables affecting the individual investor's behavior in the Stock Exchange. There is a considerable lack of information about the methodology usually used in most of the studies about individual investor's behavior. The studies reviewed do not show the method used in the selection of the items or the psychometric properties of the questionnaires. Because of the importance of investment in the Stock Exchange nowadays, it seems relevant to obtain a reliable instrument to understand individual investor's behavior in the Stock Exchange. Therefore, the goal of the present work is to validate a questionnaire about the main variables involved in individual investors' behavior in the Stock Exchange. Based on previous studies, we elaborated a questionnaire using the Delphi methodology with a group of experts. The internal consistency (Cronbach alpha=.934) and validity evidence of the questionnaire show that it may be an effective instrument and can be applied with some assurance.

  7. The Alcohol Relapse Situation Appraisal Questionnaire: Development and Validation

    PubMed Central

    Martin, Rosemarie A.; MacKinnon, Selene M.; Johnson, Jennifer E.; Myers, Mark G.; Cook, Travis A. R.; Rohsenow, Damaris J.

    2011-01-01

    Background The role of cognitive appraisal of the threat of alcohol relapse has received little attention. A previous instrument, the Relapse Situation Appraisal Questionnaire (RSAQ), was developed to assess cocaine users’ primary appraisal of the threat of situations posing a high risk for cocaine relapse. The purpose of the present study was to modify the RSAQ in order to measure primary appraisal in situations involving a high risk for alcohol relapse. Methods The development and psychometric properties of this instrument, the Alcohol Relapse Situation Appraisal Questionnaire (A-RSAQ), were examined with two samples of abstinent adults with alcohol abuse or dependence. Factor structure and validity were examined in Study 1 (N=104). Confirmation of the factor structure and predictive validity were assessed in Study 2 (N=161). Results Results demonstrated construct, discriminant and predictive validity and reliability of the A-RSAQ. Discussion Results support the important role of primary appraisal of degree of risk in alcohol relapse situations. PMID:21237586

  8. Bem Sex Role Inventory Validation in the International Mobility in Aging Study.

    PubMed

    Ahmed, Tamer; Vafaei, Afshin; Belanger, Emmanuelle; Phillips, Susan P; Zunzunegui, Maria-Victoria

    2016-09-01

    This study investigated the measurement structure of the Bem Sex Role Inventory (BSRI) with different factor analysis methods. Most previous studies on validity applied exploratory factor analysis (EFA) to examine the BSRI. We aimed to assess the psychometric properties and construct validity of the 12-item short-form BSRI in a sample administered to 1,995 older adults from wave 1 of the International Mobility in Aging Study (IMIAS). We used Cronbach's alpha to assess internal consistency reliability and confirmatory factor analysis (CFA) to assess psychometric properties. EFA revealed a three-factor model, further confirmed by CFA and compared with the original two-factor structure model. Results revealed that a two-factor solution (instrumentality-expressiveness) has satisfactory construct validity and superior fit to data compared to the three-factor solution. The two-factor solution confirms expected gender differences in older adults. The 12-item BSRI provides a brief, psychometrically sound, and reliable instrument in international samples of older adults.

  9. Radiated Sound Power from a Curved Honeycomb Panel

    NASA Technical Reports Server (NTRS)

    Robinson, Jay H.; Buehrle, Ralph D.; Klos, Jacob; Grosveld, Ferdinand W.

    2003-01-01

    The validation of finite element and boundary element model for the vibro-acoustic response of a curved honeycomb core composite aircraft panel is completed. The finite element and boundary element models were previously validated separately. This validation process was hampered significantly by the method in which the panel was installed in the test facility. The fixture used was made primarily of fiberboard and the panel was held in a groove in the fiberboard by a compression fitting made of plastic tubing. The validated model is intended to be used to evaluate noise reduction concepts from both an experimental and analytic basis simultaneously. An initial parametric study of the influence of core thickness on the radiated sound power from this panel, using this numerical model was subsequently conducted. This study was significantly influenced by the presence of strong boundary condition effects but indicated that the radiated sound power from this panel was insensitive to core thickness primarily due to the offsetting effects of added mass and added stiffness in the frequency range investigated.

  10. Rapid determination of flavonoids and phenolic acids in grape juices and wines by RP-HPLC/DAD: Method validation and characterization of commercial products of the new Brazilian varieties of grape.

    PubMed

    Padilha, Carla Valéria da Silva; Miskinis, Gabriela Aquino; de Souza, Marcelo Eduardo Alves Olinda; Pereira, Giuliano Elias; de Oliveira, Débora; Bordignon-Luiz, Marilde Terezinha; Lima, Marcos Dos Santos

    2017-08-01

    A method for rapid determination of phenolic compounds by reversed-phase high-performance liquid chromatography (RP-HPLC), using a new column of faster resolution was validated and used to characterize commercial products produced with new grape Brazilian varieties of Northeast of Brazil. The in vitro antioxidant activity was also measured. The method showed linearity (R>0.9995), good precision (CV%<2.78), recovery (91.8-105.1%) and limits of detection (0.04-0.85mgL -1 ) and quantification (0.04-1.41mgL -1 ) according to other methods previously published with the difference of a run time of only 25min. The results obtained in the characterization of the samples differed for juices and wines from other world regions, mainly because of the high values of (-)-epigallocatechin and trans-caftaric acid. The products analyzed showed high antioxidant activity, especially the wine samples with values higher than those from wines of different regions of the world. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Ultrasound SIV measurement of helical valvular flow behind the great saphenous vein

    NASA Astrophysics Data System (ADS)

    Park, Jun Hong; Kim, Jeong Ju; Lee, Sang Joon; Yeom, Eunseop; Experimental Fluid Mechanics Laboratory Team; LaboratoryMicrothermal; Microfluidic Measurements Collaboration

    2017-11-01

    Dysfunction of venous valve and induced secondary abnormal flow are closely associated with venous diseases. Thus, detailed analysis of venous valvular flow is invaluable from biological and medical perspectives. However, most previous studies on venous perivalvular flows were based on qualitative analyses. On the contrary, quantitative analysis on the perivalvular flows has not been fully understood yet. In this study, 3D valvular flows under in vitro and in vivo conditions were experimentally investigated using ultrasound speckle image velocimetry (SIV) for analyzing their flow characteristics. The results for in vitro model obtained by the SIV technique were compared with those derived by numerical simulation and color Doppler method to validate its measurement accuracy. Then blood flow in the human great saphenous vein was measured using the SIV with respect to the dimensionless index, helical intensity. The results obtained by the SIV method are well matched well with those obtained by the numerical simulation and color Doppler method. The hemodynamic characteristics of 3D valvular flows measured by the validated SIV method would be helpful in diagnosis of valve-related venous diseases. None.

  12. VALIDATION OF MICROSATELLITE MARKERS FOR USE IN GENOTYPING POLYCLONAL PLASMODIUM FALCIPARUM INFECTIONS

    PubMed Central

    GREENHOUSE, BRYAN; MYRICK, ALISSA; DOKOMAJILAR, CHRISTIAN; WOO, JONATHAN M.; CARLSON, ELAINE J.; ROSENTHAL, PHILIP J.; DORSEY, GRANT

    2006-01-01

    Genotyping methods for Plasmodium falciparum drug efficacy trials have not been standardized and may fail to accurately distinguish recrudescence from new infection, especially in high transmission areas where polyclonal infections are common. We developed a simple method for genotyping using previously identified microsatellites and capillary electrophoresis, validated this method using mixtures of laboratory clones, and applied the method to field samples. Two microsatellite markers produced accurate results for single-clone but not polyclonal samples. Four other microsatellite markers were as sensitive as, and more specific than, commonly used genotyping techniques based on merozoite surface proteins 1 and 2. When applied to samples from 15 patients in Burkina Faso with recurrent parasitemia after treatment with sulphadoxine-pyrimethamine, the addition of these four microsatellite markers to msp1 and msp2 genotyping resulted in a reclassification of outcomes that strengthened the association between dhfr 59R, an anti-folate resistance mutation, and recrudescence (P = 0.31 versus P = 0.03). Four microsatellite markers performed well on polyclonal samples and may provide a valuable addition to genotyping for clinical drug efficacy studies in high transmission areas. PMID:17123974

  13. Validating a Coarse-Grained Potential Energy Function through Protein Loop Modelling

    PubMed Central

    MacDonald, James T.; Kelley, Lawrence A.; Freemont, Paul S.

    2013-01-01

    Coarse-grained (CG) methods for sampling protein conformational space have the potential to increase computational efficiency by reducing the degrees of freedom. The gain in computational efficiency of CG methods often comes at the expense of non-protein like local conformational features. This could cause problems when transitioning to full atom models in a hierarchical framework. Here, a CG potential energy function was validated by applying it to the problem of loop prediction. A novel method to sample the conformational space of backbone atoms was benchmarked using a standard test set consisting of 351 distinct loops. This method used a sequence-independent CG potential energy function representing the protein using -carbon positions only and sampling conformations with a Monte Carlo simulated annealing based protocol. Backbone atoms were added using a method previously described and then gradient minimised in the Rosetta force field. Despite the CG potential energy function being sequence-independent, the method performed similarly to methods that explicitly use either fragments of known protein backbones with similar sequences or residue-specific /-maps to restrict the search space. The method was also able to predict with sub-Angstrom accuracy two out of seven loops from recently solved crystal structures of proteins with low sequence and structure similarity to previously deposited structures in the PDB. The ability to sample realistic loop conformations directly from a potential energy function enables the incorporation of additional geometric restraints and the use of more advanced sampling methods in a way that is not possible to do easily with fragment replacement methods and also enable multi-scale simulations for protein design and protein structure prediction. These restraints could be derived from experimental data or could be design restraints in the case of computational protein design. C++ source code is available for download from http://www.sbg.bio.ic.ac.uk/phyre2/PD2/. PMID:23824634

  14. An evaluation of pediatric dental patient education materials using contemporary health literacy measures.

    PubMed

    Kang, Edith; Fields, Henry W; Cornett, Sandy; Beck, F Michael

    2005-01-01

    The purpose of this study was to determine the appropriateness of nationally available dental information materials according to the suitability assessment of materials (SAM) method. Clinically related, professionally produced patient dental health education materials (N=22) provided by the American Academy of Pediatric Dentistry (AAPD) were evaluated using the SAM method that had previously been judged valid and reliable. A rater was trained by an experienced health literacy evaluator to establish validity. The rater then rated all materials for 5 categories of assessment (content, literacy demand, graphics, layout and typography, and learning stimulation/motivation) and an overall assessment, and repeated 5 materials to establish intrarater reliability. When compared to the experienced rater, the validity was K=0.43. The reliability was established for all ratings as K=0.52. The consistently weakest categories were content, graphics, and learning stimulation, while reading level as part of literacy demand was often not suitable. The overall suitability of the AAPD materials was generally classified as superior. Reliable and valid evaluation of available dental patient information materials can be accomplished. The materials were largely superior. There is great variability within the categories of evaluation. The categories of content, graphics, and learning stimulation require attention and could raise the overall quality of the materials.

  15. A validation procedure for a LADAR system radiometric simulation model

    NASA Astrophysics Data System (ADS)

    Leishman, Brad; Budge, Scott; Pack, Robert

    2007-04-01

    The USU LadarSIM software package is a ladar system engineering tool that has recently been enhanced to include the modeling of the radiometry of Ladar beam footprints. This paper will discuss our validation of the radiometric model and present a practical approach to future validation work. In order to validate complicated and interrelated factors affecting radiometry, a systematic approach had to be developed. Data for known parameters were first gathered then unknown parameters of the system were determined from simulation test scenarios. This was done in a way to isolate as many unknown variables as possible, then build on the previously obtained results. First, the appropriate voltage threshold levels of the discrimination electronics were set by analyzing the number of false alarms seen in actual data sets. With this threshold set, the system noise was then adjusted to achieve the appropriate number of dropouts. Once a suitable noise level was found, the range errors of the simulated and actual data sets were compared and studied. Predicted errors in range measurements were analyzed using two methods: first by examining the range error of a surface with known reflectivity and second by examining the range errors for specific detectors with known responsivities. This provided insight into the discrimination method and receiver electronics used in the actual system.

  16. Using meta-differential evolution to enhance a calculation of a continuous blood glucose level.

    PubMed

    Koutny, Tomas

    2016-09-01

    We developed a new model of glucose dynamics. The model calculates blood glucose level as a function of transcapillary glucose transport. In previous studies, we validated the model with animal experiments. We used analytical method to determine model parameters. In this study, we validate the model with subjects with type 1 diabetes. In addition, we combine the analytic method with meta-differential evolution. To validate the model with human patients, we obtained a data set of type 1 diabetes study that was coordinated by Jaeb Center for Health Research. We calculated a continuous blood glucose level from continuously measured interstitial fluid glucose level. We used 6 different scenarios to ensure robust validation of the calculation. Over 96% of calculated blood glucose levels fit A+B zones of the Clarke Error Grid. No data set required any correction of model parameters during the time course of measuring. We successfully verified the possibility of calculating a continuous blood glucose level of subjects with type 1 diabetes. This study signals a successful transition of our research from an animal experiment to a human patient. Researchers can test our model with their data on-line at https://diabetes.zcu.cz. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  17. Incidence of post-operative adhesions following Misgav Ladach caesarean section--a comparative study.

    PubMed

    Fatusić, Zlatan; Hudić, Igor

    2009-02-01

    To evaluate the incidence of peritoneal adhesions as a post-operative complication after caesarean section following the Misgav Ladach method and compare it with peritoneal adhesions following traditional caesarean section methods (Pfannenstiel-Dörffler, low midline laparotomy-Dörffler). The analysis is retrospective and is based on medical documentation of the Clinic for Gynecology and Obstetrics, University Clinical Centre, Tuzla, Bosnia and Herzegovina (data from 1 January 2001 to 31 December 2005). We analysed previous caesarean section dependent on caesarean section method (200 by Misgav Ladach method, 100 by Pfannenstiel-Dörffler method and 100 caesarean section by low midline laparotomy-Dörffler). Adhesion scores were assigned using a previously validated scoring system. We found statistically significant difference (p < 0.05) in incidence of peritoneal adhesions in second and third caesarean section between Misgav Ladach method and the Pfannestiel-Dörffler and low midline laparotomy-Dörffler method. Difference in incidence of peritoneal adhesions between low midline laparotomy-Dörffler and Pfannenstiel-Dörffler method was not statistically different (p > 0.05). The mean pelvic adhesion score was statistically lower in Misgav Ladach group (0.43 +/- 0.79) than the mean score in the Pfannestiel-Dörffler (0.71 +/- 1.27) and low midline laparotomy-Dörffler groups (0.99 +/- 1.49) (p < 0.05). Our study showed that Misgav Ladach method of caesarean section makes possible lower incidence of peritoneal adhesions as post-operative complication of previous caesarean section.

  18. Identification of volatiles by headspace gas chromatography with simultaneous flame ionization and mass spectrometric detection.

    PubMed

    Tiscione, Nicholas B; Yeatman, Dustin Tate; Shan, Xiaoqin; Kahl, Joseph H

    2013-10-01

    Volatiles are frequently abused as inhalants. The methods used for identification are generally nonspecific if analyzed concurrently with ethanol or require an additional analytical procedure that employs mass spectrometry. A previously published technique utilizing a capillary flow technology splitter to simultaneously quantitate and confirm ethyl alcohol by flame ionization and mass spectrometric detection after headspace sampling and gas chromatographic separation was evaluated for the detection of inhalants. Methanol, isopropanol, acetone, acetaldehyde, toluene, methyl ethyl ketone, isoamyl alcohol, isobutyl alcohol, n-butyl alcohol, 1,1-difluoroethane, 1,1,1-trifluoroethane, 1,1,1,2-tetrafluoroethane (Norflurane, HFC-134a), chloroethane, trichlorofluoromethane (Freon®-11), dichlorodifluoromethane (Freon®-12), dichlorofluoromethane (Freon®-21), chlorodifluoromethane (Freon®-22) and 1,2-dichlorotetrafluoroethane (Freon®-114) were validated for qualitative identification by this method. The validation for qualitative identification included evaluation of matrix effects, sensitivity, carryover, specificity, repeatability and ruggedness/robustness.

  19. Noninvasive evaluation of left ventricular elastance according to pressure-volume curves modeling in arterial hypertension.

    PubMed

    Bonnet, Benjamin; Jourdan, Franck; du Cailar, Guilhem; Fesler, Pierre

    2017-08-01

    End-systolic left ventricular (LV) elastance ( E es ) has been previously calculated and validated invasively using LV pressure-volume (P-V) loops. Noninvasive methods have been proposed, but clinical application remains complex. The aims of the present study were to 1 ) estimate E es according to modeling of the LV P-V curve during ejection ("ejection P-V curve" method) and validate our method with existing published LV P-V loop data and 2 ) test the clinical applicability of noninvasively detecting a difference in E es between normotensive and hypertensive subjects. On the basis of the ejection P-V curve and a linear relationship between elastance and time during ejection, we used a nonlinear least-squares method to fit the pressure waveform. We then computed the slope and intercept of time-varying elastance as well as the volume intercept (V 0 ). As a validation, 22 P-V loops obtained from previous invasive studies were digitized and analyzed using the ejection P-V curve method. To test clinical applicability, ejection P-V curves were obtained from 33 hypertensive subjects and 32 normotensive subjects with carotid tonometry and real-time three-dimensional echocardiography during the same procedure. A good univariate relationship ( r 2  = 0.92, P < 0.005) and good limits of agreement were found between the invasive calculation of E es and our new proposed ejection P-V curve method. In hypertensive patients, an increase in arterial elastance ( E a ) was compensated by a parallel increase in E es without change in E a / E es In addition, the clinical reproducibility of our method was similar to that of another noninvasive method. In conclusion, E es and V 0 can be estimated noninvasively from modeling of the P-V curve during ejection. This approach was found to be reproducible and sensitive enough to detect an expected increase in LV contractility in hypertensive patients. Because of its noninvasive nature, this methodology may have clinical implications in various disease states. NEW & NOTEWORTHY The use of real-time three-dimensional echocardiography-derived left ventricular volumes in conjunction with carotid tonometry was found to be reproducible and sensitive enough to detect expected differences in left ventricular elastance in arterial hypertension. Because of its noninvasive nature, this methodology may have clinical implications in various disease states. Copyright © 2017 the American Physiological Society.

  20. Analysis of cocoa flavanols and procyanidins (DP 1-10) in cocoa-containing ingredients and products by rapid resolution liquid chromatography: single-laboratory validation.

    PubMed

    Machonis, Philip R; Jones, Matthew A; Kwik-Uribe, Catherine

    2014-01-01

    Recently, a multilaboratory validation (MLV) of AOAC Official Method 2012.24 for the determination of cocoa flavanols and procyanidins (CF-CP) in cocoa-based ingredients and products determined that the method was robust, reliable, and transferrable. Due to the complexity of the CF-CP molecules, this method required a run time exceeding 1 h to achieve acceptable separations. To address this issue, a rapid resolution normal phase LC method was developed, and a single-laboratory validation (SLV) study conducted. Flavanols and procyanidins with a degree of polymerization (DP) up to 10 were eluted in 15 min using a binary gradient applied to a diol stationary phase, detected using fluorescence detection, and reported as a total sum of DP 1-10. Quantification was achieved using (-)-epicatechin-based relative response factors for DP 2-10. Spike recovery samples and seven different types of cocoa-based samples were analyzed to evaluate the accuracy, precision, LOD, LOQ, and linearity of the method. The within-day precision of the reported content for the samples was 1.15-5.08%, and overall precision was 3.97-13.61%. Spike-recovery experiments demonstrated recoveries of over 98%. The results of this SLV were compared to those previously obtained in the MLV and found to be consistent. The translation to rapid resolution LC allowed for an 80% reduction in analysis time and solvent usage, while retaining the accuracy and reliability of the original method. The savings in both cost and time of this rapid method make it well-suited for routine laboratory use.

  1. Catch-up validation study of an in vitro skin irritation test method based on an open source reconstructed epidermis (phase II).

    PubMed

    Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R

    2016-10-01

    To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. A method to validate quantitative high-frequency power doppler ultrasound with fluorescence in vivo video microscopy.

    PubMed

    Pinter, Stephen Z; Kim, Dae-Ro; Hague, M Nicole; Chambers, Ann F; MacDonald, Ian C; Lacefield, James C

    2014-08-01

    Flow quantification with high-frequency (>20 MHz) power Doppler ultrasound can be performed objectively using the wall-filter selection curve (WFSC) method to select the cutoff velocity that yields a best-estimate color pixel density (CPD). An in vivo video microscopy system (IVVM) is combined with high-frequency power Doppler ultrasound to provide a method for validation of CPD measurements based on WFSCs in mouse testicular vessels. The ultrasound and IVVM systems are instrumented so that the mouse remains on the same imaging platform when switching between the two modalities. In vivo video microscopy provides gold-standard measurements of vascular diameter to validate power Doppler CPD estimates. Measurements in four image planes from three mice exhibit wide variation in the optimal cutoff velocity and indicate that a predetermined cutoff velocity setting can introduce significant errors in studies intended to quantify vascularity. Consistent with previously published flow-phantom data, in vivo WFSCs exhibited three characteristic regions and detectable plateaus. Selection of a cutoff velocity at the right end of the plateau yielded a CPD close to the gold-standard vascular volume fraction estimated using IVVM. An investigator can implement the WFSC method to help adapt cutoff velocity to current blood flow conditions and thereby improve the accuracy of power Doppler for quantitative microvascular imaging. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  3. HPLC-electrospray mass spectrometric assay for the determination of (R,R)-fenoterol in rat plasma.

    PubMed

    Siluk, Danuta; Kim, Hee Seung; Cole, Tyler; Wainer, Irving W

    2008-11-04

    A fast and specific liquid chromatography-mass spectrometry method for the determination of (R,R)-fenoterol ((R,R)-Fen) in rat plasma has been developed and validated. (R,R)-Fen was extracted from 125 microl of plasma using solid phase extraction and analyzed on Atlantis HILIC Silica 3 microm column. The mobile phase was composed of acetonitrile:ammonium acetate (pH 4.1; 20mM) (85:15, v/v), at a flow rate of 0.2 ml/min. The lower limit of detection (LLOD) was 2 ng/ml . The procedure was validated and applied to the analysis of plasma samples from rats previously administered (R,R)-Fen in an intravenous bolus.

  4. Tracking Biases: An Update to the Validity and Reliability of Alcohol Retail Sales Data for Estimating Population Consumption in Scotland

    PubMed Central

    Henderson, Audrey; Robinson, Mark; McAdams, Rachel; McCartney, Gerry; Beeston, Clare

    2016-01-01

    Aims To highlight the importance of monitoring biases when using retail sales data to estimate population alcohol consumption. Methods Previously, we identified and where possible quantified sources of bias that may lead to under- or overestimation of alcohol consumption in Scotland. Here, we update findings by using more recent data and by quantifying emergent biases. Results Underestimation resulting from the net effect of biases on population consumption in Scotland increased from −4% in 2010 to −7% in 2013. Conclusion Biases that might impact on the validity and reliability of sales data when estimating population consumption should be routinely monitored and updated. PMID:26419684

  5. Validation of a predictive model for survival and growth of Salmonella Typhimurium DT104 on chicken skin for extrapolation to a previous history of frozen storage

    USDA-ARS?s Scientific Manuscript database

    A predictive model for survival and growth of Salmonella Typhimurium DT104 on chicken skin was evaluated for its ability to predict survival and growth of the same organism after frozen storage for 6 days at -20 C. Experimental methods used to collect data for model development were the same as tho...

  6. Oversimplifying quantum factoring.

    PubMed

    Smolin, John A; Smith, Graeme; Vargo, Alexander

    2013-07-11

    Shor's quantum factoring algorithm exponentially outperforms known classical methods. Previous experimental implementations have used simplifications dependent on knowing the factors in advance. However, as we show here, all composite numbers admit simplification of the algorithm to a circuit equivalent to flipping coins. The difficulty of a particular experiment therefore depends on the level of simplification chosen, not the size of the number factored. Valid implementations should not make use of the answer sought.

  7. Shortened Nonword Repetition Task (NWR-S): A Simple, Quick, and Less Expensive Outcome to Identify Children with Combined Specific Language and Reading Impairment

    ERIC Educational Resources Information Center

    le Clercq, Carlijn M. P.; van der Schroeff, Marc P.; Rispens, Judith E.; Ruytjens, Liesbet; Goedegebure, André; van Ingen, Gijs; Franken, Marie-Christine

    2017-01-01

    Purpose: The purpose of this research note was to validate a simplified version of the Dutch nonword repetition task (NWR; Rispens & Baker, 2012). The NWR was shortened and scoring was transformed to correct/incorrect nonwords, resulting in the shortened NWR (NWR-S). Method: NWR-S and NWR performance were compared in the previously published…

  8. More Easily Cultivated Than Identified: Classical Isolation With Molecular Identification of Vaginal Bacteria

    PubMed Central

    Srinivasan, Sujatha; Munch, Matthew M.; Sizova, Maria V.; Fiedler, Tina L.; Kohler, Christina M.; Hoffman, Noah G.; Liu, Congzhou; Agnew, Kathy J.; Marrazzo, Jeanne M.; Epstein, Slava S.; Fredricks, David N.

    2016-01-01

    Background. Women with bacterial vaginosis (BV) have complex communities of anaerobic bacteria. There are no cultivated isolates of several bacteria identified using molecular methods and associated with BV. It is unclear whether this is due to the inability to adequately propagate these bacteria or to correctly identify them in culture. Methods. Vaginal fluid from 15 women was plated on 6 different media using classical cultivation approaches. Individual isolates were identified by 16S ribosomal RNA (rRNA) gene sequencing and compared with validly described species. Bacterial community profiles in vaginal samples were determined using broad-range 16S rRNA gene polymerase chain reaction and pyrosequencing. Results. We isolated and identified 101 distinct bacterial strains spanning 6 phyla including (1) novel strains with <98% 16S rRNA sequence identity to validly described species, (2) closely related species within a genus, (3) bacteria previously isolated from body sites other than the vagina, and (4) known bacteria formerly isolated from the vagina. Pyrosequencing showed that novel strains Peptoniphilaceae DNF01163 and Prevotellaceae DNF00733 were prevalent in women with BV. Conclusions. We isolated a diverse set of novel and clinically significant anaerobes from the human vagina using conventional approaches with systematic molecular identification. Several previously “uncultivated” bacteria are amenable to conventional cultivation. PMID:27449870

  9. Computation of Pressurized Gas Bearings Using CE/SE Method

    NASA Technical Reports Server (NTRS)

    Cioc, Sorin; Dimofte, Florin; Keith, Theo G., Jr.; Fleming, David P.

    2003-01-01

    The space-time conservation element and solution element (CE/SE) method is extended to compute compressible viscous flows in pressurized thin fluid films. This numerical scheme has previously been used successfully to solve a wide variety of compressible flow problems, including flows with large and small discontinuities. In this paper, the method is applied to calculate the pressure distribution in a hybrid gas journal bearing. The formulation of the problem is presented, including the modeling of the feeding system. the numerical results obtained are compared with experimental data. Good agreement between the computed results and the test data were obtained, and thus validate the CE/SE method to solve such problems.

  10. Application of the ratio difference spectrophotometry to the determination of ibuprofen and famotidine in their combined dosage form: comparison with previously published spectrophotometric methods.

    PubMed

    Zaazaa, Hala E; Elzanfaly, Eman S; Soudi, Aya T; Salem, Maissa Y

    2015-05-15

    Ratio difference spectrophotometric method was developed for the determination of ibuprofen and famotidine in their mixture form. Ibuprofen and famotidine were determined in the presence of each other by the ratio difference spectrophotometric (RD) method where linearity was obtained from 50 to 600μg/mL and 2.5 to 25μg/mL for ibuprofen and famotidine, respectively. The suggested method was validated according to ICH guidelines and successfully applied for the analysis of ibuprofen and famotidine in their pharmaceutical dosage forms without interference from any additives or excipients. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Redrawing the US Obesity Landscape: Bias-Corrected Estimates of State-Specific Adult Obesity Prevalence

    PubMed Central

    Ward, Zachary J.; Long, Michael W.; Resch, Stephen C.; Gortmaker, Steven L.; Cradock, Angie L.; Giles, Catherine; Hsiao, Amber; Wang, Y. Claire

    2016-01-01

    Background State-level estimates from the Centers for Disease Control and Prevention (CDC) underestimate the obesity epidemic because they use self-reported height and weight. We describe a novel bias-correction method and produce corrected state-level estimates of obesity and severe obesity. Methods Using non-parametric statistical matching, we adjusted self-reported data from the Behavioral Risk Factor Surveillance System (BRFSS) 2013 (n = 386,795) using measured data from the National Health and Nutrition Examination Survey (NHANES) (n = 16,924). We validated our national estimates against NHANES and estimated bias-corrected state-specific prevalence of obesity (BMI≥30) and severe obesity (BMI≥35). We compared these results with previous adjustment methods. Results Compared to NHANES, self-reported BRFSS data underestimated national prevalence of obesity by 16% (28.67% vs 34.01%), and severe obesity by 23% (11.03% vs 14.26%). Our method was not significantly different from NHANES for obesity or severe obesity, while previous methods underestimated both. Only four states had a corrected obesity prevalence below 30%, with four exceeding 40%–in contrast, most states were below 30% in CDC maps. Conclusions Twelve million adults with obesity (including 6.7 million with severe obesity) were misclassified by CDC state-level estimates. Previous bias-correction methods also resulted in underestimates. Accurate state-level estimates are necessary to plan for resources to address the obesity epidemic. PMID:26954566

  12. Percolation in real multiplex networks

    NASA Astrophysics Data System (ADS)

    Bianconi, Ginestra; Radicchi, Filippo

    2016-12-01

    We present an exact mathematical framework able to describe site-percolation transitions in real multiplex networks. Specifically, we consider the average percolation diagram valid over an infinite number of random configurations where nodes are present in the system with given probability. The approach relies on the locally treelike ansatz, so that it is expected to accurately reproduce the true percolation diagram of sparse multiplex networks with negligible number of short loops. The performance of our theory is tested in social, biological, and transportation multiplex graphs. When compared against previously introduced methods, we observe improvements in the prediction of the percolation diagrams in all networks analyzed. Results from our method confirm previous claims about the robustness of real multiplex networks, in the sense that the average connectedness of the system does not exhibit any significant abrupt change as its individual components are randomly destroyed.

  13. Experimental Design for a Macrofoam Swab Study Relating the Recovery Efficiency and False Negative Rate to Low Concentrations of Two Bacillus anthracis Surrogates on Four Surface Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Hutchison, Janine R.

    2014-04-16

    This report describes the experimental design for a laboratory study to quantify the recovery efficiencies and false negative rates of a validated, macrofoam swab sampling method for low concentrations of Bacillus anthracis Sterne (BAS) and Bacillus atrophaeus (BG) spores on four surface materials (stainless steel, glass, vinyl tile, plastic light cover panel). Two analytical methods (plating/counting and polymerase chain reaction) will be used. Only one previous study has investigated false negative as a function of affecting test factors. The surrogates BAS and BG have not been tested together in the same study previously. Hence, this study will provide for completingmore » gaps in the available information on the performance of macrofoam swab sampling at low concentrations.« less

  14. Experimental Design for a Macrofoam-Swab Study Relating the Recovery Efficiency and False Negative Rate to Low Concentrations of Two Bacillus anthracis Surrogates on Four Surface Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Hutchison, Janine R.

    This report describes the experimental design for a laboratory study to quantify the recovery efficiencies and false negative rates of a validated, macrofoam-swab sampling method for low concentrations of Bacillus anthracis Sterne (BAS) and Bacillus atrophaeus (BG) spores on four surface materials (stainless steel, glass, vinyl tile, plastic light cover panel). Two analytical methods (culture and polymerase chain reaction) will be used. Only one previous study has investigated how the false negative rate depends on test factors. The surrogates BAS and BG have not been tested together in the same study previously. Hence, this study will provide for completing gapsmore » in the available information on the performance of macrofoam-swab sampling at low concentrations.« less

  15. New valid spectrofluorimetric method for determination of selected cephalosporins in different pharmaceutical formulations using safranin as fluorophore.

    PubMed

    Derayea, Sayed M; Ahmed, Hytham M; Abdelmageed, Osama H; Haredy, Ahmed M

    2016-01-15

    A new validated spectrofluorimetric method has been developed for the determination of some cephalosporins namely; cefepime, cefaclor, cefadroxil, cefpodoxime and cefexime. The method was based on the reaction of these drugs with safranin in slightly alkaline medium (pH 8.0), to form ion-association complexes. The fluorescent products were extracted into chloroform and their fluorescence intensities were measured at 544-565 nm after excitation at 518-524 nm. The reaction conditions influencing the product formation and stability were investigated and optimized. The relative fluorescence intensity was proportional to the drug concentration in the linear ranges of 0.15-1.35, 0.35-1.25, 0.35-1.25, 0.20-1.44 and 0.20-1.25 μg/mL for cefepime, cefaclor, cefadroxil, cefpodoxime proxetil and cefexime, respectively. The detection limits were 40, 100, 100, 60 and 70 ng/mL, respectively. The performance of the developed method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference spectrophotometric method. Various pharmaceutical formulations were successfully analyzed using the proposed method and the results were in good agreement with those of the previously reported methods. Copyright © 2015. Published by Elsevier B.V.

  16. New valid spectrofluorimetric method for determination of selected cephalosporins in different pharmaceutical formulations using safranin as fluorophore

    NASA Astrophysics Data System (ADS)

    Derayea, Sayed M.; Ahmed, Hytham M.; Abdelmageed, Osama H.; Haredy, Ahmed M.

    2016-01-01

    A new validated spectrofluorimetric method has been developed for the determination of some cephalosporins namely; cefepime, cefaclor, cefadroxil, cefpodoxime and cefexime. The method was based on the reaction of these drugs with safranin in slightly alkaline medium (pH 8.0), to form ion-association complexes. The fluorescent products were extracted into chloroform and their fluorescence intensities were measured at 544-565 nm after excitation at 518-524 nm. The reaction conditions influencing the product formation and stability were investigated and optimized. The relative fluorescence intensity was proportional to the drug concentration in the linear ranges of 0.15-1.35, 0.35-1.25, 0.35-1.25, 0.20-1.44 and 0.20-1.25 μg/mL for cefepime, cefaclor, cefadroxil, cefpodoxime proxetil and cefexime, respectively. The detection limits were 40, 100, 100, 60 and 70 ng/mL, respectively. The performance of the developed method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference spectrophotometric method. Various pharmaceutical formulations were successfully analyzed using the proposed method and the results were in good agreement with those of the previously reported methods.

  17. Implementing statistical equating for MRCP(UK) Parts 1 and 2.

    PubMed

    McManus, I C; Chis, Liliana; Fox, Ray; Waller, Derek; Tang, Peter

    2014-09-26

    The MRCP(UK) exam, in 2008 and 2010, changed the standard-setting of its Part 1 and Part 2 examinations from a hybrid Angoff/Hofstee method to statistical equating using Item Response Theory, the reference group being UK graduates. The present paper considers the implementation of the change, the question of whether the pass rate increased amongst non-UK candidates, any possible role of Differential Item Functioning (DIF), and changes in examination predictive validity after the change. Analysis of data of MRCP(UK) Part 1 exam from 2003 to 2013 and Part 2 exam from 2005 to 2013. Inspection suggested that Part 1 pass rates were stable after the introduction of statistical equating, but showed greater annual variation probably due to stronger candidates taking the examination earlier. Pass rates seemed to have increased in non-UK graduates after equating was introduced, but was not associated with any changes in DIF after statistical equating. Statistical modelling of the pass rates for non-UK graduates found that pass rates, in both Part 1 and Part 2, were increasing year on year, with the changes probably beginning before the introduction of equating. The predictive validity of Part 1 for Part 2 was higher with statistical equating than with the previous hybrid Angoff/Hofstee method, confirming the utility of IRT-based statistical equating. Statistical equating was successfully introduced into the MRCP(UK) Part 1 and Part 2 written examinations, resulting in higher predictive validity than the previous Angoff/Hofstee standard setting. Concerns about an artefactual increase in pass rates for non-UK candidates after equating were shown not to be well-founded. Most likely the changes resulted from a genuine increase in candidate ability, albeit for reasons which remain unclear, coupled with a cognitive illusion giving the impression of a step-change immediately after equating began. Statistical equating provides a robust standard-setting method, with a better theoretical foundation than judgemental techniques such as Angoff, and is more straightforward and requires far less examiner time to provide a more valid result. The present study provides a detailed case study of introducing statistical equating, and issues which may need to be considered with its introduction.

  18. Can training in empathetic validation improve medical students' communication with patients suffering pain? A test of concept.

    PubMed

    Linton, Steven J; Flink, Ida K; Nilsson, Emma; Edlund, Sara

    2017-05-01

    Patient-centered, empathetic communication has been recommended as a means for improving the health care of patients suffering pain. However, a problem has been training health care providers since programs may be time-consuming and difficult to learn. Validation, a form of empathetic response that communicates that what a patient experiences is accepted as true, has been suggested as an appropriate method for improving communication with patients suffering pain. We study the immediate effects of providing medical students with a 2-session (45-minute duration each) program in validation skills on communication. A one group, pretest vs posttest design was employed with 22 volunteer medical students. To control patient variables, actors simulated 1 of 2 patient scenarios (randomly provided at pretest and posttest). Video recordings were blindly evaluated. Self-ratings of validation and satisfaction were also employed. Observed validation responses increased significantly after training and corresponded to significant reductions in invalidating responses. Both the patient simulators and the medical students were significantly more satisfied after the training. We demonstrated that training empathetic validation results in improved communication thus extending previous findings to a medical setting with patients suffering pain. Our results suggest that it would be feasible to provide validation training for health care providers and this warrants further investigation in controlled studies.

  19. Advances in shock timing experiments on the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Robey, H. F.; Celliers, P. M.; Moody, J. D.; Sater, J.; Parham, T.; Kozioziemski, B.; Dylla-Spears, R.; Ross, J. S.; LePape, S.; Ralph, J. E.; Hohenberger, M.; Dewald, E. L.; Berzak Hopkins, L.; Kroll, J. J.; Yoxall, B. E.; Hamza, A. V.; Boehly, T. R.; Nikroo, A.; Landen, O. L.; Edwards, M. J.

    2016-03-01

    Recent advances in shock timing experiments and analysis techniques now enable shock measurements to be performed in cryogenic deuterium-tritium (DT) ice layered capsule implosions on the National Ignition Facility (NIF). Previous measurements of shock timing in inertial confinement fusion (ICF) implosions were performed in surrogate targets, where the solid DT ice shell and central DT gas were replaced with a continuous liquid deuterium (D2) fill. These previous experiments pose two surrogacy issues: a material surrogacy due to the difference of species (D2 vs. DT) and densities of the materials used and a geometric surrogacy due to presence of an additional interface (ice/gas) previously absent in the liquid-filled targets. This report presents experimental data and a new analysis method for validating the assumptions underlying this surrogate technique.

  20. Fall Risk Assessment Through Automatic Combination of Clinical Fall Risk Factors and Body-Worn Sensor Data.

    PubMed

    Greene, Barry R; Redmond, Stephen J; Caulfield, Brian

    2017-05-01

    Falls are the leading global cause of accidental death and disability in older adults and are the most common cause of injury and hospitalization. Accurate, early identification of patients at risk of falling, could lead to timely intervention and a reduction in the incidence of fall-related injury and associated costs. We report a statistical method for fall risk assessment using standard clinical fall risk factors (N = 748). We also report a means of improving this method by automatically combining it, with a fall risk assessment algorithm based on inertial sensor data and the timed-up-and-go test. Furthermore, we provide validation data on the sensor-based fall risk assessment method using a statistically independent dataset. Results obtained using cross-validation on a sample of 292 community dwelling older adults suggest that a combined clinical and sensor-based approach yields a classification accuracy of 76.0%, compared to either 73.6% for sensor-based assessment alone, or 68.8% for clinical risk factors alone. Increasing the cohort size by adding an additional 130 subjects from a separate recruitment wave (N = 422), and applying the same model building and validation method, resulted in a decrease in classification performance (68.5% for combined classifier, 66.8% for sensor data alone, and 58.5% for clinical data alone). This suggests that heterogeneity between cohorts may be a major challenge when attempting to develop fall risk assessment algorithms which generalize well. Independent validation of the sensor-based fall risk assessment algorithm on an independent cohort of 22 community dwelling older adults yielded a classification accuracy of 72.7%. Results suggest that the present method compares well to previously reported sensor-based fall risk assessment methods in assessing falls risk. Implementation of objective fall risk assessment methods on a large scale has the potential to improve quality of care and lead to a reduction in associated hospital costs, due to fewer admissions and reduced injuries due to falling.

  1. Optimization of the freeze-drying cycle: adaptation of the pressure rise analysis model to non-instantaneous isolation valves.

    PubMed

    Chouvenc, P; Vessot, S; Andrieu, J; Vacus, P

    2005-01-01

    The principal aim of this study is to extend to a pilot freeze-dryer equipped with a non-instantaneous isolation valve the previously presented pressure rise analysis (PRA) model for monitoring the product temperature and the resistance to mass transfer of the dried layer during primary drying. This method, derived from the original MTM method previously published, consists of interrupting rapidly (a few seconds) the water vapour flow from the sublimation chamber to the condenser and analysing the resulting dynamics of the total chamber pressure increase. The valve effect on the pressure rise profile observed during the isolation valve closing period was corrected by introducing in the initial PRA model a valve characteristic function factor which turned out to be independent of the operating conditions. This new extended PRA model was validated by implementing successively the two types of valves and by analysing the pressure rise kinetics data with the corresponding PRA models in the same operating conditions. The coherence and consistency shown on the identified parameter values (sublimation front temperature, dried layer mass transfer resistance) allowed validation of this extended PRA model with a non-instantaneous isolation valve. These results confirm that the PRA method, with or without an instantaneous isolation valve, is appropriate for on-line monitoring of product characteristics during freeze-drying. The advantages of PRA are that the method is rapid, non-invasive, and global. Consequently, PRA might become a powerful and promising tool not only for the control of pilot freeze-dryers but also for industrial freeze-dryers equipped with external condensers.

  2. Towards parsimony in habit measurement: Testing the convergent and predictive validity of an automaticity subscale of the Self-Report Habit Index

    PubMed Central

    2012-01-01

    Background The twelve-item Self-Report Habit Index (SRHI) is the most popular measure of energy-balance related habits. This measure characterises habit by automatic activation, behavioural frequency, and relevance to self-identity. Previous empirical research suggests that the SRHI may be abbreviated with no losses in reliability or predictive utility. Drawing on recent theorising suggesting that automaticity is the ‘active ingredient’ of habit-behaviour relationships, we tested whether an automaticity-specific SRHI subscale could capture habit-based behaviour patterns in self-report data. Methods A content validity task was undertaken to identify a subset of automaticity indicators within the SRHI. The reliability, convergent validity and predictive validity of the automaticity item subset was subsequently tested in secondary analyses of all previous SRHI applications, identified via systematic review, and in primary analyses of four raw datasets relating to energy‐balance relevant behaviours (inactive travel, active travel, snacking, and alcohol consumption). Results A four-item automaticity subscale (the ‘Self-Report Behavioural Automaticity Index’; ‘SRBAI’) was found to be reliable and sensitive to two hypothesised effects of habit on behaviour: a habit-behaviour correlation, and a moderating effect of habit on the intention-behaviour relationship. Conclusion The SRBAI offers a parsimonious measure that adequately captures habitual behaviour patterns. The SRBAI may be of particular utility in predicting future behaviour and in studies tracking habit formation or disruption. PMID:22935297

  3. Therapeutic Misconception in Research Subjects: Development and Validation of a Measure

    PubMed Central

    Appelbaum, Paul S.; Anatchkova, Milena; Albert, Karen; Dunn, Laura B.; Lidz, Charles W.

    2013-01-01

    Background Therapeutic misconception (TM), which occurs when research subjects fail to appreciate the distinction between the imperatives of clinical research and ordinary treatment, may undercut the process of obtaining meaningful consent to clinical research participation. Previous studies have found TM is widespread, but progress in addressing TM has been stymied by the absence of a validated method for assessing its presence. Purpose The goal of this study was to develop and validate a theoretically grounded measure of TM, assess its diagnostic accuracy, and test previous findings regarding its prevalence. Methods 220 participants were recruited from clinical trials at 4 academic medical centers in the U.S. Participants completed a 28-item Likert-type questionnaire to assess the presence of beliefs associated with TM, and a semi-structured TM interview designed to elicit their perceptions of the nature of the clinical trial in which they were participating. Data from the questionnaires were subjected to factor analysis and items with poor factor loadings were excluded. This resulted in a 10-item scale, with 3 strongly correlated factors and excellent internal consistency; the fit indices of the model across 10 training sets were consistent with the original results, suggesting a stable factor solution. Results The scale was validated against the TM interview, with significantly higher scores among subjects coded as displaying evidence of TM. ROC analysis based on a 10-fold internal cross-validation yielded AUC=.682 for any evidence of TM. When sensitivity (0.72) and specificity (0.61) were both optimized, Positive Predictive Value was 0.65 and Negative Predictive Value was 0.68, with a Positive Likelihood Ratio of 1.89, and a Negative Likelihood Ratio of 0.47. 50.5% (n=101) of participants manifested evidence of TM on the TM interview, a somewhat lower rate than in most previous studies. Limitations The predictive value of the scale compared with the “gold standard” clinical interview is modest, although similar to other instruments based on self-report assessing states of mind rather than discrete symptoms. Thus, although the scale can offer evidence of which subjects are at risk for distortions in their decisions and to what degree, it will not allow researchers to conclude definitively that TM is present in a given subject. Conclusions The development of a reliable and valid TM scale, even with modest predictive power, should permit investigators in clinical trials to identify subjects with tendencies to misinterpret the nature of the situation and to provide additional information to them. It should also stimulate research on how best to decrease TM and facilitate meaningful informed consent to clinical research. PMID:22942217

  4. Improving machine learning reproducibility in genetic association studies with proportional instance cross validation (PICV).

    PubMed

    Piette, Elizabeth R; Moore, Jason H

    2018-01-01

    Machine learning methods and conventions are increasingly employed for the analysis of large, complex biomedical data sets, including genome-wide association studies (GWAS). Reproducibility of machine learning analyses of GWAS can be hampered by biological and statistical factors, particularly so for the investigation of non-additive genetic interactions. Application of traditional cross validation to a GWAS data set may result in poor consistency between the training and testing data set splits due to an imbalance of the interaction genotypes relative to the data as a whole. We propose a new cross validation method, proportional instance cross validation (PICV), that preserves the original distribution of an independent variable when splitting the data set into training and testing partitions. We apply PICV to simulated GWAS data with epistatic interactions of varying minor allele frequencies and prevalences and compare performance to that of a traditional cross validation procedure in which individuals are randomly allocated to training and testing partitions. Sensitivity and positive predictive value are significantly improved across all tested scenarios for PICV compared to traditional cross validation. We also apply PICV to GWAS data from a study of primary open-angle glaucoma to investigate a previously-reported interaction, which fails to significantly replicate; PICV however improves the consistency of testing and training results. Application of traditional machine learning procedures to biomedical data may require modifications to better suit intrinsic characteristics of the data, such as the potential for highly imbalanced genotype distributions in the case of epistasis detection. The reproducibility of genetic interaction findings can be improved by considering this variable imbalance in cross validation implementation, such as with PICV. This approach may be extended to problems in other domains in which imbalanced variable distributions are a concern.

  5. Biomechanical validation of an artificial tooth–periodontal ligament–bone complex for in vitro orthodontic load measurement

    PubMed Central

    Xia, Zeyang; Chen, Jie

    2014-01-01

    Objectives To develop an artificial tooth–periodontal ligament (PDL)–bone complex (ATPBC) that simulates clinical crown displacement. Material and Methods An ATPBC was created. It had a socket hosting a tooth with a thin layer of silicon mixture in between for simulating the PDL. The complex was attached to a device that allows applying a controlled force to the crown and measuring the resulting crown displacement. Crown displacements were compared to previously published data for validation. Results The ATPBC that had a PDL made of two types of silicones, 50% gasket sealant No. 2 and 50% RTV 587 silicone, with a thickness of 0.3 mm, simulated the PDL well. The mechanical behaviors (1) force-displacement relationship, (2) stress relaxation, (3) creep, and (4) hysteresis were validated by the published results. Conclusion The ATPBC simulated the crown displacement behavior reported from biological studies well. PMID:22970752

  6. Statistical detection of EEG synchrony using empirical bayesian inference.

    PubMed

    Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven

    2015-01-01

    There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.

  7. Development of a simple chromatographic method for the determination of piracetam in human plasma and its pharmacokinetic evaluation.

    PubMed

    Barkat, K; Ahmad, M; Minhas, M U; Malik, M Z; Sohail, M

    2014-07-01

    The objective of study was to develop an accurate and reproducible HPLC method for determination of piracetam in human plasma and to evaluate pharmacokinetic parameters of 800 mg piracetam. A simple, rapid, accurate, precise and sensitive high pressure liquid chromatography method has been developed and subsequently validated for determination of piracetam. This study represents the results of a randomized, single-dose and single-period in 18 healthy male volunteers to assess pharmacokinetic parameters of 800 mg piracetam tablets. Various pharmacokinetic parameters were determined from plasma for piracetam and found to be in good agreement with previous reported values. The data was analyzed by using Kinetica® version 4.4 according to non-compartment model of pharmacokinetic analysis and after comparison with previous studies, no significant differences were found in present study of tested product. The major pharmacokinetic parameters for piracetam were as follows: t1/2 was (4.40 ± 0.179) h; Tmax value was (2.33 ± 0.105) h; Cmax was (14.53 ± 0.282) µg/mL; the AUC(0-∞) was (59.19 ± 4.402) µg · h/mL. AUMC(0-∞) was (367.23 ± 38.96) µg. (h)(2)/mL; Ke was (0.16 ± 0.006) h; MRT was (5.80 ± 0.227) h; Vd was (96.36 ± 8.917 L). A rapid, accurate and precise high pressure liquid chromatography method was developed and validated before the study. It is concluded that this method is very useful for the analysis of pharmacokinetic parameters, in human plasma and assured the safety and efficacy of piracetam, can be effectively used in medical practice. © Georg Thieme Verlag KG Stuttgart · New York.

  8. BMAA extraction of cyanobacteria samples: which method to choose?

    PubMed

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  9. A new validated HPTLC method for quantitative determination of 1, 5-dicaffeoylquinic acid in Inula crithmoides roots.

    PubMed

    Aboul-Ela, Maha Ahmed; El-Lakany, Abdalla Mohamed; Shams Eldin, Safa Mohamed; Hammoda, Hala Mostafa

    2012-10-01

    1, 5-Dicaffeoylquinic acid (1, 5-DCQA), a potent HIV-1 integrase inhibitor, is currently undergoing an evaluation as a promising novel HIV therapeutic agent. This work aims at developing an accurate, rapid, repeatable and robust HPTLC method for the determination of 1, 5-DCQA in its natural sources. 1, 5-DCQA is the major component of the n-butanol fraction, the most biologically active hepatoprotective fraction, of Inula crithmoides roots extract. Thus, it will be of interest to evaluate the plant roots as a potential source of 1, 5-DCQA using a fully validated HPTLC method. The percentage of 1, 5-DCQA in the studied plant (0.035% w/w) was found to be approximately similar to those previously determined in other antioxidant herbal drugs, in which 1, 5- DCQA is the main phenolic constituent. The results obtained showed that the described HPTLC method is suitable for routine use in quality control of herbal raw material, extracts and pharmaceutical preparations containing 1, 5-DCQA. No HPTLC method has been reported in literature for the determination of 1, 5-DCQA in medicinal plants.

  10. Nonlinear modelling of high-speed catenary based on analytical expressions of cable and truss elements

    NASA Astrophysics Data System (ADS)

    Song, Yang; Liu, Zhigang; Wang, Hongrui; Lu, Xiaobing; Zhang, Jing

    2015-10-01

    Due to the intrinsic nonlinear characteristics and complex structure of the high-speed catenary system, a modelling method is proposed based on the analytical expressions of nonlinear cable and truss elements. The calculation procedure for solving the initial equilibrium state is proposed based on the Newton-Raphson iteration method. The deformed configuration of the catenary system as well as the initial length of each wire can be calculated. Its accuracy and validity of computing the initial equilibrium state are verified by comparison with the separate model method, absolute nodal coordinate formulation and other methods in the previous literatures. Then, the proposed model is combined with a lumped pantograph model and a dynamic simulation procedure is proposed. The accuracy is guaranteed by the multiple iterative calculations in each time step. The dynamic performance of the proposed model is validated by comparison with EN 50318, the results of the finite element method software and SIEMENS simulation report, respectively. At last, the influence of the catenary design parameters (such as the reserved sag and pre-tension) on the dynamic performance is preliminarily analysed by using the proposed model.

  11. Leaner and greener analysis of cannabinoids.

    PubMed

    Mudge, Elizabeth M; Murch, Susan J; Brown, Paula N

    2017-05-01

    There is an explosion in the number of labs analyzing cannabinoids in marijuana (Cannabis sativa L., Cannabaceae) but existing methods are inefficient, require expert analysts, and use large volumes of potentially environmentally damaging solvents. The objective of this work was to develop and validate an accurate method for analyzing cannabinoids in cannabis raw materials and finished products that is more efficient and uses fewer toxic solvents. An HPLC-DAD method was developed for eight cannabinoids in cannabis flowers and oils using a statistically guided optimization plan based on the principles of green chemistry. A single-laboratory validation determined the linearity, selectivity, accuracy, repeatability, intermediate precision, limit of detection, and limit of quantitation of the method. Amounts of individual cannabinoids above the limit of quantitation in the flowers ranged from 0.02 to 14.9% w/w, with repeatability ranging from 0.78 to 10.08% relative standard deviation. The intermediate precision determined using HorRat ratios ranged from 0.3 to 2.0. The LOQs for individual cannabinoids in flowers ranged from 0.02 to 0.17% w/w. This is a significant improvement over previous methods and is suitable for a wide range of applications including regulatory compliance, clinical studies, direct patient medical services, and commercial suppliers.

  12. Closed loop statistical performance analysis of N-K knock controllers

    NASA Astrophysics Data System (ADS)

    Peyton Jones, James C.; Shayestehmanesh, Saeed; Frey, Jesse

    2017-09-01

    The closed loop performance of engine knock controllers cannot be rigorously assessed from single experiments or simulations because knock behaves as a random process and therefore the response belongs to a random distribution also. In this work a new method is proposed for computing the distributions and expected values of the closed loop response, both in steady state and in response to disturbances. The method takes as its input the control law, and the knock propensity characteristic of the engine which is mapped from open loop steady state tests. The method is applicable to the 'n-k' class of knock controllers in which the control action is a function only of the number of cycles n since the last control move, and the number k of knock events that have occurred in this time. A Cumulative Summation (CumSum) based controller falls within this category, and the method is used to investigate the performance of the controller in a deeper and more rigorous way than has previously been possible. The results are validated using onerous Monte Carlo simulations, which confirm both the validity of the method and its high computational efficiency.

  13. Population-based validation of a German version of the Brief Resilience Scale

    PubMed Central

    Wenzel, Mario; Stieglitz, Rolf-Dieter; Kunzler, Angela; Bagusat, Christiana; Helmreich, Isabella; Gerlicher, Anna; Kampa, Miriam; Kubiak, Thomas; Kalisch, Raffael; Lieb, Klaus; Tüscher, Oliver

    2018-01-01

    Smith and colleagues developed the Brief Resilience Scale (BRS) to assess the individual ability to recover from stress despite significant adversity. This study aimed to validate the German version of the BRS. We used data from a population-based (sample 1: n = 1.481) and a representative (sample 2: n = 1.128) sample of participants from the German general population (age ≥ 18) to assess reliability and validity. Confirmatory factor analyses (CFA) were conducted to compare one- and two-factorial models from previous studies with a method-factor model which especially accounts for the wording of the items. Reliability was analyzed. Convergent validity was measured by correlating BRS scores with mental health measures, coping, social support, and optimism. Reliability was good (α = .85, ω = .85 for both samples). The method-factor model showed excellent model fit (sample 1: χ2/df = 7.544; RMSEA = .07; CFI = .99; SRMR = .02; sample 2: χ2/df = 1.166; RMSEA = .01; CFI = 1.00; SRMR = .01) which was significantly better than the one-factor model (Δχ2(4) = 172.71, p < .001) or the two-factor model (Δχ2(3) = 31.16, p < .001). The BRS was positively correlated with well-being, social support, optimism, and the coping strategies active coping, positive reframing, acceptance, and humor. It was negatively correlated with somatic symptoms, anxiety and insomnia, social dysfunction, depression, and the coping strategies religion, denial, venting, substance use, and self-blame. To conclude, our results provide evidence for the reliability and validity of the German adaptation of the BRS as well as the unidimensional structure of the scale once method effects are accounted for. PMID:29438435

  14. Prediction and analysis of beta-turns in proteins by support vector machine.

    PubMed

    Pham, Tho Hoan; Satou, Kenji; Ho, Tu Bao

    2003-01-01

    Tight turn has long been recognized as one of the three important features of proteins after the alpha-helix and beta-sheet. Tight turns play an important role in globular proteins from both the structural and functional points of view. More than 90% tight turns are beta-turns. Analysis and prediction of beta-turns in particular and tight turns in general are very useful for the design of new molecules such as drugs, pesticides, and antigens. In this paper, we introduce a support vector machine (SVM) approach to prediction and analysis of beta-turns. We have investigated two aspects of applying SVM to the prediction and analysis of beta-turns. First, we developed a new SVM method, called BTSVM, which predicts beta-turns of a protein from its sequence. The prediction results on the dataset of 426 non-homologous protein chains by sevenfold cross-validation technique showed that our method is superior to the other previous methods. Second, we analyzed how amino acid positions support (or prevent) the formation of beta-turns based on the "multivariable" classification model of a linear SVM. This model is more general than the other ones of previous statistical methods. Our analysis results are more comprehensive and easier to use than previously published analysis results.

  15. Uncertainty propagation for statistical impact prediction of space debris

    NASA Astrophysics Data System (ADS)

    Hoogendoorn, R.; Mooij, E.; Geul, J.

    2018-01-01

    Predictions of the impact time and location of space debris in a decaying trajectory are highly influenced by uncertainties. The traditional Monte Carlo (MC) method can be used to perform accurate statistical impact predictions, but requires a large computational effort. A method is investigated that directly propagates a Probability Density Function (PDF) in time, which has the potential to obtain more accurate results with less computational effort. The decaying trajectory of Delta-K rocket stages was used to test the methods using a six degrees-of-freedom state model. The PDF of the state of the body was propagated in time to obtain impact-time distributions. This Direct PDF Propagation (DPP) method results in a multi-dimensional scattered dataset of the PDF of the state, which is highly challenging to process. No accurate results could be obtained, because of the structure of the DPP data and the high dimensionality. Therefore, the DPP method is less suitable for practical uncontrolled entry problems and the traditional MC method remains superior. Additionally, the MC method was used with two improved uncertainty models to obtain impact-time distributions, which were validated using observations of true impacts. For one of the two uncertainty models, statistically more valid impact-time distributions were obtained than in previous research.

  16. Standardized assessment of psychosocial factors and their influence on medically confirmed health outcomes in workers: a systematic review.

    PubMed

    Rosário, Susel; Fonseca, João A; Nienhaus, Albert; da Costa, José Torres

    2016-01-01

    Previous studies of psychosocial work factors have indicated their importance for workers' health. However, to what extent health problems can be attributed to the nature of the work environment or other psychosocial factors is not clear. No previous systematic review has used inclusion criteria based on specific medical evaluation of work-related health outcomes and the use of validated instruments for the assessment of the psychosocial (work) environment. The aim of this systematic review is to summarize the evidence assessing the relationship between the psychosocial work environment and workers' health based on studies that used standardized and validated instruments to assess the psychosocial work environment and that focused on medically confirmed health outcomes. A systematic review of the literature was carried out by searching the databases PubMed, B-ON, Science Direct, Psycarticles, Psychology and Behavioral Sciences Collection and the search engine (Google Scholar) using appropriate words for studies published from 2004 to 2014. This review follows the recommendations of the Statement for Reporting Systematic Reviews (PRISMA). Studies were included in the review if data on psychosocial validated assessment method(s) for the study population and specific medical evaluation of health-related work outcome(s) were presented. In total, the search strategy yielded 10,623 references, of which 10 studies (seven prospective cohort and three cross-sectional) met the inclusion criteria. Most studies (7/10) observed an adverse effect of poor psychosocial work factors on workers' health: 3 on sickness absence, 4 on cardiovascular diseases. The other 3 studies reported detrimental effects on sleep and on disease-associated biomarkers. A more consistent effect was observed in studies of higher methodological quality that used a prospective design jointly with the use of validated instruments for the assessment of the psychosocial (work) environment and clinical evaluation. More prospective studies are needed to assess the evidence of work-related psychosocial factors on workers´ health.

  17. Evaluation of the Irritable Bowel Syndrome Quality of Life (IBS-QOL) questionnaire in diarrheal-predominant irritable bowel syndrome patients

    PubMed Central

    2013-01-01

    Background Diarrhea-predominant irritable bowel syndrome (IBS-d) significantly diminishes the health-related quality of life (HRQOL) of patients. Psychological and social impacts are common with many IBS-d patients reporting comorbid depression, anxiety, decreased intimacy, and lost working days. The Irritable Bowel Syndrome Quality of Life (IBS-QOL) questionnaire is a 34-item instrument developed and validated for measurement of HRQOL in non-subtyped IBS patients. The current paper assesses this previously-validated instrument employing data collected from 754 patients who participated in a randomized clinical trial of a novel treatment, eluxadoline, for IBS-d. Methods Psychometric methods common to HRQOL research were employed to evaluate the IBS-QOL. Many of the historical analyses of the IBS-QOL validations were used. Other techniques that extended the original methods were applied where more appropriate for the current dataset. In IBS-d patients, we analyzed the items and substructure of the IBS-QOL via item reduction, factor structure, internal consistency, reproducibility, construct validity, and ability to detect change. Results This study supports the IBS-QOL as a psychometrically valid measure. Factor analyses suggested that IBS-specific QOL as measured by the IBS-QOL is a unidimensional construct. Construct validity was further buttressed by significant correlations between IBS-QOL total scores and related measures of IBS-d severity including the historically-relevant Irritable Bowel Syndrome Adequate Relief (IBS-AR) item and the FDA’s Clinical Responder definition. The IBS-QOL also showed a significant ability to detect change as evidenced by analysis of treatment effects. A minority of the items, unrelated to the IBS-d, performed less well by the standards set by the original authors. Conclusions We established that the IBS-QOL total score is a psychometrically valid measure of HRQOL in IBS-d patients enrolled in this study. Our analyses suggest that the IBS-QOL items demonstrate very good construct validity and ability to detect changes due to treatment effects. Furthermore, our analyses suggest that the IBS-QOL items measure a univariate construct and we believe further modeling of the IBS-QOL from an item response theory (IRT) approach under both non-treatment and treatment conditions would greatly further our understanding as item-based methods could be used to develop a short form. PMID:24330412

  18. Testing survey-based methods for rapid monitoring of child mortality, with implications for summary birth history data.

    PubMed

    Brady, Eoghan; Hill, Kenneth

    2017-01-01

    Under-five mortality estimates are increasingly used in low and middle income countries to target interventions and measure performance against global development goals. Two new methods to rapidly estimate under-5 mortality based on Summary Birth Histories (SBH) were described in a previous paper and tested with data available. This analysis tests the methods using data appropriate to each method from 5 countries that lack vital registration systems. SBH data are collected across many countries through censuses and surveys, and indirect methods often rely upon their quality to estimate mortality rates. The Birth History Imputation method imputes data from a recent Full Birth History (FBH) onto the birth, death and age distribution of the SBH to produce estimates based on the resulting distribution of child mortality. DHS FBHs and MICS SBHs are used for all five countries. In the implementation, 43 of 70 estimates are within 20% of validation estimates (61%). Mean Absolute Relative Error is 17.7.%. 1 of 7 countries produces acceptable estimates. The Cohort Change method considers the differences in births and deaths between repeated Summary Birth Histories at 1 or 2-year intervals to estimate the mortality rate in that period. SBHs are taken from Brazil's PNAD Surveys 2004-2011 and validated against IGME estimates. 2 of 10 estimates are within 10% of validation estimates. Mean absolute relative error is greater than 100%. Appropriate testing of these new methods demonstrates that they do not produce sufficiently good estimates based on the data available. We conclude this is due to the poor quality of most SBH data included in the study. This has wider implications for the next round of censuses and future household surveys across many low- and middle- income countries.

  19. fMRI capture of auditory hallucinations: Validation of the two-steps method.

    PubMed

    Leroy, Arnaud; Foucher, Jack R; Pins, Delphine; Delmaire, Christine; Thomas, Pierre; Roser, Mathilde M; Lefebvre, Stéphanie; Amad, Ali; Fovet, Thomas; Jaafari, Nemat; Jardri, Renaud

    2017-10-01

    Our purpose was to validate a reliable method to capture brain activity concomitant with hallucinatory events, which constitute frequent and disabling experiences in schizophrenia. Capturing hallucinations using functional magnetic resonance imaging (fMRI) remains very challenging. We previously developed a method based on a two-steps strategy including (1) multivariate data-driven analysis of per-hallucinatory fMRI recording and (2) selection of the components of interest based on a post-fMRI interview. However, two tests still need to be conducted to rule out critical pitfalls of conventional fMRI capture methods before this two-steps strategy can be adopted in hallucination research: replication of these findings on an independent sample and assessment of the reliability of the hallucination-related patterns at the subject level. To do so, we recruited a sample of 45 schizophrenia patients suffering from frequent hallucinations, 20 schizophrenia patients without hallucinations and 20 matched healthy volunteers; all participants underwent four different experiments. The main findings are (1) high accuracy in reporting unexpected sensory stimuli in an MRI setting; (2) good detection concordance between hypothesis-driven and data-driven analysis methods (as used in the two-steps strategy) when controlled unexpected sensory stimuli are presented; (3) good agreement of the two-steps method with the online button-press approach to capture hallucinatory events; (4) high spatial consistency of hallucinatory-related networks detected using the two-steps method on two independent samples. By validating the two-steps method, we advance toward the possible transfer of such technology to new image-based therapies for hallucinations. Hum Brain Mapp 38:4966-4979, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. Method validation for control determination of mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry.

    PubMed

    Torres, Daiane Placido; Martins-Teixeira, Maristela Braga; Cadore, Solange; Queiroz, Helena Müller

    2015-01-01

    A method for the determination of total mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) has been validated following international foodstuff protocols in order to fulfill the Brazilian National Residue Control Plan. The experimental parameters have been previously studied and optimized according to specific legislation on validation and inorganic contaminants in foodstuff. Linearity, sensitivity, specificity, detection and quantification limits, precision (repeatability and within-laboratory reproducibility), robustness as well as accuracy of the method have been evaluated. Linearity of response was satisfactory for the two range concentrations available on the TDA AAS equipment, between approximately 25.0 and 200.0 μg kg(-1) (square regression) and 250.0 and 2000.0 μg kg(-1) (linear regression) of mercury. The residues for both ranges were homoscedastic and independent, with normal distribution. Correlation coefficients obtained for these ranges were higher than 0.995. Limits of quantification (LOQ) and of detection of the method (LDM), based on signal standard deviation (SD) for a low-in-mercury sample, were 3.0 and 1.0 μg kg(-1), respectively. Repeatability of the method was better than 4%. Within-laboratory reproducibility achieved a relative SD better than 6%. Robustness of the current method was evaluated and pointed sample mass as a significant factor. Accuracy (assessed as the analyte recovery) was calculated on basis of the repeatability, and ranged from 89% to 99%. The obtained results showed the suitability of the present method for direct mercury measurement in fresh fish and shrimp samples and the importance of monitoring the analysis conditions for food control purposes. Additionally, the competence of this method was recognized by accreditation under the standard ISO/IEC 17025.

  1. NetMHCpan, a Method for Quantitative Predictions of Peptide Binding to Any HLA-A and -B Locus Protein of Known Sequence

    PubMed Central

    Nielsen, Morten; Lundegaard, Claus; Blicher, Thomas; Lamberth, Kasper; Harndahl, Mikkel; Justesen, Sune; Røder, Gustav; Peters, Bjoern; Sette, Alessandro; Lund, Ole; Buus, Søren

    2007-01-01

    Background Binding of peptides to Major Histocompatibility Complex (MHC) molecules is the single most selective step in the recognition of pathogens by the cellular immune system. The human MHC class I system (HLA-I) is extremely polymorphic. The number of registered HLA-I molecules has now surpassed 1500. Characterizing the specificity of each separately would be a major undertaking. Principal Findings Here, we have drawn on a large database of known peptide-HLA-I interactions to develop a bioinformatics method, which takes both peptide and HLA sequence information into account, and generates quantitative predictions of the affinity of any peptide-HLA-I interaction. Prospective experimental validation of peptides predicted to bind to previously untested HLA-I molecules, cross-validation, and retrospective prediction of known HIV immune epitopes and endogenous presented peptides, all successfully validate this method. We further demonstrate that the method can be applied to perform a clustering analysis of MHC specificities and suggest using this clustering to select particularly informative novel MHC molecules for future biochemical and functional analysis. Conclusions Encompassing all HLA molecules, this high-throughput computational method lends itself to epitope searches that are not only genome- and pathogen-wide, but also HLA-wide. Thus, it offers a truly global analysis of immune responses supporting rational development of vaccines and immunotherapy. It also promises to provide new basic insights into HLA structure-function relationships. The method is available at http://www.cbs.dtu.dk/services/NetMHCpan. PMID:17726526

  2. Zero-G experimental validation of a robotics-based inertia identification algorithm

    NASA Astrophysics Data System (ADS)

    Bruggemann, Jeremy J.; Ferrel, Ivann; Martinez, Gerardo; Xie, Pu; Ma, Ou

    2010-04-01

    The need to efficiently identify the changing inertial properties of on-orbit spacecraft is becoming more critical as satellite on-orbit services, such as refueling and repairing, become increasingly aggressive and complex. This need stems from the fact that a spacecraft's control system relies on the knowledge of the spacecraft's inertia parameters. However, the inertia parameters may change during flight for reasons such as fuel usage, payload deployment or retrieval, and docking/capturing operations. New Mexico State University's Dynamics, Controls, and Robotics Research Group has proposed a robotics-based method of identifying unknown spacecraft inertia properties1. Previous methods require firing known thrusts then measuring the thrust, and the velocity and acceleration changes. The new method utilizes the concept of momentum conservation, while employing a robotic device powered by renewable energy to excite the state of the satellite. Thus, it requires no fuel usage or force and acceleration measurements. The method has been well studied in theory and demonstrated by simulation. However its experimental validation is challenging because a 6- degree-of-freedom motion in a zero-gravity condition is required. This paper presents an on-going effort to test the inertia identification method onboard the NASA zero-G aircraft. The design and capability of the test unit will be discussed in addition to the flight data. This paper also introduces the design and development of an airbearing based test used to partially validate the method, in addition to the approach used to obtain reference value for the test system's inertia parameters that can be used for comparison with the algorithm results.

  3. Wrestlers' minimal weight: anthropometry, bioimpedance, and hydrostatic weighing compared.

    PubMed

    Oppliger, R A; Nielsen, D H; Vance, C G

    1991-02-01

    The need for accurate assessment of minimal wrestling weight among interscholastic wrestlers has been well documented. Previous research has demonstrated the validity of anthropometric methods for this purpose, but little research has examined the validity of bioelectrical impedance (BIA) measurements. Comparisons between BIA systems has received limited attention. With these two objectives, we compared the prediction of minimal weight (MW) among 57 interscholastic wrestlers using three anthropometric methods (skinfolds (SF) and two skeletal dimensions equations) and three BIA systems (Berkeley Medical Research (BMR), RJL, and Valhalla (VAL]. All methods showed high correlations (r values greater than 0.92) with hydrostatic weighting (HW) and between methods (r values greater than 0.90). The standard errors of estimate (SEE) were relatively small for all methods, especially for SF and the three BIA systems (SEE less than 0.70 kg). The total errors of prediction (E) for RJL and VAL (E = 4.4 and 3.9 kg) were significantly larger than observed nonsignificant BMR and SF values (E = 2.3 and 1.8 kg, respectively). Significant mean differences were observed between HW, RJL, VAL, and the two skeletal dimensions equations, but nonsignificant differences were observed between HW, BMR, and SF. BMR differed significantly from the RJL and VAL systems. The results suggest that RJL and VAL have potential application for this subpopulation. Prediction equation refinement with the addition of selected anthropometric measurement or moderating variables may enhance their utility. However, within the scope of our study, SF and BMR BIA appear to be the most valid methods for determining MW in interscholastic wrestlers.

  4. Smoking-related deaths averted due to three years of policy progress

    PubMed Central

    Ellis, Jennifer A; Mays, Darren; Huang, An-Tsun

    2013-01-01

    Abstract Objective To evaluate the global impact of adopting highest-level MPOWER tobacco control policies in different countries and territories from 2007 to 2010. Methods Policy effect sizes based on previously-validated SimSmoke models were applied to determine the reduction in the number of smokers as a result of policy adoption during this period. Based on previous research suggesting that half of all smokers die from smoking, we also derived the estimated smoking-attributable deaths (SADs) averted due to MPOWER policy implementation. The results from use of this simple yet powerful method are consistent with those predicted by using previously validated SimSmoke models. Findings In total, 41 countries adopted at least one highest-level MPOWER policy between 2007 and 2010. As a result of all policies adopted during this period, the number of smokers is estimated to have dropped by 14.8 million, with a total of 7.4 million SADs averted. The largest number of SADs was averted as a result of increased cigarette taxes (3.5 million), smoke-free air laws (2.5 million), health warnings (700 000), cessation treatments (380 000), and bans on tobacco marketing (306 000). Conclusion From 2007 to 2010, 41 countries and territories took action that will collectively prevent nearly 7.5 million smoking-related deaths globally. These findings demonstrate the magnitude of the actions already taken by countries and underscore the potential for millions of additional lives to be saved with continued adoption of MPOWER policies. PMID:23825878

  5. Cross-Cultural Adaptation and Psychometric Testing of the Brazilian Version of the Self-Care of Heart Failure Index Version 6.2

    PubMed Central

    Ávila, Christiane Wahast; Riegel, Barbara; Pokorski, Simoni Chiarelli; Camey, Suzi; Silveira, Luana Claudia Jacoby; Rabelo-Silva, Eneida Rejane

    2013-01-01

    Objective. To adapt and evaluate the psychometric properties of the Brazilian version of the SCHFI v 6.2. Methods. With the approval of the original author, we conducted a complete cross-cultural adaptation of the instrument (translation, synthesis, back translation, synthesis of back translation, expert committee review, and pretesting). The adapted version was named Brazilian version of the self-care of heart failure index v 6.2. The psychometric properties assessed were face validity and content validity (by expert committee review), construct validity (convergent validity and confirmatory factor analysis), and reliability. Results. Face validity and content validity were indicative of semantic, idiomatic, experimental, and conceptual equivalence. Convergent validity was demonstrated by a significant though moderate correlation (r = −0.51) on comparison with equivalent question scores of the previously validated Brazilian European heart failure self-care behavior scale. Confirmatory factor analysis supported the original three-factor model as having the best fit, although similar results were obtained for inadequate fit indices. The reliability of the instrument, as expressed by Cronbach's alpha, was 0.40, 0.82, and 0.93 for the self-care maintenance, self-care management, and self-care confidence scales, respectively. Conclusion. The SCHFI v 6.2 was successfully adapted for use in Brazil. Nevertheless, further studies should be carried out to improve its psychometric properties. PMID:24163765

  6. A strategy for extending the applicability of a validated plasma calibration curve to quantitative measurements in multiple tissue homogenate samples: a case study from a rat tissue distribution study of JI-101, a triple kinase inhibitor.

    PubMed

    Gurav, Sandip Dhondiram; Jeniffer, Sherine; Punde, Ravindra; Gilibili, Ravindranath Reddy; Giri, Sanjeev; Srinivas, Nuggehally R; Mullangi, Ramesh

    2012-04-01

    A general practice in bioanalysis is that, whatever the biological matrix the analyte is being quantified in, the validation is performed in the same matrix as per regulatory guidelines. In this paper, we are presenting the applicability of a validated LC-MS/MS method in rat plasma for JI-101, to estimate the concentrations of JI-101 in various tissues that were harvested in a rat tissue distribution study. A simple protein precipitation technique was used to extract JI-101 and internal standard from the tissue homogenates. The recovery of JI-101 in all the matrices was found to be >70%. Chromatographic separation was achieved using a binary gradient using mobile phase A (acetonitrile) and B (0.2% formic acid in water) at a flow rate of 0.30 mL/min on a Prodigy ODS column with a total run time of 4.0 min. The MS/MS ion transitions monitored were 466.1 → 265 for JI-101 and 180.1 → 110.1 for internal standard. The linearity range was 5.02-4017 ng/mL. The JI-101 levels were quantifiable in the various tissue samples harvested in this study. Therefore, the use of a previously validated JI-101 assay in plasma circumvented the tedious process of method development/validation in various tissue matrices. Copyright © 2011 John Wiley & Sons, Ltd.

  7. Validation of a new formula for predicting body weight in a Mexican population with overweight and obesity.

    PubMed

    Quiroz-Olguín, Gabriela; Serralde-Zúñiga, Aurora Elizabeth; Saldaña-Morales, Vianey; Guevara-Cruz, Martha

    2013-01-01

    Body weight measurement is of critical importance when evaluating the nutritional status of patients entering a hospital. In some situations, such as the case of patients who are bedridden or in wheelchairs, these measurements cannot be obtained using standardized methods. We have designed and validated a formula for predicting body weight. To design and validate a formula for predicting body weight using circumference-based equations. The following anthropometric measurements were taken for a sample of 76 patients: weight (kg), calf circumference, average arm circumference, waist circumference, hip circumference, wrist circumference and demispan. All circumferences were taken in centimetres (cm), and gender and age were taken into account. This equation was validated in 85 individuals from a different population. The correlation with the new equation was analyzed and compared to a previously validated method. The equation for weight prediction was the following: Weight = 0.524 (WC) - 0.176 (age) + 0.484 (HC) + 0.613 (DS) + 0.704 (CC) + 2.75 (WrC) - 3.330 (if female) - 140.87. The correlation coefficient was 0.96 for the total group of patients, 0.971 for men and 0.961 for women (p < 0.0001 for all measurements). The equation we developed is accurate and can be used to estimate body weight in overweight and/or obese patients with mobility problems, such as bedridden patients or patients in wheelchairs. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  8. A meta-analysis of the validity of FFQ targeted to adolescents.

    PubMed

    Tabacchi, Garden; Filippi, Anna Rita; Amodio, Emanuele; Jemni, Monèm; Bianco, Antonino; Firenze, Alberto; Mammina, Caterina

    2016-05-01

    The present work is aimed at meta-analysing validity studies of FFQ for adolescents, to investigate their overall accuracy and variables that can affect it negatively. A meta-analysis of sixteen original articles was performed within the ASSO Project (Adolescents and Surveillance System in the Obesity prevention). The articles assessed the validity of FFQ for adolescents, compared with food records or 24 h recalls, with regard to energy and nutrient intakes. Pearson's or Spearman's correlation coefficients, means/standard deviations, kappa agreement, percentiles and mean differences/limits of agreement (Bland-Altman method) were extracted. Pooled estimates were calculated and heterogeneity tested for correlation coefficients and means/standard deviations. A subgroup analysis assessed variables influencing FFQ accuracy. An overall fair/high correlation between FFQ and reference method was found; a good agreement, measured through the intake mean comparison for all nutrients except sugar, carotene and K, was observed. Kappa values showed fair/moderate agreement; an overall good ability to rank adolescents according to energy and nutrient intakes was evidenced by data of percentiles; absolute validity was not confirmed by mean differences/limits of agreement. Interviewer administration mode, consumption interval of the previous year/6 months and high number of food items are major contributors to heterogeneity and thus can reduce FFQ accuracy. The meta-analysis shows that FFQ are accurate tools for collecting data and could be used for ranking adolescents in terms of energy and nutrient intakes. It suggests how the design and the validation of a new FFQ should be addressed.

  9. Outcome and Impact Evaluation of a Transgender Health Course for Health Profession Students.

    PubMed

    Braun, Hannan M; Garcia-Grossman, Ilana R; Quiñones-Rivera, Andrea; Deutsch, Madeline B

    2017-02-01

    Being transgender is associated with numerous health disparities, and transgender individuals face mistreatment and discrimination in healthcare settings. At the same time, healthcare professionals report inadequate preparation to care for transgender people, and patients often have to teach their own medical providers about transgender care. Our study aimed to evaluate the impact of an elective course for health profession students in transgender health that was implemented to address these gaps in provider knowledge. Students participated in a 10-session, lunch-hour elective course during the spring of 2015. To evaluate impact, course participants completed pre-, immediately post-, and 3-month postcourse questionnaires, including a previously validated nine-item transphobia scale, to determine the course's effect on knowledge, attitudes, and beliefs about transgender health. Forty-six students completed the pre- and immediately postelective questionnaire (74% response rate). Compared with pre-elective surveys, immediately postelective scores demonstrated increased knowledge in most domains and reduced transphobia. Specific knowledge domains with improvements included terminology, best practices for collecting gender identity, awareness of the DSM-V gender dysphoria diagnosis, medications used for gender affirmation, and relevant federal policies. A previously validated transphobia scale was found to have good reliability in the current sample. This elective course led to positive short-term changes in measures of multiple knowledge domains and reduced measures of transphobia among health profession students. Further study is needed to assess the long-term impact. Our methods and findings, including the demonstration of reliability of a previously validated nine-item transphobia scale, serve as formative data for the future development of theory-based transgender medicine curricula and measures.

  10. Predicting ionizing radiation exposure using biochemically-inspired genomic machine learning.

    PubMed

    Zhao, Jonathan Z L; Mucaki, Eliseos J; Rogan, Peter K

    2018-01-01

    Background: Gene signatures derived from transcriptomic data using machine learning methods have shown promise for biodosimetry testing. These signatures may not be sufficiently robust for large scale testing, as their performance has not been adequately validated on external, independent datasets. The present study develops human and murine signatures with biochemically-inspired machine learning that are strictly validated using k-fold and traditional approaches. Methods: Gene Expression Omnibus (GEO) datasets of exposed human and murine lymphocytes were preprocessed via nearest neighbor imputation and expression of genes implicated in the literature to be responsive to radiation exposure (n=998) were then ranked by Minimum Redundancy Maximum Relevance (mRMR). Optimal signatures were derived by backward, complete, and forward sequential feature selection using Support Vector Machines (SVM), and validated using k-fold or traditional validation on independent datasets. Results: The best human signatures we derived exhibit k-fold validation accuracies of up to 98% ( DDB2 ,  PRKDC , TPP2 , PTPRE , and GADD45A ) when validated over 209 samples and traditional validation accuracies of up to 92% ( DDB2 ,  CD8A ,  TALDO1 ,  PCNA ,  EIF4G2 ,  LCN2 ,  CDKN1A ,  PRKCH ,  ENO1 ,  and PPM1D ) when validated over 85 samples. Some human signatures are specific enough to differentiate between chemotherapy and radiotherapy. Certain multi-class murine signatures have sufficient granularity in dose estimation to inform eligibility for cytokine therapy (assuming these signatures could be translated to humans). We compiled a list of the most frequently appearing genes in the top 20 human and mouse signatures. More frequently appearing genes among an ensemble of signatures may indicate greater impact of these genes on the performance of individual signatures. Several genes in the signatures we derived are present in previously proposed signatures. Conclusions: Gene signatures for ionizing radiation exposure derived by machine learning have low error rates in externally validated, independent datasets, and exhibit high specificity and granularity for dose estimation.

  11. Mathematical multi-scale model of the cardiovascular system including mitral valve dynamics. Application to ischemic mitral insufficiency

    PubMed Central

    2011-01-01

    Background Valve dysfunction is a common cardiovascular pathology. Despite significant clinical research, there is little formal study of how valve dysfunction affects overall circulatory dynamics. Validated models would offer the ability to better understand these dynamics and thus optimize diagnosis, as well as surgical and other interventions. Methods A cardiovascular and circulatory system (CVS) model has already been validated in silico, and in several animal model studies. It accounts for valve dynamics using Heaviside functions to simulate a physiologically accurate "open on pressure, close on flow" law. However, it does not consider real-time valve opening dynamics and therefore does not fully capture valve dysfunction, particularly where the dysfunction involves partial closure. This research describes an updated version of this previous closed-loop CVS model that includes the progressive opening of the mitral valve, and is defined over the full cardiac cycle. Results Simulations of the cardiovascular system with healthy mitral valve are performed, and, the global hemodynamic behaviour is studied compared with previously validated results. The error between resulting pressure-volume (PV) loops of already validated CVS model and the new CVS model that includes the progressive opening of the mitral valve is assessed and remains within typical measurement error and variability. Simulations of ischemic mitral insufficiency are also performed. Pressure-Volume loops, transmitral flow evolution and mitral valve aperture area evolution follow reported measurements in shape, amplitude and trends. Conclusions The resulting cardiovascular system model including mitral valve dynamics provides a foundation for clinical validation and the study of valvular dysfunction in vivo. The overall models and results could readily be generalised to other cardiac valves. PMID:21942971

  12. Estimation of hand hygiene opportunities on an adult medical ward using 24-hour camera surveillance: validation of the HOW2 Benchmark Study.

    PubMed

    Diller, Thomas; Kelly, J William; Blackhurst, Dawn; Steed, Connie; Boeker, Sue; McElveen, Danielle C

    2014-06-01

    We previously published a formula to estimate the number of hand hygiene opportunities (HHOs) per patient-day using the World Health Organization's "Five Moments for Hand Hygiene" methodology (HOW2 Benchmark Study). HHOs can be used as a denominator for calculating hand hygiene compliance rates when product utilization data are available. This study validates the previously derived HHO estimate using 24-hour video surveillance of health care worker hand hygiene activity. The validation study utilized 24-hour video surveillance recordings of 26 patients' hospital stays to measure the actual number of HHOs per patient-day on a medicine ward in a large teaching hospital. Statistical methods were used to compare these results to those obtained by episodic observation of patient activity in the original derivation study. Total hours of data collection were 81.3 and 1,510.8, resulting in 1,740 and 4,522 HHOs in the derivation and validation studies, respectively. Comparisons of the mean and median HHOs per 24-hour period did not differ significantly. HHOs were 71.6 (95% confidence interval: 64.9-78.3) and 73.9 (95% confidence interval: 69.1-84.1), respectively. This study validates the HOW2 Benchmark Study and confirms that expected numbers of HHOs can be estimated from the unit's patient census and patient-to-nurse ratio. These data can be used as denominators in calculations of hand hygiene compliance rates from electronic monitoring using the "Five Moments for Hand Hygiene" methodology. Copyright © 2014 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  13. An SSP-PCR method for the rapid detection of disease-associated alleles HLA-A*29 and HLA-B*51.

    PubMed

    Amstutz, U; Schaerer, D; Andrey, G; Wirthmueller, U; Largiadèr, C R

    2018-05-15

    HLA-A*29 and HLA-B*51 are associated with birdshot uveitis and Behçet's disease, respectively, and are used as a diagnostic criterion in patients with suspected disease, requiring their detection in diagnostic laboratories. While commercial tests for individual HLA alleles are available for other disease-associated HLA variants, no similar allele-specific assays are available for HLA-A*29 and -B*51. Here, we report SSP-PCR methods for the detection of HLA-A*29 and -B*51 using a single PCR reaction per allele. The assays were tested in 30 and 32 previously HLA-typed samples, respectively, representing >97% of HLA-A alleles and >93% of HLA-B alleles in a European population. A concordance of 100% was observed with previous typing results, validating these methods for use in a diagnostic or research context. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  14. Prediction of VO[subscript 2]max in Children and Adolescents Using Exercise Testing and Physical Activity Questionnaire Data

    ERIC Educational Resources Information Center

    Black, Nate E.; Vehrs, Pat R.; Fellingham, Gilbert W.; George, James D.; Hager, Ron

    2016-01-01

    Purpose: The purpose of this study was to evaluate the use of a treadmill walk-jog-run exercise test previously validated in adults and physical activity questionnaire data to estimate maximum oxygen consumption (VO[subscript 2]max) in boys (n = 62) and girls (n = 66) aged 12 to 17 years old. Methods: Data were collected from Physical Activity…

  15. A Diagnostic Marker to Discriminate Childhood Apraxia of Speech from Speech Delay: III. Theoretical Coherence of the Pause Marker with Speech Processing Deficits in Childhood Apraxia of Speech

    ERIC Educational Resources Information Center

    Shriberg, Lawrence D.; Strand, Edythe A.; Fourakis, Marios; Jakielski, Kathy J.; Hall, Sheryl D.; Karlsson, Heather B.; Mabie, Heather L.; McSweeny, Jane L.; Tilkens, Christie M.; Wilson, David L.

    2017-01-01

    Purpose: Previous articles in this supplement described rationale for and development of the pause marker (PM), a diagnostic marker of childhood apraxia of speech (CAS), and studies supporting its validity and reliability. The present article assesses the theoretical coherence of the PM with speech processing deficits in CAS. Method: PM and other…

  16. Life-cycle economics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lunde, P.J.

    1982-09-01

    In a continuation of previous economic analyses, life-cycle economics of solar projects are discussed using the concept of net present value (NPV) or net worth. The discount rate is defined and illustrated and a life-cycle analysis is worked out based on no down payment and a 25-year loan. The advantages of rising NPV are discussed and illustrated using an energy conserving $100 storm window as an example. Real payback period is discussed and it is concluded that NPV is the only valid method for the evaluation of an investment. Return on investment is cited as a satisfactory alternative method. (MJJ)

  17. Higher Order Corrections in the CoLoRFulNNLO Framework

    NASA Astrophysics Data System (ADS)

    Somogyi, G.; Kardos, A.; Szőr, Z.; Trócsányi, Z.

    We discuss the CoLoRFulNNLO method for computing higher order radiative corrections to jet cross sections in perturbative QCD. We apply our method to the calculation of event shapes and jet rates in three-jet production in electron-positron annihilation. We validate our code by comparing our predictions to previous results in the literature and present the jet cone energy fraction distribution at NNLO accuracy. We also present preliminary NNLO results for the three-jet rate using the Durham jet clustering algorithm matched to resummed predictions at NLL accuracy, and a comparison to LEP data.

  18. Diagnostics for insufficiencies of posterior calculations in Bayesian signal inference.

    PubMed

    Dorn, Sebastian; Oppermann, Niels; Ensslin, Torsten A

    2013-11-01

    We present an error-diagnostic validation method for posterior distributions in Bayesian signal inference, an advancement of a previous work. It transfers deviations from the correct posterior into characteristic deviations from a uniform distribution of a quantity constructed for this purpose. We show that this method is able to reveal and discriminate several kinds of numerical and approximation errors, as well as their impact on the posterior distribution. For this we present four typical analytical examples of posteriors with incorrect variance, skewness, position of the maximum, or normalization. We show further how this test can be applied to multidimensional signals.

  19. A second-order accurate immersed boundary-lattice Boltzmann method for particle-laden flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Qiang; Fan, Liang-Shih, E-mail: fan.1@osu.edu

    A new immersed boundary-lattice Boltzmann method (IB-LBM) is presented for fully resolved simulations of incompressible viscous flows laden with rigid particles. The immersed boundary method (IBM) recently developed by Breugem (2012) [19] is adopted in the present method, development including the retraction technique, the multi-direct forcing method and the direct account of the inertia of the fluid contained within the particles. The present IB-LBM is, however, formulated with further improvement with the implementation of the high-order Runge–Kutta schemes in the coupled fluid–particle interaction. The major challenge to implement high-order Runge–Kutta schemes in the LBM is that the flow information suchmore » as density and velocity cannot be directly obtained at a fractional time step from the LBM since the LBM only provides the flow information at an integer time step. This challenge can be, however, overcome as given in the present IB-LBM by extrapolating the flow field around particles from the known flow field at the previous integer time step. The newly calculated fluid–particle interactions from the previous fractional time steps of the current integer time step are also accounted for in the extrapolation. The IB-LBM with high-order Runge–Kutta schemes developed in this study is validated by several benchmark applications. It is demonstrated, for the first time, that the IB-LBM has the capacity to resolve the translational and rotational motion of particles with the second-order accuracy. The optimal retraction distances for spheres and tubes that help the method achieve the second-order accuracy are found to be around 0.30 and −0.47 times of the lattice spacing, respectively. Simulations of the Stokes flow through a simple cubic lattice of rotational spheres indicate that the lift force produced by the Magnus effect can be very significant in view of the magnitude of the drag force when the practical rotating speed of the spheres is encountered. This finding may lead to more comprehensive studies of the effect of the particle rotation on fluid–solid drag laws. It is also demonstrated that, when the third-order or the fourth-order Runge–Kutta scheme is used, the numerical stability of the present IB-LBM is better than that of all methods in the literature, including the previous IB-LBMs and also the methods with the combination of the IBM and the traditional incompressible Navier–Stokes solver. - Highlights: • The IBM is embedded in the LBM using Runge–Kutta time schemes. • The effectiveness of the present IB-LBM is validated by benchmark applications. • For the first time, the IB-LBM achieves the second-order accuracy. • The numerical stability of the present IB-LBM is better than previous methods.« less

  20. SU-D-BRD-06: Automated Population-Based Planning for Whole Brain Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreibmann, E; Fox, T; Crocker, I

    2014-06-01

    Purpose: Treatment planning for whole brain radiation treatment is technically a simple process but in practice it takes valuable clinical time of repetitive and tedious tasks. This report presents a method that automatically segments the relevant target and normal tissues and creates a treatment plan in only a few minutes after patient simulation. Methods: Segmentation is performed automatically through morphological operations on the soft tissue. The treatment plan is generated by searching a database of previous cases for patients with similar anatomy. In this search, each database case is ranked in terms of similarity using a customized metric designed formore » sensitivity by including only geometrical changes that affect the dose distribution. The database case with the best match is automatically modified to replace relevant patient info and isocenter position while maintaining original beam and MLC settings. Results: Fifteen patients were used to validate the method. In each of these cases the anatomy was accurately segmented to mean Dice coefficients of 0.970 ± 0.008 for the brain, 0.846 ± 0.009 for the eyes and 0.672 ± 0.111 for the lens as compared to clinical segmentations. Each case was then subsequently matched against a database of 70 validated treatment plans and the best matching plan (termed auto-planned), was compared retrospectively with the clinical plans in terms of brain coverage and maximum doses to critical structures. Maximum doses were reduced by a maximum of 20.809 Gy for the left eye (mean 3.533), by 13.352 (1.311) for the right eye, and by 27.471 (4.856), 25.218 (6.315) for the left and right lens. Time from simulation to auto-plan was 3-4 minutes. Conclusion: Automated database- based matching is an alternative to classical treatment planning that improves quality while providing a cost—effective solution to planning through modifying previous validated plans to match a current patient's anatomy.« less

  1. Multicenter validation of the diagnostic accuracy of a blood-based gene expression test for assessing obstructive coronary artery disease in nondiabetic patients.

    PubMed

    Rosenberg, Steven; Elashoff, Michael R; Beineke, Philip; Daniels, Susan E; Wingrove, James A; Tingley, Whittemore G; Sager, Philip T; Sehnert, Amy J; Yau, May; Kraus, William E; Newby, L Kristin; Schwartz, Robert S; Voros, Szilard; Ellis, Stephen G; Tahirkheli, Naeem; Waksman, Ron; McPherson, John; Lansky, Alexandra; Winn, Mary E; Schork, Nicholas J; Topol, Eric J

    2010-10-05

    Diagnosing obstructive coronary artery disease (CAD) in at-risk patients can be challenging and typically requires both noninvasive imaging methods and coronary angiography, the gold standard. Previous studies have suggested that peripheral blood gene expression can indicate the presence of CAD. To validate a previously developed 23-gene, expression-based classification test for diagnosis of obstructive CAD in nondiabetic patients. Multicenter prospective trial with blood samples obtained before coronary angiography. (ClinicalTrials.gov registration number: NCT00500617) SETTING: 39 centers in the United States. An independent validation cohort of 526 nondiabetic patients with a clinical indication for coronary angiography. Receiver-operating characteristic (ROC) analysis of classifier score measured by real-time polymerase chain reaction, additivity to clinical factors, and reclassification of patient disease likelihood versus disease status defined by quantitative coronary angiography. Obstructive CAD was defined as 50% or greater stenosis in 1 or more major coronary arteries by quantitative coronary angiography. The area under the ROC curve (AUC) was 0.70 ± 0.02 (P < 0.001); the test added to clinical variables (Diamond-Forrester method) (AUC, 0.72 with the test vs. 0.66 without; P = 0.003) and added somewhat to an expanded clinical model (AUC, 0.745 with the test vs. 0.732 without; P = 0.089). The test improved net reclassification over both the Diamond-Forrester method and the expanded clinical model (P < 0.001). At a score threshold that corresponded to a 20% likelihood of obstructive CAD (14.75), the sensitivity and specificity were 85% and 43% (yielding a negative predictive value of 83% and a positive predictive value of 46%), with 33% of patient scores below this threshold. Patients with chronic inflammatory disorders, elevated levels of leukocytes or cardiac protein markers, or diabetes were excluded. A noninvasive whole-blood test based on gene expression and demographic characteristics may be useful for assessing obstructive CAD in nondiabetic patients without known CAD. CardioDx.

  2. Prediction of psychosis across protocols and risk cohorts using automated language analysis

    PubMed Central

    Corcoran, Cheryl M.; Carrillo, Facundo; Fernández‐Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C.; Bearden, Carrie E.; Cecchi, Guillermo A.

    2018-01-01

    Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer‐based natural language processing analyses, we previously showed that, among English‐speaking clinical (e.g., ultra) high‐risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross‐validate these automated linguistic analytic methods in a second larger risk cohort, also English‐speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine‐learning speech classifier – comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns – that had an 83% accuracy in predicting psychosis onset (intra‐protocol), a cross‐validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross‐protocol), and a 72% accuracy in discriminating the speech of recent‐onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at‐risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. PMID:29352548

  3. Prediction of psychosis across protocols and risk cohorts using automated language analysis.

    PubMed

    Corcoran, Cheryl M; Carrillo, Facundo; Fernández-Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C; Bearden, Carrie E; Cecchi, Guillermo A

    2018-02-01

    Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer-based natural language processing analyses, we previously showed that, among English-speaking clinical (e.g., ultra) high-risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross-validate these automated linguistic analytic methods in a second larger risk cohort, also English-speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine-learning speech classifier - comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns - that had an 83% accuracy in predicting psychosis onset (intra-protocol), a cross-validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross-protocol), and a 72% accuracy in discriminating the speech of recent-onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at-risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. © 2018 World Psychiatric Association.

  4. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study

    PubMed Central

    Perraton, Luke G.; Bower, Kelly J.; Adair, Brooke; Pua, Yong-Hao; Williams, Gavin P.; McGaw, Rebekah

    2015-01-01

    Introduction Hand-held dynamometry (HHD) has never previously been used to examine isometric muscle power. Rate of force development (RFD) is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power. Methods 30 healthy young adults (age: 23±5yrs, male: 15) were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2) and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC), Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change. Results Comparison of RFD methods revealed that a peak 200ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70) for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70) for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31–0.79). Conclusions Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in clinical populations. PMID:26509265

  5. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less

  6. Consensus Induced Fit Docking (cIFD): methodology, validation, and application to the discovery of novel Crm1 inhibitors

    NASA Astrophysics Data System (ADS)

    Kalid, Ori; Toledo Warshaviak, Dora; Shechter, Sharon; Sherman, Woody; Shacham, Sharon

    2012-11-01

    We present the Consensus Induced Fit Docking (cIFD) approach for adapting a protein binding site to accommodate multiple diverse ligands for virtual screening. This novel approach results in a single binding site structure that can bind diverse chemotypes and is thus highly useful for efficient structure-based virtual screening. We first describe the cIFD method and its validation on three targets that were previously shown to be challenging for docking programs (COX-2, estrogen receptor, and HIV reverse transcriptase). We then demonstrate the application of cIFD to the challenging discovery of irreversible Crm1 inhibitors. We report the identification of 33 novel Crm1 inhibitors, which resulted from the testing of 402 purchased compounds selected from a screening set containing 261,680 compounds. This corresponds to a hit rate of 8.2 %. The novel Crm1 inhibitors reveal diverse chemical structures, validating the utility of the cIFD method in a real-world drug discovery project. This approach offers a pragmatic way to implicitly account for protein flexibility without the additional computational costs of ensemble docking or including full protein flexibility during virtual screening.

  7. Analytical method for analysis of electromagnetic scattering from inhomogeneous spherical structures using duality principles

    NASA Astrophysics Data System (ADS)

    Kiani, M.; Abdolali, A.; Safari, M.

    2018-03-01

    In this article, an analytical approach is presented for the analysis of electromagnetic (EM) scattering from radially inhomogeneous spherical structures (RISSs) based on the duality principle. According to the spherical symmetry, similar angular dependencies in all the regions are considered using spherical harmonics. To extract the radial dependency, the system of differential equations of wave propagation toward the inhomogeneity direction is equated with the dual planar ones. A general duality between electromagnetic fields and parameters and scattering parameters of the two structures is introduced. The validity of the proposed approach is verified through a comprehensive example. The presented approach substitutes a complicated problem in spherical coordinate to an easy, well posed, and previously solved problem in planar geometry. This approach is valid for all continuously varying inhomogeneity profiles. One of the major advantages of the proposed method is the capability of studying two general and applicable types of RISSs. As an interesting application, a class of lens antenna based on the physical concept of the gradient refractive index material is introduced. The approach is used to analyze the EM scattering from the structure and validate strong performance of the lens.

  8. A simple, rapid and validated high-performance liquid chromatography method suitable for clinical measurements of human mercaptalbumin and non-mercaptalbumin.

    PubMed

    Yasukawa, Keiko; Shimosawa, Tatsuo; Okubo, Shigeo; Yatomi, Yutaka

    2018-01-01

    Background Human mercaptalbumin and human non-mercaptalbumin have been reported as markers for various pathological conditions, such as kidney and liver diseases. These markers play important roles in redox regulations throughout the body. Despite the recognition of these markers in various pathophysiologic conditions, the measurements of human mercaptalbumin and non-mercaptalbumin have not been popular because of the technical complexity and long measurement time of conventional methods. Methods Based on previous reports, we explored the optimal analytical conditions for a high-performance liquid chromatography method using an anion-exchange column packed with a hydrophilic polyvinyl alcohol gel. The method was then validated using performance tests as well as measurements of various patients' serum samples. Results We successfully established a reliable high-performance liquid chromatography method with an analytical time of only 12 min per test. The repeatability (within-day variability) and reproducibility (day-to-day variability) were 0.30% and 0.27% (CV), respectively. A very good correlation was obtained with the results of the conventional method. Conclusions A practical method for the clinical measurement of human mercaptalbumin and non-mercaptalbumin was established. This high-performance liquid chromatography method is expected to be a powerful tool enabling the expansion of clinical usefulness and ensuring the elucidation of the roles of albumin in redox reactions throughout the human body.

  9. Development and Validation of a New Blade Element Momentum Skewed-Wake Model within AeroDyn: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ning, S. A.; Hayman, G.; Damiani, R.

    Blade element momentum methods, though conceptually simple, are highly useful for analyzing wind turbines aerodynamics and are widely used in many design and analysis applications. A new version of AeroDyn is being developed to take advantage of new robust solution methodologies, conform to a new modularization framework for National Renewable Energy Laboratory's FAST, utilize advanced skewed-wake analysis methods, fix limitations with previous implementations, and to enable modeling of highly flexible and nonstraight blades. This paper reviews blade element momentum theory and several of the options available for analyzing skewed inflow. AeroDyn implementation details are described for the benefit of usersmore » and developers. These new options are compared to solutions from the previous version of AeroDyn and to experimental data. Finally, recommendations are given on how one might select from the various available solution approaches.« less

  10. Validation of a spectrophotometric method for quantification of carboxyhemoglobin.

    PubMed

    Luchini, Paulo D; Leyton, Jaime F; Strombech, Maria de Lourdes C; Ponce, Julio C; Jesus, Maria das Graças S; Leyton, Vilma

    2009-10-01

    The measurement of carboxyhemoglobin (COHb) levels in blood is a valuable procedure to confirm exposure to carbon monoxide (CO) either for forensic or occupational matters. A previously described method using spectrophotometric readings at 420 and 432 nm after reduction of oxyhemoglobin (O(2)Hb) and methemoglobin with sodium hydrosulfite solution leads to an exponential curve. This curve, used with pre-established factors, serves well for lower concentrations (1-7%) or for high concentrations (> 20%) but very rarely for both. The authors have observed that small variations on the previously described factors F1, F2, and F3, obtained from readings for 100% COHb and 100% O(2)Hb, turn into significant changes in COHb% results and propose that these factors should be determined every time COHb is measured by reading CO and O(2) saturated samples. This practice leads to an increase in accuracy and precision.

  11. Optimized multiparametric flow cytometric analysis of circulating endothelial cells and their subpopulations in peripheral blood of patients with solid tumors: a technical analysis.

    PubMed

    Zhou, Fangbin; Zhou, Yaying; Yang, Ming; Wen, Jinli; Dong, Jun; Tan, Wenyong

    2018-01-01

    Circulating endothelial cells (CECs) and their subpopulations could be potential novel biomarkers for various malignancies. However, reliable enumerable methods are warranted to further improve their clinical utility. This study aimed to optimize a flow cytometric method (FCM) assay for CECs and subpopulations in peripheral blood for patients with solid cancers. An FCM assay was used to detect and identify CECs. A panel of 60 blood samples, including 44 metastatic cancer patients and 16 healthy controls, were used in this study. Some key issues of CEC enumeration, including sample material and anticoagulant selection, optimal titration of antibodies, lysis/wash procedures of blood sample preparation, conditions of sample storage, sufficient cell events to enhance the signal, fluorescence-minus-one controls instead of isotype controls to reduce background noise, optimal selection of cell surface markers, and evaluating the reproducibility of our method, were integrated and investigated. Wilcoxon and Mann-Whitney U tests were used to determine statistically significant differences. In this validation study, we refined a five-color FCM method to detect CECs and their subpopulations in peripheral blood of patients with solid tumors. Several key technical issues regarding preanalytical elements, FCM data acquisition, and analysis were addressed. Furthermore, we clinically validated the utility of our method. The baseline levels of mature CECs, endothelial progenitor cells, and activated CECs were higher in cancer patients than healthy subjects ( P <0.01). However, there was no significant difference in resting CEC levels between healthy subjects and cancer patients ( P =0.193). We integrated and comprehensively addressed significant technical issues found in previously published assays and validated the reproducibility and sensitivity of our proposed method. Future work is required to explore the potential of our optimized method in clinical oncologic applications.

  12. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  13. Validity of a small low-cost triaxial accelerometer with integrated logger for uncomplicated measurements of postures and movements of head, upper back and upper arms.

    PubMed

    Dahlqvist, Camilla; Hansson, Gert-Åke; Forsman, Mikael

    2016-07-01

    Repetitive work and work in constrained postures are risk factors for developing musculoskeletal disorders. Low-cost, user-friendly technical methods to quantify these risks are needed. The aims were to validate inclination angles and velocities of one model of the new generation of accelerometers with integrated data loggers against a previously validated one, and to compare meaurements when using a plain reference posture with that of a standardized one. All mean (n = 12 subjects) angular RMS-differences in 4 work tasks and 4 body parts were <2.5° and all mean median angular velocity differences <5.0 °/s. The mean correlation between the inclination signal-pairs was 0.996. This model of the new generation of triaxial accelerometers proved to be comparable to the validated accelerometer using a data logger. This makes it well-suited, for both researchers and practitioners, to measure postures and movements during work. Further work is needed for validation of the plain reference posture for upper arms. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. The reliability and validity of fatigue measures during multiple-sprint work: an issue revisited.

    PubMed

    Glaister, Mark; Howatson, Glyn; Pattison, John R; McInnes, Gill

    2008-09-01

    The ability to repeatedly produce a high-power output or sprint speed is a key fitness component of most field and court sports. The aim of this study was to evaluate the validity and reliability of eight different approaches to quantify this parameter in tests of multiple-sprint performance. Ten physically active men completed two trials of each of two multiple-sprint running protocols with contrasting recovery periods. Protocol 1 consisted of 12 x 30-m sprints repeated every 35 seconds; protocol 2 consisted of 12 x 30-m sprints repeated every 65 seconds. All testing was performed in an indoor sports facility, and sprint times were recorded using twin-beam photocells. All but one of the formulae showed good construct validity, as evidenced by similar within-protocol fatigue scores. However, the assumptions on which many of the formulae were based, combined with poor or inconsistent test-retest reliability (coefficient of variation range: 0.8-145.7%; intraclass correlation coefficient range: 0.09-0.75), suggested many problems regarding logical validity. In line with previous research, the results support the percentage decrement calculation as the most valid and reliable method of quantifying fatigue in tests of multiple-sprint performance.

  15. Complex-Difference Constrained Compressed Sensing Reconstruction for Accelerated PRF Thermometry with Application to MRI Induced RF Heating

    PubMed Central

    Cao, Zhipeng; Oh, Sukhoon; Otazo, Ricardo; Sica, Christopher T.; Griswold, Mark A.; Collins, Christopher M.

    2014-01-01

    Purpose Introduce a novel compressed sensing reconstruction method to accelerate proton resonance frequency (PRF) shift temperature imaging for MRI induced radiofrequency (RF) heating evaluation. Methods A compressed sensing approach that exploits sparsity of the complex difference between post-heating and baseline images is proposed to accelerate PRF temperature mapping. The method exploits the intra- and inter-image correlations to promote sparsity and remove shared aliasing artifacts. Validations were performed on simulations and retrospectively undersampled data acquired in ex-vivo and in-vivo studies by comparing performance with previously proposed techniques. Results The proposed complex difference constrained compressed sensing reconstruction method improved the reconstruction of smooth and local PRF temperature change images compared to various available reconstruction methods in a simulation study, a retrospective study with heating of a human forearm in vivo, and a retrospective study with heating of a sample of beef ex vivo . Conclusion Complex difference based compressed sensing with utilization of a fully-sampled baseline image improves the reconstruction accuracy for accelerated PRF thermometry. It can be used to improve the volumetric coverage and temporal resolution in evaluation of RF heating due to MRI, and may help facilitate and validate temperature-based methods for safety assurance. PMID:24753099

  16. Development and validation of a spectrophotometric method for the determination of macrolide antibiotics by using 2,4-dinitrophenylhydrazine.

    PubMed

    Abdelmageed, Osama H

    2007-01-01

    A simple, novel, sensitive, and specific spectrophotometric method was developed and validated for the determination of azithromycin (AZ), clarithromycin (CLA), and roxithromycin (ROX) in bulk powders and their dosage forms. The proposed method was based on the interaction of any of the cited drugs with 2,4-dinitrophenylhydrazine in the presence of an acid catalyst, followed by treatment with a methanolic solution of potassium hydroxide; an intensely colored chromogen was formed that was measured in dimethylformamide, as the diluting solvent, at 542-545, 523-526, and 539-542 nm for AZ, CLA, and ROX, respectively. All variables affecting the development of the measured chromogens were studied and optimized. Beer's law was obeyed in the concentration ranges of 5-40, 5-35, and 5-35 microg/mL for AZ, CLA, and ROX, respectively, with good correlation coefficients (0.9991-0.9999). The limits of detection for this method ranged from 0.77 to 1.47 microg/mL, and the relative standard deviations were 1.24-1.8%. The proposed method was applied successfully to the determination of the 3 drugs in pure bulk form, tablets, and suspensions without interference from commonly encountered additives. The results compared favorably with those of a previously reported method. The mechanism of the reaction was also studied.

  17. UNCLES: method for the identification of genes differentially consistently co-expressed in a specific subset of datasets.

    PubMed

    Abu-Jamous, Basel; Fa, Rui; Roberts, David J; Nandi, Asoke K

    2015-06-04

    Collective analysis of the increasingly emerging gene expression datasets are required. The recently proposed binarisation of consensus partition matrices (Bi-CoPaM) method can combine clustering results from multiple datasets to identify the subsets of genes which are consistently co-expressed in all of the provided datasets in a tuneable manner. However, results validation and parameter setting are issues that complicate the design of such methods. Moreover, although it is a common practice to test methods by application to synthetic datasets, the mathematical models used to synthesise such datasets are usually based on approximations which may not always be sufficiently representative of real datasets. Here, we propose an unsupervised method for the unification of clustering results from multiple datasets using external specifications (UNCLES). This method has the ability to identify the subsets of genes consistently co-expressed in a subset of datasets while being poorly co-expressed in another subset of datasets, and to identify the subsets of genes consistently co-expressed in all given datasets. We also propose the M-N scatter plots validation technique and adopt it to set the parameters of UNCLES, such as the number of clusters, automatically. Additionally, we propose an approach for the synthesis of gene expression datasets using real data profiles in a way which combines the ground-truth-knowledge of synthetic data and the realistic expression values of real data, and therefore overcomes the problem of faithfulness of synthetic expression data modelling. By application to those datasets, we validate UNCLES while comparing it with other conventional clustering methods, and of particular relevance, biclustering methods. We further validate UNCLES by application to a set of 14 real genome-wide yeast datasets as it produces focused clusters that conform well to known biological facts. Furthermore, in-silico-based hypotheses regarding the function of a few previously unknown genes in those focused clusters are drawn. The UNCLES method, the M-N scatter plots technique, and the expression data synthesis approach will have wide application for the comprehensive analysis of genomic and other sources of multiple complex biological datasets. Moreover, the derived in-silico-based biological hypotheses represent subjects for future functional studies.

  18. HPLC–electrospray mass spectrometric assay for the determination of (R,R)-fenoterol in rat plasma

    PubMed Central

    Siluk, Danuta; Kim, Hee Seung; Cole, Tyler; Wainer, Irving W.

    2008-01-01

    A fast and specific liquid chromatography–mass spectrometry method for the determination of (R,R)-fenoterol ((R,R)-Fen) in rat plasma has been developed and validated. (R,R)-Fen was extracted from 125 µl of plasma using solid phase extraction and analyzed on Atlantis HILIC Silica 3 µm column. The mobile phase was composed of acetonitrile:ammonium acetate (pH 4.1; 20 mM) (85:15, v/v), at a flow rate of 0.2 ml/min. The lower limit of detection (LLOD) was 2 ng/ml . The procedure was validated and applied to the analysis of plasma samples from rats previously administered (R,R)-Fen in an intravenous bolus. PMID:18617349

  19. A new 4π(LS)-γ coincidence counter at NCBJ RC POLATOM with TDCR detector in the beta channel.

    PubMed

    Ziemek, T; Jęczmieniowski, A; Cacko, D; Broda, R; Lech, E

    2016-03-01

    A new 4π(LS)-γ coincidence system (TDCRG) was built at the NCBJ RC POLATOM. The counter consists of a TDCR detector in the beta channel and scintillation detector with NaI(Tl) crystal in the gamma channel. The system is equipped with a digital board with FPGA, which records and analyses coincidences in the TDCR detector and coincidences between the beta and gamma channels. The characteristics of the system and a scheme of the FPGA implementation with behavioral simulation are given. The TDCRG counter was validated by activity measurements on (14)C and (60)Co solutions standardized in RC POLATOM using previously validated methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Project on Elite Athlete Commitment (PEAK): III. An examination of the external validity across gender, and the expansion and clarification of the Sport Commitment Model.

    PubMed

    Scanlan, Tara K; Russell, David G; Magyar, T Michelle; Scanlan, Larry A

    2009-12-01

    The Sport Commitment Model was further tested using the Scanlan Collaborative Interview Method to examine its generalizability to New Zealand's elite female amateur netball team, the Silver Ferns. Results supported or clarified Sport Commitment Model predictions, revealed avenues for model expansion, and elucidated the functions of perceived competence and enjoyment in the commitment process. A comparison and contrast of the in-depth interview data from the Silver Ferns with previous interview data from a comparable elite team of amateur male athletes allowed assessment of model external validity, tested the generalizability of the underlying mechanisms, and separated gender differences from discrepancies that simply reflected team or idiosyncratic differences.

  1. On cat's eyes and multiple disjoint cells natural convection flow in tall tilted cavities

    NASA Astrophysics Data System (ADS)

    Báez, Elsa; Nicolás, Alfredo

    2014-10-01

    Natural convection fluid flow in air-filled tall tilted cavities is studied numerically with a direct projection method applied on the unsteady Boussinesq approximation in primitive variables. The study is focused on the so called cat's eyes and multiple disjoint cells as the aspect ratio A and the angle of inclination ϕ of the cavity vary. Results have already been reported with primitive and stream function-vorticity variables. The former are validated with the latter ones, which in turn were validated through mesh size and time-step independence studies. The new results complemented with the previous ones lead to find out the fluid motion and heat transfer invariant properties of this thermal phenomenon, which is the novelty here.

  2. An Exploration Based Cognitive Bias Test for Mice: Effects of Handling Method and Stereotypic Behaviour.

    PubMed

    Novak, Janja; Bailoo, Jeremy D; Melotti, Luca; Rommen, Jonas; Würbel, Hanno

    2015-01-01

    Behavioural tests to assess affective states are widely used in human research and have recently been extended to animals. These tests assume that affective state influences cognitive processing, and that animals in a negative affective state interpret ambiguous information as expecting a negative outcome (displaying a negative cognitive bias). Most of these tests however, require long discrimination training. The aim of the study was to validate an exploration based cognitive bias test, using two different handling methods, as previous studies have shown that standard tail handling of mice increases physiological and behavioural measures of anxiety compared to cupped handling. Therefore, we hypothesised that tail handled mice would display a negative cognitive bias. We handled 28 female CD-1 mice for 16 weeks using either tail handling or cupped handling. The mice were then trained in an eight arm radial maze, where two adjacent arms predicted a positive outcome (darkness and food), while the two opposite arms predicted a negative outcome (no food, white noise and light). After six days of training, the mice were also given access to the four previously unavailable intermediate ambiguous arms of the radial maze and tested for cognitive bias. We were unable to validate this test, as mice from both handling groups displayed a similar pattern of exploration. Furthermore, we examined whether maze exploration is affected by the expression of stereotypic behaviour in the home cage. Mice with higher levels of stereotypic behaviour spent more time in positive arms and avoided ambiguous arms, displaying a negative cognitive bias. While this test needs further validation, our results indicate that it may allow the assessment of affective state in mice with minimal training-a major confound in current cognitive bias paradigms.

  3. An Exploration Based Cognitive Bias Test for Mice: Effects of Handling Method and Stereotypic Behaviour

    PubMed Central

    Novak, Janja; Bailoo, Jeremy D.; Melotti, Luca; Rommen, Jonas; Würbel, Hanno

    2015-01-01

    Behavioural tests to assess affective states are widely used in human research and have recently been extended to animals. These tests assume that affective state influences cognitive processing, and that animals in a negative affective state interpret ambiguous information as expecting a negative outcome (displaying a negative cognitive bias). Most of these tests however, require long discrimination training. The aim of the study was to validate an exploration based cognitive bias test, using two different handling methods, as previous studies have shown that standard tail handling of mice increases physiological and behavioural measures of anxiety compared to cupped handling. Therefore, we hypothesised that tail handled mice would display a negative cognitive bias. We handled 28 female CD-1 mice for 16 weeks using either tail handling or cupped handling. The mice were then trained in an eight arm radial maze, where two adjacent arms predicted a positive outcome (darkness and food), while the two opposite arms predicted a negative outcome (no food, white noise and light). After six days of training, the mice were also given access to the four previously unavailable intermediate ambiguous arms of the radial maze and tested for cognitive bias. We were unable to validate this test, as mice from both handling groups displayed a similar pattern of exploration. Furthermore, we examined whether maze exploration is affected by the expression of stereotypic behaviour in the home cage. Mice with higher levels of stereotypic behaviour spent more time in positive arms and avoided ambiguous arms, displaying a negative cognitive bias. While this test needs further validation, our results indicate that it may allow the assessment of affective state in mice with minimal training—a major confound in current cognitive bias paradigms. PMID:26154309

  4. Content Validity of Temporal Bone Models Printed Via Inexpensive Methods and Materials.

    PubMed

    Bone, T Michael; Mowry, Sarah E

    2016-09-01

    Computed tomographic (CT) scans of the 3-D printed temporal bone models will be within 15% accuracy of the CT scans of the cadaveric temporal bones. Previous studies have evaluated the face validity of 3-D-printed temporal bone models designed to train otolaryngology residents. The purpose of the study was to determine the content validity of temporal bone models printed using inexpensive printers and materials. Four cadaveric temporal bones were randomly selected and clinical temporal bone CT scans were obtained. Models were generated using previously described methods in acrylonitrile butadiene styrene (ABS) plastic using the Makerbot Replicator 2× and Hyrel printers. Models were radiographically scanned using the same protocol as the cadaveric bones. Four images from each cadaveric CT series and four corresponding images from the model CT series were selected, and voxel values were normalized to black or white. Scan slices were compared using PixelDiff software. Gross anatomic structures were evaluated in the model scans by four board certified otolaryngologists on a 4-point scale. Mean pixel difference between the cadaver and model scans was 14.25 ± 2.30% at the four selected CT slices. Mean cortical bone width difference and mean external auditory canal width difference were 0.58 ± 0.66 mm and 0.55 ± 0.46 mm, respectively. Expert raters felt the mastoid air cells were well represented (2.5 ± 0.5), while middle ear and otic capsule structures were not accurately rendered (all averaged <1.8). These results suggest that these models would be sufficient adjuncts to cadaver temporal bones for training residents in cortical mastoidectomies, but less effective for middle ear procedures.

  5. Skin friction measurements by a new nonintrusive double-laser-beam oil viscosity balance technique

    NASA Technical Reports Server (NTRS)

    Monson, D. J.; Higuchi, H.

    1980-01-01

    A portable dual-laser-beam interferometer that nonintrusively measures skin friction by monitoring the thickness change of an oil film subject to shear stress is described. The method is an advance over past versions in that the troublesome and error-introducing need to measure the distance to the oil leading edge and the starting time for the oil flow has been eliminated. The validity of the method was verified by measuring oil viscosity in the laboratory, and then using those results to measure skin friction beneath the turbulent boundary layer in a low-speed wind tunnel. The dual-laser-beam skin friction measurements are compared with Preston tube measurements, with mean velocity profile data in a 'law-of-the-wall' coordinate system, and with computations based on turbulent boundary-layer theory. Excellent agreement is found in all cases. This validation and the aforementioned improvements appear to make the present form of the instrument usable to measure skin friction reliably and nonintrusively in a wide range of flow situations in which previous methods are not practical.

  6. Skin Friction Measurements by a Dual-Laser-Beam Interferometer Technique

    NASA Technical Reports Server (NTRS)

    Monson, D. J.; Higuchi, H.

    1981-01-01

    A portable dual-laser-beam interferometer that nonintrusively measures skin friction by monitoring the thickness change of an oil film subject to shear stress is described. The method is an advance over past versions in that the troublesome and error-introducing need to measure the distance to the oil leading edge and the starting time for the oil flow has been eliminated. The validity of the method was verified by measuring oil viscosity in the laboratory, and then using those results to measure skin friction beneath the turbulent boundary layer in a low speed wind tunnel. The dual-laser-beam skin friction measurements are compared with Preston tube measurements, with mean velocity profile data in a "law-of-the-well" coordinate system, and with computations based on turbulent boundary-layer theory. Excellent agreement is found in all cases. (This validation and the aforementioned improvements appear to make the present form of the instrument usable to measure skin friction reliably and nonintrusively in a wide range of flow situations in which previous methods are not practical.)

  7. Noninvasive recording of electrocardiogram in conscious rat: A new device.

    PubMed

    Kumar, Pradeep; Srivastava, Pooja; Gupta, Ankit; Bajpai, Manish

    2017-01-01

    Electrocardiogram (ECG) is an important tool for the study of cardiac electrophysiology both in human beings and experimental animals. Existing methods of ECG recording in small animals like rat have several limitations and ECG recordings of the anesthetized rat lack validity for heart rate (HR) variability analysis. The aim of the present study was to validate the ECG data from new device with ECG of anesthetized rat. The ECG was recorded on student's physiograph (BioDevice, Ambala) and suitable coupler and electrodes in six animals first by the newly developed device in conscious state and second in anesthetized state (stabilized technique). The data obtained were analyzed using unpaired t -test showed no significant difference ( P < 0.05) in QTc, QRS, and HR recorded by new device and established device in rats. No previous study describes a similar ECG recording in conscious state of rats. Thus, the present method may be a most physiological and inexpensive alternative to other methods. In this study, the animals were not restrained; they were just secured and represent a potential strength of the study.

  8. Predicting drug-induced liver injury using ensemble learning methods and molecular fingerprints.

    PubMed

    Ai, Haixin; Chen, Wen; Zhang, Li; Huang, Liangchao; Yin, Zimo; Hu, Huan; Zhao, Qi; Zhao, Jian; Liu, Hongsheng

    2018-05-21

    Drug-induced liver injury (DILI) is a major safety concern in the drug-development process, and various methods have been proposed to predict the hepatotoxicity of compounds during the early stages of drug trials. In this study, we developed an ensemble model using three machine learning algorithms and 12 molecular fingerprints from a dataset containing 1,241 diverse compounds. The ensemble model achieved an average accuracy of 71.1±2.6%, sensitivity of 79.9±3.6%, specificity of 60.3±4.8%, and area under the receiver operating characteristic curve (AUC) of 0.764±0.026 in five-fold cross-validation and an accuracy of 84.3%, sensitivity of 86.9%, specificity of 75.4%, and AUC of 0.904 in an external validation dataset of 286 compounds collected from the Liver Toxicity Knowledge Base (LTKB). Compared with previous methods, the ensemble model achieved relatively high accuracy and sensitivity. We also identified several substructures related to DILI. In addition, we provide a web server offering access to our models (http://ccsipb.lnu.edu.cn/toxicity/HepatoPred-EL/).

  9. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian F.; Robertson, Amy N.

    2016-07-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  10. Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Herrick, Gregory P.; Chen, Jen-Ping

    2012-01-01

    This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.

  11. Three-dimensional optic axis determination using variable-incidence-angle polarization-optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Ugryumova, Nadezhda; Gangnus, Sergei V.; Matcher, Stephen J.

    2006-08-01

    Polarization optical coherence tomography (PSOCT) is a powerful technique to nondestructively map the retardance and fast-axis orientation of birefringent biological tissues. Previous studies have concentrated on the case where the optic axis lies on the plane of the surface. We describe a method to determine the polar angle of the optic axis of a uniaxial birefringent tissue by making PSOCT measurements with a number of incident illumination directions. The method is validated on equine flexor tendon, yielding a variability of 4% for the true birefringence and 3% for the polar angle. We use the method to map the polar angle of fibers in the transitional region of equine cartilage.

  12. Wavelet-Based Motion Artifact Removal for Electrodermal Activity

    PubMed Central

    Chen, Weixuan; Jaques, Natasha; Taylor, Sara; Sano, Akane; Fedor, Szymon; Picard, Rosalind W.

    2017-01-01

    Electrodermal activity (EDA) recording is a powerful, widely used tool for monitoring psychological or physiological arousal. However, analysis of EDA is hampered by its sensitivity to motion artifacts. We propose a method for removing motion artifacts from EDA, measured as skin conductance (SC), using a stationary wavelet transform (SWT). We modeled the wavelet coefficients as a Gaussian mixture distribution corresponding to the underlying skin conductance level (SCL) and skin conductance responses (SCRs). The goodness-of-fit of the model was validated on ambulatory SC data. We evaluated the proposed method in comparison with three previous approaches. Our method achieved a greater reduction of artifacts while retaining motion-artifact-free data. PMID:26737714

  13. Diagnosing Students' Understanding of the Nature of Models

    NASA Astrophysics Data System (ADS)

    Gogolin, Sarah; Krüger, Dirk

    2017-10-01

    Students' understanding of models in science has been subject to a number of investigations. The instruments the researchers used are suitable for educational research but, due to their complexity, cannot be employed directly by teachers. This article presents forced choice (FC) tasks, which, assembled as a diagnostic instrument, are supposed to measure students' understanding of the nature of models efficiently, while being sensitive enough to detect differences between individuals. In order to evaluate if the diagnostic instrument is suitable for its intended use, we propose an approach that complies with the demand to integrate students' responses to the tasks into the validation process. Evidence for validity was gathered based on relations to other variables and on students' response processes. Students' understanding of the nature of models was assessed using three methods: FC tasks, open-ended tasks and interviews ( N = 448). Furthermore, concurrent think-aloud protocols ( N = 30) were performed. The results suggest that the method and the age of the students have an effect on their understanding of the nature of models. A good understanding of the FC tasks as well as a convergence in the findings across the three methods was documented for grades eleven and twelve. This indicates that teachers can use the diagnostic instrument for an efficient and, at the same time, valid diagnosis for this group. Finally, the findings of this article may provide a possible explanation for alternative findings from previous studies as a result of specific methods that were used.

  14. Landing Gear Noise Prediction and Analysis for Tube-and-Wing and Hybrid-Wing-Body Aircraft

    NASA Technical Reports Server (NTRS)

    Guo, Yueping; Burley, Casey L.; Thomas, Russell H.

    2016-01-01

    Improvements and extensions to landing gear noise prediction methods are developed. New features include installation effects such as reflection from the aircraft, gear truck angle effect, local flow calculation at the landing gear locations, gear size effect, and directivity for various gear designs. These new features have not only significantly improved the accuracy and robustness of the prediction tools, but also have enabled applications to unconventional aircraft designs and installations. Systematic validations of the improved prediction capability are then presented, including parametric validations in functional trends as well as validations in absolute amplitudes, covering a wide variety of landing gear designs, sizes, and testing conditions. The new method is then applied to selected concept aircraft configurations in the portfolio of the NASA Environmentally Responsible Aviation Project envisioned for the timeframe of 2025. The landing gear noise levels are on the order of 2 to 4 dB higher than previously reported predictions due to increased fidelity in accounting for installation effects and gear design details. With the new method, it is now possible to reveal and assess the unique noise characteristics of landing gear systems for each type of aircraft. To address the inevitable uncertainties in predictions of landing gear noise models for future aircraft, an uncertainty analysis is given, using the method of Monte Carlo simulation. The standard deviation of the uncertainty in predicting the absolute level of landing gear noise is quantified and determined to be 1.4 EPNL dB.

  15. Spontaneous Swallow Frequency Compared with Clinical Screening in the Identification of Dysphagia in Acute Stroke

    PubMed Central

    Crary, Michael A.; Carnaby, Giselle D.; Sia, Isaac

    2017-01-01

    Background The aim of this study was to compare spontaneous swallow frequency analysis (SFA) with clinical screening protocols for identification of dysphagia in acute stroke. Methods In all, 62 patients with acute stroke were evaluated for spontaneous swallow frequency rates using a validated acoustic analysis technique. Independent of SFA, these same patients received a routine nurse-administered clinical dysphagia screening as part of standard stroke care. Both screening tools were compared against a validated clinical assessment of dysphagia for acute stroke. In addition, psychometric properties of SFA were compared against published, validated clinical screening protocols. Results Spontaneous SFA differentiates patients with versus without dysphagia after acute stroke. Using a previously identified cut point based on swallows per minute, spontaneous SFA demonstrated superior ability to identify dysphagia cases compared with a nurse-administered clinical screening tool. In addition, spontaneous SFA demonstrated equal or superior psychometric properties to 4 validated, published clinical dysphagia screening tools. Conclusions Spontaneous SFA has high potential to identify dysphagia in acute stroke with psychometric properties equal or superior to clinical screening protocols. PMID:25088166

  16. HLPI-Ensemble: Prediction of human lncRNA-protein interactions based on ensemble strategy.

    PubMed

    Hu, Huan; Zhang, Li; Ai, Haixin; Zhang, Hui; Fan, Yetian; Zhao, Qi; Liu, Hongsheng

    2018-03-27

    LncRNA plays an important role in many biological and disease progression by binding to related proteins. However, the experimental methods for studying lncRNA-protein interactions are time-consuming and expensive. Although there are a few models designed to predict the interactions of ncRNA-protein, they all have some common drawbacks that limit their predictive performance. In this study, we present a model called HLPI-Ensemble designed specifically for human lncRNA-protein interactions. HLPI-Ensemble adopts the ensemble strategy based on three mainstream machine learning algorithms of Support Vector Machines (SVM), Random Forests (RF) and Extreme Gradient Boosting (XGB) to generate HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble, respectively. The results of 10-fold cross-validation show that HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble achieved AUCs of 0.95, 0.96 and 0.96, respectively, in the test dataset. Furthermore, we compared the performance of the HLPI-Ensemble models with the previous models through external validation dataset. The results show that the false positives (FPs) of HLPI-Ensemble models are much lower than that of the previous models, and other evaluation indicators of HLPI-Ensemble models are also higher than those of the previous models. It is further showed that HLPI-Ensemble models are superior in predicting human lncRNA-protein interaction compared with previous models. The HLPI-Ensemble is publicly available at: http://ccsipb.lnu.edu.cn/hlpiensemble/ .

  17. Fired Cartridge Case Identification Using Optical Images and the Congruent Matching Cells (CMC) Method

    PubMed Central

    Tong, Mingsi; Song, John; Chu, Wei; Thompson, Robert M

    2014-01-01

    The Congruent Matching Cells (CMC) method for ballistics identification was invented at the National Institute of Standards and Technology (NIST). The CMC method is based on the correlation of pairs of small correlation cells instead of the correlation of entire images. Four identification parameters – TCCF, Tθ, Tx and Ty are proposed for identifying correlated cell pairs originating from the same firearm. The correlation conclusion (matching or non-matching) is determined by whether the number of CMC is ≥ 6. This method has been previously validated using a set of 780 pair-wise 3D topography images. However, most ballistic images stored in current local and national databases are in an optical intensity (grayscale) format. As a result, the reliability of applying the CMC method on optical intensity images is an important issue. In this paper, optical intensity images of breech face impressions captured on the same set of 40 cartridge cases are correlated and analyzed for the validation test of CMC method using optical images. This includes correlations of 63 pairs of matching images and 717 pairs of non-matching images under top ring lighting. Tests of the method do not produce any false identification (false positive) or false exclusion (false negative) results, which support the CMC method and the proposed identification criterion, C = 6, for firearm breech face identifications using optical intensity images. PMID:26601045

  18. Fired Cartridge Case Identification Using Optical Images and the Congruent Matching Cells (CMC) Method.

    PubMed

    Tong, Mingsi; Song, John; Chu, Wei; Thompson, Robert M

    2014-01-01

    The Congruent Matching Cells (CMC) method for ballistics identification was invented at the National Institute of Standards and Technology (NIST). The CMC method is based on the correlation of pairs of small correlation cells instead of the correlation of entire images. Four identification parameters - T CCF, T θ, T x and T y are proposed for identifying correlated cell pairs originating from the same firearm. The correlation conclusion (matching or non-matching) is determined by whether the number of CMC is ≥ 6. This method has been previously validated using a set of 780 pair-wise 3D topography images. However, most ballistic images stored in current local and national databases are in an optical intensity (grayscale) format. As a result, the reliability of applying the CMC method on optical intensity images is an important issue. In this paper, optical intensity images of breech face impressions captured on the same set of 40 cartridge cases are correlated and analyzed for the validation test of CMC method using optical images. This includes correlations of 63 pairs of matching images and 717 pairs of non-matching images under top ring lighting. Tests of the method do not produce any false identification (false positive) or false exclusion (false negative) results, which support the CMC method and the proposed identification criterion, C = 6, for firearm breech face identifications using optical intensity images.

  19. Risk prediction models of breast cancer: a systematic review of model performances.

    PubMed

    Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin

    2012-05-01

    The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.

  20. Automated finite element meshing of the lumbar spine: Verification and validation with 18 specimen-specific models.

    PubMed

    Campbell, J Q; Coombs, D J; Rao, M; Rullkoetter, P J; Petrella, A J

    2016-09-06

    The purpose of this study was to seek broad verification and validation of human lumbar spine finite element models created using a previously published automated algorithm. The automated algorithm takes segmented CT scans of lumbar vertebrae, automatically identifies important landmarks and contact surfaces, and creates a finite element model. Mesh convergence was evaluated by examining changes in key output variables in response to mesh density. Semi-direct validation was performed by comparing experimental results for a single specimen to the automated finite element model results for that specimen with calibrated material properties from a prior study. Indirect validation was based on a comparison of results from automated finite element models of 18 individual specimens, all using one set of generalized material properties, to a range of data from the literature. A total of 216 simulations were run and compared to 186 experimental data ranges in all six primary bending modes up to 7.8Nm with follower loads up to 1000N. Mesh convergence results showed less than a 5% difference in key variables when the original mesh density was doubled. The semi-direct validation results showed that the automated method produced results comparable to manual finite element modeling methods. The indirect validation results showed a wide range of outcomes due to variations in the geometry alone. The studies showed that the automated models can be used to reliably evaluate lumbar spine biomechanics, specifically within our intended context of use: in pure bending modes, under relatively low non-injurious simulated in vivo loads, to predict torque rotation response, disc pressures, and facet forces. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. An extended validation of the last generation of particle finite element method for free surface flows

    NASA Astrophysics Data System (ADS)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  2. Automated solid-phase extraction-liquid chromatography-tandem mass spectrometry analysis of 11-nor-Delta9-tetrahydrocannabinol-9-carboxylic acid in human urine specimens: application to a high-throughput urine analysis laboratory.

    PubMed

    Robandt, P V; Klette, K L; Sibum, M

    2009-10-01

    An automated solid-phase extraction coupled with liquid chromatography and tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of 11-nor-Delta(9)-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in human urine specimens was developed. The method was linear (R(2) = 0.9986) to 1000 ng/mL with no carryover evidenced at 2000 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision was evaluated at the 15 ng/mL level over nine batches spanning 15 days (n = 45). The coefficient of variation (%CV) was found to be 5.5% over the course of the validation. Intrarun precision of a 15 ng/mL control (n = 5) ranged from 0.58% CV to 7.4% CV for the same set of analytical batches. Interference was tested using (+/-)-11-hydroxy-Delta(9)-tetrahydrocannabinol, cannabidiol, (-)-Delta(8)-tetrahydrocannabinol, and cannabinol. One hundred and nineteen specimens previously found to contain THC-COOH by a previously validated gas chromatographic mass spectrometry (GC-MS) procedure were compared to the SPE-LC-MS-MS method. Excellent agreement was found (R(2) = 0.9925) for the parallel comparison study. The automated SPE procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. Additionally, method runtime is greatly reduced (e.g., during parallel studies the SPE-LC-MS-MS instrument was often finished with analysis by the time the technician finished the offline SPE and derivatization procedure prior to the GC-MS analysis).

  3. Assessment of mitral regurgitation severity by Doppler color flow mapping of the vena contracta in dogs.

    PubMed

    Di Marcello, M; Terzo, E; Locatelli, C; Palermo, V; Sala, E; Dall'Aglio, E; Bussadori, C M; Spalla, I; Brambilla, P G

    2014-01-01

    Quantitative and semiquantitative methods have been proposed for the assessment of MR severity, and though all are associated with limitations. Measurement of vena contracta width (VCW) has been used in clinical practice. To measure the VCW in dogs with different levels of MR severity. Two hundred and seventy-nine dogs were classified according to 5 levels of MR severity. This was a retrospective study. EROA and regurgitant volume calculated by the PISA method, were measured and indexed to BSA. Descriptive statistics were calculated for VCW and VCW index for all categories of MR severity. Spearman's rank correlation coefficients (ρs ) were calculated to compare the results of the different methods (VCW and VCW index vs RV PISA, RV PISA index, EROA, EROA index), and between VCW and VCW index versus MR severity. All Spearman's rank correlation coefficients were significant (P < .001). The median values of VCW resulted of 2.9 mm (IQR 3.4-2.5) and of 4.6 mm (IQR 5.4-4.1) in the groups previously classified as mild-to-moderate and moderate-to-severe, respectively. The median values of VCW index resulted of 4.4 mm/m(2) (IQR = 5.5-4.2) in mild-to-moderate MR and of 10.8 mm/m(2) (IQR = 12.8-9.4) in moderate-to-severe MR. This is not a validation study against any previously validated invasive gold standard, the VCW method has proved easy to employ and it might be an additional tool in quantifying disease severity that supports, rather than replace, data coming from other techniques in daily clinical practice and research. Copyright © 2014 by the American College of Veterinary Internal Medicine.

  4. Level A in vitro-in vivo correlation: Application to establish a dissolution test for artemether and lumefantrine tablets.

    PubMed

    Rivelli, Graziella Gomes; Ricoy, Letícia Brandão Magalhães; César, Isabela Costa; Fernandes, Christian; Pianetti, Gérson Antônio

    2018-06-05

    Malaria is the most incident parasite infection worldwide. Artemisinin based combination therapy (ACT) has been proposed as a promising treatment for malaria, and artemether + lumefantrine (20 + 120 mg) is the recommended association in endemic areas. Despite its widespread use, there is still scarce information about dissolution of artemether and lumefantrine, reflecting in the absence of a specific method in pharmacopoeias and international compendia. Because the of their low solubility, both artemether and lumefantrine are candidates for in vitro-in vivo correlation (IVIVC) studies. Previous equilibrium solubility studies have been carried out for both drugs using the shake-flask method and dissolution profiles. Experiments were conducted with a range of parameters such as medium composition, pH and surfactants. In vivo data obtained in a previous pharmacokinetic study was used to select the optimum conditions for dissolution test, based on IVIVC. For drug quantitation, a selective method by high performance liquid chromatography was optimized and validated. For this dosage form, the best dissolution conditions found for artemether were: paddles, 900 mL of dissolution medium containing phosphate buffer pH 6.8 with 1.0% sodium lauryl sulfate and rotation speed of 100 rpm. The same was obtained for lumefantrine, except the dissolution medium, which was pH 1.2 with 1.0% polysorbate 80. After obtaining the curve of in vitro dissolved fraction versus in vivo absorbed fraction, the calculated coefficient of determination (R squared) was close to 1.00 for both drugs, indicating a level A correlation. Therefore, a novel method for assessing dissolution of arthemeter and lumefantrine tablets was established and validated. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. A comparison of reliability and construct validity between the original and revised versions of the Rosenberg Self-Esteem Scale.

    PubMed

    Wongpakaran, Tinakon; Tinakon, Wongpakaran; Wongpakaran, Nahathai; Nahathai, Wongpakaran

    2012-03-01

    The Rosenberg Self-Esteem Scale (RSES) is a widely used instrument that has been tested for reliability and validity in many settings; however, some negative-worded items appear to have caused it to reveal low reliability in a number of studies. In this study, we revised one negative item that had previously (from the previous studies) produced the worst outcome in terms of the structure of the scale, then re-analyzed the new version for its reliability and construct validity, comparing it to the original version with respect to fit indices. In total, 851 students from Chiang Mai University (mean age: 19.51±1.7, 57% of whom were female), participated in this study. Of these, 664 students completed the Thai version of the original RSES - containing five positively worded and five negatively worded items, while 187 students used the revised version containing six positively worded and four negatively worded items. Confirmatory factor analysis was applied, using a uni-dimensional model with method effects and a correlated uniqueness approach. The revised version showed the same level of reliability (good) as the original, but yielded a better model fit. The revised RSES demonstrated excellent fit statistics, with χ²=29.19 (df=19, n=187, p=0.063), GFI=0.970, TFI=0.969, NFI=0.964, CFI=0.987, SRMR=0.040 and RMSEA=0.054. The revised version of the Thai RSES demonstrated an equivalent level of reliability but a better construct validity when compared to the original.

  6. Hierarchical Clustering on the Basis of Inter-Job Similarity as a Tool in Validity Generalization

    ERIC Educational Resources Information Center

    Mobley, William H.; Ramsay, Robert S.

    1973-01-01

    The present research was stimulated by three related problems frequently faced in validation research: viable procedures for combining similar jobs in order to assess the validity of various predictors, for assessing groups of jobs represented in previous validity studies, and for assessing the applicability of validity findings between units.…

  7. The Philosophy, Theoretical Bases, and Implementation of the AHAAH Model for Evaluation of Hazard from Exposure to Intense Sounds

    DTIC Science & Technology

    2018-04-01

    empirical, external energy-damage correlation methods for evaluating hearing damage risk associated with impulsive noise exposure. AHAAH applies the...is validated against the measured results of human exposures to impulsive sounds, and unlike wholly empirical correlation approaches, AHAAH’s...a measured level (LAEQ8 of 85 dB). The approach in MIL-STD-1474E is very different. Previous standards tried to find a correlation between some

  8. Assessment of effect of Yb3+ ion pairs on a highly Yb-doped double-clad fibre laser

    NASA Astrophysics Data System (ADS)

    Vallés, J. A.; Martín, J. C.; Berdejo, V.; Cases, R.; Álvarez, J. M.; Rebolledo, M. Á.

    2018-03-01

    Using a previously validated characterization method based on the careful measurement of the characteristic parameters and fluorescence emission spectra of a highly Yb-doped double-clad fibre, we evaluate the contribution of ion pair induced processes to the output power of a double-clad Yb-doped fibre ring laser. This contribution is proved to be insignificant, contrary to analysis by other authors, who overestimate the role of ion pairs.

  9. Limb-Enhancer Genie: An accessible resource of accurate enhancer predictions in the developing limb

    DOE PAGES

    Monti, Remo; Barozzi, Iros; Osterwalder, Marco; ...

    2017-08-21

    Epigenomic mapping of enhancer-associated chromatin modifications facilitates the genome-wide discovery of tissue-specific enhancers in vivo. However, reliance on single chromatin marks leads to high rates of false-positive predictions. More sophisticated, integrative methods have been described, but commonly suffer from limited accessibility to the resulting predictions and reduced biological interpretability. Here we present the Limb-Enhancer Genie (LEG), a collection of highly accurate, genome-wide predictions of enhancers in the developing limb, available through a user-friendly online interface. We predict limb enhancers using a combination of > 50 published limb-specific datasets and clusters of evolutionarily conserved transcription factor binding sites, taking advantage ofmore » the patterns observed at previously in vivo validated elements. By combining different statistical models, our approach outperforms current state-of-the-art methods and provides interpretable measures of feature importance. Our results indicate that including a previously unappreciated score that quantifies tissue-specific nuclease accessibility significantly improves prediction performance. We demonstrate the utility of our approach through in vivo validation of newly predicted elements. Moreover, we describe general features that can guide the type of datasets to include when predicting tissue-specific enhancers genome-wide, while providing an accessible resource to the general biological community and facilitating the functional interpretation of genetic studies of limb malformations.« less

  10. Validation of a Commercially Available Enzyme ImmunoAssay for the Determination of Oxytocin in Plasma Samples from Seven Domestic Animal Species.

    PubMed

    Bienboire-Frosini, Cecile; Chabaud, Camille; Cozzi, Alessandro; Codecasa, Elisa; Pageat, Patrick

    2017-01-01

    The neurohormone oxytocin (OT) has a broad range of behavioral effects in mammals. It modulates a multitude of social behaviors, e.g., affiliative and sexual interactions. Consequently, the OT role in various animal species is increasingly explored. However, several issues have been raised regarding the peripheral OT measurement. Indeed, various methods have been described, leading to assay discrepancies and inconsistent results. This highlights the need for a recognized and reliable method to measure peripheral OT. Our aim was to validate a method combining a pre-extraction step, previously demonstrated as essential by several authors, and a commercially available enzyme immunoassay (EIA) for OT measurement, using plasma from seven domestic species (cat, dog, horse, cow, pig, sheep, and goat). The Oxytocin EIA kit (EnzoLifeSciences) was used to assay the solid-phase extracted samples following the manufacturer's instructions with slight modifications. For all species except dogs and cats, concentration factors were applied to work above the kit's sensitivity (15 pg/ml). To validate the method, the following performance characteristics were evaluated using Validation Samples (VS) at various concentrations in each species: extraction efficiency via spiking tests and intra- and inter-assay precision, allowing for the calculation of total errors. Parallelism studies to assess matrix effects could not be performed because of too low basal concentrations. Quantification ranges and associated precision profiles were established to account for the various OT plasma concentrations in each species. According to guidelines for bioanalytical validation of immunoassays, the measurements were sufficiently precise and accurate in each species to achieve a total error ≤30% in each VS sample. In each species, the inter-assay precision after 3 runs was acceptable, except in low concentration samples. The linearity under dilution of dogs and cats' samples was verified. Although matrix effects assessments are lacking, our results indicate that OT plasma levels can reliably be measured in several domestic animal species by the method described here. Studies involving samples with low OT plasma concentrations should pay attention to reproducibility issues. This work opens new perspectives to reliably study peripheral OT in a substantial number of domestic animal species in various behavioral contexts.

  11. Resolved-particle simulation by the Physalis method: Enhancements and new capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierakowski, Adam J., E-mail: sierakowski@jhu.edu; Prosperetti, Andrea; Faculty of Science and Technology and J.M. Burgers Centre for Fluid Dynamics, University of Twente, P.O. Box 217, 7500 AE Enschede

    2016-03-15

    We present enhancements and new capabilities of the Physalis method for simulating disperse multiphase flows using particle-resolved simulation. The current work enhances the previous method by incorporating a new type of pressure-Poisson solver that couples with a new Physalis particle pressure boundary condition scheme and a new particle interior treatment to significantly improve overall numerical efficiency. Further, we implement a more efficient method of calculating the Physalis scalar products and incorporate short-range particle interaction models. We provide validation and benchmarking for the Physalis method against experiments of a sedimenting particle and of normal wall collisions. We conclude with an illustrativemore » simulation of 2048 particles sedimenting in a duct. In the appendix, we present a complete and self-consistent description of the analytical development and numerical methods.« less

  12. Genetic demographic networks: Mathematical model and applications.

    PubMed

    Kimmel, Marek; Wojdyła, Tomasz

    2016-10-01

    Recent improvement in the quality of genetic data obtained from extinct human populations and their ancestors encourages searching for answers to basic questions regarding human population history. The most common and successful are model-based approaches, in which genetic data are compared to the data obtained from the assumed demography model. Using such approach, it is possible to either validate or adjust assumed demography. Model fit to data can be obtained based on reverse-time coalescent simulations or forward-time simulations. In this paper we introduce a computational method based on mathematical equation that allows obtaining joint distributions of pairs of individuals under a specified demography model, each of them characterized by a genetic variant at a chosen locus. The two individuals are randomly sampled from either the same or two different populations. The model assumes three types of demographic events (split, merge and migration). Populations evolve according to the time-continuous Moran model with drift and Markov-process mutation. This latter process is described by the Lyapunov-type equation introduced by O'Brien and generalized in our previous works. Application of this equation constitutes an original contribution. In the result section of the paper we present sample applications of our model to both simulated and literature-based demographies. Among other we include a study of the Slavs-Balts-Finns genetic relationship, in which we model split and migrations between the Balts and Slavs. We also include another example that involves the migration rates between farmers and hunters-gatherers, based on modern and ancient DNA samples. This latter process was previously studied using coalescent simulations. Our results are in general agreement with the previous method, which provides validation of our approach. Although our model is not an alternative to simulation methods in the practical sense, it provides an algorithm to compute pairwise distributions of alleles, in the case of haploid non-recombining loci such as mitochondrial and Y-chromosome loci in humans. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps.

    PubMed

    Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás

    2014-02-01

    Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps.

  14. Scoring Methods for Building Genotypic Scores: An Application to Didanosine Resistance in a Large Derivation Set

    PubMed Central

    Houssaini, Allal; Assoumou, Lambert; Miller, Veronica; Calvez, Vincent; Marcelin, Anne-Geneviève; Flandre, Philippe

    2013-01-01

    Background Several attempts have been made to determine HIV-1 resistance from genotype resistance testing. We compare scoring methods for building weighted genotyping scores and commonly used systems to determine whether the virus of a HIV-infected patient is resistant. Methods and Principal Findings Three statistical methods (linear discriminant analysis, support vector machine and logistic regression) are used to determine the weight of mutations involved in HIV resistance. We compared these weighted scores with known interpretation systems (ANRS, REGA and Stanford HIV-db) to classify patients as resistant or not. Our methodology is illustrated on the Forum for Collaborative HIV Research didanosine database (N = 1453). The database was divided into four samples according to the country of enrolment (France, USA/Canada, Italy and Spain/UK/Switzerland). The total sample and the four country-based samples allow external validation (one sample is used to estimate a score and the other samples are used to validate it). We used the observed precision to compare the performance of newly derived scores with other interpretation systems. Our results show that newly derived scores performed better than or similar to existing interpretation systems, even with external validation sets. No difference was found between the three methods investigated. Our analysis identified four new mutations associated with didanosine resistance: D123S, Q207K, H208Y and K223Q. Conclusions We explored the potential of three statistical methods to construct weighted scores for didanosine resistance. Our proposed scores performed at least as well as already existing interpretation systems and previously unrecognized didanosine-resistance associated mutations were identified. This approach could be used for building scores of genotypic resistance to other antiretroviral drugs. PMID:23555613

  15. Do Current Recommendations for Upper Instrumented Vertebra Predict Shoulder Imbalance? An Attempted Validation of Level Selection for Adolescent Idiopathic Scoliosis.

    PubMed

    Bjerke, Benjamin T; Cheung, Zoe B; Shifflett, Grant D; Iyer, Sravisht; Derman, Peter B; Cunningham, Matthew E

    2015-10-01

    Shoulder balance for adolescent idiopathic scoliosis (AIS) patients is associated with patient satisfaction and self-image. However, few validated systems exist for selecting the upper instrumented vertebra (UIV) post-surgical shoulder balance. The purpose is to examine the existing UIV selection criteria and correlate with post-surgical shoulder balance in AIS patients. Patients who underwent spinal fusion at age 10-18 years for AIS over a 6-year period were reviewed. All patients with a minimum of 1-year radiographic follow-up were included. Imbalance was determined to be radiographic shoulder height |RSH| ≥ 15 mm at latest follow-up. Three UIV selection methods were considered: Lenke, Ilharreborde, and Trobisch. A recommended UIV was determined using each method from pre-surgical radiographs. The recommended UIV for each method was compared to the actual UIV instrumented for all three methods; concordance between these levels was defined as "Correct" UIV selection, and discordance was defined as "Incorrect" selection. One hundred seventy-one patients were included with 2.3 ± 1.1 year follow-up. For all methods, "Correct" UIV selection resulted in more shoulder imbalance than "Incorrect" UIV selection. Overall shoulder imbalance incidence was improved from 31.0% (53/171) to 15.2% (26/171). New shoulder imbalance incidence for patients with previously level shoulders was 8.8%. We could not identify a set of UIV selection criteria that accurately predicted post-surgical shoulder balance. Further validated measures are needed in this area. The complexity of proximal thoracic curve correction is underscored in a case example, where shoulder imbalance occurred despite "Correct" UIV selection by all methods.

  16. The Integral Theory System Questionnaire: an anatomically directed questionnaire to determine pelvic floor dysfunctions in women.

    PubMed

    Wagenlehner, Florian Martin Erich; Fröhlich, Oliver; Bschleipfer, Thomas; Weidner, Wolfgang; Perletti, Gianpaolo

    2014-06-01

    Anatomical damage to pelvic floor structures may cause multiple symptoms. The Integral Theory System Questionnaire (ITSQ) is a holistic questionnaire that uses symptoms to help locate damage in specific connective tissue structures as a guide to reconstructive surgery. It is based on the integral theory, which states that pelvic floor symptoms and prolapse are both caused by lax suspensory ligaments. The aim of the present study was to psychometrically validate the ITSQ. Established psychometric properties including validity, reliability, and responsiveness were considered for evaluation. Criterion validity was assessed in a cohort of 110 women with pelvic floor dysfunctions by analyzing the correlation of questionnaire responses with objective clinical data. Test-retest was performed with questionnaires from 47 patients. Cronbach's alpha and "split-half" reliability coefficients were calculated for inner consistency analysis. Psychometric properties of ITSQ were comparable to the ones of previously validated Pelvic Floor Questionnaires. Face validity and content validity were approved by an expert group of the International Collaboration of Pelvic Floor surgeons. Convergent validity assessed using Bayesian method was at least as accurate as the expert assessment of anatomical defects. Objective data measurement in patients demonstrated significant correlations with ITSQ domains fulfilling criterion validity. Internal consistency values ranked from 0.85 to 0.89 in different scenarios. The ITSQ proofed accurate and is able to serve as a holistic Pelvic Floor Questionnaire directing symptoms to site-specific pelvic floor reconstructive surgery.

  17. Multisensor system for toxic gases detection generated on indoor environments

    NASA Astrophysics Data System (ADS)

    Durán, C. M.; Monsalve, P. A. G.; Mosquera, C. J.

    2016-11-01

    This work describes a wireless multisensory system for different toxic gases detection generated on indoor environments (i.e., Underground coal mines, etc.). The artificial multisensory system proposed in this study was developed through a set of six chemical gas sensors (MQ) of low cost with overlapping sensitivities to detect hazardous gases in the air. A statistical parameter was implemented to the data set and two pattern recognition methods such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA) were used for feature selection. The toxic gases categories were classified with a Probabilistic Neural Network (PNN) in order to validate the results previously obtained. The tests were carried out to verify feasibility of the application through a wireless communication model which allowed to monitor and store the information of the sensor signals for the appropriate analysis. The success rate in the measures discrimination was 100%, using an artificial neural network where leave-one-out was used as cross validation method.

  18. Improved spectrophotometric analysis of fullerenes C60 and C70 in high-solubility organic solvents.

    PubMed

    Törpe, Alexander; Belton, Daniel J

    2015-01-01

    Fullerenes are among a number of recently discovered carbon allotropes that exhibit unique and versatile properties. The analysis of these materials is of great importance and interest. We present previously unreported spectroscopic data for C60 and C70 fullerenes in high-solubility solvents, including error bounds, so as to allow reliable colorimetric analysis of these materials. The Beer-Lambert-Bouguer law is found to be valid at all wavelengths. The measured data were highly reproducible, and yielded high-precision molar absorbance coefficients for C60 and C70 in o-xylene and o-dichlorobenzene, which both exhibit a high solubility for these fullerenes, and offer the prospect of improved extraction efficiency. A photometric method for a C60/C70 mixture analysis was validated with standard mixtures, and subsequently improved for real samples by correcting for light scattering, using a power-law fit. The method was successfully applied to the analysis of C60/C70 mixtures extracted from fullerene soot.

  19. Validation and application of a quantitative real-time PCR assay to detect common wheat adulteration of durum wheat for pasta production.

    PubMed

    Carloni, Elisa; Amagliani, Giulia; Omiccioli, Enrica; Ceppetelli, Veronica; Del Mastro, Michele; Rotundo, Luca; Brandi, Giorgio; Magnani, Mauro

    2017-06-01

    Pasta is the Italian product par excellence and it is now popular worldwide. Pasta of a superior quality is made with pure durum wheat. In Italy, addition of Triticum aestivum (common wheat) during manufacturing is not allowed and, without adequate labeling, its presence is considered an adulteration. PCR-related techniques can be employed for the detection of common wheat contaminations. In this work, we demonstrated that a previously published method for the detection of T. aestivum, based on the gliadin gene, is inadequate. Moreover, a new molecular method, based on DNA extraction from semolina and real-time PCR determination of T. aestivum in Triticum spp., was validated. This multiplex real-time PCR, based on the dual-labeled probe strategy, guarantees target detection specificity and sensitivity in a short period of time. Moreover, the molecular analysis of common wheat contamination in commercial wheat and flours is described for the first time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Transferability and inter-laboratory variability assessment of the in vitro bovine oocyte fertilization test.

    PubMed

    Tessaro, Irene; Modina, Silvia C; Crotti, Gabriella; Franciosi, Federica; Colleoni, Silvia; Lodde, Valentina; Galli, Cesare; Lazzari, Giovanna; Luciano, Alberto M

    2015-01-01

    The dramatic increase in the number of animals required for reproductive toxicity testing imposes the validation of alternative methods to reduce the use of laboratory animals. As we previously demonstrated for in vitro maturation test of bovine oocytes, the present study describes the transferability assessment and the inter-laboratory variability of an in vitro test able to identify chemical effects during the process of bovine oocyte fertilization. Eight chemicals with well-known toxic properties (benzo[a]pyrene, busulfan, cadmium chloride, cycloheximide, diethylstilbestrol, ketoconazole, methylacetoacetate, mifepristone/RU-486) were tested in two well-trained laboratories. The statistical analysis demonstrated no differences in the EC50 values for each chemical in within (inter-runs) and in between-laboratory variability of the proposed test. We therefore conclude that the bovine in vitro fertilization test could advance toward the validation process as alternative in vitro method and become part of an integrated testing strategy in order to predict chemical hazards on mammalian fertility. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  2. A Bayesian Approach for Measurements of Stray Neutrons at Proton Therapy Facilities: Quantifying Neutron Dose Uncertainty.

    PubMed

    Dommert, M; Reginatto, M; Zboril, M; Fiedler, F; Helmbrecht, S; Enghardt, W; Lutz, B

    2017-11-28

    Bonner sphere measurements are typically analyzed using unfolding codes. It is well known that it is difficult to get reliable estimates of uncertainties for standard unfolding procedures. An alternative approach is to analyze the data using Bayesian parameter estimation. This method provides reliable estimates of the uncertainties of neutron spectra leading to rigorous estimates of uncertainties of the dose. We extend previous Bayesian approaches and apply the method to stray neutrons in proton therapy environments by introducing a new parameterized model which describes the main features of the expected neutron spectra. The parameterization is based on information that is available from measurements and detailed Monte Carlo simulations. The validity of this approach has been validated with results of an experiment using Bonner spheres carried out at the experimental hall of the OncoRay proton therapy facility in Dresden. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. A methodology for the analysis of differential coexpression across the human lifespan.

    PubMed

    Gillis, Jesse; Pavlidis, Paul

    2009-09-22

    Differential coexpression is a change in coexpression between genes that may reflect 'rewiring' of transcriptional networks. It has previously been hypothesized that such changes might be occurring over time in the lifespan of an organism. While both coexpression and differential expression of genes have been previously studied in life stage change or aging, differential coexpression has not. Generalizing differential coexpression analysis to many time points presents a methodological challenge. Here we introduce a method for analyzing changes in coexpression across multiple ordered groups (e.g., over time) and extensively test its validity and usefulness. Our method is based on the use of the Haar basis set to efficiently represent changes in coexpression at multiple time scales, and thus represents a principled and generalizable extension of the idea of differential coexpression to life stage data. We used published microarray studies categorized by age to test the methodology. We validated the methodology by testing our ability to reconstruct Gene Ontology (GO) categories using our measure of differential coexpression and compared this result to using coexpression alone. Our method allows significant improvement in characterizing these groups of genes. Further, we examine the statistical properties of our measure of differential coexpression and establish that the results are significant both statistically and by an improvement in semantic similarity. In addition, we found that our method finds more significant changes in gene relationships compared to several other methods of expressing temporal relationships between genes, such as coexpression over time. Differential coexpression over age generates significant and biologically relevant information about the genes producing it. Our Haar basis methodology for determining age-related differential coexpression performs better than other tested methods. The Haar basis set also lends itself to ready interpretation in terms of both evolutionary and physiological mechanisms of aging and can be seen as a natural generalization of two-category differential coexpression. paul@bioinformatics.ubc.ca.

  4. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    PubMed

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  5. Three-dimensional surface deformation derived from airborne interferometric UAVSAR: Application to the Slumgullion Landslide

    USGS Publications Warehouse

    Delbridge, Brent G.; Burgmann, Roland; Fielding, Eric; Hensley, Scott; Schulz, William

    2016-01-01

    In order to provide surface geodetic measurements with “landslide-wide” spatial coverage, we develop and validate a method for the characterization of 3-D surface deformation using the unique capabilities of the Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) airborne repeat-pass radar interferometry system. We apply our method at the well-studied Slumgullion Landslide, which is 3.9 km long and moves persistently at rates up to ∼2 cm/day. A comparison with concurrent GPS measurements validates this method and shows that it provides reliable and accurate 3-D surface deformation measurements. The UAVSAR-derived vector velocity field measurements accurately capture the sharp boundaries defining previously identified kinematic units and geomorphic domains within the landslide. We acquired data across the landslide during spring and summer and identify that the landslide moves more slowly during summer except at its head, presumably in response to spatiotemporal variations in snowmelt infiltration. In order to constrain the mechanics controlling landslide motion from surface velocity measurements, we present an inversion framework for the extraction of slide thickness and basal geometry from dense 3-D surface velocity fields. We find that the average depth of the Slumgullion Landslide is 7.5 m, several meters less than previous depth estimates. We show that by considering a viscoplastic rheology, we can derive tighter theoretical bounds on the rheological parameter relating mean horizontal flow rate to surface velocity. Using inclinometer data for slow-moving, clay-rich landslides across the globe, we find a consistent value for the rheological parameter of 0.85 ± 0.08.

  6. Simultaneous Determination of Crypto-Chlorogenic Acid, Isoquercetin, and Astragalin Contents in Moringa oleifera Leaf Extracts by TLC-Densitometric Method.

    PubMed

    Vongsak, Boonyadist; Sithisarn, Pongtip; Gritsanapan, Wandee

    2013-01-01

    Moringa oleifera Lamarck (Moringaceae) is used as a multipurpose medicinal plant for the treatment of various diseases. Isoquercetin, astragalin, and crypto-chlorogenic acid have been previously found to be major active components in the leaves of this plant. In this study, a thin-layer-chromatography (TLC-)densitometric method was developed and validated for simultaneous quantification of these major components in the 70% ethanolic extracts of M. oleifera leaves collected from 12 locations. The average amounts of crypto-chlorogenic acid, isoquercetin, and astragalin were found to be 0.0473, 0.0427, and 0.0534% dry weight, respectively. The method was validated for linearity, precision, accuracy, limit of detection, limit of quantitation, and robustness. The linearity was obtained in the range of 100-500 ng/spot with a correlation coefficient (r) over 0.9961. Intraday and interday precisions demonstrated relative standard deviations of less than 5%. The accuracy of the method was confirmed by determining the recovery. The average recoveries of each component from the extracts were in the range of 98.28 to 99.65%. Additionally, the leaves from Chiang Mai province contained the highest amounts of all active components. The proposed TLC-densitometric method was simple, accurate, precise, and cost-effective for routine quality controlling of M. oleifera leaf extracts.

  7. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    PubMed Central

    2012-01-01

    Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO). MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA) with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO) correctly identified (p < 0.05) microarray data in which genes annotated to differentially expressed GO terms are upregulated. We found that GSEA + MIMGO was slightly less effective than, or comparable to, GSEA (Pearson), a method that uses Pearson’s correlation as a metric, at detecting true differentially expressed GO terms. However, unlike other methods including GSEA (Pearson), GSEA + MIMGO can comprehensively identify the microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively. PMID:23232071

  8. Creation and validation of a novel body condition scoring method for the magellanic penguin (Spheniscus magellanicus) in the zoo setting.

    PubMed

    Clements, Julie; Sanchez, Jessica N

    2015-11-01

    This research aims to validate a novel, visual body scoring system created for the Magellanic penguin (Spheniscus magellanicus) suitable for the zoo practitioner. Magellanics go through marked seasonal fluctuations in body mass gains and losses. A standardized multi-variable visual body condition guide may provide a more sensitive and objective assessment tool compared to the previously used single variable method. Accurate body condition scores paired with seasonal weight variation measurements give veterinary and keeper staff a clearer understanding of an individual's nutritional status. San Francisco Zoo staff previously used a nine-point body condition scale based on the classic bird standard of a single point of keel palpation with the bird restrained in hand, with no standard measure of reference assigned to each scoring category. We created a novel, visual body condition scoring system that does not require restraint to assesses subcutaneous fat and muscle at seven body landmarks using illustrations and descriptive terms. The scores range from one, the least robust or under-conditioned, to five, the most robust, or over-conditioned. The ratio of body weight to wing length was used as a "gold standard" index of body condition and compared to both the novel multi-variable and previously used single-variable body condition scores. The novel multi-variable scale showed improved agreement with weight:wing ratio compared to the single-variable scale, demonstrating greater accuracy, and reliability when a trained assessor uses the multi-variable body condition scoring system. Zoo staff may use this tool to manage both the colony and the individual to assist in seasonally appropriate Magellanic penguin nutrition assessment. © 2015 Wiley Periodicals, Inc.

  9. Inter-laboratory validation of an inexpensive streamlined method to measure inorganic arsenic in rice grain.

    PubMed

    Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J

    2018-05-04

    With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.

  10. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  11. Viability of Cross-Flow Fan with Helical Blades for Vertical Take-off and Landing Aircraft

    DTIC Science & Technology

    2012-09-01

    fluid dynamics (CFD) software, ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental results...computational fluid dynamics software (CFD), ANSYS - CFX , a three-dimensional (3-D) straight-bladed model was validated against previous study’s experimental...37 B. SIZING PARAMETERS AND ILLUSTRATION ................................. 37 APPENDIX B. ANSYS CFX PARAMETERS

  12. The structure of liquid metals probed by XAS

    NASA Astrophysics Data System (ADS)

    Filipponi, Adriano; Di Cicco, Andrea; Iesari, Fabio; Trapananti, Angela

    2017-08-01

    X-ray absorption spectroscopy (XAS) is a powerful technique to investigate the short-range order around selected atomic species in condensed matter. The theoretical framework and previous applications to undercooled elemental liquid metals are briefly reviewed. Specific results on undercooled liquid Ni obtained using a peak fitting approach validated on the spectra of solid Ni are presented. This method provides a clear evidence that a signature from close packed triangular configurations of nearest neighbors survives in the liquid state and is clearly detectable below k ≈ 5 Å-1, stimulating the improvement of data-analysis methods that account properly for the ensemble average, such as Reverse Monte Carlo.

  13. Radar Sensing for Intelligent Vehicles in Urban Environments

    PubMed Central

    Reina, Giulio; Johnson, David; Underwood, James

    2015-01-01

    Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios. This paper presents a novel method for ground and obstacle segmentation based on radar sensing. The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field. Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations. PMID:26102493

  14. Radar Sensing for Intelligent Vehicles in Urban Environments.

    PubMed

    Reina, Giulio; Johnson, David; Underwood, James

    2015-06-19

    Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios. This paper presents a novel method for ground and obstacle segmentation based on radar sensing. The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field. Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations.

  15. High-Fidelity Micromechanics Model Developed for the Response of Multiphase Materials

    NASA Technical Reports Server (NTRS)

    Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, Steven M.

    2002-01-01

    A new high-fidelity micromechanics model has been developed under funding from the NASA Glenn Research Center for predicting the response of multiphase materials with arbitrary periodic microstructures. The model's analytical framework is based on the homogenization technique, but the method of solution for the local displacement and stress fields borrows concepts previously employed in constructing the higher order theory for functionally graded materials. The resulting closed-form macroscopic and microscopic constitutive equations, valid for both uniaxial and multiaxial loading of periodic materials with elastic and inelastic constitutive phases, can be incorporated into a structural analysis computer code. Consequently, this model now provides an alternative, accurate method.

  16. EL_PSSM-RT: DNA-binding residue prediction by integrating ensemble learning with PSSM Relation Transformation.

    PubMed

    Zhou, Jiyun; Lu, Qin; Xu, Ruifeng; He, Yulan; Wang, Hongpeng

    2017-08-29

    Prediction of DNA-binding residue is important for understanding the protein-DNA recognition mechanism. Many computational methods have been proposed for the prediction, but most of them do not consider the relationships of evolutionary information between residues. In this paper, we first propose a novel residue encoding method, referred to as the Position Specific Score Matrix (PSSM) Relation Transformation (PSSM-RT), to encode residues by utilizing the relationships of evolutionary information between residues. PDNA-62 and PDNA-224 are used to evaluate PSSM-RT and two existing PSSM encoding methods by five-fold cross-validation. Performance evaluations indicate that PSSM-RT is more effective than previous methods. This validates the point that the relationship of evolutionary information between residues is indeed useful in DNA-binding residue prediction. An ensemble learning classifier (EL_PSSM-RT) is also proposed by combining ensemble learning model and PSSM-RT to better handle the imbalance between binding and non-binding residues in datasets. EL_PSSM-RT is evaluated by five-fold cross-validation using PDNA-62 and PDNA-224 as well as two independent datasets TS-72 and TS-61. Performance comparisons with existing predictors on the four datasets demonstrate that EL_PSSM-RT is the best-performing method among all the predicting methods with improvement between 0.02-0.07 for MCC, 4.18-21.47% for ST and 0.013-0.131 for AUC. Furthermore, we analyze the importance of the pair-relationships extracted by PSSM-RT and the results validates the usefulness of PSSM-RT for encoding DNA-binding residues. We propose a novel prediction method for the prediction of DNA-binding residue with the inclusion of relationship of evolutionary information and ensemble learning. Performance evaluation shows that the relationship of evolutionary information between residues is indeed useful in DNA-binding residue prediction and ensemble learning can be used to address the data imbalance issue between binding and non-binding residues. A web service of EL_PSSM-RT ( http://hlt.hitsz.edu.cn:8080/PSSM-RT_SVM/ ) is provided for free access to the biological research community.

  17. Optimisation of an analytical method and results from the inter-laboratory comparison of the migration of regulated substances from food packaging into the new mandatory European Union simulant for dry foodstuffs.

    PubMed

    Jakubowska, Natalia; Beldì, Giorgia; Peychès Bach, Aurélie; Simoneau, Catherine

    2014-01-01

    This paper presents the outcome of the development, optimisation and validation at European Union level of an analytical method for using poly(2,6-diphenyl phenylene oxide--PPPO), which is stipulated in Regulation (EU) No. 10/2011, as food simulant E for testing specific migration from plastics into dry foodstuffs. Two methods for fortifying respectively PPPO and a low-density polyethylene (LDPE) film with surrogate substances that are relevant to food contact were developed. A protocol for cleaning the PPPO and an efficient analytical method were developed for the quantification of butylhydroxytoluene (BHT), benzophenone (BP), diisobutylphthalate (DiBP), bis(2-ethylhexyl) adipate (DEHA) and 1,2-cyclohexanedicarboxylic acid, diisononyl ester (DINCH) from PPPO. A protocol for a migration test from plastics using small migration cells was also developed. The method was validated by an inter-laboratory comparison (ILC) with 16 national reference laboratories for food contact materials in the European Union. This allowed for the first time data to be obtained on the precision and laboratory performance of both migration and quantification. The results showed that the validation ILC was successful even when taking into account the complexity of the exercise. The results showed that the method performance was 7-9% repeatability standard deviation (rSD) for most substances (regardless of concentration), with 12% rSD for the high level of BHT and for DiBP at very low levels. The reproducibility standard deviation results for the 16 European Union laboratories were in the range of 20-30% for the quantification from PPPO (for the three levels of concentrations of the five substances) and 15-40% from migration experiments from the fortified plastic at 60°C for 10 days and subsequent quantification. Considering the lack of data previously available in the literature, this work has demonstrated that the validation of a method is possible both for migration from a film and for quantification into a corresponding simulant for specific migration.

  18. Identifying critical success factors for designing selection processes into postgraduate specialty training: the case of UK general practice.

    PubMed

    Plint, Simon; Patterson, Fiona

    2010-06-01

    The UK national recruitment process into general practice training has been developed over several years, with incremental introduction of stages which have been piloted and validated. Previously independent processes, which encouraged multiple applications and produced inconsistent outcomes, have been replaced by a robust national process which has high reliability and predictive validity, and is perceived to be fair by candidates and allocates applicants equitably across the country. Best selection practice involves a job analysis which identifies required competencies, then designs reliable assessment methods to measure them, and over the long term ensures that the process has predictive validity against future performance. The general practitioner recruitment process introduced machine markable short listing assessments for the first time in the UK postgraduate recruitment context, and also adopted selection centre workplace simulations. The key success factors have been identified as corporate commitment to the goal of a national process, with gradual convergence maintaining locus of control rather than the imposition of change without perceived legitimate authority.

  19. An atomic model of brome mosaic virus using direct electron detection and real-space optimization.

    PubMed

    Wang, Zhao; Hryc, Corey F; Bammes, Benjamin; Afonine, Pavel V; Jakana, Joanita; Chen, Dong-Hua; Liu, Xiangan; Baker, Matthew L; Kao, Cheng; Ludtke, Steven J; Schmid, Michael F; Adams, Paul D; Chiu, Wah

    2014-09-04

    Advances in electron cryo-microscopy have enabled structure determination of macromolecules at near-atomic resolution. However, structure determination, even using de novo methods, remains susceptible to model bias and overfitting. Here we describe a complete workflow for data acquisition, image processing, all-atom modelling and validation of brome mosaic virus, an RNA virus. Data were collected with a direct electron detector in integrating mode and an exposure beyond the traditional radiation damage limit. The final density map has a resolution of 3.8 Å as assessed by two independent data sets and maps. We used the map to derive an all-atom model with a newly implemented real-space optimization protocol. The validity of the model was verified by its match with the density map and a previous model from X-ray crystallography, as well as the internal consistency of models from independent maps. This study demonstrates a practical approach to obtain a rigorously validated atomic resolution electron cryo-microscopy structure.

  20. Validation of cryo-EM structure of IP₃R1 channel.

    PubMed

    Murray, Stephen C; Flanagan, John; Popova, Olga B; Chiu, Wah; Ludtke, Steven J; Serysheva, Irina I

    2013-06-04

    About a decade ago, three electron cryomicroscopy (cryo-EM) single-particle reconstructions of IP3R1 were reported at low resolution. It was disturbing that these structures bore little similarity to one another, even at the level of quaternary structure. Recently, we published an improved structure of IP3R1 at ∼1 nm resolution. However, this structure did not bear any resemblance to any of the three previously published structures, leading to the question of why the structure should be considered more reliable than the original three. Here, we apply several methods, including class-average/map comparisons, tilt-pair validation, and use of multiple refinement software packages, to give strong evidence for the reliability of our recent structure. The map resolution and feature resolvability are assessed with the gold standard criterion. This approach is generally applicable to assessing the validity of cryo-EM maps of other molecular machines. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. External validation of a simple clinical tool used to predict falls in people with Parkinson disease

    PubMed Central

    Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.

    2015-01-01

    Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412

  2. Modal parameter identification using the log decrement method and band-pass filters

    NASA Astrophysics Data System (ADS)

    Liao, Yabin; Wells, Valana

    2011-10-01

    This paper presents a time-domain technique for identifying modal parameters of test specimens based on the log-decrement method. For lightly damped multidegree-of-freedom or continuous systems, the conventional method is usually restricted to identification of fundamental-mode parameters only. Implementation of band-pass filters makes it possible for the proposed technique to extract modal information of higher modes. The method has been applied to a polymethyl methacrylate (PMMA) beam for complex modulus identification in the frequency range 10-1100 Hz. Results compare well with those obtained using the Least Squares method, and with those previously published in literature. Then the accuracy of the proposed method has been further verified by experiments performed on a QuietSteel specimen with very low damping. The method is simple and fast. It can be used for a quick estimation of the modal parameters, or as a complementary approach for validation purposes.

  3. Initialization of a fractional order identification algorithm applied for Lithium-ion battery modeling in time domain

    NASA Astrophysics Data System (ADS)

    Nasser Eddine, Achraf; Huard, Benoît; Gabano, Jean-Denis; Poinot, Thierry

    2018-06-01

    This paper deals with the initialization of a non linear identification algorithm used to accurately estimate the physical parameters of Lithium-ion battery. A Randles electric equivalent circuit is used to describe the internal impedance of the battery. The diffusion phenomenon related to this modeling is presented using a fractional order method. The battery model is thus reformulated into a transfer function which can be identified through Levenberg-Marquardt algorithm to ensure the algorithm's convergence to the physical parameters. An initialization method is proposed in this paper by taking into account previously acquired information about the static and dynamic system behavior. The method is validated using noisy voltage response, while precision of the final identification results is evaluated using Monte-Carlo method.

  4. Bulk Enthalpy Calculations in the Arc Jet Facility at NASA ARC

    NASA Technical Reports Server (NTRS)

    Thompson, Corinna S.; Prabhu, Dinesh; Terrazas-Salinas, Imelda; Mach, Jeffrey J.

    2011-01-01

    The Arc Jet Facilities at NASA Ames Research Center generate test streams with enthalpies ranging from 5 MJ/kg to 25 MJ/kg. The present work describes a rigorous method, based on equilibrium thermodynamics, for calculating the bulk enthalpy of the flow produced in two of these facilities. The motivation for this work is to determine a dimensionally-correct formula for calculating the bulk enthalpy that is at least as accurate as the conventional formulas that are currently used. Unlike previous methods, the new method accounts for the amount of argon that is present in the flow. Comparisons are made with bulk enthalpies computed from an energy balance method. An analysis of primary facility operating parameters and their associated uncertainties is presented in order to further validate the enthalpy calculations reported herein.

  5. Position Accuracy Analysis of a Robust Vision-Based Navigation

    NASA Astrophysics Data System (ADS)

    Gaglione, S.; Del Pizzo, S.; Troisi, S.; Angrisano, A.

    2018-05-01

    Using images to determine camera position and attitude is a consolidated method, very widespread for application like UAV navigation. In harsh environment, where GNSS could be degraded or denied, image-based positioning could represent a possible candidate for an integrated or alternative system. In this paper, such method is investigated using a system based on single camera and 3D maps. A robust estimation method is proposed in order to limit the effect of blunders or noisy measurements on position solution. The proposed approach is tested using images collected in an urban canyon, where GNSS positioning is very unaccurate. A previous photogrammetry survey has been performed to build the 3D model of tested area. The position accuracy analysis is performed and the effect of the robust method proposed is validated.

  6. Jwalk and MNXL Web Server: Model Validation using Restraints from Crosslinking Mass Spectrometry.

    PubMed

    Bullock, J M A; Thalassinos, K; Topf, M

    2018-05-07

    Crosslinking Mass Spectrometry generates restraints that can be used to model proteins and protein complexes. Previously, we have developed two methods, to help users achieve better modelling performance from their crosslinking restraints: Jwalk, to estimate solvent accessible distances between crosslinked residues and MNXL, to assess the quality of the models based on these distances. Here we present the Jwalk and MNXL webservers, which streamline the process of validating monomeric protein models using restraints from crosslinks. We demonstrate this by using the MNXL server to filter models made of varying quality, selecting the most native-like. The webserver and source code are freely available from jwalk.ismb.lon.ac.uk and mnxl.ismb.lon.ac.uk. m.topf@cryst.bbk.ac.uk, j.bullock@cryst.bbk.ac.uk.

  7. Experimental evaluation of certification trails using abstract data type validation

    NASA Technical Reports Server (NTRS)

    Wilson, Dwight S.; Sullivan, Gregory F.; Masson, Gerald M.

    1993-01-01

    Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. Recent experimental work reveals many cases in which a certification-trail approach allows for significantly faster program execution time than a basic time-redundancy approach. Algorithms for answer-validation of abstract data types allow a certification trail approach to be used for a wide variety of problems. An attempt to assess the performance of algorithms utilizing certification trails on abstract data types is reported. Specifically, this method was applied to the following problems: heapsort, Hullman tree, shortest path, and skyline. Previous results used certification trails specific to a particular problem and implementation. The approach allows certification trails to be localized to 'data structure modules,' making the use of this technique transparent to the user of such modules.

  8. Computerized content analysis of some adolescent writings of Napoleon Bonaparte: a test of the validity of the method.

    PubMed

    Gottschalk, Louis A; DeFrancisco, Don; Bechtel, Robert J

    2002-08-01

    The aim of this study was to test the validity of a computer software program previously demonstrated to be capable of making DSM-IV neuropsychiatric diagnoses from the content analysis of speech or verbal texts. In this report, the computer program was applied to three personal writings of Napoleon Bonaparte when he was 12 to 16 years of age. The accuracy of the neuropsychiatric evaluations derived from the computerized content analysis of these writings of Napoleon was independently corroborated by two biographers who have described pertinent details concerning his life situations, moods, and other emotional reactions during this adolescent period of his life. The relevance of this type of computer technology to psychohistorical research and clinical psychiatry is suggested.

  9. Spatial-temporal features of thermal images for Carpal Tunnel Syndrome detection

    NASA Astrophysics Data System (ADS)

    Estupinan Roldan, Kevin; Ortega Piedrahita, Marco A.; Benitez, Hernan D.

    2014-02-01

    Disorders associated with repeated trauma account for about 60% of all occupational illnesses, Carpal Tunnel Syndrome (CTS) being the most consulted today. Infrared Thermography (IT) has come to play an important role in the field of medicine. IT is non-invasive and detects diseases based on measuring temperature variations. IT represents a possible alternative to prevalent methods for diagnosis of CTS (i.e. nerve conduction studies and electromiography). This work presents a set of spatial-temporal features extracted from thermal images taken in healthy and ill patients. Support Vector Machine (SVM) classifiers test this feature space with Leave One Out (LOO) validation error. The results of the proposed approach show linear separability and lower validation errors when compared to features used in previous works that do not account for temperature spatial variability.

  10. Accurate Determination of the Frequency Response Function of Submerged and Confined Structures by Using PZT-Patches†.

    PubMed

    Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme; Egusquiza, Mònica; Bossio, Matias

    2017-03-22

    To accurately determine the dynamic response of a structure is of relevant interest in many engineering applications. Particularly, it is of paramount importance to determine the Frequency Response Function (FRF) for structures subjected to dynamic loads in order to avoid resonance and fatigue problems that can drastically reduce their useful life. One challenging case is the experimental determination of the FRF of submerged and confined structures, such as hydraulic turbines, which are greatly affected by dynamic problems as reported in many cases in the past. The utilization of classical and calibrated exciters such as instrumented hammers or shakers to determine the FRF in such structures can be very complex due to the confinement of the structure and because their use can disturb the boundary conditions affecting the experimental results. For such cases, Piezoelectric Patches (PZTs), which are very light, thin and small, could be a very good option. Nevertheless, the main drawback of these exciters is that the calibration as dynamic force transducers (relationship voltage/force) has not been successfully obtained in the past. Therefore, in this paper, a method to accurately determine the FRF of submerged and confined structures by using PZTs is developed and validated. The method consists of experimentally determining some characteristic parameters that define the FRF, with an uncalibrated PZT exciting the structure. These parameters, which have been experimentally determined, are then introduced in a validated numerical model of the tested structure. In this way, the FRF of the structure can be estimated with good accuracy. With respect to previous studies, where only the natural frequencies and mode shapes were considered, this paper discuss and experimentally proves the best excitation characteristic to obtain also the damping ratios and proposes a procedure to fully determine the FRF. The method proposed here has been validated for the structure vibrating in air comparing the FRF experimentally obtained with a calibrated exciter (impact Hammer) and the FRF obtained with the described method. Finally, the same methodology has been applied for the structure submerged and close to a rigid wall, where it is extremely important to not modify the boundary conditions for an accurate determination of the FRF. As experimentally shown in this paper, in such cases, the use of PZTs combined with the proposed methodology gives much more accurate estimations of the FRF than other calibrated exciters typically used for the same purpose. Therefore, the validated methodology proposed in this paper can be used to obtain the FRF of a generic submerged and confined structure, without a previous calibration of the PZT.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonior, Jason D; Hu, Zhen; Guo, Terry N.

    This letter presents an experimental demonstration of software-defined-radio-based wireless tomography using computer-hosted radio devices called Universal Software Radio Peripheral (USRP). This experimental brief follows our vision and previous theoretical study of wireless tomography that combines wireless communication and RF tomography to provide a novel approach to remote sensing. Automatic data acquisition is performed inside an RF anechoic chamber. Semidefinite relaxation is used for phase retrieval, and the Born iterative method is utilized for imaging the target. Experimental results are presented, validating our vision of wireless tomography.

  12. Methods for Reachability-based Hybrid Controller Design

    DTIC Science & Technology

    2012-05-10

    approaches for airport runways ( Teo and Tomlin, 2003). The results of the reachability calculations were validated in extensive simulations as well as...UAV flight experiments (Jang and Tomlin, 2005; Teo , 2005). While the focus of these previous applications lies largely in safety verification, the work...B([15, 0],a0)× [−π,π])\\ V,∀qi ∈ Q, where a0 = 30m is the protected radius (chosen based upon published data of the wingspan of a Boeing KC -135

  13. Revisiting the Schönbein ozone measurement methodology

    NASA Astrophysics Data System (ADS)

    Ramírez-González, Ignacio A.; Añel, Juan A.; Saiz-López, Alfonso; García-Feal, Orlando; Cid, Antonio; Mejuto, Juan Carlos; Gimeno, Luis

    2017-04-01

    Trough the XIX century the Schönbein method gained a lot of popularity by its easy way to measure tropospheric ozone. Traditionally it has been considered that Schönbein measurements are not accurate enough to be useful. Detractors of this method argue that it is sensitive to meteorological conditions, being the most important the influence of relative humidity. As a consequence the data obtained by this method have usually been discarded. Here we revisit this method taking into account that values measured during the 19th century were taken using different measurement papers. We explore several concentrations of starch and potassium iodide, the basis for this measurement method. Our results are compared with the previous ones existing in the literature. The validity of the Schönbein methodology is discussed having into account humidity and other meteorological variables.

  14. Local discretization method for overdamped Brownian motion on a potential with multiple deep wells.

    PubMed

    Nguyen, P T T; Challis, K J; Jack, M W

    2016-11-01

    We present a general method for transforming the continuous diffusion equation describing overdamped Brownian motion on a time-independent potential with multiple deep wells to a discrete master equation. The method is based on an expansion in localized basis states of local metastable potentials that match the full potential in the region of each potential well. Unlike previous basis methods for discretizing Brownian motion on a potential, this approach is valid for periodic potentials with varying multiple deep wells per period and can also be applied to nonperiodic systems. We apply the method to a range of potentials and find that potential wells that are deep compared to five times the thermal energy can be associated with a discrete localized state while shallower wells are better incorporated into the local metastable potentials of neighboring deep potential wells.

  15. Local discretization method for overdamped Brownian motion on a potential with multiple deep wells

    NASA Astrophysics Data System (ADS)

    Nguyen, P. T. T.; Challis, K. J.; Jack, M. W.

    2016-11-01

    We present a general method for transforming the continuous diffusion equation describing overdamped Brownian motion on a time-independent potential with multiple deep wells to a discrete master equation. The method is based on an expansion in localized basis states of local metastable potentials that match the full potential in the region of each potential well. Unlike previous basis methods for discretizing Brownian motion on a potential, this approach is valid for periodic potentials with varying multiple deep wells per period and can also be applied to nonperiodic systems. We apply the method to a range of potentials and find that potential wells that are deep compared to five times the thermal energy can be associated with a discrete localized state while shallower wells are better incorporated into the local metastable potentials of neighboring deep potential wells.

  16. An operant-based detection method for inferring tinnitus in mice.

    PubMed

    Zuo, Hongyan; Lei, Debin; Sivaramakrishnan, Shobhana; Howie, Benjamin; Mulvany, Jessica; Bao, Jianxin

    2017-11-01

    Subjective tinnitus is a hearing disorder in which a person perceives sound when no external sound is present. It can be acute or chronic. Because our current understanding of its pathology is incomplete, no effective cures have yet been established. Mouse models are useful for studying the pathophysiology of tinnitus as well as for developing therapeutic treatments. We have developed a new method for determining acute and chronic tinnitus in mice, called sound-based avoidance detection (SBAD). The SBAD method utilizes one paradigm to detect tinnitus and another paradigm to monitor possible confounding factors, such as motor impairment, loss of motivation, and deficits in learning and memory. The SBAD method has succeeded in monitoring both acute and chronic tinnitus in mice. Its detection ability is further validated by functional studies demonstrating an abnormal increase in neuronal activity in the inferior colliculus of mice that had previously been identified as having tinnitus by the SBAD method. The SBAD method provides a new means by which investigators can detect tinnitus in a single mouse accurately and with more control over potential confounding factors than existing methods. This work establishes a new behavioral method for detecting tinnitus in mice. The detection outcome is consistent with functional validation. One key advantage of mouse models is they provide researchers the opportunity to utilize an extensive array of genetic tools. This new method could lead to a deeper understanding of the molecular pathways underlying tinnitus pathology. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Automatic brain caudate nuclei segmentation and classification in diagnostic of Attention-Deficit/Hyperactivity Disorder.

    PubMed

    Igual, Laura; Soliva, Joan Carles; Escalera, Sergio; Gimeno, Roger; Vilarroya, Oscar; Radeva, Petia

    2012-12-01

    We present a fully automatic diagnostic imaging test for Attention-Deficit/Hyperactivity Disorder diagnosis assistance based on previously found evidences of caudate nucleus volumetric abnormalities. The proposed method consists of different steps: a new automatic method for external and internal segmentation of caudate based on Machine Learning methodologies; the definition of a set of new volume relation features, 3D Dissociated Dipoles, used for caudate representation and classification. We separately validate the contributions using real data from a pediatric population and show precise internal caudate segmentation and discrimination power of the diagnostic test, showing significant performance improvements in comparison to other state-of-the-art methods. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Optical information encryption based on incoherent superposition with the help of the QR code

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Gong, Qiong

    2014-01-01

    In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.

  19. Morphology vs morphokinetics: a retrospective comparison of inter-observer and intra-observer agreement between embryologists on blastocysts with known implantation outcome.

    PubMed

    Adolfsson, Emma; Andershed, Anna Nowosad

    2018-06-18

    Our primary aim was to compare the morphology and morphokinetics on inter- and intra-observer agreement for blastocyst with known implantation outcome. Our secondary aim was to validate the morphokinetic parameters' ability to predict pregnancy using a previous published selection algorithm, and to compare this to standard morphology assessments. Two embryologists made independent blinded annotations on two occasions using time-lapse images and morphology evaluations using the Gardner Schoolcraft criteria of 99 blastocysts with known implantation outcome. Inter- and intra-observer agreement was calculated and compared using the two methods. The embryos were grouped based on their morphological score, and on their morphokinetic class using a previous published selection algorithm. The implantation rates for each group was calculated and compared. There was moderate agreement for morphology, with agreement on the same embryo score in 55 of 99 cases. The highest agreement rate was found for expansion grade, followed by trophectoderm and inner cell mass. Correlation with pregnancy was inconclusive. For morphokinetics, almost perfect agreement was found for early and late embryo development events, and strong agreement for day-2 and day-3 events. When applying the selection algorithm, the embryo distributions were uneven, and correlation to pregnancy was inconclusive. Time-lapse annotation is consistent and accurate, but our external validation of a previously published selection algorithm was unsuccessful.

  20. Discovery and validation of information theory-based transcription factor and cofactor binding site motifs.

    PubMed

    Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K

    2017-03-17

    Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. How to test validity in orthodontic research: a mixed dentition analysis example.

    PubMed

    Donatelli, Richard E; Lee, Shin-Jae

    2015-02-01

    The data used to test the validity of a prediction method should be different from the data used to generate the prediction model. In this study, we explored whether an independent data set is mandatory for testing the validity of a new prediction method and how validity can be tested without independent new data. Several validation methods were compared in an example using the data from a mixed dentition analysis with a regression model. The validation errors of real mixed dentition analysis data and simulation data were analyzed for increasingly large data sets. The validation results of both the real and the simulation studies demonstrated that the leave-1-out cross-validation method had the smallest errors. The largest errors occurred in the traditional simple validation method. The differences between the validation methods diminished as the sample size increased. The leave-1-out cross-validation method seems to be an optimal validation method for improving the prediction accuracy in a data set with limited sample sizes. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  2. Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.

    PubMed

    Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si

    2017-07-01

    Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.

  3. Asthma Symptom Utility Index: Reliability, validity, responsiveness and the minimal important difference in adult asthma patients

    PubMed Central

    Bime, Christian; Wei, Christine Y.; Holbrook, Janet T.; Sockrider, Marianna M.; Revicki, Dennis A.; Wise, Robert A.

    2012-01-01

    Background The evaluation of asthma symptoms is a core outcome measure in asthma clinical research. The Asthma Symptom Utility Index (ASUI) was developed to assess frequency and severity of asthma symptoms. The psychometric properties of the ASUI are not well characterized and a minimal important difference (MID) is not established. Objectives We assessed the reliability, validity, and responsiveness to change of the ASUI in a population of adult asthma patients. We also sought to determine the MID for the ASUI. Methods Adult asthma patients (n = 1648) from two previously completed multicenter randomized trials were included. Demographic information, spirometry, ASUI scores, and other asthma questionnaire scores were obtained at baseline and during follow-up visits. Participants also kept a daily asthma diary. Results Internal consistency reliability of the ASUI was 0.74 (Cronbach’s alpha). Test-retest reliability was 0.76 (intra-class correlation). Construct validity was demonstrated by significant correlations between ASUI scores and Asthma Control Questionnaire (ACQ) scores (Spearman correlation r = −0.79, 95% CI [−0.85, −0.75], P<0.001) and Mini Asthma Quality of Life Questionnaire (Mini AQLQ) scores (r = 0.59, 95% CI [0.51, 0.61], P<0.001). Responsiveness to change was demonstrated, with significant differences between mean changes in ASUI score across groups of participants differing by 10% in the percent predicted FEV1 (P<0.001), and by 0.5 points in ACQ score (P < 0.001). Anchor-based methods and statistical methods support an MID for the ASUI of 0.09 points. Conclusions The ASUI is reliable, valid, and responsive to changes in asthma control over time. The MID of the ASUI (range of scores 0–1) is 0.09. PMID:23026499

  4. Removal of BCG artifacts from EEG recordings inside the MR scanner: a comparison of methodological and validation-related aspects.

    PubMed

    Vanderperren, Katrien; De Vos, Maarten; Ramautar, Jennifer R; Novitskiy, Nikolay; Mennes, Maarten; Assecondi, Sara; Vanrumste, Bart; Stiers, Peter; Van den Bergh, Bea R H; Wagemans, Johan; Lagae, Lieven; Sunaert, Stefan; Van Huffel, Sabine

    2010-04-15

    Multimodal approaches are of growing interest in the study of neural processes. To this end much attention has been paid to the integration of electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) data because of their complementary properties. However, the simultaneous acquisition of both types of data causes serious artifacts in the EEG, with amplitudes that may be much larger than those of EEG signals themselves. The most challenging of these artifacts is the ballistocardiogram (BCG) artifact, caused by pulse-related electrode movements inside the magnetic field. Despite numerous efforts to find a suitable approach to remove this artifact, still a considerable discrepancy exists between current EEG-fMRI studies. This paper attempts to clarify several methodological issues regarding the different approaches with an extensive validation based on event-related potentials (ERPs). More specifically, Optimal Basis Set (OBS) and Independent Component Analysis (ICA) based methods were investigated. Their validation was not only performed with measures known from previous studies on the average ERPs, but most attention was focused on task-related measures, including their use on trial-to-trial information. These more detailed validation criteria enabled us to find a clearer distinction between the most widely used cleaning methods. Both OBS and ICA proved to be able to yield equally good results. However, ICA methods needed more parameter tuning, thereby making OBS more robust and easy to use. Moreover, applying OBS prior to ICA can optimize the data quality even more, but caution is recommended since the effect of the additional ICA step may be strongly subject-dependent. Copyright 2010 Elsevier Inc. All rights reserved.

  5. An Arrayed Genome-Scale Lentiviral-Enabled Short Hairpin RNA Screen Identifies Lethal and Rescuer Gene Candidates

    PubMed Central

    Bhinder, Bhavneet; Antczak, Christophe; Ramirez, Christina N.; Shum, David; Liu-Sullivan, Nancy; Radu, Constantin; Frattini, Mark G.

    2013-01-01

    Abstract RNA interference technology is becoming an integral tool for target discovery and validation.; With perhaps the exception of only few studies published using arrayed short hairpin RNA (shRNA) libraries, most of the reports have been either against pooled siRNA or shRNA, or arrayed siRNA libraries. For this purpose, we have developed a workflow and performed an arrayed genome-scale shRNA lethality screen against the TRC1 library in HeLa cells. The resulting targets would be a valuable resource of candidates toward a better understanding of cellular homeostasis. Using a high-stringency hit nomination method encompassing criteria of at least three active hairpins per gene and filtered for potential off-target effects (OTEs), referred to as the Bhinder–Djaballah analysis method, we identified 1,252 lethal and 6 rescuer gene candidates, knockdown of which resulted in severe cell death or enhanced growth, respectively. Cross referencing individual hairpins with the TRC1 validated clone database, 239 of the 1,252 candidates were deemed independently validated with at least three validated clones. Through our systematic OTE analysis, we have identified 31 microRNAs (miRNAs) in lethal and 2 in rescuer genes; all having a seed heptamer mimic in the corresponding shRNA hairpins and likely cause of the OTE observed in our screen, perhaps unraveling a previously unknown plausible essentiality of these miRNAs in cellular viability. Taken together, we report on a methodology for performing large-scale arrayed shRNA screens, a comprehensive analysis method to nominate high-confidence hits, and a performance assessment of the TRC1 library highlighting the intracellular inefficiencies of shRNA processing in general. PMID:23198867

  6. Developmental and internal validation of a novel 13 loci STR multiplex method for Cannabis sativa DNA profiling.

    PubMed

    Houston, Rachel; Birck, Matthew; Hughes-Stamm, Sheree; Gangitano, David

    2017-05-01

    Marijuana (Cannabis sativa L.) is a plant cultivated and trafficked worldwide as a source of fiber (hemp), medicine, and intoxicant. The development of a validated method using molecular techniques such as short tandem repeats (STRs) could serve as an intelligence tool to link multiple cases by means of genetic individualization or association of cannabis samples. For this purpose, a 13 loci STR multiplex method was developed, optimized, and validated according to relevant ISFG and SWGDAM guidelines. The STR multiplex consists of 13 previously described C. sativa STR loci: ANUCS501, 9269, 4910, 5159, ANUCS305, 9043, B05, 1528, 3735, CS1, D02, C11, and H06. A sequenced allelic ladder consisting of 56 alleles was designed to accurately genotype 101 C. sativa samples from three seizures provided by a U.S. Customs and Border Protection crime lab. Using an optimal range of DNA (0.5-1.0ng), validation studies revealed well-balanced electropherograms (inter-locus balance range: 0.500-1.296), relatively balanced heterozygous peaks (mean peak height ratio of 0.83 across all loci) with minimal artifacts and stutter ratio (mean stutter of 0.021 across all loci). This multi-locus system is relatively sensitive (0.13ng of template DNA) with a combined power of discrimination of 1 in 55 million. The 13 STR panel was found to be species specific for C. sativa; however, non-specific peaks were produced with Humulus lupulus. The results of this research demonstrate the robustness and applicability of this 13 loci STR system for forensic DNA profiling of marijuana samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Sorbent, Sublimation, and Icing Modeling Methods: Experimental Validation and Application to an Integrated MTSA Subassembly Thermal Model

    NASA Technical Reports Server (NTRS)

    Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.

    2010-01-01

    This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.

  8. The medline UK filter: development and validation of a geographic search filter to retrieve research about the UK from OVID medline.

    PubMed

    Ayiku, Lynda; Levay, Paul; Hudson, Tom; Craven, Jenny; Barrett, Elizabeth; Finnegan, Amy; Adams, Rachel

    2017-07-13

    A validated geographic search filter for the retrieval of research about the United Kingdom (UK) from bibliographic databases had not previously been published. To develop and validate a geographic search filter to retrieve research about the UK from OVID medline with high recall and precision. Three gold standard sets of references were generated using the relative recall method. The sets contained references to studies about the UK which had informed National Institute for Health and Care Excellence (NICE) guidance. The first and second sets were used to develop and refine the medline UK filter. The third set was used to validate the filter. Recall, precision and number-needed-to-read (NNR) were calculated using a case study. The validated medline UK filter demonstrated 87.6% relative recall against the third gold standard set. In the case study, the medline UK filter demonstrated 100% recall, 11.4% precision and a NNR of nine. A validated geographic search filter to retrieve research about the UK with high recall and precision has been developed. The medline UK filter can be applied to systematic literature searches in OVID medline for topics with a UK focus. © 2017 Crown copyright. Health Information and Libraries Journal © 2017 Health Libraries GroupThis article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  9. Deformable image registration for tissues with large displacements

    PubMed Central

    Huang, Xishi; Ren, Jing; Green, Mark

    2017-01-01

    Abstract. Image registration for internal organs and soft tissues is considered extremely challenging due to organ shifts and tissue deformation caused by patients’ movements such as respiration and repositioning. In our previous work, we proposed a fast registration method for deformable tissues with small rotations. We extend our method to deformable registration of soft tissues with large displacements. We analyzed the deformation field of the liver by decomposing the deformation into shift, rotation, and pure deformation components and concluded that in many clinical cases, the liver deformation contains large rotations and small deformations. This analysis justified the use of linear elastic theory in our image registration method. We also proposed a region-based neuro-fuzzy transformation model to seamlessly stitch together local affine and local rigid models in different regions. We have performed the experiments on a liver MRI image set and showed the effectiveness of the proposed registration method. We have also compared the performance of the proposed method with the previous method on tissues with large rotations and showed that the proposed method outperformed the previous method when dealing with the combination of pure deformation and large rotations. Validation results show that we can achieve a target registration error of 1.87±0.87  mm and an average centerline distance error of 1.28±0.78  mm. The proposed technique has the potential to significantly improve registration capabilities and the quality of intraoperative image guidance. To the best of our knowledge, this is the first time that the complex displacement of the liver is explicitly separated into local pure deformation and rigid motion. PMID:28149924

  10. A rapid and efficient newly established method to detect COL1A1-PDGFB gene fusion in dermatofibrosarcoma protuberans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yokoyama, Yoko; Shimizu, Akira; Okada, Etsuko

    Highlights: Black-Right-Pointing-Pointer We developed new method to rapidly identify COL1A1-PDGFB fusion in DFSP. Black-Right-Pointing-Pointer New PCR method using a single primer pair detected COL1A1-PDGFB fusion in DFSP. Black-Right-Pointing-Pointer This is the first report of DFSP with a novel COL1A1 breakpoint in exon 5. -- Abstract: The detection of fusion transcripts of the collagen type 1{alpha}1 (COL1A1) and platelet-derived growth factor-BB (PDGFB) genes by genetic analysis has recognized as a reliable and valuable molecular tool for the diagnosis of dermatofibrosarcoma protuberans (DFSP). To detect the COL1A1-PDGFB fusion, almost previous reports performed reverse transcription polymerase chain reaction (RT-PCR) using multiplex forward primersmore » from COL1A1. However, it has possible technical difficulties with respect to the handling of multiple primers and reagents in the procedure. The objective of this study is to establish a rapid, easy, and efficient one-step method of PCR using only a single primer pair to detect the fusion transcripts of the COL1A1 and PDGFB in DFSP. To validate new method, we compared the results of RT-PCR in five patients of DFSP between the previous method using multiplex primers and our established one-step RT-PCR using a single primer pair. In all cases of DFSP, the COL1A1-PDGFB fusion was detected by both previous method and newly established one-step PCR. Importantly, we detected a novel COL1A1 breakpoint in exon 5. The newly developed method is valuable to rapidly identify COL1A1-PDGFB fusion transcripts in DFSP.« less

  11. Prediction of early death among patients enrolled in phase I trials: development and validation of a new model based on platelet count and albumin.

    PubMed

    Ploquin, A; Olmos, D; Lacombe, D; A'Hern, R; Duhamel, A; Twelves, C; Marsoni, S; Morales-Barrera, R; Soria, J-C; Verweij, J; Voest, E E; Schöffski, P; Schellens, J H; Kramar, A; Kristeleit, R S; Arkenau, H-T; Kaye, S B; Penel, N

    2012-09-25

    Selecting patients with 'sufficient life expectancy' for Phase I oncology trials remains challenging. The Royal Marsden Hospital Score (RMS) previously identified high-risk patients as those with ≥ 2 of the following: albumin <35 g l(-1); LDH > upper limit of normal; >2 metastatic sites. This study developed an alternative prognostic model, and compared its performance with that of the RMS. The primary end point was the 90-day mortality rate. The new model was developed from the same database as RMS, but it used Chi-squared Automatic Interaction Detection (CHAID). The ROC characteristics of both methods were then validated in an independent database of 324 patients enrolled in European Organization on Research and Treatment of Cancer Phase I trials of cytotoxic agents between 2000 and 2009. The CHAID method identified high-risk patients as those with albumin <33 g l(-1) or ≥ 33 g l(-1), but platelet counts ≥ 400.000 mm(-3). In the validation data set, the rates of correctly classified patients were 0.79 vs 0.67 for the CHAID model and RMS, respectively. The negative predictive values (NPV) were similar for the CHAID model and RMS. The CHAID model and RMS provided a similarly high level of NPV, but the CHAID model gave a better accuracy in the validation set. Both CHAID model and RMS may improve the screening process in phase I trials.

  12. Calibration and validation of wearable monitors.

    PubMed

    Bassett, David R; Rowlands, Alex; Trost, Stewart G

    2012-01-01

    Wearable monitors are increasingly being used to objectively monitor physical activity in research studies within the field of exercise science. Calibration and validation of these devices are vital to obtaining accurate data. This article is aimed primarily at the physical activity measurement specialist, although the end user who is conducting studies with these devices also may benefit from knowing about this topic. Initially, wearable physical activity monitors should undergo unit calibration to ensure interinstrument reliability. The next step is to simultaneously collect both raw signal data (e.g., acceleration) from the wearable monitors and rates of energy expenditure, so that algorithms can be developed to convert the direct signals into energy expenditure. This process should use multiple wearable monitors and a large and diverse subject group and should include a wide range of physical activities commonly performed in daily life (from sedentary to vigorous). New methods of calibration now use "pattern recognition" approaches to train the algorithms on various activities, and they provide estimates of energy expenditure that are much better than those previously available with the single-regression approach. Once a method of predicting energy expenditure has been established, the next step is to examine its predictive accuracy by cross-validating it in other populations. In this article, we attempt to summarize the best practices for calibration and validation of wearable physical activity monitors. Finally, we conclude with some ideas for future research ideas that will move the field of physical activity measurement forward.

  13. Development and Validation of an Agency for Healthcare Research and Quality Indicator for Mortality After Congenital Heart Surgery Harmonized With Risk Adjustment for Congenital Heart Surgery (RACHS-1) Methodology.

    PubMed

    Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee

    2016-05-20

    The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients <18 years with International Classification of Diseases, Ninth Revision, Clinical Modification procedure codes for congenital heart surgery or nonspecific heart surgery combined with congenital heart disease diagnosis codes. The final model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  14. Predictive validity of the UK clinical aptitude test in the final years of medical school: a prospective cohort study

    PubMed Central

    2014-01-01

    Background The UK Clinical Aptitude Test (UKCAT) was designed to address issues identified with traditional methods of selection. This study aims to examine the predictive validity of the UKCAT and compare this to traditional selection methods in the senior years of medical school. This was a follow-up study of two cohorts of students from two medical schools who had previously taken part in a study examining the predictive validity of the UKCAT in first year. Methods The sample consisted of 4th and 5th Year students who commenced their studies at the University of Aberdeen or University of Dundee medical schools in 2007. Data collected were: demographics (gender and age group), UKCAT scores; Universities and Colleges Admissions Service (UCAS) form scores; admission interview scores; Year 4 and 5 degree examination scores. Pearson’s correlations were used to examine the relationships between admissions variables, examination scores, gender and age group, and to select variables for multiple linear regression analysis to predict examination scores. Results Ninety-nine and 89 students at Aberdeen medical school from Years 4 and 5 respectively, and 51 Year 4 students in Dundee, were included in the analysis. Neither UCAS form nor interview scores were statistically significant predictors of examination performance. Conversely, the UKCAT yielded statistically significant validity coefficients between .24 and .36 in four of five assessments investigated. Multiple regression analysis showed the UKCAT made a statistically significant unique contribution to variance in examination performance in the senior years. Conclusions Results suggest the UKCAT appears to predict performance better in the later years of medical school compared to earlier years and provides modest supportive evidence for the UKCAT’s role in student selection within these institutions. Further research is needed to assess the predictive validity of the UKCAT against professional and behavioural outcomes as the cohort commences working life. PMID:24762134

  15. Computer Simulations of Coronary Blood Flow Through a Constriction

    DTIC Science & Technology

    2014-03-01

    interventional procedures (e.g., stent deployment). Building off previous models that have been partially validated with experimental data, this thesis... stent deployment). Building off previous models that have been partially validated with experimental data, this thesis continues to develop the...the artery and increase blood flow. Generally a stent , or a mesh wire tube, is permanently inserted in order to scaffold open the artery wall

  16. Crypto-Giardia antigen rapid test versus conventional modified Ziehl-Neelsen acid fast staining method for diagnosis of cryptosporidiosis.

    PubMed

    Zaglool, Dina Abdulla Muhammad; Mohamed, Amr; Khodari, Yousif Abdul Wahid; Farooq, Mian Usman

    2013-03-01

    To evaluate the validity of Crypto-Giardia antigen rapid test (CA-RT) in comparison with the conventional modified Ziehl-Neelsen acid fast (MZN-AF) staining method for the diagnosis of cryptosporidiosis. Fifteen preserved stool samples from previously confirmed infections were used as positive controls and 40 stool samples from healthy people were used as negative control. A total of 85 stool samples were collected from suspected patients with cryptosporidiosis over 6 months during the period from January till June, 2011. The study was conducted in the department of parasitology, central laboratory, Alnoor Specialist Hospital, Makkah, Saudi Arabia. All samples were subjected to CA-RT and conventional MZN-AF staining method. Validation parameters including sensitivity (SN), specificity (SP), accuracy index (AI), positive predictive value (PPV), and negative predictive value (NPV) were evaluated for both tests. Out of 15 positive controls, CA-RT detected 13 (86.7%) while MZN-AF detected 11(73.3%) positive cases. However, CA-RT detected no positive case in 40 normal controls but MZN-AF detected 2(5%) as positive cases. Based on the results, the SN, SP, AI, PPV and NPV were high in CA-RT than MZN-AF staining method, ie., 86.7%vs. 73.3%, 100%vs. 95%, 96.4%vs. 89.1%, 100%vs. 84.6% and 95.2%vs. 90.5%, respectively. Out of a total of 85 suspected specimens, CA-RT detected 7(8.2%) but MZN-AF detected 6(7.1%) cases as positive. CA-RT immunoassay is more valid and reliable than MZN-AF staining method. Copyright © 2013 Hainan Medical College. Published by Elsevier B.V. All rights reserved.

  17. Determination of Efavirenz in Human Dried Blood Spots by Reversed-Phase High Performance Liquid Chromatography with UV Detection

    PubMed Central

    Hoffman, Justin T; Rossi, Steven S; Espina-Quinto, Rowena; Letendre, Scott; Capparelli, Edmund V

    2013-01-01

    Background Previously published methods for determination of efavirenz (EFV) in human dried blood spots (DBS) employ costly and complex liquid chromatography/mass spectrometry. We describe the validation and evaluation of a simple and inexpensive high-performance liquid chromatography (HPLC) method for EFV quantification in human DBS and dried plasma spots (DPS), using ultraviolet (UV) detection appropriate for resource-limited settings. Methods 100μl of heparinized whole blood or plasma were spotted onto blood collection cards, dried, punched, and eluted. Eluates are injected onto a C-18 reversed phase HPLC column. EFV is separated isocratically using a potassium phosphate and ACN mobile phase. UV detection is at 245nm. Quantitation is by use of external calibration standards. Following validation, the method was evaluated using whole blood and plasma from HIV-positive patients undergoing EFV therapy. Results Mean recovery of drug from dried blood spots is 91.5%. The method is linear over the validated concentration range of 0.3125 – 20.0μg/mL. A good correlation (Spearman r=0.96) between paired plasma and DBS EFV concentrations from the clinical samples was observed, and hematocrit level was not found to be a significant determinant of the EFV DBS level. The mean observed CDBS/Cplasma ratio was 0.68. A good correlation (Spearman r=0.96) between paired plasma and DPS EFV concentrations from the clinical samples was observed. The mean percent deviation of DPS samples from plasma samples is 1.68%. Conclusions Dried whole blood spot or dried plasma spot sampling is well suited for monitoring EFV therapy in resource limited settings, particularly when high sensitivity is not essential. PMID:23503446

  18. Segmental analysis of amphetamines in hair using a sensitive UHPLC-MS/MS method.

    PubMed

    Jakobsson, Gerd; Kronstrand, Robert

    2014-06-01

    A sensitive and robust ultra high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed and validated for quantification of amphetamine, methamphetamine, 3,4-methylenedioxyamphetamine and 3,4-methylenedioxy methamphetamine in hair samples. Segmented hair (10 mg) was incubated in 2M sodium hydroxide (80°C, 10 min) before liquid-liquid extraction with isooctane followed by centrifugation and evaporation of the organic phase to dryness. The residue was reconstituted in methanol:formate buffer pH 3 (20:80). The total run time was 4 min and after optimization of UHPLC-MS/MS-parameters validation included selectivity, matrix effects, recovery, process efficiency, calibration model and range, lower limit of quantification, precision and bias. The calibration curve ranged from 0.02 to 12.5 ng/mg, and the recovery was between 62 and 83%. During validation the bias was less than ±7% and the imprecision was less than 5% for all analytes. In routine analysis, fortified control samples demonstrated an imprecision <13% and control samples made from authentic hair demonstrated an imprecision <26%. The method was applied to samples from a controlled study of amphetamine intake as well as forensic hair samples previously analyzed with an ultra high performance liquid chromatography time of flight mass spectrometry (UHPLC-TOF-MS) screening method. The proposed method was suitable for quantification of these drugs in forensic cases including violent crimes, autopsy cases, drug testing and re-granting of driving licences. This study also demonstrated that if hair samples are divided into several short segments, the time point for intake of a small dose of amphetamine can be estimated, which might be useful when drug facilitated crimes are investigated. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Internal wave energy flux from density perturbations in nonlinear stratifications

    NASA Astrophysics Data System (ADS)

    Lee, Frank M.; Allshouse, Michael R.; Swinney, Harry L.; Morrison, P. J.

    2017-11-01

    Tidal flow over the topography at the bottom of the ocean, whose density varies with depth, generates internal gravity waves that have a significant impact on the energy budget of the ocean. Thus, understanding the energy flux (J = p v) is important, but it is difficult to measure simultaneously the pressure and velocity perturbation fields, p and v . In a previous work, a Green's-function-based method was developed to calculate the instantaneous p, v , and thus J , given a density perturbation field for a constant buoyancy frequency N. Here we extend the previous analytic Green's function work to include nonuniform N profiles, namely the tanh-shaped and linear cases, because background density stratifications that occur in the ocean and some experiments are nonlinear. In addition, we present a finite-difference method for the general case where N has an arbitrary profile. Each method is validated against numerical simulations. The methods we present can be applied to measured density perturbation data by using our MATLAB graphical user interface EnergyFlux. PJM was supported by the U.S. Department of Energy Contract DE-FG05-80ET-53088. HLS and MRA were supported by ONR Grant No. N000141110701.

  20. A partial least squares based spectrum normalization method for uncertainty reduction for laser-induced breakdown spectroscopy measurements

    NASA Astrophysics Data System (ADS)

    Li, Xiongwei; Wang, Zhe; Lui, Siu-Lung; Fu, Yangting; Li, Zheng; Liu, Jianming; Ni, Weidou

    2013-10-01

    A bottleneck of the wide commercial application of laser-induced breakdown spectroscopy (LIBS) technology is its relatively high measurement uncertainty. A partial least squares (PLS) based normalization method was proposed to improve pulse-to-pulse measurement precision for LIBS based on our previous spectrum standardization method. The proposed model utilized multi-line spectral information of the measured element and characterized the signal fluctuations due to the variation of plasma characteristic parameters (plasma temperature, electron number density, and total number density) for signal uncertainty reduction. The model was validated by the application of copper concentration prediction in 29 brass alloy samples. The results demonstrated an improvement on both measurement precision and accuracy over the generally applied normalization as well as our previously proposed simplified spectrum standardization method. The average relative standard deviation (RSD), average of the standard error (error bar), the coefficient of determination (R2), the root-mean-square error of prediction (RMSEP), and average value of the maximum relative error (MRE) were 1.80%, 0.23%, 0.992, 1.30%, and 5.23%, respectively, while those for the generally applied spectral area normalization were 3.72%, 0.71%, 0.973, 1.98%, and 14.92%, respectively.

  1. Retrieval evaluation and distance learning from perceived similarity between endomicroscopy videos.

    PubMed

    André, Barbara; Vercauteren, Tom; Buchner, Anna M; Wallace, Michael B; Ayache, Nicholas

    2011-01-01

    Evaluating content-based retrieval (CBR) is challenging because it requires an adequate ground-truth. When the available groundtruth is limited to textual metadata such as pathological classes, retrieval results can only be evaluated indirectly, for example in terms of classification performance. In this study we first present a tool to generate perceived similarity ground-truth that enables direct evaluation of endomicroscopic video retrieval. This tool uses a four-points Likert scale and collects subjective pairwise similarities perceived by multiple expert observers. We then evaluate against the generated ground-truth a previously developed dense bag-of-visual-words method for endomicroscopic video retrieval. Confirming the results of previous indirect evaluation based on classification, our direct evaluation shows that this method significantly outperforms several other state-of-the-art CBR methods. In a second step, we propose to improve the CBR method by learning an adjusted similarity metric from the perceived similarity ground-truth. By minimizing a margin-based cost function that differentiates similar and dissimilar video pairs, we learn a weight vector applied to the visual word signatures of videos. Using cross-validation, we demonstrate that the learned similarity distance is significantly better correlated with the perceived similarity than the original visual-word-based distance.

  2. Measuring global monopole velocities, one by one

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez-Eiguren, Asier; Urrestilla, Jon; Achúcarro, Ana, E-mail: asier.lopez@ehu.eus, E-mail: jon.urrestilla@ehu.eus, E-mail: achucar@lorentz.leidenuniv.nl

    We present an estimation of the average velocity of a network of global monopoles in a cosmological setting using large numerical simulations. In order to obtain the value of the velocity, we improve some already known methods, and present a new one. This new method estimates individual global monopole velocities in a network, by means of detecting each monopole position in the lattice and following the path described by each one of them. Using our new estimate we can settle an open question previously posed in the literature: velocity-dependent one-scale (VOS) models for global monopoles predict two branches of scalingmore » solutions, one with monopoles moving at subluminal speeds and one with monopoles moving at luminal speeds. Previous attempts to estimate monopole velocities had large uncertainties and were not able to settle that question. Our simulations find no evidence of a luminal branch. We also estimate the values of the parameters of the VOS model. With our new method we can also study the microphysics of the complicated dynamics of individual monopoles. Finally we use our large simulation volume to compare the results from the different estimator methods, as well as to asses the validity of the numerical approximations made.« less

  3. Assessing Adolescent Mindfulness: Validation of an Adapted Mindful Attention Awareness Scale in Adolescent Normative and Psychiatric Populations

    ERIC Educational Resources Information Center

    Brown, Kirk Warren; West, Angela Marie; Loverich, Tamara M.; Biegel, Gina M.

    2011-01-01

    Interest in mindfulness-based interventions for children and adolescents is burgeoning, bringing with it the need for validated instruments to assess mindfulness in youths. The present studies were designed to validate among adolescents a measure of mindfulness previously validated for adults (e.g., Brown & Ryan, 2003), which we herein call…

  4. Soldier Dimensions in Combat Models

    DTIC Science & Technology

    1990-05-07

    and performance. Questionnaires, SQTs, and ARTEPs were often used. Many scales had estimates of reliability but few had validity data. Most studies...pending its validation . Research plans were provided for applications in simulated combat and with simulation devices, for data previously gathered...regarding reliability and validity . Lack of information following an instrument indicates neither reliability nor validity information was provided by the

  5. Measurement of left ventricular torsion using block-matching-based speckle tracking for two-dimensional echocardiography

    NASA Astrophysics Data System (ADS)

    Sun, Feng-Rong; Wang, Xiao-Jing; Wu, Qiang; Yao, Gui-Hua; Zhang, Yun

    2013-01-01

    Left ventricular (LV) torsion is a sensitive and global index of LV systolic and diastolic function, but how to noninvasively measure it is challenging. Two-dimensional echocardiography and the block-matching based speckle tracking method were used to measure LV torsion. Main advantages of the proposed method over the previous ones are summarized as follows: (1) The method is automatic, except for manually selecting some endocardium points on the end-diastolic frame in initialization step. (2) The diamond search strategy is applied, with a spatial smoothness constraint introduced into the sum of absolute differences matching criterion; and the reference frame during the search is determined adaptively. (3) The method is capable of removing abnormal measurement data automatically. The proposed method was validated against that using Doppler tissue imaging and some preliminary clinical experimental studies were presented to illustrate clinical values of the proposed method.

  6. Comparison of Random Forest and Support Vector Machine classifiers using UAV remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Piragnolo, Marco; Masiero, Andrea; Pirotti, Francesco

    2017-04-01

    Since recent years surveying with unmanned aerial vehicles (UAV) is getting a great amount of attention due to decreasing costs, higher precision and flexibility of usage. UAVs have been applied for geomorphological investigations, forestry, precision agriculture, cultural heritage assessment and for archaeological purposes. It can be used for land use and land cover classification (LULC). In literature, there are two main types of approaches for classification of remote sensing imagery: pixel-based and object-based. On one hand, pixel-based approach mostly uses training areas to define classes and respective spectral signatures. On the other hand, object-based classification considers pixels, scale, spatial information and texture information for creating homogeneous objects. Machine learning methods have been applied successfully for classification, and their use is increasing due to the availability of faster computing capabilities. The methods learn and train the model from previous computation. Two machine learning methods which have given good results in previous investigations are Random Forest (RF) and Support Vector Machine (SVM). The goal of this work is to compare RF and SVM methods for classifying LULC using images collected with a fixed wing UAV. The processing chain regarding classification uses packages in R, an open source scripting language for data analysis, which provides all necessary algorithms. The imagery was acquired and processed in November 2015 with cameras providing information over the red, blue, green and near infrared wavelength reflectivity over a testing area in the campus of Agripolis, in Italy. Images were elaborated and ortho-rectified through Agisoft Photoscan. The ortho-rectified image is the full data set, and the test set is derived from partial sub-setting of the full data set. Different tests have been carried out, using a percentage from 2 % to 20 % of the total. Ten training sets and ten validation sets are obtained from each test set. The control dataset consist of an independent visual classification done by an expert over the whole area. The classes are (i) broadleaf, (ii) building, (iii) grass, (iv) headland access path, (v) road, (vi) sowed land, (vii) vegetable. The RF and SVM are applied to the test set. The performances of the methods are evaluated using the three following accuracy metrics: Kappa index, Classification accuracy and Classification Error. All three are calculated in three different ways: with K-fold cross validation, using the validation test set and using the full test set. The analysis indicates that SVM gets better results in terms of good scores using K-fold cross or validation test set. Using the full test set, RF achieves a better result in comparison to SVM. It also seems that SVM performs better with smaller training sets, whereas RF performs better as training sets get larger.

  7. Validation and application of a multiresidue method based on liquid chromatography-tandem mass spectrometry for evaluating the plant uptake of 74 microcontaminants in crops irrigated with treated municipal wastewater.

    PubMed

    Martínez-Piernas, A B; Polo-López, M I; Fernández-Ibáñez, P; Agüera, A

    2018-01-26

    Reuse of treated wastewater for agricultural purposes can mitigate water stress in some regions where the lack of water is an extended problem. However, the environmental long-term consequences of this practice are still unknown. It is demonstrated that using reclaimed water for irrigation lead to accumulation and translocation of some microcontaminants (MCs) in soil and crops. However, so far, only a small group of contaminants has been investigated. This study aims to develop and validate a simple and efficient multiresidue method based on QuEChERs (Quick, Easy, Cheap, Effective and Rugged) extraction coupled to liquid chromatography tandem mass spectrometry (LC-MS/MS). The novelty of the study relays in the large number of MCs analyzed (74), some of them not previously investigated, in three commodities (lettuce, radish and strawberry). Optimized conditions yielded good results for the three commodities under study. Up to 84% of the compounds were recovered within a 70-120% range, with good repeatability (relative standard deviations below 20% in most cases). Method detection (MDLs) and quantification limits (MQLs) ranged from 0.01 to 2 ng/g. The proposed method was successfully applied to assess the potential uptake of MCs by lettuce and radish crops irrigated with wastewater under controlled conditions for 3 and 1.5 months, respectively. 12 compounds were detected in the crops with concentrations ranging from 0.03 to 57.6 ng/g. N-Formyl-4-aminoantipyrine (4FAA) was the most concentrated compound. The application of this method demonstrated for the first time the accumulation of 5 contaminants of emerging concern (CECs) not previously reported: 4FAA, N-Acetyl-4-aminoantipyrine (4AAA), hydrochlorothiazide, mepivacaine and venlafaxine. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Development and validation of an improved method for the determination of chloropropanols in paperboard food packaging by GC-MS.

    PubMed

    Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G

    2015-01-01

    The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).

  9. Erratum: Development, appraisal, validation and implementation of a consensus protocol for the assessment of cerebral amyloid angiopathy in post-mortem brain tissue.

    PubMed

    Love, Seth; Chalmers, Katy; Ince, Paul; Esiri, Margaret; Attems, Johannes; Kalaria, Raj; Jellinger, Kurt; Yamada, Masahito; McCarron, Mark; Minett, Thais; Matthews, Fiona; Greenberg, Steven; Mann, David; Kehoe, Patrick Gavin

    2015-01-01

    In a collaboration involving 11 groups with research interests in cerebral amyloid angiopathy (CAA), we used a two-stage process to develop and in turn validate a new consensus protocol and scoring scheme for the assessment of CAA and associated vasculopathic abnormalities in post-mortem brain tissue. Stage one used an iterative Delphi-style survey to develop the consensus protocol. The resultant scoring scheme was tested on a series of digital images and paraffin sections that were circulated blind to a number of scorers. The scoring scheme and choice of staining methods were refined by open-forum discussion. The agreed protocol scored parenchymal and meningeal CAA on a 0-3 scale, capillary CAA as present/absent and vasculopathy on 0-2 scale, in the 4 cortical lobes that were scored separately. A further assessment involving three centres was then undertaken. Neuropathologists in three centres (Bristol, Oxford and Sheffield) independently scored sections from 75 cases (25 from each centre) and high inter-rater reliability was demonstrated. Stage two used the results of the three-centre assessment to validate the protocol by investigating previously described associations between APOE genotype (previously determined), and both CAA and vasculopathy. Association of capillary CAA with or without arteriolar CAA with APOE ε4 was confirmed. However APOE ε2 was also found to be a strong risk factor for the development of CAA, not only in AD but also in elderly non-demented controls. Further validation of this protocol and scoring scheme is encouraged, to aid its wider adoption to facilitate collaborative and replication studies of CAA.[This corrects the article on p. 19 in vol. 3, PMID: 24754000.].

  10. Development, appraisal, validation and implementation of a consensus protocol for the assessment of cerebral amyloid angiopathy in post-mortem brain tissue

    PubMed Central

    Love, Seth; Chalmers, Katy; Ince, Paul; Esiri, Margaret; Attems, Johannes; Kalaria, Raj; Jellinger, Kurt; Yamada, Masahito; McCarron, Mark; Minett, Thais; Matthews, Fiona; Greenberg, Steven; Mann, David; Kehoe, Patrick Gavin

    2015-01-01

    In a collaboration involving 11 groups with research interests in cerebral amyloid angiopathy (CAA), we used a two-stage process to develop and in turn validate a new consensus protocol and scoring scheme for the assessment of CAA and associated vasculopathic abnormalities in post-mortem brain tissue. Stage one used an iterative Delphi-style survey to develop the consensus protocol. The resultant scoring scheme was tested on a series of digital images and paraffin sections that were circulated blind to a number of scorers. The scoring scheme and choice of staining methods were refined by open-forum discussion. The agreed protocol scored parenchymal and meningeal CAA on a 0-3 scale, capillary CAA as present/absent and vasculopathy on 0-2 scale, in the 4 cortical lobes that were scored separately. A further assessment involving three centres was then undertaken. Neuropathologists in three centres (Bristol, Oxford and Sheffield) independently scored sections from 75 cases (25 from each centre) and high inter-rater reliability was demonstrated. Stage two used the results of the three-centre assessment to validate the protocol by investigating previously described associations between APOE genotype (previously determined), and both CAA and vasculopathy. Association of capillary CAA with or without arteriolar CAA with APOE ε4 was confirmed. However APOE ε2 was also found to be a strong risk factor for the development of CAA, not only in AD but also in elderly non-demented controls. Further validation of this protocol and scoring scheme is encouraged, to aid its wider adoption to facilitate collaborative and replication studies of CAA. PMID:26807344

  11. Development, appraisal, validation and implementation of a consensus protocol for the assessment of cerebral amyloid angiopathy in post-mortem brain tissue

    PubMed Central

    Love, Seth; Chalmers, Katy; Ince, Paul; Esiri, Margaret; Attems, Johannes; Jellinger, Kurt; Yamada, Masahito; McCarron, Mark; Minett, Thais; Matthews, Fiona; Greenberg, Steven; Mann, David; Kehoe, Patrick Gavin

    2014-01-01

    In a collaboration involving 11 groups with research interests in cerebral amyloid angiopathy (CAA), we used a two-stage process to develop and in turn validate a new consensus protocol and scoring scheme for the assessment of CAA and associated vasculopathic abnormalities in post-mortem brain tissue. Stage one used an iterative Delphi-style survey to develop the consensus protocol. The resultant scoring scheme was tested on a series of digital images and paraffin sections that were circulated blind to a number of scorers. The scoring scheme and choice of staining methods were refined by open-forum discussion. The agreed protocol scored parenchymal and meningeal CAA on a 0-3 scale, capillary CAA as present/absent and vasculopathy on 0-2 scale, in the 4 cortical lobes that were scored separately. A further assessment involving three centres was then undertaken. Neuropathologists in three centres (Bristol, Oxford and Sheffield) independently scored sections from 75 cases (25 from each centre) and high inter-rater reliability was demonstrated. Stage two used the results of the three-centre assessment to validate the protocol by investigating previously described associations between APOE genotype (previously determined), and both CAA and vasculopathy. Association of capillary CAA with or without arteriolar CAA with APOE ε4 was confirmed. However APOE ε2 was also found to be a strong risk factor for the development of CAA, not only in AD but also in elderly non-demented controls. Further validation of this protocol and scoring scheme is encouraged, to aid its wider adoption to facilitate collaborative and replication studies of CAA. PMID:24754000

  12. Using expired air carbon monoxide to determine smoking status during pregnancy: preliminary identification of an appropriately sensitive and specific cut-point.

    PubMed

    Bailey, Beth A

    2013-10-01

    Measurement of carbon monoxide in expired air samples (ECO) is a non-invasive, cost-effective biochemical marker for smoking. Cut points of 6ppm-10ppm have been established, though appropriate cut-points for pregnant woman have been debated due to metabolic changes. This study assessed whether an ECO cut-point identifying at least 90% of pregnant smokers, and misidentifying fewer than 10% of non-smokers, could be established. Pregnant women (N=167) completed a validated self-report smoking assessment, a urine drug screen for cotinine (UDS), and provided an expired air sample twice during pregnancy. Half of women reported non-smoking status early (51%) and late (53%) in pregnancy, confirmed by UDS. Using a traditional 8ppm+cut-point for the early pregnancy reading, only 1% of non-smokers were incorrectly identified as smokers, but only 56% of all smokers, and 67% who smoked 5+ cigarettes in the previous 24h, were identified. However, at 4ppm+, only 8% of non-smokers were misclassified as smokers, and 90% of all smokers and 96% who smoked 5+ cigarettes in the previous 24h were identified. False positives were explained by heavy second hand smoke exposure and marijuana use. Results were similar for late pregnancy ECO, with ROC analysis revealing an area under the curve of .95 for early pregnancy, and .94 for late pregnancy readings. A lower 4ppm ECO cut-point may be necessary to identify pregnant smokers using expired air samples, and this cut-point appears valid throughout pregnancy. Work is ongoing to validate findings in larger samples, but it appears if an appropriate cut-point is used, ECO is a valid method for determining smoking status in pregnancy. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Validation of an instrument to assess barriers to care-seeking for accidental bowel leakage in women: the BCABL questionnaire

    PubMed Central

    Brown, Heidi Wendell; Wise, Meg E.; Westenberg, Danielle; Schmuhl, Nicholas B.; Brezoczky, Kelly Lewis; Rogers, Rebecca G.; Constantine, Melissa L.

    2017-01-01

    Introduction and hypothesis Fewer than 30% of women with accidental bowel leakage (ABL) seek care, despite the existence of effective, minimally invasive therapies. We developed and validated a condition-specific instrument to assess barriers to care-seeking for ABL in women. Methods Adult women with ABL completed an electronic survey about condition severity, patient activation, previous care-seeking, and demographics. The Barriers to Care-seeking for Accidental Bowel Leakage (BCABL) instrument contained 42 potential items completed at baseline and again 2 weeks later. Paired t tests evaluated test–retest reliability. Factor analysis evaluated factor structure and guided item retention. Cronbach’s alpha evaluated internal consistency. Within and across factor item means generated a summary BCABL score used to evaluate scale validity with six external criterion measures. Results Among 1,677 click-throughs, 736 (44%) entered the survey; 95% of eligible female respondents (427 out of 458) provided complete data. Fifty-three percent of respondents had previously sought care for their ABL; median age was 62 years (range 27–89); mean Vaizey score was 12.8 (SD = 5.0), indicating moderate to severe ABL. Test–retest reliability was excellent for all items. Factor extraction via oblique rotation resulted in the final structure of 16 items in six domains, within which internal consistency was high. All six external criterion measures correlated significantly with BCABL score. Conclusions The BCABL questionnaire, with 16 items mapping to six domains, has excellent criterion validity and test–retest reliability when administered electronically in women with ABL. The BCABL can be used to identify care-seeking barriers for ABL in different populations, inform targeted interventions, and measure their effectiveness. PMID:28236039

  14. Novel prediction model of renal function after nephrectomy from automated renal volumetry with preoperative multidetector computed tomography (MDCT).

    PubMed

    Isotani, Shuji; Shimoyama, Hirofumi; Yokota, Isao; Noma, Yasuhiro; Kitamura, Kousuke; China, Toshiyuki; Saito, Keisuke; Hisasue, Shin-ichi; Ide, Hisamitsu; Muto, Satoru; Yamaguchi, Raizo; Ukimura, Osamu; Gill, Inderbir S; Horie, Shigeo

    2015-10-01

    The predictive model of postoperative renal function may impact on planning nephrectomy. To develop the novel predictive model using combination of clinical indices with computer volumetry to measure the preserved renal cortex volume (RCV) using multidetector computed tomography (MDCT), and to prospectively validate performance of the model. Total 60 patients undergoing radical nephrectomy from 2011 to 2013 participated, including a development cohort of 39 patients and an external validation cohort of 21 patients. RCV was calculated by voxel count using software (Vincent, FUJIFILM). Renal function before and after radical nephrectomy was assessed via the estimated glomerular filtration rate (eGFR). Factors affecting postoperative eGFR were examined by regression analysis to develop the novel model for predicting postoperative eGFR with a backward elimination method. The predictive model was externally validated and the performance of the model was compared with that of the previously reported models. The postoperative eGFR value was associated with age, preoperative eGFR, preserved renal parenchymal volume (RPV), preserved RCV, % of RPV alteration, and % of RCV alteration (p < 0.01). The significant correlated variables for %eGFR alteration were %RCV preservation (r = 0.58, p < 0.01) and %RPV preservation (r = 0.54, p < 0.01). We developed our regression model as follows: postoperative eGFR = 57.87 - 0.55(age) - 15.01(body surface area) + 0.30(preoperative eGFR) + 52.92(%RCV preservation). Strong correlation was seen between postoperative eGFR and the calculated estimation model (r = 0.83; p < 0.001). The external validation cohort (n = 21) showed our model outperformed previously reported models. Combining MDCT renal volumetry and clinical indices might yield an important tool for predicting postoperative renal function.

  15. 34 CFR 462.11 - What must an application contain?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the methodology and procedures used to measure the reliability of the test. (h) Construct validity... previous test, and results from validity, reliability, and equating or standard-setting studies undertaken... NRS educational functioning levels (content validity). Documentation of the extent to which the items...

  16. Simulation of anisoplanatic imaging through optical turbulence using numerical wave propagation with new validation analysis

    NASA Astrophysics Data System (ADS)

    Hardie, Russell C.; Power, Jonathan D.; LeMaster, Daniel A.; Droege, Douglas R.; Gladysz, Szymon; Bose-Pillai, Santasri

    2017-07-01

    We present a numerical wave propagation method for simulating imaging of an extended scene under anisoplanatic conditions. While isoplanatic simulation is relatively common, few tools are specifically designed for simulating the imaging of extended scenes under anisoplanatic conditions. We provide a complete description of the proposed simulation tool, including the wave propagation method used. Our approach computes an array of point spread functions (PSFs) for a two-dimensional grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. The degradation includes spatially varying warping and blurring. To produce the PSF array, we generate a series of extended phase screens. Simulated point sources are numerically propagated from an array of positions on the object plane, through the phase screens, and ultimately to the focal plane of the simulated camera. Note that the optical path for each PSF will be different, and thus, pass through a different portion of the extended phase screens. These different paths give rise to a spatially varying PSF to produce anisoplanatic effects. We use a method for defining the individual phase screen statistics that we have not seen used in previous anisoplanatic simulations. We also present a validation analysis. In particular, we compare simulated outputs with the theoretical anisoplanatic tilt correlation and a derived differential tilt variance statistic. This is in addition to comparing the long- and short-exposure PSFs and isoplanatic angle. We believe this analysis represents the most thorough validation of an anisoplanatic simulation to date. The current work is also unique that we simulate and validate both constant and varying Cn2(z) profiles. Furthermore, we simulate sequences with both temporally independent and temporally correlated turbulence effects. Temporal correlation is introduced by generating even larger extended phase screens and translating this block of screens in front of the propagation area. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. Thus, we think this tool can be used effectively to study optical anisoplanatic turbulence and to aid in the development of image restoration methods.

  17. Non-invasive tissue temperature measurements based on quantitative diffuse optical spectroscopy (DOS) of water.

    PubMed

    Chung, S H; Cerussi, A E; Merritt, S I; Ruth, J; Tromberg, B J

    2010-07-07

    We describe the development of a non-invasive method for quantitative tissue temperature measurements using Broadband diffuse optical spectroscopy (DOS). Our approach is based on well-characterized opposing shifts in near-infrared (NIR) water absorption spectra that appear with temperature and macromolecular binding state. Unlike conventional reflectance methods, DOS is used to generate scattering-corrected tissue water absorption spectra. This allows us to separate the macromolecular bound water contribution from the thermally induced spectral shift using the temperature isosbestic point at 996 nm. The method was validated in intralipid tissue phantoms by correlating DOS with thermistor measurements (R=0.96) with a difference of 1.1+/-0.91 degrees C over a range of 28-48 degrees C. Once validated, thermal and hemodynamic (i.e. oxy- and deoxy-hemoglobin concentration) changes were measured simultaneously and continuously in human subjects (forearm) during mild cold stress. DOS-measured arm temperatures were consistent with previously reported invasive deep tissue temperature studies. These results suggest that DOS can be used for non-invasive, co-registered measurements of absolute temperature and hemoglobin parameters in thick tissues, a potentially important approach for optimizing thermal diagnostics and therapeutics.

  18. Non-invasive fetal sex determination by maternal plasma sequencing and application in X-linked disorder counseling.

    PubMed

    Pan, Xiaoyu; Zhang, Chunlei; Li, Xuchao; Chen, Shengpei; Ge, Huijuan; Zhang, Yanyan; Chen, Fang; Jiang, Hui; Jiang, Fuman; Zhang, Hongyun; Wang, Wei; Zhang, Xiuqing

    2014-12-01

    To develop a fetal sex determination method based on maternal plasma sequencing (MPS), assess its performance and potential use in X-linked disorder counseling. 900 cases of MPS data from a previous study were reviewed, in which 100 and 800 cases were used as training and validation set, respectively. The percentage of uniquely mapped sequencing reads on Y chromosome was calculated and used to classify male and female cases. Eight pregnant women who are carriers of Duchenne muscular dystrophy (DMD) mutations were recruited, whose plasma were subjected to multiplex sequencing and fetal sex determination analysis. In the training set, a sensitivity of 96% and false positive rate of 0% for male cases detection were reached in our method. The blinded validation results showed 421 in 423 male cases and 374 in 377 female cases were successfully identified, revealing sensitivity and specificity of 99.53% and 99.20% for fetal sex determination, at as early as 12 gestational weeks. Fetal sex for all eight DMD genetic counseling cases were correctly identified, which were confirmed by amniocentesis. Based on MPS, high accuracy of non-invasive fetal sex determination can be achieved. This method can potentially be used for prenatal genetic counseling.

  19. [Assessment of an Evaluation System for Psychiatry Learning].

    PubMed

    Campo-Cabal, Gerardo

    2012-01-01

    Through the analysis of a teaching evaluation system for a Psychiatry course aimed at Medicine students, the author reviews the basic elements taken into account in a teaching assessment process. Analysis was carried out of the assessment methods used as well as of the grades obtained by the students from four groups into which the they were divided. The selected assessment methods are appropriate to evaluate educational objectives; the contents are selected by means of a specification matrix; there is a high correlation coefficient between the grades obtained in previous academic periods and the ones obtained in the course, thus demonstrating the validity of the results (both considering the whole exam or just a part of it). Most of the students are on the right side of the grading curve, which means that the majority of them acquire the knowledge expected. The assessment system used in the Psychopathology course is fair, valid and reliable, specifically concerning the objective methods used, but the conceptual evaluation should be improved or, preferably, eliminated as a constituernt part of the evaluation system. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  20. Validation of pharmaceutical potency determinations by quantitative nuclear magnetic resonance spectrometry.

    PubMed

    Webster, Gregory K; Marsden, Ian; Pommerening, Cynthia A; Tyrakowski, Christina M

    2010-05-01

    With the changing development paradigms in the pharmaceutical industry, laboratories are challenged to release materials for clinical studies with rapid turnaround times. To minimize cost demands, many businesses are looking to develop ways of using early Good Manufacturing Practice (GMP) materials of active pharmaceutical ingredients (API) for Good Laboratory Practice (GLP) toxicology studies. To make this happen, the analytical laboratory releases the material by one of three scenarios: (1) holding the GLP release until full GMP testing is ready, (2) issuing a separate lot number for a portion of the GMP material and releasing the material for GLP use, or (3) releasing the lot of material for GLP using alternate (equivalent) method(s) not specified for GMP release testing. Many companies are finding the third scenario to be advantageous in terms of cost and efficiency through the use of quantitative nuclear magnetic resonance (q-NMR). The use of q-NMR has proved to be a single-point replacement for routine early development testing that previously combined elements of identity testing, chromatographic assay, moisture analysis, residual solvent analysis, and elemental analysis. This study highlights that q-NMR can be validated to meet current regulatory analytical method guidelines for routine pharmaceutical analysis.

Top