Science.gov

Sample records for purpose validated method

  1. Validity of a simple videogrammetric method to measure the movement of all hand segments for clinical purposes.

    PubMed

    Sancho-Bru, Joaquín L; Jarque-Bou, Néstor J; Vergara, Margarita; Pérez-González, Antonio

    2014-02-01

    Hand movement measurement is important in clinical, ergonomics and biomechanical fields. Videogrammetric techniques allow the measurement of hand movement without interfering with the natural hand behaviour. However, an accurate measurement of the hand movement requires the use of a high number of markers, which limits its applicability for the clinical practice (60 markers would be needed for hand and wrist). In this work, a simple method that uses a reduced number of markers (29), based on a simplified kinematic model of the hand, is proposed and evaluated. A set of experiments have been performed to evaluate the errors associated with the kinematic simplification, together with the evaluation of its accuracy, repeatability and reproducibility. The global error attributed to the kinematic simplification was 6.68°. The method has small errors in repeatability and reproducibility (3.43° and 4.23°, respectively) and shows no statistically significant difference with the use of electronic goniometers. The relevance of the work lies in the ability of measuring all degrees of freedom of the hand with a reduced number of markers without interfering with the natural hand behaviour, which makes it suitable for its use in clinical applications, as well as for ergonomic and biomechanical purposes. PMID:24503512

  2. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  3. Fit for purpose validated method for the determination of the strontium isotopic signature in mineral water samples by multi-collector inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Brach-Papa, Christophe; Van Bocxstaele, Marleen; Ponzevera, Emmanuel; Quétel, Christophe R.

    2009-03-01

    A robust method allowing the routine determination of n( 87Sr)/ n( 86Sr) with at least five significant decimal digits for large sets of mineral water samples is described. It is based on 2 consecutive chromatographic separations of Sr associated to multi-collector inductively coupled plasma mass spectrometry (MC-ICPMS) measurements. Separations are performed using commercial pre-packed columns filled with "Sr resin" to overcome isobaric interferences affecting the determination of strontium isotope ratios. The careful method validation scheme applied is described. It included investigations on all parameters influencing both chromatographic separations and MC-ICPMS measurements, and also the test on a synthetic sample made of an aliquot of the NIST SRM 987 certified reference material dispersed in a saline matrix to mimic complex samples. Correction for mass discrimination was done internally using the n( 88Sr)/ n( 86Sr) ratio. For comparing mineral waters originating from different geological backgrounds or identifying counterfeits, calculations involved the well known consensus value (1/0.1194) ± 0 as reference. The typical uncertainty budget estimated for these results was 40 'ppm' relative ( k = 2). It increased to 150 'ppm' ( k = 2) for the establishment of stand alone results, taking into account a relative difference of about 126 'ppm' systematically observed between measured and certified values of the NIST SRM 987. In case there was suspicion of a deviation of the n( 88Sr)/ n( 86Sr) ratio (worst case scenario) our proposal was to use the NIST SRM 987 value 8.37861 ± 0.00325 ( k = 2) as reference, and assign a typical relative uncertainty budget of 300 'ppm' ( k = 2). This method is thus fit for purpose and was applied to eleven French samples.

  4. Content Validation of the Purpose Dimension.

    ERIC Educational Resources Information Center

    LaPlante, Marilyn J.; Jewett, Ann E.

    1987-01-01

    The article reports on LaPlante's research (1973) on evaluation of the purpose dimension of the Purpose Process Curriculum Framework, which established a set of criteria for evaluating the framework and demonstrated that the Delphi technique is appropriate for study of physical education curriculum. (CB)

  5. Construct Validity in Formative Assessment: Purpose and Practices

    ERIC Educational Resources Information Center

    Rix, Samantha

    2012-01-01

    This paper examines the utilization of construct validity in formative assessment for classroom-based purposes. Construct validity pertains to the notion that interpretations are made by educators who analyze test scores during formative assessment. The purpose of this paper is to note the challenges that educators face when interpreting these…

  6. Homework Purpose Scale for High School Students: A Validation Study

    ERIC Educational Resources Information Center

    Xu, Jianzhong

    2010-01-01

    The purpose of this study is to test the validity of scores on the Homework Purpose Scale using 681 rural and 306 urban high school students. First, confirmatory factor analysis was conducted on the rural sample. The results reveal that the Homework Purpose Scale comprises three separate yet related factors, including Learning-Oriented Reasons,…

  7. Homework Purpose Scale for Middle School Students: A Validation Study

    ERIC Educational Resources Information Center

    Xu, Jianzhong

    2011-01-01

    The purpose of the present study is to test the validity of scores on the Homework Purpose Scale (HPS) for middle school students. The participants were 1,181 eighth graders in the southeastern United States, including (a) 699 students in urban school districts and (b) 482 students in rural school districts. First, confirmatory factor analysis was…

  8. Review of surface steam sterilization for validation purposes.

    PubMed

    van Doornmalen, Joost; Kopinga, Klaas

    2008-03-01

    Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes. PMID:18313509

  9. VAN method lacks validity

    NASA Astrophysics Data System (ADS)

    Jackson, David D.; Kagan, Yan Y.

    Varotsos and colleagues (the VAN group) claim to have successfully predicted many earthquakes in Greece. Several authors have refuted these claims, as reported in the May 27,1996, special issue of Geophysical Research Letters and a recent book, A Critical Review of VAN [Lighthill 1996]. Nevertheless, the myth persists. Here we summarize why the VAN group's claims lack validity.The VAN group observes electrical potential differences that they call “seismic electric signals” (SES) weeks before and hundreds of kilometers away from some earthquakes, claiming that SES are somehow premonitory. This would require that increases in stress or decreases in strength cause the electrical variations, or that some regional process first causes the electrical signals and then helps trigger the earthquakes. Here we adopt their notation SES to refer to the electrical variations, without accepting any link to the quakes.

  10. Toward a Unified Validation Framework in Mixed Methods Research

    ERIC Educational Resources Information Center

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  11. Random Qualitative Validation: A Mixed-Methods Approach to Survey Validation

    ERIC Educational Resources Information Center

    Van Duzer, Eric

    2012-01-01

    The purpose of this paper is to introduce the process and value of Random Qualitative Validation (RQV) in the development and interpretation of survey data. RQV is a method of gathering clarifying qualitative data that improves the validity of the quantitative analysis. This paper is concerned with validity in relation to the participants'…

  12. Purposes and methods of scoring earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2010-12-01

    There are two kinds of purposes in the studies on earthquake prediction or forecasts: one is to give a systematic estimation of earthquake risks in some particular region and period in order to give advice to governments and enterprises for the use of reducing disasters, the other one is to search for reliable precursors that can be used to improve earthquake prediction or forecasts. For the first case, a complete score is necessary, while for the latter case, a partial score, which can be used to evaluate whether the forecasts or predictions have some advantages than a well know model, is necessary. This study reviews different scoring methods for evaluating the performance of earthquake prediction and forecasts. Especially, the gambling scoring method, which is developed recently, shows its capacity in finding good points in an earthquake prediction algorithm or model that are not in a reference model, even if its overall performance is no better than the reference model.

  13. External Validity in Policy Evaluations That Choose Sites Purposively

    ERIC Educational Resources Information Center

    Olsen, Robert B.; Orr, Larry L.; Bell, Stephen H.; Stuart, Elizabeth A.

    2013-01-01

    Evaluations of the impact of social programs are often carried out in multiple sites, such as school districts, housing authorities, local TANF offices, or One-Stop Career Centers. Most evaluations select sites purposively following a process that is nonrandom. Unfortunately, purposive site selection can produce a sample of sites that is not…

  14. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  15. Design for validation, based on formal methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1990-01-01

    Validation of ultra-reliable systems decomposes into two subproblems: (1) quantification of probability of system failure due to physical failure; (2) establishing that Design Errors are not present. Methods of design, testing, and analysis of ultra-reliable software are discussed. It is concluded that a design-for-validation based on formal methods is needed for the digital flight control systems problem, and also that formal methods will play a major role in the development of future high reliability digital systems.

  16. Purpose in Life Test assessment using latent variable methods.

    PubMed

    Harlow, L L; Newcomb, M D; Bentler, P M

    1987-09-01

    A psychometric assessment was conducted on a slightly revised version of the Purpose in Life Test (PIL-R). Factor analyses revealed a large general factor plus four primary factors comprising lack of purpose in life, positive sense of purpose, motivation for meaning, and existential confusion. Validity models showed that the PIL-R was positively related to a construct of happiness and was negatively related to suicidality and meaninglessness. Reliability estimates ranged from 0.78 to 0.86. The revised version can be presented compactly and may be less confusing to subjects than the original PIL. PMID:3664045

  17. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. PMID:24958671

  18. Experimental validation of structural optimization methods

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.

    1992-01-01

    The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.

  19. ASTM Validates Air Pollution Test Methods

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  20. A Practical Guide to Immunoassay Method Validation

    PubMed Central

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J. C.; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H. Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M.; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E.

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer’s disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well. PMID:26347708

  1. Fit-for-purpose bioanalytical cross-validation for LC-MS/MS assays in clinical studies.

    PubMed

    Xu, Xiaohui; Ji, Qin C; Jemal, Mohammed; Gleason, Carol; Shen, Jim X; Stouffer, Bruce; Arnold, Mark E

    2013-01-01

    The paradigm shift of globalized research and conducting clinical studies at different geographic locations worldwide to access broader patient populations has resulted in increased need of correlating bioanalytical results generated in multiple laboratories, often across national borders. Cross-validations of bioanalytical methods are often implemented to assure the equivalency of the bioanalytical results is demonstrated. Regulatory agencies, such as the US FDA and European Medicines Agency, have included the requirement of cross-validations in their respective bioanalytical validation guidance and guidelines. While those documents provide high-level expectations, the detailed implementation is at the discretion of each individual organization. At Bristol-Myers Squibb, we practice a fit-for-purpose approach for conducting cross-validations for small-molecule bioanalytical methods using LC-MS/MS. A step-by-step proposal on the overall strategy, procedures and technical details for conducting a successful cross-validation is presented herein. A case study utilizing the proposed cross-validation approach to rule out method variability as the potential cause for high variance observed in PK studies is also presented. PMID:23256474

  2. Validation of qualitative microbiological test methods.

    PubMed

    IJzerman-Boon, Pieta C; van den Heuvel, Edwin R

    2015-01-01

    This paper considers a statistical model for the detection mechanism of qualitative microbiological test methods with a parameter for the detection proportion (the probability to detect a single organism) and a parameter for the false positive rate. It is demonstrated that the detection proportion and the bacterial density cannot be estimated separately, not even in a multiple dilution experiment. Only the product can be estimated, changing the interpretation of the most probable number estimator. The asymptotic power of the likelihood ratio statistic for comparing an alternative method with the compendial method, is optimal for a single dilution experiment. The bacterial density should either be close to two CFUs per test unit or equal to zero, depending on differences in the model parameters between the two test methods. The proposed strategy for method validation is to use these two dilutions and test for differences in the two model parameters, addressing the validation parameters specificity and accuracy. Robustness of these two parameters might still be required, but all other validation parameters can be omitted. A confidence interval-based approach for the ratio of the detection proportions for the two methods is recommended, since it is most informative and close to the power of the likelihood ratio test. PMID:25412584

  3. Validation of an alternative microbiological method for tissue products.

    PubMed

    Suessner, Susanne; Hennerbichler, Simone; Schreiberhuber, Stefanie; Stuebl, Doris; Gabriel, Christian

    2014-06-01

    According to the European Pharmacopoeia sterility testing of products includes an incubation time of 14 days in thioglycollate medium and soya-bean casein medium. In this case a large period of time is needed for product testing. So we designed a study to evaluate an alternative method for sterility testing. The aim of this study was to reduce the incubation time for the routinely produced products in our tissue bank (cornea and amnion grafts) by obtaining the same detection limit, accurateness and recovery rates as the reference method described in the European Pharmacopoeia. The study included two steps of validation. Primary validation compared the reference method with the alternative method. Therefore eight bacterial and two fungi test strains were tested at their preferred milieu. A geometric dilution series from 10 to 0.625 colony forming unit per 10 ml culture media was used. Subsequent to the evaluation the second part of the study started including the validation of the fertility of the culture media and the parallel testing of the two methods by investigating products. For this purpose two product batches were tested in three independent runs. Concerning the validation we could not find any aberration between the alternative and the reference method. In addition, the recovery rate of each microorganism was between 83.33 and 100 %. The alternative method showed non-inferiority regarding accuracy to the reference method. Due to this study we reduced the sterility testing for cornea and amniotic grafts to 9 days. PMID:24810914

  4. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  5. Validation methods for flight crucial systems

    NASA Technical Reports Server (NTRS)

    Holt, H. M.

    1983-01-01

    Research to develop techniques that can aid in determining the reliability and performance of digital electronic fault-tolerant systems, that have probability of catastrophic system failure on the order of 10 to the -9th at 10 hours, is reviewed. The computer-aided reliability estimation program (CARE III) provides general-purpose reliability analysis and a design tool for fault-tolerant systems; large reduction of state size; and a fault-handling model based on probabilistic description of detection, isolation, and recovery mechanisms. The application of design proof techniques as part of the design and development of the software implemented fault-tolerance computer is mentioned. Emulation techniques and experimental procedures are verified using specimens of fault-tolerant computers and the capabilities of the validation research laboratory, AIRLAB.

  6. [Methods of risk assessment and their validation].

    PubMed

    Baracco, Alessandro

    2014-01-01

    The review of the literature data shows several methods for the the risks assessment of biomnechanical overload of the musculoskeletal system in activities with repetitive strain of the upper limbs and manual material handling. The application of these methods should allow the quantification ofriskfor the working population, the identification of the preventive measures to reduce the risk and their effectiveness and thle design of a specific health surveillance scheme. In this paper we analyze the factors which must be taken into account in Occupational Medicine to implement a process of validation of these methods. In conclusion we believe it will necessary in the future the availability of new methods able to analyze and reduce the risk already in the design phase of the production process. PMID:25558718

  7. New method of deposition of biomolecules for bioelectronic purposes

    NASA Astrophysics Data System (ADS)

    Morales, P.; Sperandei, M.

    1994-02-01

    A laser induced plasma vaporization and ionization technique is proposed for electric field assisted deposition of proteins. Experiments were carried out depositing thick layers of lysozyme on metal, creating submillimeter patterns. The enzymatic activity of horseradish peroxidase deposited by this method was tested, together with that of the enzyme laccase from a micro-organism. The applicability of this method to the construction of nanometric patterns for bioelectronic purposes is discussed.

  8. Establishing the Content Validity of Tests Designed To Serve Multiple Purposes: Bridging Secondary-Postsecondary Mathematics.

    ERIC Educational Resources Information Center

    Burstein, Leigh; And Others

    A method is presented for determining the content validity of a series of secondary school mathematics tests. These tests are part of the Mathematics Diagnostic Testing Project (MDTP), a collaborative effort by California university systems to develop placement examinations and a means to document student preparation in mathematics. Content…

  9. Validation of analytical methods involved in dissolution assays: acceptance limits and decision methodologies.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-11-01

    Dissolution tests are key elements to ensure continuing product quality and performance. The ultimate goal of these tests is to assure consistent product quality within a defined set of specification criteria. Validation of an analytical method aimed at assessing the dissolution profile of products or at verifying pharmacopoeias compliance should demonstrate that this analytical method is able to correctly declare two dissolution profiles as similar or drug products as compliant with respect to their specifications. It is essential to ensure that these analytical methods are fit for their purpose. Method validation is aimed at providing this guarantee. However, even in the ICHQ2 guideline there is no information explaining how to decide whether the method under validation is valid for its final purpose or not. Are the entire validation criterion needed to ensure that a Quality Control (QC) analytical method for dissolution test is valid? What acceptance limits should be set on these criteria? How to decide about method's validity? These are the questions that this work aims at answering. Focus is made to comply with the current implementation of the Quality by Design (QbD) principles in the pharmaceutical industry in order to allow to correctly defining the Analytical Target Profile (ATP) of analytical methods involved in dissolution tests. Analytical method validation is then the natural demonstration that the developed methods are fit for their intended purpose and is not any more the inconsiderate checklist validation approach still generally performed to complete the filing required to obtain product marketing authorization. PMID:23084050

  10. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  11. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  12. Softcopy quality ruler method: implementation and validation

    NASA Astrophysics Data System (ADS)

    Jin, Elaine W.; Keelan, Brian W.; Chen, Junqing; Phillips, Jonathan B.; Chen, Ying

    2009-01-01

    A softcopy quality ruler method was implemented for the International Imaging Industry Association (I3A) Camera Phone Image Quality (CPIQ) Initiative. This work extends ISO 20462 Part 3 by virtue of creating reference digital images of known subjective image quality, complimenting the hardcopy Standard Reference Stimuli (SRS). The softcopy ruler method was developed using images from a Canon EOS 1Ds Mark II D-SLR digital still camera (DSC) and a Kodak P880 point-and-shoot DSC. Images were viewed on an Apple 30in Cinema Display at a viewing distance of 34 inches. Ruler images were made for 16 scenes. Thirty ruler images were generated for each scene, representing ISO 20462 Standard Quality Scale (SQS) values of approximately 2 to 31 at an increment of one just noticeable difference (JND) by adjusting the system modulation transfer function (MTF). A Matlab GUI was developed to display the ruler and test images side-by-side with a user-adjustable ruler level controlled by a slider. A validation study was performed at Kodak, Vista Point Technology, and Aptina Imaging in which all three companies set up a similar viewing lab to run the softcopy ruler method. The results show that the three sets of data are in reasonable agreement with each other, with the differences within the range expected from observer variability. Compared to previous implementations of the quality ruler, the slider-based user interface allows approximately 2x faster assessments with 21.6% better precision.

  13. A special purpose knowledge-based face localization method

    NASA Astrophysics Data System (ADS)

    Hassanat, Ahmad; Jassim, Sabah

    2008-04-01

    This paper is concerned with face localization for visual speech recognition (VSR) system. Face detection and localization have got a great deal of attention in the last few years, because it is an essential pre-processing step in many techniques that handle or deal with faces, (e.g. age, face, gender, race and visual speech recognition). We shall present an efficient method for localization human's faces in video images captured on mobile constrained devices, under a wide variation in lighting conditions. We use a multiphase method that may include all or some of the following steps starting with image pre-processing, followed by a special purpose edge detection, then an image refinement step. The output image will be passed through a discrete wavelet decomposition procedure, and the computed LL sub-band at a certain level will be transformed into a binary image that will be scanned by using a special template to select a number of possible candidate locations. Finally, we fuse the scores from the wavelet step with scores determined by color information for the candidate location and employ a form of fuzzy logic to distinguish face from non-face locations. We shall present results of large number of experiments to demonstrate that the proposed face localization method is efficient and achieve high level of accuracy that outperforms existing general-purpose face detection methods.

  14. Tripless control method for general-purpose inverters

    SciTech Connect

    Mutoh, N.; Ueda, A. . Hitachi Research Lab.); Nandoh, K.; Ibori, S. )

    1992-09-01

    In this paper a new control method is described. This method prevents general-purpose inverters without current regulators from tripping easily, i.e., to be tripless no matter how their load is varied, and enables motors to rotate stably at high frequencies. This control is performed using only current sensors and is a combination of PWM control and torque control. This approach for PWM control changes an asynchronized mode to a synchronized mode when the modulation ratio becomes more than one. This enables the carrier wave frequency to be continuously varied with the inverter frequency. As a result, motors can rotate stably over a wide frequency range. The torque control uses a real and reactive component detector, magnetic flux compensator, slip compensator, and current limit controller.

  15. FDIR Strategy Validation with the B Method

    NASA Astrophysics Data System (ADS)

    Sabatier, D.; Dellandrea, B.; Chemouil, D.

    2008-08-01

    In a formation flying satellite system, the FDIR strategy (Failure Detection, Isolation and Recovery) is paramount. When a failure occurs, satellites should be able to take appropriate reconfiguration actions to obtain the best possible results given the failure, ranging from avoiding satellite-to-satellite collision to continuing the mission without disturbance if possible. To achieve this goal, each satellite in the formation has an implemented FDIR strategy that governs how it detects failures (from tests or by deduction) and how it reacts (reconfiguration using redundant equipments, avoidance manoeuvres, etc.). The goal is to protect the satellites first and the mission as much as possible. In a project initiated by the CNES, ClearSy experiments the B Method to validate the FDIR strategies developed by Thales Alenia Space, of the inter satellite positioning and communication devices that will be used for the SIMBOL-X (2 satellite configuration) and the PEGASE (3 satellite configuration) missions and potentially for other missions afterward. These radio frequency metrology sensor devices provide satellite positioning and inter satellite communication in formation flying. This article presents the results of this experience.

  16. Some useful statistical methods for model validation.

    PubMed Central

    Marcus, A H; Elias, R W

    1998-01-01

    Although formal hypothesis tests provide a convenient framework for displaying the statistical results of empirical comparisons, standard tests should not be used without consideration of underlying measurement error structure. As part of the validation process, predictions of individual blood lead concentrations from models with site-specific input parameters are often compared with blood lead concentrations measured in field studies that also report lead concentrations in environmental media (soil, dust, water, paint) as surrogates for exposure. Measurements of these environmental media are subject to several sources of variability, including temporal and spatial sampling, sample preparation and chemical analysis, and data entry or recording. Adjustments for measurement error must be made before statistical tests can be used to empirically compare environmental data with model predictions. This report illustrates the effect of measurement error correction using a real dataset of child blood lead concentrations for an undisclosed midwestern community. We illustrate both the apparent failure of some standard regression tests and the success of adjustment of such tests for measurement error using the SIMEX (simulation-extrapolation) procedure. This procedure adds simulated measurement error to model predictions and then subtracts the total measurement error, analogous to the method of standard additions used by analytical chemists. Images Figure 1 Figure 3 PMID:9860913

  17. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  18. Methods for developing and validating survivability distributions

    SciTech Connect

    Williams, R.L.

    1993-10-01

    A previous report explored and discussed statistical methods and procedures that may be applied to validate the survivability of a complex system of systems that cannot be tested as an entity. It described a methodology where Monte Carlo simulation was used to develop the system survivability distribution from the component distributions using a system model that registers the logical interactions of the components to perform system functions. This paper discusses methods that can be used to develop the required survivability distributions based upon three sources of knowledge. These are (1) available test results; (2) little or no available test data, but a good understanding of the physical laws and phenomena which can be applied by computer simulation; and (3) neither test data nor adequate knowledge of the physics are known, in which case, one must rely upon, and quantify, the judgement of experts. This paper describes the relationship between the confidence bounds that can be placed on survivability and the number of tests conducted. It discusses the procedure for developing system level survivability distributions from the distributions for lower levels of integration. It demonstrates application of these techniques by defining a communications network for a Hypothetical System Architecture. A logic model for the performance of this communications network is developed, as well as the survivability distributions for the nodes and links based on two alternate data sets, reflecting the effects of increased testing of all elements. It then shows how this additional testing could be optimized by concentrating only on those elements contained in the low-order fault sets which the methodology identifies.

  19. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. PMID:23246613

  20. The method of measurement system software automatic validation using business rules management system

    NASA Astrophysics Data System (ADS)

    Zawistowski, Piotr

    2015-09-01

    The method of measurement system software automatic validation using business rules management system (BRMS) is discussed in this paper. The article contains a description of the new approach to measurement systems execution validation, a description of the implementation of the system that supports mentioned validation and examples documenting the correctness of the approach. In the new approach BRMS are used for measurement systems execution validation. Such systems have not been used for software execution validation nor for measurement systems. The benefits of using them for the listed purposes are discussed as well.

  1. Evaluating regional vulnerability to climate change: purposes and methods

    SciTech Connect

    Malone, Elizabeth L.; Engle, Nathan L.

    2011-03-15

    As the emphasis in climate change research, international negotiations, and developing-country activities has shifted from mitigation to adaptation, vulnerability has emerged as a bridge between impacts on one side and the need for adaptive changes on the other. Still, the term vulnerability remains abstract, its meaning changing with the scale, focus, and purpose of each assessment. Understanding regional vulnerability has advanced over the past several decades, with studies using a combination of indicators, case studies and analogues, stakeholder-driven processes, and scenario-building methodologies. As regions become increasingly relevant scales of inquiry for bridging the aggregate and local, for every analysis, it is perhaps most appropriate to ask three “what” questions: “What/who is vulnerable?,” “What is vulnerability?,” and “Vulnerable to what?” The answers to these questions will yield different definitions of vulnerability as well as different methods for assessing it.

  2. Purpose and methods of a Pollution Prevention Awareness Program

    SciTech Connect

    Flowers, P.A.; Irwin, E.F.; Poligone, S.E.

    1994-08-15

    The purpose of the Pollution Prevention Awareness Program (PPAP), which is required by DOE Order 5400.1, is to foster the philosophy that prevention is superior to remediation. The goal of the program is to incorporate pollution prevention into the decision-making process at every level throughout the organization. The objectives are to instill awareness, disseminate information, provide training and rewards for identifying the true source or cause of wastes, and encourage employee participation in solving environmental issues and preventing pollution. PPAP at the Oak Ridge Y-12 Plant was created several years ago and continues to grow. We believe that we have implemented several unique methods of communicating environmental awareness to promote a more active work force in identifying ways of reducing pollution.

  3. Validation of a Theoretical Model of Diagnostic Classroom Assessment: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Koh, Nancy

    2012-01-01

    The purpose of the study was to validate a theoretical model of diagnostic, formative classroom assessment called, "Proximal Assessment for Learner Diagnosis" (PALD). To achieve its purpose, the study employed a two-stage, mixed-methods design. The study utilized multiple data sources from 11 elementary level mathematics teachers who…

  4. The Value of Qualitative Methods in Social Validity Research

    ERIC Educational Resources Information Center

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based…

  5. A Clinical Method for Identifying Scapular Dyskinesis, Part 2: Validity

    PubMed Central

    Tate, Angela R; McClure, Philip; Kareha, Stephen; Irwin, Dominic; Barbe, Mary F

    2009-01-01

    Context: Although clinical methods for detecting scapular dyskinesis have been described, evidence supporting the validity of these methods is lacking. Objective: To determine the validity of the scapular dyskinesis test, a visually based method of identifying abnormal scapular motion. A secondary purpose was to explore the relationship between scapular dyskinesis and shoulder symptoms. Design: Validation study comparing 3-dimensional measures of scapular motion among participants clinically judged as having either normal motion or scapular dyskinesis. Setting: University athletic training facilities. Patients or Other Participants: A sample of 142 collegiate athletes (National Collegiate Athletic Association Division I and Division III) participating in sports requiring overhead use of the arm was rated, and 66 of these underwent 3-dimensional testing. Intervention(s): Volunteers were viewed by 2 raters while performing weighted shoulder flexion and abduction. The right and left sides were rated independently as normal, subtle dyskinesis, or obvious dyskinesis using the scapular dyskinesis test. Symptoms were assessed using the Penn Shoulder Score. Main Outcome Measure(s): Athletes judged as having either normal motion or obvious dyskinesis underwent 3-dimensional electromagnetic kinematic testing while performing the same movements. The kinematic data from both groups were compared via multifactor analysis of variance with post hoc testing using the least significant difference procedure. The relationship between symptoms and scapular dyskinesis was evaluated by odds ratios. Results: Differences were found between the normal and obvious dyskinesis groups. Participants with obvious dyskinesis showed less scapular upward rotation (P < .001), less clavicular elevation (P < .001), and greater clavicular protraction (P  =  .044). The presence of shoulder symptoms was not different between the normal and obvious dyskinesis volunteers (odds ratio  =  0.79, 95

  6. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  7. Estimates of External Validity Bias When Impact Evaluations Select Sites Purposively

    ERIC Educational Resources Information Center

    Stuart, Elizabeth A.; Olsen, Robert B.; Bell, Stephen H.; Orr, Larry L.

    2012-01-01

    While there has been some increasing interest in external validity, most work to this point has been in assessing the similarity of a randomized trial sample and a population of interest (e.g., Stuart et al., 2010; Tipton, 2011). The goal of this research is to calculate empirical estimates of the external validity bias in educational intervention…

  8. Development and Validation of a Reading-Related Assessment Battery in Malay for the Purpose of Dyslexia Assessment

    ERIC Educational Resources Information Center

    Lee, Lay Wah

    2008-01-01

    Malay is an alphabetic language with transparent orthography. A Malay reading-related assessment battery which was conceptualised based on the International Dyslexia Association definition of dyslexia was developed and validated for the purpose of dyslexia assessment. The battery consisted of ten tests: Letter Naming, Word Reading, Non-word…

  9. Illustrating a Mixed-Method Approach for Validating Culturally Specific Constructs

    ERIC Educational Resources Information Center

    Hitchcock, J.H.; Nastasi, B.K.; Dai, D.Y.; Newman, J.; Jayasena, A.; Bernstein-Moore, R.; Sarkar, S.; Varjas, K.

    2005-01-01

    The purpose of this article is to illustrate a mixed-method approach (i.e., combining qualitative and quantitative methods) for advancing the study of construct validation in cross-cultural research. The article offers a detailed illustration of the approach using the responses 612 Sri Lankan adolescents provided to an ethnographic survey. Such…

  10. Purpose in Life in Emerging Adulthood: Development and Validation of a New Brief Measure

    PubMed Central

    Hill, Patrick L.; Edmonds, Grant W.; Peterson, Missy; Luyckx, Koen; Andrews, Judy A.

    2015-01-01

    Accruing evidence points to the value of studying purpose in life across adolescence and emerging adulthood. Research though is needed to understand the unique role of purpose in life in predicting well-being and developmentally relevant outcomes during emerging adulthood. The current studies (total n = 669) found support for the development of a new brief measure of purpose in life using data from American and Canadian samples, while demonstrating evidence for two important findings. First, purpose in life predicted well-being during emerging adulthood, even when controlling for the Big Five personality traits. Second, purpose in life was positively associated with self-image and negatively associated with delinquency, again controlling for personality traits. Findings are discussed with respect to how studying purpose in life can help understand which individuals are more likely to experience positive transitions into adulthood. PMID:26958072

  11. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  12. Bioanalytical method validation: An updated review.

    PubMed

    Tiwari, Gaurav; Tiwari, Ruchi

    2010-10-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development, culminating in a marketing approval. The objective of this paper is to review the sample preparation of drug in biological matrix and to provide practical approaches for determining selectivity, specificity, limit of detection, lower limit of quantitation, linearity, range, accuracy, precision, recovery, stability, ruggedness, and robustness of liquid chromatographic methods to support pharmacokinetic (PK), toxicokinetic, bioavailability, and bioequivalence studies. Bioanalysis, employed for the quantitative determination of drugs and their metabolites in biological fluids, plays a significant role in the evaluation and interpretation of bioequivalence, PK, and toxicokinetic studies. Selective and sensitive analytical methods for quantitative evaluation of drugs and their metabolites are critical for the successful conduct of pre-clinical and/or biopharmaceutics and clinical pharmacology studies. PMID:23781413

  13. Design and validation of a general purpose robotic testing system for musculoskeletal applications.

    PubMed

    Noble, Lawrence D; Colbrunn, Robb W; Lee, Dong-Gil; van den Bogert, Antonie J; Davis, Brian L

    2010-02-01

    Orthopaedic research on in vitro forces applied to bones, tendons, and ligaments during joint loading has been difficult to perform because of limitations with existing robotic simulators in applying full-physiological loading to the joint under investigation in real time. The objectives of the current work are as follows: (1) describe the design of a musculoskeletal simulator developed to support in vitro testing of cadaveric joint systems, (2) provide component and system-level validation results, and (3) demonstrate the simulator's usefulness for specific applications of the foot-ankle complex and knee. The musculoskeletal simulator allows researchers to simulate a variety of loading conditions on cadaver joints via motorized actuators that simulate muscle forces while simultaneously contacting the joint with an external load applied by a specialized robot. Multiple foot and knee studies have been completed at the Cleveland Clinic to demonstrate the simulator's capabilities. Using a variety of general-use components, experiments can be designed to test other musculoskeletal joints as well (e.g., hip, shoulder, facet joints of the spine). The accuracy of the tendon actuators to generate a target force profile during simulated walking was found to be highly variable and dependent on stance position. Repeatability (the ability of the system to generate the same tendon forces when the same experimental conditions are repeated) results showed that repeat forces were within the measurement accuracy of the system. It was determined that synchronization system accuracy was 6.7+/-2.0 ms and was based on timing measurements from the robot and tendon actuators. The positioning error of the robot ranged from 10 microm to 359 microm, depending on measurement condition (e.g., loaded or unloaded, quasistatic or dynamic motion, centralized movements or extremes of travel, maximum value, or root-mean-square, and x-, y- or z-axis motion). Algorithms and methods for controlling

  14. Exploring valid and reliable assessment methods for care management education.

    PubMed

    Gennissen, Lokke; Stammen, Lorette; Bueno-de-Mesquita, Jolien; Wieringa, Sietse; Busari, Jamiu

    2016-07-01

    Purpose It is assumed that the use of valid and reliable assessment methods can facilitate the development of medical residents' management and leadership competencies. To justify this assertion, the perceptions of an expert panel of health care leaders were explored on assessment methods used for evaluating care management (CM) development in Dutch residency programs. This paper aims to investigate how assessors and trainees value these methods and examine for any inherent benefits or shortcomings when they are applied in practice. Design/methodology/approach A Delphi survey was conducted among members of the platform for medical leadership in The Netherlands. This panel of experts was made up of clinical educators, practitioners and residents interested in CM education. Findings Of the respondents, 40 (55.6 per cent) and 31 (43 per cent) participated in the first and second rounds of the Delphi survey, respectively. The respondents agreed that assessment methods currently being used to measure residents' CM competencies were weak, though feasible for use in many residency programs. Multi-source feedback (MSF, 92.1 per cent), portfolio/e-portfolio (86.8 per cent) and knowledge testing (76.3 per cent) were identified as the most commonly known assessment methods with familiarity rates exceeding 75 per cent. Practical implications The findings suggested that an "assessment framework" comprising MSF, portfolios, individual process improvement projects or self-reflections and observations in clinical practice should be used to measure CM competencies in residents. Originality/value This study reaffirms the need for objective methods to assess CM skills in post-graduate medical education, as there was not a single assessment method that stood out as the best instrument. PMID:27397747

  15. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines. PMID:27179186

  16. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    NASA Astrophysics Data System (ADS)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  17. Validation of Analytical Methods for Biomarkers Employed in Drug Development

    PubMed Central

    Chau, Cindy H.; Rixe, Olivier; McLeod, Howard; Figg, William D.

    2008-01-01

    The role of biomarkers in drug discovery and development has gained precedence over the years. As biomarkers become integrated into drug development and clinical trials, quality assurance and in particular assay validation becomes essential with the need to establish standardized guidelines for analytical methods used in biomarker measurements. New biomarkers can revolutionize both the development and use of therapeutics, but is contingent upon the establishment of a concrete validation process that addresses technology integration and method validation as well as regulatory pathways for efficient biomarker development. This perspective focuses on the general principles of the biomarker validation process with an emphasis on assay validation and the collaborative efforts undertaken by various sectors to promote the standardization of this procedure for efficient biomarker development. PMID:18829475

  18. Validation of population-based disease simulation models: a review of concepts and methods

    PubMed Central

    2010-01-01

    Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1) the process of model development, 2) the performance of a model, and 3) the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction). More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility. PMID:21087466

  19. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    ERIC Educational Resources Information Center

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  20. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    ERIC Educational Resources Information Center

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  1. Reliability and validity of optoelectronic method for biophotonical measurements

    NASA Astrophysics Data System (ADS)

    Karpienko, Katarzyna; Wróbel, Maciej S.; UrniaŻ, Rafał

    2013-11-01

    Reliability and validity of measurements is of utmost importance when assessing measuring capability of instruments developed for research. In order to perform an experiment which is legitimate, used instruments must be both reliable and valid. Reliability estimates the degree of precision of measurement, the extent to which a measurement is internally consistent. Validity is the usefulness of an instrument to perform accurate measurements of quantities it was designed to measure. Statistical analysis for reliability and validity control of low-coherence interferometry method for refractive index measurements of biological fluids is presented. The low-coherence interferometer is sensitive to optical path difference between interfering beams. This difference depends on the refractive index of measured material. To assess the validity and reliability of proposed method for blood measurements, the statistical analysis of the method was performed on several substances with known refractive indices. Analysis of low-coherence interferograms considered the mean distances between fringes. Performed statistical analysis for validity and reliability consisted of Grubb's test for outliers, Shapiro-Wilk test for normal distribution, T-Student test, standard deviation, coefficient of determination and r-Pearson correlation. Overall the tests proved high statistical significance of measurement method with confidence level < 0.0001 of measurement method.

  2. High Explosive Verification and Validation: Systematic and Methodical Approach

    NASA Astrophysics Data System (ADS)

    Scovel, Christina; Menikoff, Ralph

    2011-06-01

    Verification and validation of high explosive (HE) models does not fit the standard mold for several reasons. First, there are no non-trivial test problems with analytic solutions. Second, an HE model depends on a burn rate and the equation of states (EOS) of both the reactants and products. Third, there is a wide range of detonation phenomena from initiation under various stimuli to propagation of curved detonation fronts with non-rigid confining materials. Fourth, in contrast to a shock wave in a non-reactive material, the reaction-zone width is physically significant and affects the behavior of a detonation wave. Because of theses issues, a systematic and methodical approach to HE V & V is needed. Our plan is to build a test suite from the ground up. We have started with the cylinder test and have run simulations with several EOS models and burn models. We have compared with data and cross-compared the different runs to check on the sensitivity to model parameters. A related issue for V & V is what experimental data are available for calibrating and testing models. For this purpose we have started a WEB based high explosive database (HED). The current status of HED will be discussed.

  3. Development and validation of a new fallout transport method using variable spectral winds. Doctoral thesis

    SciTech Connect

    Hopkins, A.T.

    1984-09-01

    The purpose of this research was to develop and validate a fallout prediction method using variable transport calculations. The new method uses National Meteorological Center (NMC) spectral coefficients to compute wind vectors along the space- and time-varying trajectories of falling particles. The method was validated by comparing computed and actual cloud trajectories from a Mount St. Helens volcanic eruption and a high dust cloud. In summary, this research demonstrated the feasibility of using spectral coefficients for fallout transport calculations, developed a two-step smearing model to treat variable winds, and showed that uncertainties in spectral winds do not contribute significantly to the error in computed dose rate.

  4. An introduction to clinical microeconomic analysis: purposes and analytic methods.

    PubMed

    Weintraub, W S; Mauldin, P D; Becker, E R

    1994-06-01

    The recent concern with health care economics has fostered the development of a new discipline that is generally called clinical microeconomics. This is a discipline in which microeconomic methods are used to study the economics of specific medical therapies. It is possible to perform stand alone cost analyses, but more profound insight into the medical decision making process may be accomplished by combining cost studies with measures of outcome. This is most often accomplished with cost-effectiveness or cost-utility studies. In cost-effectiveness studies there is one measure of outcome, often death. In cost-utility studies there are multiple measures of outcome, which must be grouped together to give an overall picture of outcome or utility. There are theoretical limitations to the determination of utility that must be accepted to perform this type of analysis. A summary statement of outcome is quality adjusted life years (QALYs), which is utility time socially discounted survival. Discounting is used because people value a year of future life less than a year of present life. Costs are made up of in-hospital direct, professional, follow-up direct, and follow-up indirect costs. Direct costs are for medical services. Indirect costs reflect opportunity costs such as lost time at work. Cost estimates are often based on marginal costs, or the cost for one additional procedure of the same type. Finally an overall statistic may be generated as cost per unit increase in effectiveness, such as dollars per QALY.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:10151059

  5. Testing and Validation of Computational Methods for Mass Spectrometry.

    PubMed

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-01

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods. PMID:26549429

  6. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    ERIC Educational Resources Information Center

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  7. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  8. Adapting CEF-Descriptors for Rating Purposes: Validation by a Combined Rater Training and Scale Revision Approach

    ERIC Educational Resources Information Center

    Harsch, Claudia; Martin, Guido

    2012-01-01

    We explore how a local rating scale can be based on the Common European Framework CEF-proficiency scales. As part of the scale validation (Alderson, 1991; Lumley, 2002), we examine which adaptations are needed to turn CEF-proficiency descriptors into a rating scale for a local context, and to establish a practicable method to revise the initial…

  9. Cochlear Dummy Electrodes for Insertion Training and Research Purposes: Fabrication, Mechanical Characterization, and Experimental Validation.

    PubMed

    Kobler, Jan-Philipp; Dhanasingh, Anandhan; Kiran, Raphael; Jolly, Claude; Ortmaier, Tobias

    2015-01-01

    To develop skills sufficient for hearing preservation cochlear implant surgery, surgeons need to perform several electrode insertion trials in ex vivo temporal bones, thereby consuming relatively expensive electrode carriers. The objectives of this study were to evaluate the insertion characteristics of cochlear electrodes in a plastic scala tympani model and to fabricate radio opaque polymer filament dummy electrodes of equivalent mechanical properties. In addition, this study should aid the design and development of new cochlear electrodes. Automated insertion force measurement is a new technique to reproducibly analyze and evaluate the insertion dynamics and mechanical characteristics of an electrode. Mechanical properties of MED-EL's FLEX(28), FLEX(24), and FLEX(20) electrodes were assessed with the help of an automated insertion tool. Statistical analysis of the overall mechanical behavior of the electrodes and factors influencing the insertion force are discussed. Radio opaque dummy electrodes of comparable characteristics were fabricated based on insertion force measurements. The platinum-iridium wires were replaced by polymer filament to provide sufficient stiffness to the electrodes and to eradicate the metallic artifacts in X-ray and computed tomography (CT) images. These low-cost dummy electrodes are cheap alternatives for surgical training and for in vitro, ex vivo, and in vivo research purposes. PMID:26247024

  10. Cochlear Dummy Electrodes for Insertion Training and Research Purposes: Fabrication, Mechanical Characterization, and Experimental Validation

    PubMed Central

    Kobler, Jan-Philipp; Dhanasingh, Anandhan; Kiran, Raphael; Jolly, Claude; Ortmaier, Tobias

    2015-01-01

    To develop skills sufficient for hearing preservation cochlear implant surgery, surgeons need to perform several electrode insertion trials in ex vivo temporal bones, thereby consuming relatively expensive electrode carriers. The objectives of this study were to evaluate the insertion characteristics of cochlear electrodes in a plastic scala tympani model and to fabricate radio opaque polymer filament dummy electrodes of equivalent mechanical properties. In addition, this study should aid the design and development of new cochlear electrodes. Automated insertion force measurement is a new technique to reproducibly analyze and evaluate the insertion dynamics and mechanical characteristics of an electrode. Mechanical properties of MED-EL's FLEX28, FLEX24, and FLEX20 electrodes were assessed with the help of an automated insertion tool. Statistical analysis of the overall mechanical behavior of the electrodes and factors influencing the insertion force are discussed. Radio opaque dummy electrodes of comparable characteristics were fabricated based on insertion force measurements. The platinum-iridium wires were replaced by polymer filament to provide sufficient stiffness to the electrodes and to eradicate the metallic artifacts in X-ray and computed tomography (CT) images. These low-cost dummy electrodes are cheap alternatives for surgical training and for in vitro, ex vivo, and in vivo research purposes. PMID:26247024

  11. Beyond Correctness: Development and Validation of Concept-Based Categorical Scoring Rubrics for Diagnostic Purposes

    ERIC Educational Resources Information Center

    Arieli-Attali, Meirav; Liu, Ying

    2016-01-01

    Diagnostic assessment approaches intend to provide fine-grained reports of what students know and can do, focusing on their areas of strengths and weaknesses. However, current application of such diagnostic approaches is limited by the scoring method for item responses; important diagnostic information, such as type of errors and strategy use is…

  12. Visualization of vasculature with convolution surfaces: method, validation and evaluation.

    PubMed

    Oeltze, Steffen; Preim, Bernhard

    2005-04-01

    We present a method for visualizing vasculature based on clinical computed tomography or magnetic resonance data. The vessel skeleton as well as the diameter information per voxel serve as input. Our method adheres to these data, while producing smooth transitions at branchings and closed, rounded ends by means of convolution surfaces. We examine the filter design with respect to irritating bulges, unwanted blending and the correct visualization of the vessel diameter. The method has been applied to a large variety of anatomic trees. We discuss the validation of the method by means of a comparison to other visualization methods. Surface distance measures are carried out to perform a quantitative validation. Furthermore, we present the evaluation of the method which has been accomplished on the basis of a survey by 11 radiologists and surgeons. PMID:15822811

  13. Recommendations on biomarker bioanalytical method validation by GCC.

    PubMed

    Hougton, Richard; Gouty, Dominique; Allinson, John; Green, Rachel; Losauro, Mike; Lowes, Steve; LeLacheur, Richard; Garofolo, Fabio; Couerbe, Philippe; Bronner, Stéphane; Struwe, Petra; Schiebl, Christine; Sangster, Timothy; Pattison, Colin; Islam, Rafiq; Garofolo, Wei; Pawula, Maria; Buonarati, Mike; Hayes, Roger; Cameron, Mark; Nicholson, Robert; Harman, Jake; Wieling, Jaap; De Boer, Theo; Reuschel, Scott; Cojocaru, Laura; Harter, Tammy; Malone, Michele; Nowatzke, William

    2012-10-01

    The 5th GCC in Barcelona (Spain) and 6th GCC in San Antonio (TX, USA) events provided a unique opportunity for CRO leaders to openly share opinions and perspectives, and to agree upon recommendations on biomarker bioanalytical method validation. PMID:23157353

  14. The Relationship between Method and Validity in Social Science Research.

    ERIC Educational Resources Information Center

    MacKinnon, David; And Others

    An endless debate in social science research focuses on whether or not there is a philosophical basis for justifying the application of scientific methods to social inquiry. A review of the philosophies of various scholars in the field indicates that there is no single procedure for arriving at a valid statement in a scientific inquiry. Natural…

  15. ePortfolios: The Method of Choice for Validation

    ERIC Educational Resources Information Center

    Scott, Ken; Kim, Jichul

    2015-01-01

    Community colleges have long been institutions of higher education in the arenas of technical education and training, as well as preparing students for transfer to universities. While students are engaged in their student learning outcomes, projects, research, and community service, how have these students validated their work? One method of…

  16. DEVELOPMENT AND VALIDATION OF A TEST METHOD FOR ACRYLONITRILE EMISSIONS

    EPA Science Inventory

    Acrylonitrile (AN) has been identified as a suspected carcinogen and may be regulated in the future as a hazardous air pollutant under Section 112 of the Clean Air Act. A method was validated that utilizes a midget impinger containing methanol for trapping AN vapors followed by a...

  17. Recommendations for Use and Fit-for-Purpose Validation of Biomarker Multiplex Ligand Binding Assays in Drug Development.

    PubMed

    Jani, Darshana; Allinson, John; Berisha, Flora; Cowan, Kyra J; Devanarayan, Viswanath; Gleason, Carol; Jeromin, Andreas; Keller, Steve; Khan, Masood U; Nowatzke, Bill; Rhyne, Paul; Stephen, Laurie

    2016-01-01

    Multiplex ligand binding assays (LBAs) are increasingly being used to support many stages of drug development. The complexity of multiplex assays creates many unique challenges in comparison to single-plexed assays leading to various adjustments for validation and potentially during sample analysis to accommodate all of the analytes being measured. This often requires a compromise in decision making with respect to choosing final assay conditions and acceptance criteria of some key assay parameters, depending on the intended use of the assay. The critical parameters that are impacted due to the added challenges associated with multiplexing include the minimum required dilution (MRD), quality control samples that span the range of all analytes being measured, quantitative ranges which can be compromised for certain targets, achieving parallelism for all analytes of interest, cross-talk across assays, freeze-thaw stability across analytes, among many others. Thus, these challenges also increase the complexity of validating the performance of the assay for its intended use. This paper describes the challenges encountered with multiplex LBAs, discusses the underlying causes, and provides solutions to help overcome these challenges. Finally, we provide recommendations on how to perform a fit-for-purpose-based validation, emphasizing issues that are unique to multiplex kit assays. PMID:26377333

  18. Validation of a previous day recall for measuring the location and purpose of active and sedentary behaviors compared to direct observation

    PubMed Central

    2014-01-01

    Purpose Gathering contextual information (i.e., location and purpose) about active and sedentary behaviors is an advantage of self-report tools such as previous day recalls (PDR). However, the validity of PDR’s for measuring context has not been empirically tested. The purpose of this paper was to compare PDR estimates of location and purpose to direct observation (DO). Methods Fifteen adult (18–75 y) and 15 adolescent (12–17 y) participants were directly observed during at least one segment of the day (i.e., morning, afternoon or evening). Participants completed their normal daily routine while trained observers recorded the location (i.e., home, community, work/school), purpose (e.g., leisure, transportation) and whether the behavior was sedentary or active. The day following the observation, participants completed an unannounced PDR. Estimates of time in each context were compared between PDR and DO. Intra-class correlations (ICC), percent agreement and Kappa statistics were calculated. Results For adults, percent agreement was 85% or greater for each location and ICC values ranged from 0.71 to 0.96. The PDR-reported purpose of adults’ behaviors were highly correlated with DO for household activities and work (ICCs of 0.84 and 0.88, respectively). Transportation was not significantly correlated with DO (ICC = -0.08). For adolescents, reported classification of activity location was 80.8% or greater. The ICCs for purpose of adolescents’ behaviors ranged from 0.46 to 0.78. Participants were most accurate in classifying the location and purpose of the behaviors in which they spent the most time. Conclusions This study suggests that adults and adolescents can accurately report where and why they spend time in behaviors using a PDR. This information on behavioral context is essential for translating the evidence for specific behavior-disease associations to health interventions and public policy. PMID:24490619

  19. Methods for causal inference from gene perturbation experiments and validation.

    PubMed

    Meinshausen, Nicolai; Hauser, Alain; Mooij, Joris M; Peters, Jonas; Versteeg, Philip; Bühlmann, Peter

    2016-07-01

    Inferring causal effects from observational and interventional data is a highly desirable but ambitious goal. Many of the computational and statistical methods are plagued by fundamental identifiability issues, instability, and unreliable performance, especially for large-scale systems with many measured variables. We present software and provide some validation of a recently developed methodology based on an invariance principle, called invariant causal prediction (ICP). The ICP method quantifies confidence probabilities for inferring causal structures and thus leads to more reliable and confirmatory statements for causal relations and predictions of external intervention effects. We validate the ICP method and some other procedures using large-scale genome-wide gene perturbation experiments in Saccharomyces cerevisiae The results suggest that prediction and prioritization of future experimental interventions, such as gene deletions, can be improved by using our statistical inference techniques. PMID:27382150

  20. Methods for causal inference from gene perturbation experiments and validation

    PubMed Central

    Meinshausen, Nicolai; Hauser, Alain; Mooij, Joris M.; Peters, Jonas; Versteeg, Philip; Bühlmann, Peter

    2016-01-01

    Inferring causal effects from observational and interventional data is a highly desirable but ambitious goal. Many of the computational and statistical methods are plagued by fundamental identifiability issues, instability, and unreliable performance, especially for large-scale systems with many measured variables. We present software and provide some validation of a recently developed methodology based on an invariance principle, called invariant causal prediction (ICP). The ICP method quantifies confidence probabilities for inferring causal structures and thus leads to more reliable and confirmatory statements for causal relations and predictions of external intervention effects. We validate the ICP method and some other procedures using large-scale genome-wide gene perturbation experiments in Saccharomyces cerevisiae. The results suggest that prediction and prioritization of future experimental interventions, such as gene deletions, can be improved by using our statistical inference techniques. PMID:27382150

  1. Validation of cleaning method for various parts fabricated at a Beryllium facility

    SciTech Connect

    Davis, Cynthia M.

    2015-12-15

    This study evaluated and documented a cleaning process that is used to clean parts that are fabricated at a beryllium facility at Los Alamos National Laboratory. The purpose of evaluating this cleaning process was to validate and approve it for future use to assure beryllium surface levels are below the Department of Energy’s release limits without the need to sample all parts leaving the facility. Inhaling or coming in contact with beryllium can cause an immune response that can result in an individual becoming sensitized to beryllium, which can then lead to a disease of the lungs called chronic beryllium disease, and possibly lung cancer. Thirty aluminum and thirty stainless steel parts were fabricated on a lathe in the beryllium facility, as well as thirty-two beryllium parts, for the purpose of testing a parts cleaning method that involved the use of ultrasonic cleaners. A cleaning method was created, documented, validated, and approved, to reduce beryllium contamination.

  2. LC-MS quantification of protein drugs: validating protein LC-MS methods with predigestion immunocapture.

    PubMed

    Duggan, Jeffrey; Ren, Bailuo; Mao, Yan; Chen, Lin-Zhi; Philip, Elsy

    2016-09-01

    A refinement of protein LC-MS bioanalysis is to use predigestion immunoaffinity capture to extract the drug from matrix prior to digestion. Because of their increased sensitivity, such hybrid assays have been successfully validated and applied to a number of clinical studies; however, they can also be subject to potential interferences from antidrug antibodies, circulating ligands or other matrix components specific to patient populations and/or dosed subjects. The purpose of this paper is to describe validation experiments that measure immunocapture efficiency, digestion efficiency, matrix effect and selectivity/specificity that can be used during method optimization and validation to test the resistance of the method to these potential interferences. The designs and benefits of these experiments are discussed in this report using an actual assay case study. PMID:27532431

  3. The Bland-Altman Method Should Not Be Used in Regression Cross-Validation Studies

    ERIC Educational Resources Information Center

    O'Connor, Daniel P.; Mahar, Matthew T.; Laughlin, Mitzi S.; Jackson, Andrew S.

    2011-01-01

    The purpose of this study was to demonstrate the bias in the Bland-Altman (BA) limits of agreement method when it is used to validate regression models. Data from 1,158 men were used to develop three regression equations to estimate maximum oxygen uptake (R[superscript 2] = 0.40, 0.61, and 0.82, respectively). The equations were evaluated in a…

  4. Methods for Geometric Data Validation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  5. Making clinical trials more relevant: improving and validating the PRECIS tool for matching trial design decisions to trial purpose

    PubMed Central

    2013-01-01

    Background If you want to know which of two or more healthcare interventions is most effective, the randomised controlled trial is the design of choice. Randomisation, however, does not itself promote the applicability of the results to situations other than the one in which the trial was done. A tool published in 2009, PRECIS (PRagmatic Explanatory Continuum Indicator Summaries) aimed to help trialists design trials that produced results matched to the aim of the trial, be that supporting clinical decision-making, or increasing knowledge of how an intervention works. Though generally positive, groups evaluating the tool have also found weaknesses, mainly that its inter-rater reliability is not clear, that it needs a scoring system and that some new domains might be needed. The aim of the study is to: Produce an improved and validated version of the PRECIS tool. Use this tool to compare the internal validity of, and effect estimates from, a set of explanatory and pragmatic trials matched by intervention. Methods The study has four phases. Phase 1 involves brainstorming and a two-round Delphi survey of authors who cited PRECIS. In Phase 2, the Delphi results will then be discussed and alternative versions of PRECIS-2 developed and user-tested by experienced trialists. Phase 3 will evaluate the validity and reliability of the most promising PRECIS-2 candidate using a sample of 15 to 20 trials rated by 15 international trialists. We will assess inter-rater reliability, and raters’ subjective global ratings of pragmatism compared to PRECIS-2 to assess convergent and face validity. Phase 4, to determine if pragmatic trials sacrifice internal validity in order to achieve applicability, will compare the internal validity and effect estimates of matched explanatory and pragmatic trials of the same intervention, condition and participants. Effect sizes for the trials will then be compared in a meta-regression. The Cochrane Risk of Bias scores will be compared with the

  6. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset.

    PubMed

    Neto, Joana P; Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R

    2016-08-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo "paired-recordings" such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  7. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset

    PubMed Central

    Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R.

    2016-01-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo “paired-recordings” such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  8. Validation of chemistry models employed in a particle simulation method

    NASA Technical Reports Server (NTRS)

    Haas, Brian L.; Mcdonald, Jeffrey D.

    1991-01-01

    The chemistry models employed in a statistical particle simulation method, as implemented in the Intel iPSC/860 multiprocessor computer, are validated and applied. Chemical relaxation of five-species air in these reservoirs involves 34 simultaneous dissociation, recombination, and atomic-exchange reactions. The reaction rates employed in the analytic solutions are obtained from Arrhenius experimental correlations as functions of temperature for adiabatic gas reservoirs in thermal equilibrium. Favorable agreement with the analytic solutions validates the simulation when applied to relaxation of O2 toward equilibrium in reservoirs dominated by dissociation and recombination, respectively, and when applied to relaxation of air in the temperature range 5000 to 30,000 K. A flow of O2 over a circular cylinder at high Mach number is simulated to demonstrate application of the method to multidimensional reactive flows.

  9. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  10. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    PubMed

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study. PMID:26656945

  11. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  12. Validation of an Impedance Education Method in Flow

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Parrott, Tony L.

    2004-01-01

    This paper reports results of a research effort to validate a method for educing the normal incidence impedance of a locally reacting liner, located in a grazing incidence, nonprogressive acoustic wave environment with flow. The results presented in this paper test the ability of the method to reproduce the measured normal incidence impedance of a solid steel plate and two soft test liners in a uniform flow. The test liners are known to be locally react- ing and exhibit no measurable amplitude-dependent impedance nonlinearities or flow effects. Baseline impedance spectra for these liners were therefore established from measurements in a conventional normal incidence impedance tube. A key feature of the method is the expansion of the unknown impedance function as a piecewise continuous polynomial with undetermined coefficients. Stewart's adaptation of the Davidon-Fletcher-Powell optimization algorithm is used to educe the normal incidence impedance at each Mach number by optimizing an objective function. The method is shown to reproduce the measured normal incidence impedance spectrum for each of the test liners, thus validating its usefulness for determining the normal incidence impedance of test liners for a broad range of source frequencies and flow Mach numbers. Nomenclature

  13. Validated HPTLC method of analysis for artemether and its formulations.

    PubMed

    Tayade, Nitin G; Nagarsenker, Mangal S

    2007-02-19

    A simple, sensitive, precise and rapid high-performance thin-layer chromatographic (HPTLC) method of analysis for artemether both as a bulk drug and in pharmaceutical formulations was developed and validated. The method employed TLC aluminum plates precoated with silica gel 60F-254 as the stationary phase. The solvent system consisted of toluene-ethyl acetate-formic acid (8:2:0.3, v/v/v) as mobile phase. Densitometric analysis of artemether was carried out in the reflectance mode at 565 nm. The system was found to give compact spots for artemether (R(f) value of 0.50+/-0.03). The linear regression analysis data for the calibration plots showed good linear relationship with r(2)=0.9904 in the concentration range 200-1000 ng per spot. The mean value of correlation coefficient, slope and intercept were 0.9904+/-0.011, 7.27+/-0.11 and 166.24+/-56.92, respectively. The method was validated for precision, accuracy, recovery and robustness. The limits of detection and quantitation were 65.91 and 197.74 ng per spot, respectively. The method has been successfully applied in the analysis of lipid based parenteral formulations and marketed oral solid dosage formulation. PMID:17045768

  14. Brazilian Center for the Validation of Alternative Methods (BraCVAM) and the process of validation in Brazil.

    PubMed

    Presgrave, Octavio; Moura, Wlamir; Caldeira, Cristiane; Pereira, Elisabete; Bôas, Maria H Villas; Eskes, Chantra

    2016-03-01

    The need for the creation of a Brazilian centre for the validation of alternative methods was recognised in 2008, and members of academia, industry and existing international validation centres immediately engaged with the idea. In 2012, co-operation between the Oswaldo Cruz Foundation (FIOCRUZ) and the Brazilian Health Surveillance Agency (ANVISA) instigated the establishment of the Brazilian Center for the Validation of Alternative Methods (BraCVAM), which was officially launched in 2013. The Brazilian validation process follows OECD Guidance Document No. 34, where BraCVAM functions as the focal point to identify and/or receive requests from parties interested in submitting tests for validation. BraCVAM then informs the Brazilian National Network on Alternative Methods (RENaMA) of promising assays, which helps with prioritisation and contributes to the validation studies of selected assays. A Validation Management Group supervises the validation study, and the results obtained are peer-reviewed by an ad hoc Scientific Review Committee, organised under the auspices of BraCVAM. Based on the peer-review outcome, BraCVAM will prepare recommendations on the validated test method, which will be sent to the National Council for the Control of Animal Experimentation (CONCEA). CONCEA is in charge of the regulatory adoption of all validated test methods in Brazil, following an open public consultation. PMID:27031604

  15. Flexibility and applicability of β-expectation tolerance interval approach to assess the fitness of purpose of pharmaceutical analytical methods.

    PubMed

    Bouabidi, A; Talbi, M; Bourichi, H; Bouklouze, A; El Karbane, M; Boulanger, B; Brik, Y; Hubert, Ph; Rozet, E

    2012-12-01

    An innovative versatile strategy using Total Error has been proposed to decide about the method's validity that controls the risk of accepting an unsuitable assay together with the ability to predict the reliability of future results. This strategy is based on the simultaneous combination of systematic (bias) and random (imprecision) error of analytical methods. Using validation standards, both types of error are combined through the use of a prediction interval or β-expectation tolerance interval. Finally, an accuracy profile is built by connecting, on one hand all the upper tolerance limits, and on the other hand all the lower tolerance limits. This profile combined with pre-specified acceptance limits allows the evaluation of the validity of any quantitative analytical method and thus their fitness for their intended purpose. In this work, the approach of accuracy profile was evaluated on several types of analytical methods encountered in the pharmaceutical industrial field and also covering different pharmaceutical matrices. The four studied examples depicted the flexibility and applicability of this approach for different matrices ranging from tablets to syrups, different techniques such as liquid chromatography, or UV spectrophotometry, and for different categories of assays commonly encountered in the pharmaceutical industry i.e. content assays, dissolution assays, and quantitative impurity assays. The accuracy profile approach assesses the fitness of purpose of these methods for their future routine application. It also allows the selection of the most suitable calibration curve, the adequate evaluation of a potential matrix effect and propose efficient solution and the correct definition of the limits of quantification of the studied analytical procedures. PMID:22615163

  16. Validation of spectrophotometric method for lactulose assay in syrup preparation

    NASA Astrophysics Data System (ADS)

    Mahardhika, Andhika Bintang; Novelynda, Yoshella; Damayanti, Sophi

    2015-09-01

    Lactulose is a synthetic disaccharide widely used in food and pharmaceutical fields. In the pharmaceutical field, lactulose is used as osmotic laxative in a syrup dosage form. This research was aimed to validate the spectrophotometric method to determine the levels of lactulose in syrup preparation and the commercial sample. Lactulose is hydrolyzed by hydrochloric acid to form fructose and galactose. The fructose was reacted with resorcinol reagent, forming compounds that give absorption peak at 485 nm. Analytical methods was validated, hereafter lactulose content in syrup preparation were determined. The calibration curve was linear in the range of 30-100 μg/mL with a correlation coefficient (r) of 0.9996, coefficient of variance (Vxo) of 1.1 %, limit of detection of 2.32 μg/mL, and limit of quantitation of 7.04 μg/mL. The result of accuracy test for the lactulose assay in the syrup preparation showed recoveries of 96.6 to 100.8 %. Repeatability test of lactulose assay in standard solution of lactulose and sample preparation syrup showed the coefficient of variation (CV) of 0.75 % and 0.7 %. Intermediate precision (interday) test resulted in coefficient of variation 1.06 % on the first day, the second day by 0.99 %, and 0.95 % for the third day. This research gave a valid analysis method and levels of lactulose in syrup preparations of samples A, B, C were 101.6, 100.5, and 100.6 %, respectively.

  17. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology

    PubMed Central

    Krishnamoorthi, Shankarjee; Perotti, Luigi E.; Borgstrom, Nils P.; Ajijola, Olujimi A.; Frid, Anna; Ponnaluri, Aditya V.; Weiss, James N.; Qu, Zhilin; Klug, William S.; Ennis, Daniel B.; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations. PMID:25493967

  18. A General Method of Empirical Q-matrix Validation.

    PubMed

    de la Torre, Jimmy; Chiu, Chia-Yi

    2016-06-01

    In contrast to unidimensional item response models that postulate a single underlying proficiency, cognitive diagnosis models (CDMs) posit multiple, discrete skills or attributes, thus allowing CDMs to provide a finer-grained assessment of examinees' test performance. A common component of CDMs for specifying the attributes required for each item is the Q-matrix. Although construction of Q-matrix is typically performed by domain experts, it nonetheless, to a large extent, remains a subjective process, and misspecifications in the Q-matrix, if left unchecked, can have important practical implications. To address this concern, this paper proposes a discrimination index that can be used with a wide class of CDM subsumed by the generalized deterministic input, noisy "and" gate model to empirically validate the Q-matrix specifications by identifying and replacing misspecified entries in the Q-matrix. The rationale for using the index as the basis for a proposed validation method is provided in the form of mathematical proofs to several relevant lemmas and a theorem. The feasibility of the proposed method was examined using simulated data generated under various conditions. The proposed method is illustrated using fraction subtraction data. PMID:25943366

  19. Method validation for methanol quantification present in working places

    NASA Astrophysics Data System (ADS)

    Muna, E. D. M.; Bizarri, C. H. B.; Maciel, J. R. M.; da Rocha, G. P.; de Araújo, I. O.

    2015-01-01

    Given the widespread use of methanol by different industry sectors and high toxicity associated with this substance, it is necessary to use an analytical method able to determine in a sensitive, precise and accurate levels of methanol in the air of working environments. Based on the methodology established by the National Institute for Occupational Safety and Health (NIOSH), it was validated a methodology for determination of methanol in silica gel tubes which had demonstrated its effectiveness based on the participation of the international collaborative program sponsored by the American Industrial Hygiene Association (AIHA).

  20. Determination of Al in cake mix: Method validation and estimation of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Andrade, G.; Rocha, O.; Junqueira, R.

    2016-07-01

    An analytical method for the determination of Al in cake mix was developed. Acceptable values were obtained for the following parameters: linearity, detection limit - LOD (5.00 mg-kg-1) quantification limit - LOQ (12.5 mg-kg-1), the recovery assays values (between 91 and 102%), the relative standard deviation under repeatability and within-reproducibility conditions (<20.0%) and measurement uncertainty tests (<10.0%) The results of the validation process showed that the proposed method is fitness for purpose.

  1. Determination of methylmercury in marine biota samples: method validation.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2014-05-01

    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  2. Validation of a numerical method for unsteady flow calculations

    SciTech Connect

    Giles, M.; Haimes, R. . Dept. of Aeronautics and Astronautics)

    1993-01-01

    This paper describes and validates a numerical method for the calculation of unsteady inviscid and viscous flows. A companion paper compares experimental measurements of unsteady heat transfer on a transonic rotor with the corresponding computational results. The mathematical model is the Reynolds-averaged unsteady Navier-Stokes equations for a compressible ideal gas. Quasi-three-dimensionality is included through the use of a variable streamtube thickness. The numerical algorithm is unusual in two respects: (a) For reasons of efficiency and flexibility, it uses a hybrid Navier-Stokes/Euler method, and (b) to allow for the computation of stator/rotor combinations with arbitrary pitch ratio, a novel space-time coordinate transformation is used. Several test cases are presented to validate the performance of the computer program, UNSFLO. These include: (a) unsteady, inviscid flat plate cascade flows (b) steady and unsteady, viscous flat plate cascade flows, (c) steady turbine heat transfer and loss prediction. In the first two sets of cases comparisons are made with theory, and in the third the comparison is with experimental data.

  3. Experimental validation of boundary element methods for noise prediction

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Oswald, Fred B.

    1992-01-01

    Experimental validation of methods to predict radiated noise is presented. A combined finite element and boundary element model was used to predict the vibration and noise of a rectangular box excited by a mechanical shaker. The predicted noise was compared to sound power measured by the acoustic intensity method. Inaccuracies in the finite element model shifted the resonance frequencies by about 5 percent. The predicted and measured sound power levels agree within about 2.5 dB. In a second experiment, measured vibration data was used with a boundary element model to predict noise radiation from the top of an operating gearbox. The predicted and measured sound power for the gearbox agree within about 3 dB.

  4. Video quality experts group: the quest for valid objective methods

    NASA Astrophysics Data System (ADS)

    Corriveau, Philip J.; Webster, Arthur A.; Rohaly, Ann M.; Libert, John M.

    2000-06-01

    Subjective assessment methods have been used reliably for many years to evaluate video quality. They continue to provide the most reliable assessments compared to objective methods. Some issues that arise with subjective assessment include the cost of conducting the evaluations and the fact that these methods cannot easily be used to monitor video quality in real time. Furthermore, traditional, analog objective methods, while still necessary, are not sufficient to measure the quality of digitally compressed video systems. Thus, there is a need to develop new objective methods utilizing the characteristics of the human visual system. While several new objective methods have been developed, there is to date no internationally standardized method. The Video Quality Experts Group (VQEG) was formed in October 1997 to address video quality issues. The group is composed of experts from various backgrounds and affiliations, including participants from several internationally recognized organizations working in the field of video quality assessment. The majority of participants are active in the International Telecommunications Union (ITU) and VQEG combines the expertise and resources found in several ITU Study Groups to work towards a common goal. The first task undertaken by VQEG was to provide a validation of objective video quality measurement methods leading to Recommendations in both the Telecommunications (ITU-T) and Radiocommunication (ITU-R) sectors of the ITU. To this end, VQEG designed and executed a test program to compare subjective video quality evaluations to the predictions of a number of proposed objective measurement methods for video quality in the bit rate range of 768 kb/s to 50 Mb/s. The results of this test show that there is no objective measurement system that is currently able to replace subjective testing. Depending on the metric used for evaluation, the performance of eight or nine models was found to be statistically equivalent, leading to the

  5. Validated spectrophotometric methods for determination of some oral hypoglycemic drugs.

    PubMed

    Farouk, M; Abdel-Satar, O; Abdel-Aziz, O; Shaaban, M

    2011-02-01

    Four accurate, precise, rapid, reproducible, and simple spectrophotometric methods were validated for determination of repaglinide (RPG), pioglitazone hydrochloride (PGL) and rosiglitazone maleate (RGL). The first two methods were based on the formation of a charge-transfer purple-colored complex of chloranilic acid with RPG and RGL with a molar absorptivity 1.23 × 103 and 8.67 × 102 l•mol-1•cm-1 and a Sandell's sensitivity of 0.367 and 0.412 μg•cm-2, respectively, and an ion-pair yellow-colored complex of bromophenol blue with RPG, PGL and RGL with molar absorptivity 8.86 × 103, 6.95 × 103, and 7.06 × 103 l•mol-1•cm-1, respectively, and a Sandell's sensitivity of 0.051 μg•cm-2 for all ion-pair complexes. The influence of different parameters on color formation was studied to determine optimum conditions for the visible spectrophotometric methods. The other spectrophotometric methods were adopted for demtermination of the studied drugs in the presence of their acid-, alkaline- and oxidative-degradates by computing derivative and pH-induced difference spectrophotometry, as stability-indicating techniques. All the proposed methods were validated according to the International Conference on Harmonization guidelines and successfully applied for determination of the studied drugs in pure form and in pharmaceutical preparations with good extraction recovery ranges between 98.7-101.4%, 98.2-101.3%, and 99.9-101.4% for RPG, PGL, and RGL, respectively. Results of relative standard deviations did not exceed 1.6%, indicating that the proposed methods having good repeatability and reproducibility. All the obtained results were statistically compared to the official method used for RPG analysis and the manufacturers methods used for PGL and RGL analysis, respectively, where no significant differences were found. PMID:22466095

  6. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Method of identification of substances for reporting purposes. 712.5 Section 712.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of identification of substances for...

  7. Knowledge Transmission versus Social Transformation: A Critical Analysis of Purpose in Elementary Social Studies Methods Textbooks

    ERIC Educational Resources Information Center

    Butler, Brandon M.; Suh, Yonghee; Scott, Wendy

    2015-01-01

    In this article, the authors investigate the extent to which 9 elementary social studies methods textbooks present the purpose of teaching and learning social studies. Using Stanley's three perspectives of teaching social studies for knowledge transmission, method of intelligence, and social transformation; we analyze how these texts prepare…

  8. Examining the Content Validity of the WHOQOL-BRF from Respondents' Perspective by Quantitative Methods

    ERIC Educational Resources Information Center

    Yao, Grace; Wu, Chia-Huei; Yang, Cheng-Ta

    2008-01-01

    Content validity, the extent to which a measurement reflects the specific intended domain of content, is a basic type of validity for a valid measurement. It was usually examined qualitatively and relied on experts' subjective judgments, not on respondents' responses. Therefore, the purpose of this study was to introduce and demonstrate how to use…

  9. Validated spectrofluorometric methods for determination of amlodipine besylate in tablets

    NASA Astrophysics Data System (ADS)

    Abdel-Wadood, Hanaa M.; Mohamed, Niveen A.; Mahmoud, Ashraf M.

    2008-08-01

    Two simple and sensitive spectrofluorometric methods have been developed and validated for determination of amlodipine besylate (AML) in tablets. The first method was based on the condensation reaction of AML with ninhydrin and phenylacetaldehyde in buffered medium (pH 7.0) resulting in formation of a green fluorescent product, which exhibits excitation and emission maxima at 375 and 480 nm, respectively. The second method was based on the reaction of AML with 7-chloro-4-nitro-2,1,3-benzoxadiazole (NBD-Cl) in a buffered medium (pH 8.6) resulting in formation of a highly fluorescent product, which was measured fluorometrically at 535 nm ( λex, 480 nm). The factors affecting the reactions were studied and optimized. Under the optimum reaction conditions, linear relationships with good correlation coefficients (0.9949-0.9997) were found between the fluorescence intensity and the concentrations of AML in the concentration range of 0.35-1.8 and 0.55-3.0 μg ml -1 for ninhydrin and NBD-Cl methods, respectively. The limits of assays detection were 0.09 and 0.16 μg ml -1 for the first and second method, respectively. The precisions of the methods were satisfactory; the relative standard deviations were ranged from 1.69 to 1.98%. The proposed methods were successfully applied to the analysis of AML in pure and pharmaceutical dosage forms with good accuracy; the recovery percentages ranged from 100.4-100.8 ± 1.70-2.32%. The results were compared favorably with those of the reported method.

  10. Computational Methods for RNA Structure Validation and Improvement.

    PubMed

    Jain, Swati; Richardson, David C; Richardson, Jane S

    2015-01-01

    With increasing recognition of the roles RNA molecules and RNA/protein complexes play in an unexpected variety of biological processes, understanding of RNA structure-function relationships is of high current importance. To make clean biological interpretations from three-dimensional structures, it is imperative to have high-quality, accurate RNA crystal structures available, and the community has thoroughly embraced that goal. However, due to the many degrees of freedom inherent in RNA structure (especially for the backbone), it is a significant challenge to succeed in building accurate experimental models for RNA structures. This chapter describes the tools and techniques our research group and our collaborators have developed over the years to help RNA structural biologists both evaluate and achieve better accuracy. Expert analysis of large, high-resolution, quality-conscious RNA datasets provides the fundamental information that enables automated methods for robust and efficient error diagnosis in validating RNA structures at all resolutions. The even more crucial goal of correcting the diagnosed outliers has steadily developed toward highly effective, computationally based techniques. Automation enables solving complex issues in large RNA structures, but cannot circumvent the need for thoughtful examination of local details, and so we also provide some guidance for interpreting and acting on the results of current structure validation for RNA. PMID:26068742

  11. Indentation Measurements to Validate Dynamic Elasticity Imaging Methods.

    PubMed

    Altahhan, Khaldoon N; Wang, Yue; Sobh, Nahil; Insana, Michael F

    2016-09-01

    We describe macro-indentation techniques for estimating the elastic modulus of soft hydrogels. Our study describes (a) conditions under which quasi-static indentation can validate dynamic shear-wave imaging estimates and (b) how each of these techniques uniquely biases modulus estimates as they couple to the sample geometry. Harmonic shear waves between 25 and 400 Hz were imaged using ultrasonic Doppler and optical coherence tomography methods to estimate shear dispersion. From the shear-wave speed of sound, average elastic moduli of homogeneous samples were estimated. These results are compared directly with macroscopic indentation measurements measured two ways. One set of measurements applied Hertzian theory to the loading phase of the force-displacement curves using samples treated to minimize surface adhesion forces. A second set of measurements applied Johnson-Kendall-Roberts theory to the unloading phase of the force-displacement curve when surface adhesions were significant. All measurements were made using gelatin hydrogel samples of different sizes and concentrations. Agreement within 5% among elastic modulus estimates was achieved for a range of experimental conditions. Consequently, a simple quasi-static indentation measurement using a common gel can provide elastic modulus measurements that help validate dynamic shear-wave imaging estimates. PMID:26376923

  12. The Equivalence of Positive and Negative Methods of Validating a Learning Hierarchy.

    ERIC Educational Resources Information Center

    Kee, Kevin N.; White, Richard T.

    1979-01-01

    The compound nature of Gagne's original definition of learning hierarchies leads to two methods of validation, the positive and negative methods. Sections of a hierarchy that had been validated by the negative method were subjected to test by the more cumbersome positive method, and again were found to be valid. (Author/RD)

  13. Validating for Use and Interpretation: A Mixed Methods Contribution Illustrated

    ERIC Educational Resources Information Center

    Morell, Linda; Tan, Rachael Jin Bee

    2009-01-01

    Researchers in the areas of psychology and education strive to understand the intersections among validity, educational measurement, and cognitive theory. Guided by a mixed model conceptual framework, this study investigates how respondents' opinions inform the validation argument. Validity evidence for a science assessment was collected through…

  14. Line profile reconstruction: validation and comparison of reconstruction methods

    NASA Astrophysics Data System (ADS)

    Tsai, Ming-Yi; Yost, Michael G.; Wu, Chang-Fu; Hashmonay, Ram A.; Larson, Timothy V.

    Currently, open path Fourier transform infrared (OP-FTIR) spectrometers have been applied in some fenceline monitoring, but their use has been limited because path-integrated concentration measurements typically only provide an estimate of the average concentration. We present a series of experiments that further explore the use of path-integrated measurements to reconstruct various pollutant distributions along a linear path. Our experiments were conducted in a ventilation chamber using an OP-FTIR instrument to monitor a tracer-gas release over a fenceline configuration. These experiments validate a line profile method (1-D reconstruction). Additionally, we expand current reconstruction techniques by applying the Bootstrap to our measurements. We compared our reconstruction results to our point samplers using the concordance correlation factor (CCF). Of the four different release types, three were successfully reconstructed with CCFs greater than 0.9. The difficult reconstruction involved a narrow release where the pollutant was limited to one segment of the segmented beampath. In general, of the three reconstruction methods employed, the average of the bootstrapped reconstructions was found to have the highest CCFs when compared to the point samplers. Furthermore, the bootstrap method was the most flexible and allowed a determination of the uncertainty surrounding our reconstructions.

  15. [Validation of a HPLC method for ochratoxin A determination].

    PubMed

    Bulea, Delia; Spac, A F; Dorneanu, V

    2011-01-01

    Ochratoxin A is a mycotoxin produced by various species of Aspergillus and Penicillium. Ochratoxin A has been detected in cereals and cereal products, coffee beans, beer, wine, spices, pig's kidney and cow's milk. For ochratoxin A, a HPLC method was developed and validated. Ochratoxin A was determined by RP-HPLC, using a liquid chromatograph type HP 1090 Series II, equiped with a fluorescence detector. The analysis was performed with a Phenomenex column, type Luna C18(2) 100A (150 x 4.6 mm; 5 microm) with a mobile phase consisting of a mixture of acetonitrile/water/acid acetic (99/99/2), a flow of 0.7 mL/min. For detection, the wavelenght of excitation was 228 nm and wavelenght of emision was 423 nm. The calibration graph was linear in 6.25-50 ng/mL concentration range (r2 = 0,9991). The detection limits was 1.6 ng/mL and the quantification limit was 4.9 ng/mL. The method precision (RSD = 2.4975%) and the accuracy (recovery was 100.1%) were studied. The HPLC method was applyed for ochratoxin A from food samples with good results. PMID:21870763

  16. Testing and Validation of the Dynamic Inertia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.

  17. Validation of a digital PCR method for quantification of DNA copy number concentrations by using a certified reference material.

    PubMed

    Deprez, Liesbet; Corbisier, Philippe; Kortekaas, Anne-Marie; Mazoua, Stéphane; Beaz Hidalgo, Roxana; Trapmann, Stefanie; Emons, Hendrik

    2016-09-01

    Digital PCR has become the emerging technique for the sequence-specific detection and quantification of nucleic acids for various applications. During the past years, numerous reports on the development of new digital PCR methods have been published. Maturation of these developments into reliable analytical methods suitable for diagnostic or other routine testing purposes requires their validation for the intended use. Here, the results of an in-house validation of a droplet digital PCR method are presented. This method is intended for the quantification of the absolute copy number concentration of a purified linearized plasmid in solution with a nucleic acid background. It has been investigated which factors within the measurement process have a significant effect on the measurement results, and the contribution to the overall measurement uncertainty has been estimated. A comprehensive overview is provided on all the aspects that should be investigated when performing an in-house method validation of a digital PCR method. PMID:27617230

  18. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  19. Determination of formaldehyde in food and feed by an in-house validated HPLC method.

    PubMed

    Wahed, P; Razzaq, Md A; Dharmapuri, S; Corrales, M

    2016-07-01

    Formalin is carcinogenic and is detrimental to public health. The illegal addition of formalin (37% formaldehyde and 14% methanol) to foods to extend their shelf-life is considered to be a common practice in Bangladesh. The lack of accurate methods and the ubiquitous presence of formaldehyde in foods make the detection of illegally added formalin challenging. With the aim of helping regulatory authorities, a sensitive high performance liquid chromatography method was validated for the quantitative determination of formaldehyde in mango, fish and milk. The method was fit-for-purpose and showed good analytical performance in terms of specificity, linearity, precision, recovery and robustness. The expanded uncertainty was <35%. The validated method was applied to screen samples of fruits, vegetables, fresh fish, milk and fish feed collected from different local markets in Dhaka, Bangladesh. Levels of formaldehyde in food samples were compared with published data. The applicability of the method in different food matrices might mean it has potential as a reference standard method. PMID:26920321

  20. Validation of the Benefit Forecasting Method: Organization Development Program to Increase Health Organization Membership. Training and Development Research Center, Project Number Eleven.

    ERIC Educational Resources Information Center

    Sleezer, Catherine M.; And Others

    This project is the sixth in a series of studies designed to validate the Training and Development Benefit Forecasting Method (BFM) sponsored by the Training and Development Research Center (TDRC) at the University of Minnesota. The purpose of this study was to validate the BFM's ability to forecast the benefits of an organization development…

  1. An Anatomically Validated Brachial Plexus Contouring Method for Intensity Modulated Radiation Therapy Planning

    SciTech Connect

    Van de Velde, Joris; Audenaert, Emmanuel; Speleers, Bruno; Vercauteren, Tom; Mulliez, Thomas; Vandemaele, Pieter; Achten, Eric; Kerckaert, Ingrid; D'Herde, Katharina; De Neve, Wilfried; Van Hoof, Tom

    2013-11-15

    Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validated the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection.

  2. STANDARDIZATION AND VALIDATION OF MICROBIOLOGICAL METHODS FOR EXAMINATION OF BIOSOLIDS

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within a complex matrix. Implications of ...

  3. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  4. Comments on "validation of two innovative methods to measure contaminant mass flux in groundwater" by Goltz et al.

    NASA Astrophysics Data System (ADS)

    Sun, Kerang

    2014-12-01

    I wish to comment on the paper published by Goltz et al. on this journal, titled Validation of two innovative methods to measure contaminant mass flux in groundwater (Goltz et al., 2009). The paper presents the results of experiments Goltz et al. conducted on an artificial aquifer for the purpose of validating two recently developed methods to measure contaminant mass flux in groundwater, the tandem circulation well (TCW) method and the modified integral pumping test (MIPT) method. Their experiment results showed that the TCW method implemented using both the multi-dipole technique and the tracer test technique successfully estimated the mass fluxes with respective accuracies within 2% and 16% of the known values. The MIPT method, on the other hand, underestimated the mass flux by as much as 70%. My comments focus on the MIPT method.

  5. Cleaning validation 2: development and validation of an ion chromatographic method for the detection of traces of CIP-100 detergent.

    PubMed

    Resto, Wilfredo; Hernández, Darimar; Rey, Rosamil; Colón, Héctor; Zayas, José

    2007-05-01

    A cleaning validation method, ion chromatographic method with conductivity detection was developed and validated for the determination of traces of a clean-in-place (CIP) detergent. It was shown to be linear with a squared correlation coefficient (r(2)) of 0.9999 and average recoveries of 71.4% (area response factor) from stainless steel surfaces and 101% from cotton. The repeatability was found to be 2.17% and an intermediate precision of 1.88% across the range. The method was also shown to be sensitive with a detection limit (DL) of 0.13 ppm and a quantitation limit (QL) of 0.39 ppm for EDTA, which translates to less than 1 microL of CIP diluted in 100mL of diluent in both cases. The EDTA signal was well resolved from typical ions encountered in water samples or any other interference presented from swabs and surfaces. The method could be applied to cleaning validation samples. The validated method could be included as a suitable one for rapid and reliable cleaning validation program. PMID:17344013

  6. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    SciTech Connect

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  7. Guidelines for the validation of qualitative multi-residue methods used to detect pesticides in food.

    PubMed

    Mol, H G J; Reynolds, S L; Fussell, R J; Stajnbaher, D

    2012-08-01

    There is a current trend for many laboratories to develop and use qualitative gas chromatography-mass spectrometry (GC-MS) and liquid chromatography-mass spectrometry (LC-MS) based multi-residue methods (MRMs) in order to greatly increase the number of pesticides that they can target. Before these qualitative MRMs can be used for the monitoring of pesticide residues in food, their fitness-for-purpose needs to be established by initial method validation. This paper sets out to assess the performances of two such qualitative MRMs against a set of parameters and criteria that might be suitable for their effective validation. As expected, the ease of detection was often dependent on the particular pesticide/commodity combinations that were targeted, especially at the lowest concentrations tested (0.01 mg/kg). The two examples also clearly demonstrated that the percentage of pesticides detected was dependent on many factors, but particularly on the capabilities of the automated software/library packages and the parameters and threshold settings selected for operation. Another very important consideration was the condition of chromatographic system and detector at the time of analysis. If the system was relatively clean, then the detection rate was much higher than if it had become contaminated over time from previous injections of sample extracts. The parameters and criteria suggested for method validation of qualitative MRMs are aimed at achieving a 95% confidence level of pesticide detection. However, the presence of any pesticide that is 'detected' will need subsequent analysis for quantification and, depending on the qualitative method used, further evidence of identity. PMID:22851355

  8. Use of Validation by Enterprises for Human Resource and Career Development Purposes. Cedefop Reference Series No 96

    ERIC Educational Resources Information Center

    Cedefop - European Centre for the Development of Vocational Training, 2014

    2014-01-01

    European enterprises give high priority to assessing skills and competences, seeing this as crucial for recruitment and human resource management. Based on a survey of 400 enterprises, 20 in-depth case studies and interviews with human resource experts in 10 countries, this report analyses the main purposes of competence assessment, the standards…

  9. Validation of a new method to measure contact and flight times during treadmill running.

    PubMed

    Ogueta-Alday, Ana; Morante, Juan C; Rodríguez-Marroyo, Jose A; García-López, Juan

    2013-05-01

    The purpose of this study was to validate a new method to measure contact and flight times during treadmill running and to test its reliability and sensitivity. Fifteen well-trained runners performed 7 sets of running at different speeds (from 10 to 22 km·h). Contact and flight times were simultaneously recorded by a high-speed video system (gold standard method) and a new method based on laser technology (SportJump System Pro). Athletes were classified according to their foot strike pattern (rearfoot vs. midfoot and forefoot). The new method overestimated the contact time and underestimated the flight time with respect to the gold standard method (p < 0.001). However, relationships and intraclass correlation coefficients (ICCs) between both systems were very strong (r and ICC > 0.99, p < 0.001). Contact time differences between the 2 systems depended on running speed (p < 0.001) but not on foot strike pattern or runners' body mass. This allowed to correct the differences in contact time and flight time. The new method was sensitive for detecting small differences in contact time (<20 ms) when the running speed increased and when the type of foot strike patterns changed. Additionally, a low intraindividual step variability (coefficient of variation = 2.0 ± 0.5%) and high intra- (ICC = 0.998) and interobserver (ICC = 0.977) reliability were shown. In conclusion, the new method was validated, being reliable and sensitive for detecting small differences in contact and flight times during treadmill running. Therefore, it could be used to compare biomechanical variables between groups in cross-sectional studies and to verify the influence of some independent variables (i.e., training, running economy, or performance) on running biomechanics. PMID:22836607

  10. Measurement Practices: Methods for Developing Content-Valid Student Examinations.

    ERIC Educational Resources Information Center

    Bridge, Patrick D.; Musial, Joseph; Frank, Robert; Roe, Thomas; Sawilowsky, Shlomo

    2003-01-01

    Reviews the fundamental principles associated with achieving a high level of content validity when developing tests for students. Suggests that the short-term efforts necessary to develop and integrate measurement theory into practice will lead to long-term gains for students, faculty, and academic institutions. (Includes 21 references.)…

  11. Cost-Benefit Considerations in Choosing among Cross-Validation Methods.

    ERIC Educational Resources Information Center

    Murphy, Kevin R.

    There are two general methods of cross-validation: empirical estimation, and formula estimation. In choosing a specific cross-validation procedure, one should consider both costs (e.g., inefficient use of available data in estimating regression parameters) and benefits (e.g., accuracy in estimating population cross-validity). Empirical…

  12. Validation Methods for Fault-Tolerant avionics and control systems, working group meeting 1

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The proceedings of the first working group meeting on validation methods for fault tolerant computer design are presented. The state of the art in fault tolerant computer validation was examined in order to provide a framework for future discussions concerning research issues for the validation of fault tolerant avionics and flight control systems. The development of positions concerning critical aspects of the validation process are given.

  13. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  14. Validation of a high performance liquid chromatography method for the stabilization of epigallocatechin gallate.

    PubMed

    Fangueiro, Joana F; Parra, Alexander; Silva, Amélia M; Egea, Maria A; Souto, Eliana B; Garcia, Maria L; Calpena, Ana C

    2014-11-20

    Epigallocatechin gallate (EGCG) is a green tea catechin with potential health benefits, such as anti-oxidant, anti-carcinogenic and anti-inflammatory effects. In general, EGCG is highly susceptible to degradation, therefore presenting stability problems. The present paper was focused on the study of EGCG stability in HEPES (N-2-hydroxyethylpiperazine-N'-2-ethanesulfonic acid) medium regarding the pH dependency, storage temperature and in the presence of ascorbic acid a reducing agent. The evaluation of EGCG in HEPES buffer has demonstrated that this molecule is not able of maintaining its physicochemical properties and potential beneficial effects, since it is partially or completely degraded, depending on the EGCG concentration. The storage temperature of EGCG most suitable to maintain its structure was shown to be the lower values (4 or -20 °C). The pH 3.5 was able to provide greater stability than pH 7.4. However, the presence of a reducing agent (i.e., ascorbic acid) was shown to provide greater protection against degradation of EGCG. A validation method based on RP-HPLC with UV-vis detection was carried out for two media: water and a biocompatible physiological medium composed of Transcutol®P, ethanol and ascorbic acid. The quantification of EGCG for purposes, using pure EGCG, requires a validated HPLC method which could be possible to apply in pharmacokinetic and pharmacodynamics studies. PMID:25175728

  15. The development and validation of methods for evaluating the immune system in preweaning piglets.

    PubMed

    Zeigler, Brandon M; Cameron, Mark; Nelson, Keith; Bailey, Kristi; Weiner, Myra L; Mahadevan, Brinda; Thorsrud, Bjorn

    2015-10-01

    The preweaning piglet has been found to be a valuable research model for testing ingredients used in infant formula. As part of the safety assessment, the neonates' immune system is an important component that has to be evaluated. In this study three concurrent strategies were developed to assess immune system status. The methods included (1) immunophenotying to assess circulating innate immune cell populations, (2) monitoring of circulating cytokines, particularly in response to a positive control agent, and (3) monitoring of localized gastrointestinal tissue cytokines using immunohistochemistry (IHC), particularly in response to a positive control agent. All assays were validated using white papers and regulatory guidance within a GLP environment. To validate the assays precision, accuracy and sample stability were evaluated as needed using a fit for purpose approach. In addition animals were treated with proinflammtory substances to detect a positive versus negative signal. In conclusion, these three methods were confirmed to be robust assays to evaluate the immune system and GIT-specific immune responses of preweaning piglets. PMID:26341191

  16. An Automatic Method for Geometric Segmentation of Masonry Arch Bridges for Structural Engineering Purposes

    NASA Astrophysics Data System (ADS)

    Riveiro, B.; DeJong, M.; Conde, B.

    2016-06-01

    Despite the tremendous advantages of the laser scanning technology for the geometric characterization of built constructions, there are important limitations preventing more widespread implementation in the structural engineering domain. Even though the technology provides extensive and accurate information to perform structural assessment and health monitoring, many people are resistant to the technology due to the processing times involved. Thus, new methods that can automatically process LiDAR data and subsequently provide an automatic and organized interpretation are required. This paper presents a new method for fully automated point cloud segmentation of masonry arch bridges. The method efficiently creates segmented, spatially related and organized point clouds, which each contain the relevant geometric data for a particular component (pier, arch, spandrel wall, etc.) of the structure. The segmentation procedure comprises a heuristic approach for the separation of different vertical walls, and later image processing tools adapted to voxel structures allows the efficient segmentation of the main structural elements of the bridge. The proposed methodology provides the essential processed data required for structural assessment of masonry arch bridges based on geometric anomalies. The method is validated using a representative sample of masonry arch bridges in Spain.

  17. The validation of analytical methods for drug substances and drug products in UK pharmaceutical laboratories.

    PubMed

    Clarke, G S

    1994-05-01

    Results of a survey on method validation of analytical procedures used in the testing of drug substances and finished products, of most major research based pharmaceutical companies with laboratories in the UK, are presented. The results indicate that although method validation shows an essential similarity in different laboratories (in particular, chromatographic assay methods are validated in a similar manner in most laboratories), there is much diversity in the detailed application of validation parameters. Testing procedures for drug substances are broadly similar to finished products. Many laboratories validate methods at clinical trial stage to the same extent and detail as at the marketing authorization application (MAA)/new drug application (NDA) submission stage, however, only a small minority of laboratories apply the same criteria to methodology at pre-clinical trial stage. Extensive details of method validation parameters are included in the summary tables of this survey, together with details of the median response given for the validation of the most extensively applied methods. These median response details could be useful in suggesting a harmonized approach to method validation as applied by UK pharmaceutical laboratories. These guidelines would extend beyond the recommendations made to date by regulatory authorities and pharmacopoeias in that minimum requirements for each method validation parameter, e.g. number of replicates, range and tolerance, could be harmonized, both between laboratories and also in Product Licence submissions. PMID:7948185

  18. Optimal combining of ground-based sensors for the purpose of validating satellite-based rainfall estimates

    NASA Technical Reports Server (NTRS)

    Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie

    1991-01-01

    Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.

  19. A simple method to generate adipose stem cell-derived neurons for screening purposes.

    PubMed

    Bossio, Caterina; Mastrangelo, Rosa; Morini, Raffaella; Tonna, Noemi; Coco, Silvia; Verderio, Claudia; Matteoli, Michela; Bianco, Fabio

    2013-10-01

    Strategies involved in mesenchymal stem cell (MSC) differentiation toward neuronal cells for screening purposes are characterized by quality and quantity issues. Differentiated cells are often scarce with respect to starting undifferentiated population, and the differentiation process is usually quite long, with high risk of contamination and low yield efficiency. Here, we describe a novel simple method to induce direct differentiation of MSCs into neuronal cells, without neurosphere formation. Differentiated cells are characterized by clear morphological changes, expression of neuronal specific markers, showing functional response to depolarizing stimuli and electrophysiological properties similar to those of developing neurons. The method described here represents a valuable tool for future strategies aimed at personalized screening of therapeutic agents in vitro. PMID:23468184

  20. Convergent validity of a novel method for quantifying rowing training loads.

    PubMed

    Tran, Jacqueline; Rice, Anthony J; Main, Luana C; Gastin, Paul B

    2015-01-01

    Elite rowers complete rowing-specific and non-specific training, incorporating continuous and interval-like efforts spanning the intensity spectrum. However, established training load measures are unsuitable for use in some modes and intensities. Consequently, a new measure known as the T2minute method was created. The method quantifies load as the time spent in a range of training zones (time-in-zone), multiplied by intensity- and mode-specific weighting factors that scale the relative stress of different intensities and modes to the demands of on-water rowing. The purpose of this study was to examine the convergent validity of the T2minute method with Banister's training impulse (TRIMP), Lucia's TRIMP and Session-RPE when quantifying elite rowing training. Fourteen elite rowers (12 males, 2 females) were monitored during four weeks of routine training. Unadjusted T2minute loads (using coaches' estimates of time-in-zone) demonstrated moderate-to-strong correlations with Banister's TRIMP, Lucia's TRIMP and Session-RPE (rho: 0.58, 0.55 and 0.42, respectively). Adjusting T2minute loads by using actual time-in-zone data resulted in stronger correlations between the T2minute method and Banister's TRIMP and Lucia's TRIMP (rho: 0.85 and 0.81, respectively). The T2minute method is an appropriate in-field measure of elite rowing training loads, particularly when actual time-in-zone values are used to quantify load. PMID:25083912

  1. Optimization and Validation of an ETAAS Method for the Determination of Nickel in Postmortem Material.

    PubMed

    Dudek-Adamska, Danuta; Lech, Teresa; Kościelniak, Paweł

    2015-01-01

    In this article, optimization and validation of a procedure for the determination of total nickel in wet digested samples of human body tissues (internal organs) for forensic toxicological purposes are presented. Four experimental setups of the electrothermal atomic absorption spectrometry (ETAAS) using a Solaar MQZe (Thermo Electron Co.) were compared, using the following (i) no modifier, (ii) magnesium nitrate, (iii) palladium nitrate and (iv) magnesium nitrate and ammonium dihydrogen phosphate mixture as chemical modifiers. It was ascertained that the ETAAS without any modifier with 1,300/2,400°C as the pyrolysis and atomization temperatures, respectively, can be used to determine total nickel at reference levels in biological materials as well as its levels found in chronic or acute poisonings. The method developed was validated, obtaining a linear range of calibration from 0.76 to 15.0 μg/L, limit of detection at 0.23 µg/L, limit of quantification at 0.76 µg/L, precision (as relative standard deviation) up to 10% and accuracy of 97.1% for the analysis of certified material (SRM 1577c Bovine Liver) and within a range from 99.2 to 109.9% for the recovery of fortified liver samples. PMID:25868556

  2. VALIDATION OF A METHOD FOR ESTIMATING LONG-TERM EXPOSURES BASED ON SHORT-TERM MEASUREMENTS

    EPA Science Inventory

    A method for estimating long-term exposures from short-term measurements is validated using data from a recent EPA study of exposure to fine particles. The method was developed a decade ago but long-term exposure data to validate it did not exist until recently. In this paper, ...

  3. ECVAM's approach to intellectual property rights in the validation of alternative methods.

    PubMed

    Linge, Jens P; Hartung, Thomas

    2007-08-01

    In this article, we discuss how intellectual property rights affect the validation of alternative methods at ECVAM. We point out recent cases and summarise relevant EU and OECD documents. Finally, we discuss guidelines for dealing with intellectual property rights during the validation of alternative methods at ECVAM. PMID:17850189

  4. Data on the verification and validation of segmentation and registration methods for diffusion MRI.

    PubMed

    Esteban, Oscar; Zosso, Dominique; Daducci, Alessandro; Bach-Cuadra, Meritxell; Ledesma-Carbayo, María J; Thiran, Jean-Philippe; Santos, Andres

    2016-09-01

    The verification and validation of segmentation and registration methods is a necessary assessment in the development of new processing methods. However, verification and validation of diffusion MRI (dMRI) processing methods is challenging for the lack of gold-standard data. The data described here are related to the research article entitled "Surface-driven registration method for the structure-informed segmentation of diffusion MR images" [1], in which publicly available data are used to derive golden-standard reference-data to validate and evaluate segmentation and registration methods in dMRI. PMID:27508235

  5. Validation of two ribosomal RNA removal methods for microbial metatranscriptomics

    SciTech Connect

    He, Shaomei; Wurtzel, Omri; Singh, Kanwar; Froula, Jeff L; Yilmaz, Suzan; Tringe, Susannah G; Wang, Zhong; Chen, Feng; Lindquist, Erika A; Sorek, Rotem; Hugenholtz, Philip

    2010-10-01

    The predominance of rRNAs in the transcriptome is a major technical challenge in sequence-based analysis of cDNAs from microbial isolates and communities. Several approaches have been applied to deplete rRNAs from (meta)transcriptomes, but no systematic investigation of potential biases introduced by any of these approaches has been reported. Here we validated the effectiveness and fidelity of the two most commonly used approaches, subtractive hybridization and exonuclease digestion, as well as combinations of these treatments, on two synthetic five-microorganism metatranscriptomes using massively parallel sequencing. We found that the effectiveness of rRNA removal was a function of community composition and RNA integrity for these treatments. Subtractive hybridization alone introduced the least bias in relative transcript abundance, whereas exonuclease and in particular combined treatments greatly compromised mRNA abundance fidelity. Illumina sequencing itself also can compromise quantitative data analysis by introducing a G+C bias between runs.

  6. An evaluation of alternate production methods for Pu-238 general purpose heat source pellets

    SciTech Connect

    Mark Borland; Steve Frank

    2009-06-01

    For the past half century, the National Aeronautics and Space Administration (NASA) has used Radioisotope Thermoelectric Generators (RTG) to power deep space satellites. Fabricating heat sources for RTGs, specifically General Purpose Heat Sources (GPHSs), has remained essentially unchanged since their development in the 1970s. Meanwhile, 30 years of technological advancements have been made in the applicable fields of chemistry, manufacturing and control systems. This paper evaluates alternative processes that could be used to produce Pu 238 fueled heat sources. Specifically, this paper discusses the production of the plutonium-oxide granules, which are the input stream to the ceramic pressing and sintering processes. Alternate chemical processes are compared to current methods to determine if alternative fabrication processes could reduce the hazards, especially the production of respirable fines, while producing an equivalent GPHS product.

  7. STATISTICAL VALIDATION OF SULFATE QUANTIFICATION METHODS USED FOR ANALYSIS OF ACID MINE DRAINAGE

    EPA Science Inventory

    Turbidimetric method (TM), ion chromatography (IC) and inductively coupled plasma atomic emission spectrometry (ICP-AES) with and without acid digestion have been compared and validated for the determination of sulfate in mining wastewater. Analytical methods were chosen to compa...

  8. Comparison of Machine Learning Methods for the Purpose Of Human Fall Detection

    NASA Astrophysics Data System (ADS)

    Strémy, Maximilián; Peterková, Andrea

    2014-12-01

    According to several studies, the European population is rapidly aging far over last years. It is therefore important to ensure that aging population is able to live independently without the support of working-age population. In accordance with the studies, fall is the most dangerous and frequent accident in the everyday life of aging population. In our paper, we present a system to track the human fall by a visual detection, i.e. using no wearable equipment. For this purpose, we used a Kinect sensor, which provides the human body position in the Cartesian coordinates. It is possible to directly capture a human body because the Kinect sensor has a depth and also an infrared camera. The first step in our research was to detect postures and classify the fall accident. We experimented and compared the selected machine learning methods including Naive Bayes, decision trees and SVM method to compare the performance in recognizing the human postures (standing, sitting and lying). The highest classification accuracy of over 93.3% was achieved by the decision tree method.

  9. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  10. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  11. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  12. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  13. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  14. FIELD VALIDATION OF EPA (ENVIRONMENTAL PROTECTION AGENCY) REFERENCE METHOD 23

    EPA Science Inventory

    The accuracy and precision of U.S. Environmental Protection Agency Reference Method 23 was evaluated at a trichloroethylene degreasing facility and an ethylene dichloride plant. The method consists of a procedure for obtaining an integrated sample followed by gas chromatographic ...

  15. Testing and Validation of the Dynamic Interia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander; Herrera, Claudia; Spivey, Natalie; Fladung, William; Cloutier, David

    2015-01-01

    This presentation describes the DIM method and how it measures the inertia properties of an object by analyzing the frequency response functions measured during a ground vibration test (GVT). The DIM method has been in development at the University of Cincinnati and has shown success on a variety of small scale test articles. The NASA AFRC version was modified for larger applications.

  16. Validation of a Generic qHNMR Method for Natural Products Analysis†

    PubMed Central

    Gödecke, Tanja; Napolitano, José G.; Rodríguez-Brasco, María F.; Chen, Shao-Nong; Jaki, Birgit U.; Lankin, David C.; Pauli, Guido F.

    2014-01-01

    Introduction Nuclear magnetic resonance (NMR) spectroscopy is increasingly employed in the quantitative analysis and quality control (QC) of natural products (NPs) including botanical dietary supplements (BDSs). The establishment of qHNMR based QC protocols requires method validation. Objective Develop and validate a generic qHNMR method. Optimize acquisition and processing parameters, with specific attention to the requirements for the analysis of complex NP samples, including botanicals and purity assessment of NP isolates. Methodology In order to establish the validated qHNMR method, samples containing two highly pure reference materials were used. The influence of acquisition and processing parameters on the method validation were examined, and general aspects of method validation of qHNMR methods discussed. Subsequently, the established method was applied to the analysis of two natural products samples: a purified reference compound and a crude mixture. Results The accuracy and precision of qHNMR using internal or external calibration were compared, using a validated method suitable for complex samples. The impact of post-acquisition processing on method validation was examined using three software packages: TopSpin, MNova, and NUTS. The dynamic range of the developed qHNMR method was 5,000:1 with a limit of detection (LOD) of better than 10 μM. The limit of quantification (LOQ) depends on the desired level of accuracy and experiment time spent. Conclusions This study revealed that acquisition parameters, processing parameters, and processing software all contribute to qHNMR method validation. A validated method with high dynamic range and general workflow for qHNMR analysis of NPs is proposed. PMID:23740625

  17. Flight critical system design guidelines and validation methods

    NASA Technical Reports Server (NTRS)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  18. Production of general purpose heat source (GPHS) using advanced manufacturing methods

    NASA Astrophysics Data System (ADS)

    Miller, Roger G.

    1996-03-01

    Mankind will continue to explore the stars through the use of unmanned space craft until the technology and costs are compatible with sending travelers to the outer planets of our solar system and beyond. Unmanned probes of the present and future will be necessary to develop the necessary technologies and obtain information that will make this travel possible. Because of the significant costs incurred, the use of modern manufacturing technologies must be used to lower the investment needed even when shared by international partnerships. For over the last 30 years, radioisotopes have provided the heat from which electrical power is extracted. Electric power for future spacecraft will be provided by either Radioisotope Thermoelectric Generators (RTG), Radioisotopic Thermophotovoltaic systems (RTPV), radioisotope Stirling systems, or a combination of these. All of these systems will be thermally driven by General Purpose Heat Source (GPHS) fueled clad in some configuration. The GPHS clad contains a 238PuO2 pellet encapsulated in an iridium alloy container. Historically, the fabrication of the iridium alloy shells has been performed at EG&G Mound and Oak Ridge National Laboratory (ORNL), and girth welding at Westinghouse Savannah River Corporation (WSRC) and Los Alamos National Laboratory (LANL). This paper will describe the use of laser processing for welding, drilling, cutting, and machining with other manufacturing methods to reduce the costs of producing GPHS fueled clad components and compléted assemblies. Incorporation of new quality technologies will compliment these manufacturing methods to reduce cost.

  19. Validation of the WHO Hemoglobin Color Scale Method

    PubMed Central

    Darshana, Leeniyagala Gamaralalage Thamal; Uluwaduge, Deepthi Inoka

    2014-01-01

    This study was carried out to evaluate the diagnostic accuracy of WHO color scale in screening anemia during blood donor selection in Sri Lanka. A comparative cross-sectional study was conducted by the Medical Laboratory Sciences Unit of University of Sri Jayewardenepura in collaboration with National Blood Transfusion Centre, Sri Lanka. A total of 100 subjects participated in this study. Hemoglobin value of each participant was analyzed by both WHO color scale method and cyanmethemoglobin method. Bland-Altman plot was used to determine the agreement between the two methods. Sensitivity, specificity, predictive values, false positive, and negative rates were calculated. The sensitivity of the WHO color scale was very low. The highest sensitivity observed was 55.55% in hemoglobin concentrations >13.1 g/dL and the lowest was 28.57% in hemoglobin concentrations between 7.1 and 9.0 g/dL. The mean difference between the WHO color scale and the cyanmethemoglobin method was 0.2 g/dL (95% confidence interval; 3.2 g/dL above and 2.8 g/dL below). Even though the WHO color scale is an inexpensive and portable method for field studies, from the overall results in this study it is concluded that WHO color scale is an inaccurate method to screen anemia during blood donations. PMID:24839555

  20. Thermogravimetric desorption and de novo tests I: method development and validation.

    PubMed

    Tsytsik, Palina; Czech, Jan; Carleer, Robert; Reggers, Guy; Buekens, Alfons

    2008-08-01

    Thermogravimetric analysis (TGA) has been combined with evolved gas analysis (EGA) with the purpose of simulating the thermal behaviour of filter dust samples under inert (desorption) and de novo test oxidising conditions. Emphasis is on studying de novo formation of dioxins, surrogates and precursors arising from filter dust derived from thermal processes, such as municipal solid waste incineration and metallurgy. A new method is tested for sampling and analysing dioxin surrogates and precursors in the TGA effluent, which are collected on sampling tubes; the adsorbed compounds are eventually desorbed and quantified by TD-GC-MS. The major sources of error and losses are considered, including potential sorbent artefacts, possible breakthrough of volatiles through sampling tubes, or eventual losses of semi-volatiles due to their incomplete desorption or re-condensation inside the TG Analyser. The method is optimised and validated for di- to hexa-chlorinated benzenes in a range of 10-1000 ppb with average recovery exceeding 85%. The results are compared with data obtained in similar studies, performed by other research groups. As a result, the method provides the means for simulating de novo synthesis of dioxins in fly-ash and facilitates reliable and easy estimation of de novo activity, comparable with results of other studies, in combination with wide flexibility of testing conditions. PMID:18556042

  1. PEM fuel cell fault detection and identification using differential method: simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Frappé, E.; de Bernardinis, A.; Bethoux, O.; Candusso, D.; Harel, F.; Marchand, C.; Coquery, G.

    2011-05-01

    PEM fuel cell performance and lifetime strongly depend on the polymer membrane and MEA hydration. As the internal moisture is very sensitive to the operating conditions (temperature, stoichiometry, load current, water management…), keeping the optimal working point is complex and requires real-time monitoring. This article focuses on PEM fuel cell stack health diagnosis and more precisely on stack fault detection monitoring. This paper intends to define new, simple and effective methods to get relevant information on usual faults or malfunctions occurring in the fuel cell stack. For this purpose, the authors present a fault detection method using simple and non-intrusive on-line technique based on the space signature of the cell voltages. The authors have the objective to minimize the number of embedded sensors and instrumentation in order to get a precise, reliable and economic solution in a mass market application. A very low number of sensors are indeed needed for this monitoring and the associated algorithm can be implemented on-line. This technique is validated on a 20-cell PEMFC stack. It demonstrates that the developed method is particularly efficient in flooding case. As a matter of fact, it uses directly the stack as a sensor which enables to get a quick feedback on its state of health.

  2. Validation of doubly labeled water method using a ruminant

    SciTech Connect

    Fancy, S.G.; Blanchard, J.M.; Holleman, D.F.; Kokjer, K.J.; White, R.G.

    1986-07-01

    CO/sub 2/ production (CDP, ml CO/sub 2/ . g-1 . h-1) by captive caribou and reindeer (Rangifer tarandus) was measured using the doubly labeled water method (/sup 3/H/sub 2/O and H2(18)O) and compared with CO/sub 2/ expiration rates (VCO/sub 2/), adjusted for CO/sub 2/ losses in CH4 and urine, as determined by open-circuit respirometry. CDP calculated from samples of blood or urine from a reindeer in winter was 1-3% higher than the adjusted VCO/sub 2/. Differences between values derived by the two methods of 5-20% were found in summer trials with caribou. None of these differences were statistically significant (P greater than 0.05). Differences in summer could in part be explained by the net deposition of /sup 3/H, 18O, and unlabeled CO/sub 2/ in antlers and other growing tissues. Total body water volumes calculated from /sup 3/H/sub 2/O dilution were up to 15% higher than those calculated from H/sub 2/(18)O dilution. The doubly labeled water method appears to be a reasonably accurate method for measuring CDP by caribou and reindeer in winter when growth rates are low, but the method may overestimate CDP by rapidly growing and/or fattening animals.

  3. Validation of a Numerical Method for Determining Liner Impedance

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Tanner, Sharon E.; Parrott, Tony L.

    1996-01-01

    This paper reports the initial results of a test series to evaluate a method for determining the normal incidence impedance of a locally reacting acoustically absorbing liner, located on the lower wall of a duct in a grazing incidence, multi-modal, non-progressive acoustic wave environment without flow. This initial evaluation is accomplished by testing the methods' ability to converge to the known normal incidence impedance of a solid steel plate, and to the normal incidence impedance of an absorbing test specimen whose impedance was measured in a conventional normal incidence tube. The method is shown to converge to the normal incident impedance values and thus to be an adequate tool for determining the impedance of specimens in a grazing incidence, multi-modal, nonprogressive acoustic wave environment for a broad range of source frequencies.

  4. Differences among methods to validate genomic evaluations for dairy cattle

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Two methods of testing predictions from genomic evaluations were investigated. Data used were from the April 2010 and August 2006 official USDA genetic evaluations of dairy cattle. The training data set consisted of both cows and bulls that were proven (had own or daughter information) as of Augus...

  5. Application of neural networks and geomorphometry method for purposes of urban planning (Kazan, Russia)

    NASA Astrophysics Data System (ADS)

    Yermolaev, Oleg; Selivanov, Renat

    2013-04-01

    The landscape structure of a territory imposes serious limitations on the adoption of certain decisions. Differentiation of the relief into separate elementary geomorphological sections yields the basis for most adequate determination of the boundaries of urban geosystems. In paper the results of approbation of relief classification methods based on Artificial Neuron Networks are presented. Approbation of Artificial Neuron Networks (ANN) method (Kohonen's Self-Organizing Maps - SOM) for purposes of automated zoning of a modern city's territory on the example of the city of Kazan. The developed model of the restored landscapes represents the city territory as a system of geomorphologically homogenous terrains. Main research objectives: development of a digital model of relief of the city of Kazan; approbation of relief classification methods based on ANN and expert estimations; creation of a SOM-based map of urban geosystems; verification of the received results of classification, clarification and enlargement of landscape units; determination of the applicability of the method in question for purposes of zoning of big cities' territory, identification of strengths and weaknesses. First stage: analysis and digitalization of the detailed large-scale topographic map of Kazan. Digital model of the relief with a grid size of 10m has been produced. We have used this data for building various analytical maps of certain morphometric characteristics of the relief: height, slope, exposition, profile and plan curvature. Calculated morphometric values were transformed into a data matrix. Software packages use training algorithms without the use of a tutor, whereas weight coefficients are redistributed for each specific operational-territorial unit. After several iterations of the "education" process, neural network leads to gradual clumping of groups of operational-territorial unit with similar sets of morphometric parameters. 81 classes have been distinguished. Such atomism

  6. A Validation of Elements, Methods, and Barriers to Inclusive High School Service-Learning Programs

    ERIC Educational Resources Information Center

    Dymond, Stacy K.; Chun, Eul Jung; Kim, Rah Kyung; Renzaglia, Adelle

    2013-01-01

    A statewide survey of coordinators of inclusive high school service-learning programs was conducted to validate elements, methods, and barriers to including students with and without disabilities in service-learning. Surveys were mailed to 655 service-learning coordinators; 190 (29%) returned a completed survey. Findings support the validity of…

  7. Double Cross-Validation in Multiple Regression: A Method of Estimating the Stability of Results.

    ERIC Educational Resources Information Center

    Rowell, R. Kevin

    In multiple regression analysis, where resulting predictive equation effectiveness is subject to shrinkage, it is especially important to evaluate result replicability. Double cross-validation is an empirical method by which an estimate of invariance or stability can be obtained from research data. A procedure for double cross-validation is…

  8. Cost-Benefit Considerations in Choosing among Cross-Validation Methods.

    ERIC Educational Resources Information Center

    Murphy, Kevin R.

    1984-01-01

    Outlines costs and benefits associated with different cross-validation strategies; in particular the way in which the study design affects the cost and benefits of different types of cross-validation. Suggests that the choice between empirical estimation methods and formula estimates involves a trade-off between accuracy and simplicity. (JAC)

  9. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  10. Establishing Survey Validity and Reliability for American Indians Through “Think Aloud” and Test–Retest Methods

    PubMed Central

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L.; Burgess, Katherine M.; Puumala, Susan E.; Wilton, Georgiana; Hanson, Jessica D.

    2015-01-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a “think aloud” methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test–retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test–retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. PMID:25888693

  11. VALIDATION OF AN EMISSION MEASUREMENT METHOD FOR INORGANIC ARSENIC FROM STATIONARY SOURCES: PROPOSED METHOD 108. LABORATORY AND FIELD TEST EVALUATION

    EPA Science Inventory

    The United States Environmental Protection Agency (USEPA) has listed inorganic arsenic emissions as a hazardous air pollutant. USEPA proposed Method 108 for the measurement of these emissions from stationary sources has been subjected to validation studies in this work. Laborator...

  12. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    SciTech Connect

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated.

  13. System identification methods for aircraft flight control development and validation

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1995-01-01

    System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.

  14. Experimental Validation for Hot Stamping Process by Using Taguchi Method

    NASA Astrophysics Data System (ADS)

    Fawzi Zamri, Mohd; Lim, Syh Kai; Razlan Yusoff, Ahmad

    2016-02-01

    Due to the demand for reduction in gas emissions, energy saving and producing safer vehicles has driven the development of Ultra High Strength Steel (UHSS) material. To strengthen UHSS material such as boron steel, it needed to undergo a process of hot stamping for heating at certain temperature and time. In this paper, Taguchi method is applied to determine the appropriate parameter of thickness, heating temperature and heating time to achieve optimum strength of boron steel. The experiment is conducted by using flat square shape of hot stamping tool with tensile dog bone as a blank product. Then, the value of tensile strength and hardness is measured as response. The results showed that the lower thickness, higher heating temperature and heating time give the higher strength and hardness for the final product. In conclusion, boron steel blank are able to achieve up to 1200 MPa tensile strength and 650 HV of hardness.

  15. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    PubMed Central

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  16. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose. PMID:19368351

  17. Validation of an evacuated canister method for measuring part-per-billion levels of chemical warfare agent simulants.

    PubMed

    Coffey, Christopher C; LeBouf, Ryan F; Calvert, Catherine A; Slaven, James E

    2011-08-01

    The National Institute for Occupational Safety and Health (NIOSH) research on direct-reading instruments (DRIs) needed an instantaneous sampling method to provide independent confirmation of the concentrations of chemical warfare agent (CWA) simulants. It was determined that evacuated canisters would be the method of choice. There is no method specifically validated for volatile organic compounds (VOCs) in the NIOSH Manual of Analytical Methods. The purpose of this study was to validate an evacuated canister method for sampling seven specific VOCs that can be used as a simulant for CWA agents (cyclohexane) or influence the DRI measurement of CWA agents (acetone, chloroform, methylene chloride, methyl ethyl ketone, hexane, and carbon tetrachloride [CCl4]). The method used 6-L evacuated stainless-steel fused silica-lined canisters to sample the atmosphere containing VOCs. The contents of the canisters were then introduced into an autosampler/preconcentrator using a microscale purge and trap (MPT) method. The MPT method trapped and concentrated the VOCs in the air sample and removed most of the carbon dioxide and water vapor. After preconcentration, the samples were analyzed using a gas chromatograph with a mass selective detector. The method was tested, evaluated, and validated using the NIOSH recommended guidelines. The evaluation consisted of determining the optimum concentration range for the method; the sample stability over 30 days; and the accuracy, precision, and bias of the method. This method meets the NIOSH guidelines for six of the seven compounds (excluding acetone) tested in the range of 2.3-50 parts per billion (ppb), making it suitable for sampling of these VOCs at the ppb level. PMID:21874953

  18. Development and validity of a method for the evaluation of printed education material

    PubMed Central

    Castro, Mauro Silveira; Pilger, Diogo; Fuchs, Flávio Danni; Ferreira, Maria Beatriz Cardoso

    Objectives To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM); to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men) and 5 nurses (all women). Results Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group. Conclusions The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material. PMID:25214924

  19. A validated high performance liquid chromatographic method for the analysis of Goldenseal.

    PubMed

    Li, Wenkui; Fitzloff, John F

    2002-03-01

    Goldenseal (Hydrastis canadensis L.) has emerged as one of the top ten herbal supplements on the worldwide market. A rapid, simple and validated high performance liquid chromatographic method, with photodiode array detection, has been developed for the analysis of commercial Goldenseal products. Samples were treated by sonication with acidified methanol/water. The method was validated for LOD, LOQ, linearity, reproducibility and recovery with good results. PMID:11902811

  20. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... 40 Protection of Environment 14 2011-07-01 2011-07-01 false Alternative Validation Procedure...

  1. Validity of Eye Movement Methods and Indices for Capturing Semantic (Associative) Priming Effects

    ERIC Educational Resources Information Center

    Odekar, Anshula; Hallowell, Brooke; Kruse, Hans; Moates, Danny; Lee, Chao-Yang

    2009-01-01

    Purpose: The purpose of this investigation was to evaluate the usefulness of eye movement methods and indices as a tool for studying priming effects by verifying whether eye movement indices capture semantic (associative) priming effects in a visual cross-format (written word to semantically related picture) priming paradigm. Method: In the…

  2. General purpose nonlinear system solver based on Newton-Krylov method.

    Energy Science and Technology Software Center (ESTSC)

    2013-12-01

    KINSOL is part of a software family called SUNDIALS: SUite of Nonlinear and Differential/Algebraic equation Solvers [1]. KINSOL is a general-purpose nonlinear system solver based on Newton-Krylov and fixed-point solver technologies [2].

  3. Bioanalytical method validation: concepts, expectations and challenges in small molecule and macromolecule--a report of PITTCON 2013 symposium.

    PubMed

    Bashaw, Edward D; DeSilva, Binodh; Rose, Mark J; Wang, Yow-Ming C; Shukla, Chinmay

    2014-05-01

    The concepts, importance, and implications of bioanalytical method validation has been discussed and debated for a long time. The recent high profile issues related to bioanalytical method validation at both Cetero Houston and former MDS Canada has brought this topic back in the limelight. Hence, a symposium on bioanalytical method validation with the aim of revisiting the building blocks as well as discussing the challenges and implications on the bioanalysis of both small molecules and macromolecules was featured at the PITTCON 2013 Conference and Expo. This symposium was cosponsored by the American Chemical Society (ACS)-Division of Analytical Chemistry and Analysis and Pharmaceutical Quality (APQ) Section of the American Association of Pharmaceutical Scientists (AAPS) and featured leading speakers from the Food & Drug Administration (FDA), academia, and industry. In this symposium, the speakers shared several unique examples, and this session also provided a platform to discuss the need for continuous vigilance of the bioanalytical methods during drug discovery and development. The purpose of this article is to provide a concise report on the materials that were presented. PMID:24700273

  4. Single Lab Validation of a LC/UV/FLD/MS Method for Simultaneous Determination of Water-soluble Vitamins in Multi-Vitamin Dietary Supplements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this study was to develop a Single-Lab Validated Method using high-performance liquid chromatography (HPLC) with different detectors (diode array detector - DAD, fluorescence detector - FLD, and mass spectrometer - MS) for determination of seven B-complex vitamins (B1 - thiamin, B2 – ...

  5. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to Environmental Protection Agency methods developed by the Office of Water and the Office of... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste...

  6. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment...

  7. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment...

  8. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment...

  9. Alternative Methods for Validating Admissions and Course Placement Criteria. AIR 1995 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Noble, Julie; Sawyer, Richard

    Correlational methods are compared to an alternative method based on decision theory and logistic regression for providing validity evidence for college admissions and course placement criteria. The advantages and limitations of both methods are examined. The correlation coefficient measures the strength of the linear statistical relationship…

  10. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  11. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    PubMed

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method. PMID:19533405

  12. Determination of methylmercury in marine sediment samples: method validation and occurrence data.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography-pyrolysis-atomic fluorescence spectrometry (GC-Py-AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO3/CuSO4, solvent extraction and back extraction into Na2S2O3 yielded the highest extraction recovery, i.e., 94±3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC-ICP-MS), using isotopically enriched Me(201)Hg and (202)Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed. With this in mind, blanks, selectivity, working range (1-800 pg), linearity (0.9995), recovery (94-96%), repeatability (3%), intermediate precision (4%), limit of detection (0.45 pg) and limit of quantification (0.85 pg) were systematically assessed with CRM IAEA-405. The uncertainty budget was calculated and the major contribution to the combined uncertainty (16.24%, k=2) was found to arise from the uncertainty associated with recovery (74.1%). Demonstration of traceability of

  13. Development and validation of videotaped scenarios: a method for targeting specific participant groups.

    PubMed

    Noel, Nora E; Maisto, Stephen A; Johnson, James D; Jackson, Lee A; Goings, Christopher D; Hagman, Brett T

    2008-04-01

    Researchers using scenarios often neglect to validate perceived content and salience of embedded stimuli specifically with intended participants, even when such meaning is integral to the study. For example, sex and aggression stimuli are heavily influenced by culture, so participants may not perceive what researchers intended in sexual aggression scenarios. Using four studies, the authors describe the method of scenario validation to produce two videos assessing alcohol-related sexual aggression. Both videos are identical except for the presence in one video of antiforce cues that are extremely salient to the young heterosexual men. Focus groups and questionnaires validate these men's perceptions that (a) the woman was sexually interested, (b) the sexual cues were salient, (c) the antiforce cues were salient (antiaggression video only), and (e) these antiforce cues inhibited acceptance of forced sex. Results show the value of carefully selecting and validating content when assessing socially volatile variables and provide a useful template for developing culturally valid scenarios. PMID:18252938

  14. A Comparative Study of Information-Based Source Number Estimation Methods and Experimental Validations on Mechanical Systems

    PubMed Central

    Cheng, Wei; Zhang, Zhousuo; Cao, Hongrui; He, Zhengjia; Zhu, Guanwen

    2014-01-01

    This paper investigates one eigenvalue decomposition-based source number estimation method, and three information-based source number estimation methods, namely the Akaike Information Criterion (AIC), Minimum Description Length (MDL) and Bayesian Information Criterion (BIC), and improves BIC as Improved BIC (IBIC) to make it more efficient and easier for calculation. The performances of the abovementioned source number estimation methods are studied comparatively with numerical case studies, which contain a linear superposition case and a both linear superposition and nonlinear modulation mixing case. A test bed with three sound sources is constructed to test the performances of these methods on mechanical systems, and source separation is carried out to validate the effectiveness of the experimental studies. This work can benefit model order selection, complexity analysis of a system, and applications of source separation to mechanical systems for condition monitoring and fault diagnosis purposes. PMID:24776935

  15. VDA, a Method of Choosing a Better Algorithm with Fewer Validations

    PubMed Central

    Kluger, Yuval

    2011-01-01

    The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power. Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico. VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms. Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256

  16. VDA, a method of choosing a better algorithm with fewer validations.

    PubMed

    Strino, Francesco; Parisi, Fabio; Kluger, Yuval

    2011-01-01

    The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power.Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico.VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms.Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256

  17. Development and validation of an ionic chromatography method for the determination of nitrate, nitrite and chloride in meat.

    PubMed

    Lopez-Moreno, Cristina; Perez, Isabel Viera; Urbano, Ana M

    2016-03-01

    The purpose of this study is to develop the validation of a method for the analysis of certain preservatives in meat and to obtain a suitable Certified Reference Material (CRM) to achieve this task. The preservatives studied were NO3(-), NO2(-) and Cl(-) as they serve as important antimicrobial agents in meat to inhibit the growth of bacteria spoilage. The meat samples were prepared using a treatment that allowed the production of a known CRM concentration that is highly homogeneous and stable in time. The matrix effects were also studied to evaluate the influence on the analytical signal for the ions of interest, showing that the matrix influence does not affect the final result. An assessment of the signal variation in time was carried out for the ions. In this regard, although the chloride and nitrate signal remained stable for the duration of the study, the nitrite signal decreased appreciably with time. A mathematical treatment of the data gave a stable nitrite signal, obtaining a method suitable for the validation of these anions in meat. A statistical study was needed for the validation of the method, where the precision, accuracy, uncertainty and other mathematical parameters were evaluated obtaining satisfactory results. PMID:26471608

  18. School Discipline: Have We Lost Our Sense of Purpose in Our Search for a Good Method?

    ERIC Educational Resources Information Center

    Burton, Mary Alice Blanford

    The general economic and psychological evolution in America from a producer society to a consumer society has resulted in a conflict of purposes for American educators regarding school discipline. Consequently, contemporary American educators, unlike their forerunners, have ignored the long term social goals of classroom discipline. They have,…

  19. Validation of an in-vivo proton beam range check method in an anthropomorphic pelvic phantom using dose measurements

    SciTech Connect

    Bentefour, El H. Prieels, Damien; Tang, Shikui; Cascio, Ethan W.; Testa, Mauro; Lu, Hsiao-Ming; Samuel, Deepak; Gottschalk, Bernard

    2015-04-15

    Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification in the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose

  20. Wastewater standards and extraction chemistry in validation of microwave-assisted EPA method 3015A

    SciTech Connect

    Link, D.D.; Walter, P.J.; Kingston, H.M. . Dept. of Chemistry and Biochemistry)

    1999-07-15

    The difficulties associated with the control and transfer of environmental leach methods are discussed. Optimized EPA Method 3015A, a microwave-assisted leach of wastewater and drinking water matrices and aqueous extracts, is evaluated. The option to add HCl in addition to HNO[sub 3] provides better complexation and recovery of certain metals that are regulated by the Resource Conservation and Recovery Act (RCRA) than the original HNO[sub 3]-only Method 3015. Also discussed is the preparation and appropriate use of simulated wastewater standards. Standard reference materials for a wastewater matrix are unavailable, and this novel approach provides NIST-traceability of results for the first time on this matrix type. Leach concentrations from these simulated standards were determined using both the 5 mL HNO[sub 3] and the 4 mL HNO[sub 3] and 1 mL HCl leach options of new Method 3015A. Validation of the new mixed-acid option of Method 3015A has been provided by evaluating its performance on the 23 elements for which original Method 3015 was validated. In addition, validation is provided for boron, mercury, and strontium, elements that were not validated in original Method 3015. Method 3015A has been developed into a method capable of evaluating 26 elements in a single, efficient, 20-min procedure.

  1. Application of EU guidelines for the validation of screening methods for veterinary drugs.

    PubMed

    Stolker, Alida A M Linda

    2012-08-01

    Commission Decision (CD) 2002/657/EC describes detailed rules for method validation within the framework of residue monitoring programmes. The approach described in this CD is based on criteria. For (qualitative) screening methods, the most important criteria is that the CCβ has to be below any regulatory limit. Especially when microbiological or immunochemical methods are involved, the approach described in the CD is not easily applied. For example, by those methods, a large number of analytes (all antibiotics) within several different matrices (meat, milk, fish, eggs, etc.) are detected. It is not completely clear whether all those analytes and all matrices have to be taken into account during method validation. To clarify this, a working group - from EU Reference Laboratories - came up with a practical approach to validate multi-analyte multi-matrix screening methods. It describes how many analyte/matrix combinations have to be tested and how these combinations are selected. Furthermore it describes how to determine CCβ for screening methods in relation to a large list of compounds and maximum residue limits (MRLs). First for each analyte/matrix combination the 'cut-off' level - i.e. the level at which the method separates blanks from contaminated samples - is established. The validation is preferably at the concentration of 50% of the regulatory limit. A minimum set of 20 different samples has to be tested. From the experiences with applying these guidelines it was concluded that the validation approach is very 'practical'; however, there are some remarks. One has to be careful with selecting 'representative' analytes and matrices and it is strongly recommended to collect additional validation data during the routine application of the method. PMID:22851358

  2. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    PubMed

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods. PMID:20012027

  3. Validation of High-Performance Thin-Layer Chromatographic Methods for the Identification of Botanicals in a cGMP Environment

    PubMed Central

    REICH, EIKE; SCHIBLI, ANNE; DEBATT, ALISON

    2009-01-01

    Current Good Manufacturing Practices (cGMP) for botanicals stipulates the use of appropriate methods for identification of raw materials. Due to natural variability, chemical analysis of plant material is a great challenge and requires special approaches. This paper presents a comprehensive proposal to the process of validating qualitative high-performance thin-layer chromatographic (HPTLC) methods, proving that such methods are suitable for the purpose. The steps of the validation process are discussed and illustrated with examples taken from a project aiming at validation of methods for identification of green tea leaf, ginseng root, eleuthero root, echinacea root, black cohosh rhizome, licorice root, kava root, milk thistle aerial parts, feverfew aerial parts, and ginger root. The appendix of the paper, which includes complete documentation and method write-up for those plants, is available on the J. AOAC Int. Website (http://www.atypon-link.com/AOAC/loi/jaoi). PMID:18376581

  4. Verification, Validation, and Solution Quality in Computational Physics: CFD Methods Applied to Ice Sheet Physics

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    2005-01-01

    Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.

  5. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials. PMID:24282943

  6. Bridging the gap between comprehensive extraction protocols in plant metabolomics studies and method validation.

    PubMed

    Bijttebier, Sebastiaan; Van der Auwera, Anastasia; Foubert, Kenn; Voorspoels, Stefan; Pieters, Luc; Apers, Sandra

    2016-09-01

    It is vital to pay much attention to the design of extraction methods developed for plant metabolomics, as any non-extracted or converted metabolites will greatly affect the overall quality of the metabolomics study. Method validation is however often omitted in plant metabolome studies, as the well-established methodologies for classical targeted analyses such as recovery optimization cannot be strictly applied. The aim of the present study is to thoroughly evaluate state-of-the-art comprehensive extraction protocols for plant metabolomics with liquid chromatography-photodiode array-accurate mass mass spectrometry (LC-PDA-amMS) by bridging the gap with method validation. Validation of an extraction protocol in untargeted plant metabolomics should ideally be accomplished by validating the protocol for all possible outcomes, i.e. for all secondary metabolites potentially present in the plant. In an effort to approach this ideal validation scenario, two plant matrices were selected based on their wide versatility of phytochemicals: meadowsweet (Filipendula ulmaria) for its polyphenols content, and spicy paprika powder (from the genus Capsicum) for its apolar phytochemicals content (carotenoids, phytosterols, capsaicinoids). These matrices were extracted with comprehensive extraction protocols adapted from literature and analysed with a generic LC-PDA-amMS characterization platform that was previously validated for broad range phytochemical analysis. The performance of the comprehensive sample preparation protocols was assessed based on extraction efficiency, repeatability and intermediate precision and on ionization suppression/enhancement evaluation. The manuscript elaborates on the finding that none of the extraction methods allowed to exhaustively extract the metabolites. Furthermore, it is shown that depending on the extraction conditions enzymatic degradation mechanisms can occur. Investigation of the fractions obtained with the different extraction methods

  7. External Standards or Standard Additions? Selecting and Validating a Method of Standardization.

    ERIC Educational Resources Information Center

    Harvey, David

    2002-01-01

    Reports an experiment which is suitable for an introductory course in analytical chemistry and which illustrates the importance of matrix effects when selecting a method of standardization. Asserts that students learn how a spike recovery is used to validate an analytical method, and obtain practical experience in the difference between performing…

  8. Multi-laboratory validation of a standard method for quantifying proanthocyanidins in cranberry powders

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this study was to validate an improved 4-dimethylaminocinnamaldehyde (DMAC) colorimetric method using a commercially available standard (procyanidin A2), for the standard method for quantification of proanthocyanidins (PACs) in cranberry powders, in order to establish dosage guideli...

  9. Method of validating measurement data of a process parameter from a plurality of individual sensor inputs

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1998-01-01

    A method for generating a validated measurement of a process parameter at a point in time by using a plurality of individual sensor inputs from a scan of said sensors at said point in time. The sensor inputs from said scan are stored and a first validation pass is initiated by computing an initial average of all stored sensor inputs. Each sensor input is deviation checked by comparing each input including a preset tolerance against the initial average input. If the first deviation check is unsatisfactory, the sensor which produced the unsatisfactory input is flagged as suspect. It is then determined whether at least two of the inputs have not been flagged as suspect and are therefore considered good inputs. If two or more inputs are good, a second validation pass is initiated by computing a second average of all the good sensor inputs, and deviation checking the good inputs by comparing each good input including a present tolerance against the second average. If the second deviation check is satisfactory, the second average is displayed as the validated measurement and the suspect sensor as flagged as bad. A validation fault occurs if at least two inputs are not considered good, or if the second deviation check is not satisfactory. In the latter situation the inputs from each of all the sensors are compared against the last validated measurement and the value from the sensor input that deviates the least from the last valid measurement is displayed.

  10. Tutorial review on validation of liquid chromatography-mass spectrometry methods: part I.

    PubMed

    Kruve, Anneli; Rebane, Riin; Kipper, Karin; Oldekop, Maarja-Liisa; Evard, Hanno; Herodes, Koit; Ravio, Pekka; Leito, Ivo

    2015-04-22

    This is the part I of a tutorial review intending to give an overview of the state of the art of method validation in liquid chromatography mass spectrometry (LC-MS) and discuss specific issues that arise with MS (and MS/MS) detection in LC (as opposed to the "conventional" detectors). The Part I briefly introduces the principles of operation of LC-MS (emphasizing the aspects important from the validation point of view, in particular the ionization process and ionization suppression/enhancement); reviews the main validation guideline documents and discusses in detail the following performance parameters: selectivity/specificity/identity, ruggedness/robustness, limit of detection, limit of quantification, decision limit and detection capability. With every method performance characteristic its essence and terminology are addressed, the current status of treating it is reviewed and recommendations are given, how to determine it, specifically in the case of LC-MS methods. PMID:25819785

  11. Bioanalytical method validation considerations for LC-MS/MS assays of therapeutic proteins.

    PubMed

    Duggan, Jeffrey X; Vazvaei, Faye; Jenkins, Rand

    2015-01-01

    This paper highlights the recommendations of a group of industry scientists in validating regulated bioanalytical LC-MS/MS methods for protein therapeutics in a 2015 AAPSJ White Paper. This group recommends that most of the same precision and accuracy validation criteria used for ligand-binding assays (LBAs) be applied to LC-MS/MS-based assays where proteins are quantified using the LC-MS/MS signal from a surrogate peptide after proteolytic digestion (PrD-LCMS methods). PrD-LCMS methods are generally more complex than small molecule LC-MS/MS assays and may often include LBA procedures, leading to the recommendation for a combination of chromatographic and LBA validation strategies and appropriate acceptance criteria. Several key aspects of this bioanalytical approach that are discussed in the White Paper are treated here in additional detail. These topics include selectivity/specificity, matrix effect, digestion efficiency, stability and critical reagent considerations. PMID:26110712

  12. Display format, highlight validity, and highlight method: Their effects on search performance

    NASA Technical Reports Server (NTRS)

    Donner, Kimberly A.; Mckay, Tim D.; Obrien, Kevin M.; Rudisill, Marianne

    1991-01-01

    Display format and highlight validity were shown to affect visual display search performance; however, these studies were conducted on small, artificial displays of alphanumeric stimuli. A study manipulating these variables was conducted using realistic, complex Space Shuttle information displays. A 2x2x3 within-subjects analysis of variance found that search times were faster for items in reformatted displays than for current displays. Responses to valid applications of highlight were significantly faster than responses to non or invalidly highlighted applications. The significant format by highlight validity interaction showed that there was little difference in response time to both current and reformatted displays when the highlight validity was applied; however, under the non or invalid highlight conditions, search times were faster with reformatted displays. A separate within-subject analysis of variance of display format, highlight validity, and several highlight methods did not reveal a main effect of highlight method. In addition, observed display search times were compared to search time predicted by Tullis' Display Analysis Program. Benefits of highlighting and reformatting displays to enhance search and the necessity to consider highlight validity and format characteristics in tandem for predicting search performance are discussed.

  13. Validity and Feasibility of a Digital Diet Estimation Method for Use with Preschool Children: A Pilot Study

    ERIC Educational Resources Information Center

    Nicklas, Theresa A.; O'Neil, Carol E.; Stuff, Janice; Goodell, Lora Suzanne; Liu, Yan; Martin, Corby K.

    2012-01-01

    Objective: The goal of the study was to assess the validity and feasibility of a digital diet estimation method for use with preschool children in "Head Start." Methods: Preschool children and their caregivers participated in validation (n = 22) and feasibility (n = 24) pilot studies. Validity was determined in the metabolic research unit using…

  14. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... chemical have no reporting responsibilities under this Part. Note, however, that any method of extraction... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of... otherwise required, respondents must report only about quantities of a chemical that is defined as...

  15. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... chemical have no reporting responsibilities under this Part. Note, however, that any method of extraction... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of... otherwise required, respondents must report only about quantities of a chemical that is defined as...

  16. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures § 46.261... method or procedure. 46.261 Section 46.261 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO...

  17. Use of glyceroltriheptanoate as marker for processed animal by-products: development and validation of an analytical method.

    PubMed

    von Holst, C; Boix, A; Bellorini, S; Serano, F; Androni, S; Verkuylen, B; Margry, R

    2009-04-01

    A recently published European Regulation requires that the artificial marker, glycerol triheptanoate (GTH), be added to processed animal by-product (ABPs) prohibited from entering the food chain. The objective of this new requirement is to allow full traceability and ensure that these materials are disposed of in a proper way. Here, we report the development and single-laboratory validation of an analytical method for the determination of GTH in meat and bone meal plus animal fat. The method comprises three steps: (1) extraction of GTH from the samples with petroleum ether when analysing meat and bone meal or dissolving the sample in n-hexane when analysing fat; (2) clean-up of the extract using commercially available SPE cartridges; (3) determination of GTH by GC/MS or GC with flame ionisation detection (FID). The results of the validation study demonstrated that the relative standard for intermediate precision varied between 2.5 and 8.2%, depending on GTH concentration and the detector utilised. In all cases, the relative recovery rate was above 96%. The limit of quantification was 16 mg kg(-1) (GTH/fat content of the sample) with MS as detector and 20 mg kg(-1) with FID. Moreover, the method has been successfully applied in a second laboratory, indicating its transferability. Considering the minimum GTH concentration in ABPs of 250 mg kg(-1), the method is considered suitable for the intended purpose and can be utilised by EU Member States laboratories for official control and monitoring. PMID:19680920

  18. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  19. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  20. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  1. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  2. From the Bronx to Bengifunda (and Other Lines of Flight): Deterritorializing Purposes and Methods in Science Education Research

    ERIC Educational Resources Information Center

    Gough, Noel

    2011-01-01

    In this essay I explore a number of questions about purposes and methods in science education research prompted by my reading of Wesley Pitts' ethnographic study of interactions among four students and their teacher in a chemistry classroom in the Bronx, New York City. I commence three "lines of flight" (small acts of Deleuzo-Guattarian…

  3. Validated spectrofluorimetric method for the determination of clonazepam in pharmaceutical preparations.

    PubMed

    Ibrahim, Fawzia; El-Enany, Nahed; Shalan, Shereen; Elsharawy, Rasha

    2016-05-01

    A simple, highly sensitive and validated spectrofluorimetric method was applied in the determination of clonazepam (CLZ). The method is based on reduction of the nitro group of clonazepam with zinc/CaCl2 , and the product is then reacted with 2-cyanoacetamide (2-CNA) in the presence of ammonia (25%) yielding a highly fluorescent product. The produced fluorophore exhibits strong fluorescence intensity at ʎem = 383 nm after excitation at ʎex = 333 nm. The method was rectilinear over a concentration range of 0.1-0.5 ng/mL with a limit of detection (LOD) of 0.0057 ng/mL and a limit of quantification (LOQ) of 0.017 ng/mL. The method was fully validated and successfully applied to the determination of CLZ in its tablets with a mean percentage recovery of 100.10 ± 0.75%. Method validation according to ICH Guidelines was evaluated. Statistical analysis of the results obtained using the proposed method was successfully compared with those obtained using a reference method, and there was no significance difference between the two methods in terms of accuracy and precision. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26335592

  4. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    EPA Science Inventory

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  5. Developing and Validating the Youth Conduct Problems Scale-Rwanda: A Mixed Methods Approach

    PubMed Central

    Ng, Lauren C.; Kanyanganzi, Frederick; Munyanah, Morris; Mushashi, Christine; Betancourt, Theresa S.

    2014-01-01

    This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R). Qualitative free listing (n = 74) and key informant interviews (n = 47) identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research. PMID:24949628

  6. Validation of the Eriksen method for the exact Foldy-Wouthuysen representation

    NASA Astrophysics Data System (ADS)

    Silenko, A. Ya.

    2013-05-01

    The Eriksen method is proven to yield a correct and exact result when a sufficient condition of exact transformation to the Foldy-Wouthuysen (FW) representation is satisfied. Therefore, the Eriksen method is confirmed as valid. This makes it possible to establish the limits within which the approximate "step-by-step" methods are applicable. The latter is done by comparing the relativistic formulas for a Hamiltonian operator in FW representation (obtained using those methods) and the known expression for the first terms of a series, which defines the expansion of this operator in powers of v/ c as found by applying the Eriksen method.

  7. Suitability of analytical methods to measure solubility for the purpose of nanoregulation.

    PubMed

    Tantra, Ratna; Bouwmeester, Hans; Bolea, Eduardo; Rey-Castro, Carlos; David, Calin A; Dogné, Jean-Michel; Jarman, John; Laborda, Francisco; Laloy, Julie; Robinson, Kenneth N; Undas, Anna K; van der Zande, Meike

    2016-01-01

    Solubility is an important physicochemical parameter in nanoregulation. If nanomaterial is completely soluble, then from a risk assessment point of view, its disposal can be treated much in the same way as "ordinary" chemicals, which will simplify testing and characterisation regimes. This review assesses potential techniques for the measurement of nanomaterial solubility and evaluates the performance against a set of analytical criteria (based on satisfying the requirements as governed by the cosmetic regulation as well as the need to quantify the concentration of free (hydrated) ions). Our findings show that no universal method exists. A complementary approach is thus recommended, to comprise an atomic spectrometry-based method in conjunction with an electrochemical (or colorimetric) method. This article shows that although some techniques are more commonly used than others, a huge research gap remains, related with the need to ensure data reliability. PMID:26001188

  8. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  9. Review of gaseous methods of killing poultry on-farm for disease control purposes.

    PubMed

    Raj, A B M; Sandilands, V; Sparks, N H C

    2006-08-19

    Poultry may need to be culled in the event of an outbreak of disease. Gassing has advantages over mechanical and electrical methods or overdoses of anaesthetics because large numbers can be killed simultaneously and little or no handling of the birds is required. However, gaseous killing methods may have welfare implications for the birds, which may find various gases more or less aversive, may undergo respiratory distress and/or experience convulsions, and may remain conscious for a considerable time before they die. In addition, the gases used may present health and safety risks to human operators, and be difficult to supply and deliver. PMID:16921011

  10. Experimental validation of applied strain sensors: importance, methods and still unsolved challenges

    NASA Astrophysics Data System (ADS)

    Habel, Wolfgang R.; Schukar, Vivien G.; Mewis, Franziska; Kohlhoff, Harald

    2013-09-01

    Fiber-optic strain sensors are increasingly used in very different technical fields. Sensors are provided with specifications defined by the manufacturer or ascertained by the interested user. If deformation sensors are to be used to evaluate the long-term behavior of safety-relevant structures or to monitor critical structure components, their performance and signal stability must be of high quality to enable reliable data recording. The measurement system must therefore be validated according to established technical rules and standards before its application and after. In some cases, not all details of the complex characteristic and performance of applied fiber-optic sensors are sufficiently understood, or can be validated because of a lack of knowledge and methods to check the sensors' behavior. This contribution focusses therefore on the importance of serious validation in avoiding a decrease or even deterioration of the sensors' function. Methods for validation of applied sensors are discussed and should reveal weaknesses in validation of embedded or integrated fiber-optic deformation and/or strain sensors. An outlook to some research work that has to be carried out to ensure a well-accepted practical use of fiber-optic sensors is given.

  11. A validation framework for microbial forensic methods based on statistical pattern recognition

    SciTech Connect

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  12. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  13. 77 FR 61610 - Interagency Coordinating Committee on the Validation of Alternative Methods Evaluation Report and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-10

    ...-18, 2010 meeting (75 FR 26758, May 12, 2010) for comment. The public was also given an opportunity to....niehs.nih.gov/methods/ocutox/reducenum.htm ) for comment by the broad stakeholder community (76 FR 50220... HUMAN SERVICES National Institutes of Health Interagency Coordinating Committee on the Validation...

  14. Validity of a Simulation Game as a Method for History Teaching

    ERIC Educational Resources Information Center

    Corbeil, Pierre; Laveault, Dany

    2011-01-01

    The aim of this research is, first, to determine the validity of a simulation game as a method of teaching and an instrument for the development of reasoning and, second, to study the relationship between learning and students' behavior toward games. The participants were college students in a History of International Relations course, with two…

  15. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  16. Comparison of different mass transport calculation methods for wind erosion quantification purposes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative estimation of the material transported by the wind is essential in the study and control of wind erosion, although methods for its calculation are still controversial. Sampling the dust cloud at discrete heights, fitting an equation to the data, and integrating this equation from the so...

  17. [Comparison of the selected methods of cord processing for transplantation purposes].

    PubMed

    Ołdak, T; Machaj, E K; Gajkowska, A; Kruszewski, M; Kłos, M; Szczecina, R; Czajkowski, K; Kuczyńska-Sicińska, J; Pojda, Z

    2000-09-01

    Human umbilical cord blood (UCB) has been successfully used as a source of allogeneic hematopoietic cells for transplantation. Banking of the UCB requires its volume reduction to decrease storage space, costs and volume of infused DMSO. In order to select an optimal method for volume reduction we compared several methods of cord blood processing, namely buffy coat centrifugation, red cell lysis, hydroxyethyl starch (HES)-, methylcellulose- and gelatin-sedimentations. The viability of cells and the recoveries of total white blood cells, mononuclear cells and CD34+ cells was evaluated. We also compared the efficacy of red cells depletion from the original UCB sample. Buffy coat centrifugation, red cell lysis, HES, gelatin or methylcellulose resulted in high mononuclear cell recoveries, whereas high hematopoietic cell recovery was observed only after HES sedimentation and buffy coat processing. The HES sedimentation procedure compared to buffy coat processing is more time and labor consuming and resulted in higher red blood cell and platelets depletion. Both methods can be recommended as a method at choice for the umbilical cord blood processing before banking. PMID:11083012

  18. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders. PMID:23179190

  19. Development and Validation of Stability Indicating RP-HPLC Method for Voriconazole.

    PubMed

    Khetre, A B; Sinha, P K; Damle, Mrinalini C; Mehendre, R

    2009-09-01

    This study describes the development and validation of stability indicating HPLC method for voriconazole, an antifungal drug. Voriconazole was subjected to stress degradation under different conditions recommended by International Conference on Harmonization. The sample so generated was used to develop a stability-indicating high performance liquid chromatographic method for voriconazole. The peak for voriconazole was well resolved from peaks of degradation products, using a Hypersil C18 (250x4.6 mm) column and a mobile phase comprising of acetonitrile: water (40:60, v/v), at flow rate of 1 ml/min. Detection was carried out using photodiode array detector. A linear response (r > 0.99) was observed in the range of 5-25 mug/ml. The method showed good recoveries (average 100.06%) and relative standard deviation for intra and inter-day were method was validated for specificity and robustness also. PMID:20502568

  20. Validity and reliability of an alternative method for measuring power output during six-second all-out cycling.

    PubMed

    Watson, Martin; Bibbo, Daniele; Duffy, Charles R; Riches, Philip E; Conforto, Silvia; Macaluso, Andrea

    2014-08-01

    In a laboratory setting where both a mechanically-braked cycling ergometer and a motion analysis (MA) system are available, flywheel angular displacement can be estimated by using MA. The purpose of this investigation was to assess the validity and reliability of a MA method for measuring maximal power output (Pmax) in comparison with a force transducer (FT) method. Eight males and eight females undertook three identical sessions, separated by 4 to 6 days; the first being a familiarization session. Individuals performed three 6-second sprints against 50% of the maximal resistance to complete two pedal revolutions with a 3-minute rest between trials. Power was determined independently using both MA and FT analyses. Validity: MA recorded significantly higher Pmax than FT (P < .05). Bland-Altman plots showed that there was a systematic bias in the difference between the measures of the two systems. This difference increased as power increased. Repeatability: Intraclass correlation coefficients were on average 0.90 ± 0.05 in males and 0.85 ± 0.08 in females. Measuring Pmax by MA, therefore, is as appropriate for use in exercise physiology research as Pmax measured by FT, provided that a bias between these measurements methods is allowed for. PMID:24977624

  1. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    NASA Astrophysics Data System (ADS)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  2. Refraction-based X-ray Computed Tomography for Biomedical Purpose Using Dark Field Imaging Method

    NASA Astrophysics Data System (ADS)

    Sunaguchi, Naoki; Yuasa, Tetsuya; Huo, Qingkai; Ichihara, Shu; Ando, Masami

    We have proposed a tomographic x-ray imaging system using DFI (dark field imaging) optics along with a data-processing method to extract information on refraction from the measured intensities, and a reconstruction algorithm to reconstruct a refractive-index field from the projections generated from the extracted refraction information. The DFI imaging system consists of a tandem optical system of Bragg- and Laue-case crystals, a positioning device system for a sample, and two CCD (charge coupled device) cameras. Then, we developed a software code to simulate the data-acquisition, data-processing, and reconstruction methods to investigate the feasibility of the proposed methods. Finally, in order to demonstrate its efficacy, we imaged a sample with DCIS (ductal carcinoma in situ) excised from a breast cancer patient using a system constructed at the vertical wiggler beamline BL-14C in KEK-PF. Its CT images depicted a variety of fine histological structures, such as milk ducts, duct walls, secretions, adipose and fibrous tissue. They correlate well with histological sections.

  3. E-Flux2 and SPOT: Validated Methods for Inferring Intracellular Metabolic Flux Distributions from Transcriptomic Data

    PubMed Central

    Kim, Min Kyung; Lane, Anatoliy; Kelley, James J.; Lun, Desmond S.

    2016-01-01

    Background Several methods have been developed to predict system-wide and condition-specific intracellular metabolic fluxes by integrating transcriptomic data with genome-scale metabolic models. While powerful in many settings, existing methods have several shortcomings, and it is unclear which method has the best accuracy in general because of limited validation against experimentally measured intracellular fluxes. Results We present a general optimization strategy for inferring intracellular metabolic flux distributions from transcriptomic data coupled with genome-scale metabolic reconstructions. It consists of two different template models called DC (determined carbon source model) and AC (all possible carbon sources model) and two different new methods called E-Flux2 (E-Flux method combined with minimization of l2 norm) and SPOT (Simplified Pearson cOrrelation with Transcriptomic data), which can be chosen and combined depending on the availability of knowledge on carbon source or objective function. This enables us to simulate a broad range of experimental conditions. We examined E. coli and S. cerevisiae as representative prokaryotic and eukaryotic microorganisms respectively. The predictive accuracy of our algorithm was validated by calculating the uncentered Pearson correlation between predicted fluxes and measured fluxes. To this end, we compiled 20 experimental conditions (11 in E. coli and 9 in S. cerevisiae), of transcriptome measurements coupled with corresponding central carbon metabolism intracellular flux measurements determined by 13C metabolic flux analysis (13C-MFA), which is the largest dataset assembled to date for the purpose of validating inference methods for predicting intracellular fluxes. In both organisms, our method achieves an average correlation coefficient ranging from 0.59 to 0.87, outperforming a representative sample of competing methods. Easy-to-use implementations of E-Flux2 and SPOT are available as part of the open

  4. Development and Validation of a Terbium-Sensitized LuminescenceAnalytical Method for Deferiprone

    PubMed Central

    Manzoori Lashkar, Jamshid; Amjadi, Mohammad; Soleymani, Jafar; Tamizi, Elnaz; Panahi-Azar, Vahid; Jouyban, Abolghasem

    2012-01-01

    A sensitive fluorometric method for the determination of deferiprone (DFP) based on the formation of a luminescent complex with Tb3+ ions in aqueous solutions is reported. The maximum excitation and emission wavelengths were 295 and 545 nm, respectively. The effects of various factors on the luminescence intensity of the system were investigated and optimized, then under the optimum conditions, the method was validated. The method validation results indicated that the relative intensity at 545 nm has a linear relationship with the concentration of DFP in aqueous solutions at the range of 7.2 × 10-9 to 1.4 × 10-5 M, the detection and quantification limits were calculated respectively as 6.3 × 10-9 and 2.1 × 10-8 M, precision and accuracy of the method were lower than 5% and the recovery was between 100.1% and 102.3%. The results indicated that this method was simple, time saving, specific, accurate and precise for the determination of DFP in aqueous solutions. After optimization and validation, the method successfully applied for determination of DFP in tablet dosage forms. The stoichiometry of the Tb3+-DFP complex was found as 1:3 and the complex formation constant was 1.6 × 1016. PMID:24250504

  5. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    NASA Astrophysics Data System (ADS)

    Mermet, J. M.; Granier, G.

    2012-10-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725-4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation.

  6. Standardization of water purification in the central dialysis fluid delivery system: validation and parametric method.

    PubMed

    Tomo, Tadashi; Shinoda, Tosiho

    2009-01-01

    The central dialysis fluid delivery system (CDDS) has been mainly used for hemodialysis therapy in Japan. Validation and a parametric method are necessary for the quality control of dialysis fluid in CDDS. Validation is a concept for the assurance of system compatibility and product quality, and is defined as follows: the manufacturing and quality control methods including the system design and equipment of the manufacturing facility, manufacturing procedure and processes. Confirmed results must be kept within acceptable limits and they must be documented in a record. Important parameters for validating CDDS include: (1) setting the sterilized area; (2) decision of sterilization level; (3) confirmation of the maximum bio-burden; (4) performance of endotoxin retentive filter and reverse osmosis (RO) module, and (5) checkpoints of purity of dialysis water in the system. Taking the concept of validation and a parametric method in the management of CDDS into consideration enables the supply the purified dialysis fluid or the online prepared substitution fluid that meet the 2008 standards of the Japanese Society for Dialysis Therapy. PMID:19556762

  7. Assembly for collecting samples for purposes of identification or analysis and method of use

    DOEpatents

    Thompson, Cyril V [Knoxville, TN; Smith, Rob R [Knoxville, TN

    2010-02-02

    An assembly and an associated method for collecting a sample of material desired to be characterized with diagnostic equipment includes or utilizes an elongated member having a proximal end with which the assembly is manipulated by a user and a distal end. In addition, a collection tip which is capable of being placed into contact with the material to be characterized is supported upon the distal end. The collection tip includes a body of chemically-inert porous material for binding a sample of material when the tip is placed into contact with the material and thereby holds the sample of material for subsequent introduction to the diagnostic equipment.

  8. Multiple methods, maps, and management applications: Purpose made seafloor maps in support of ocean management

    NASA Astrophysics Data System (ADS)

    Brown, Craig J.; Sameoto, Jessica A.; Smith, Stephen J.

    2012-08-01

    The establishment of multibeam echosounders (MBES) as a mainstream tool in ocean mapping has facilitated integrative approaches toward nautical charting, benthic habitat mapping, and seafloor geotechnical surveys. The inherent bathymetric and backscatter information generated by MBES enables marine scientists to present highly accurate bathymetric data with a spatial resolution closely matching that of terrestrial mapping. Furthermore, developments in data collection and processing of MBES backscatter, combined with the quality of the co-registered depth information, have resulted in the increasing preferential use of multibeam technology over conventional sidescan sonar for the production of benthic habitat maps. A range of post-processing approaches can generate customized map products to meet multiple ocean management needs, thus extracting maximum value from a single survey data set. Based on recent studies over German Bank off SW Nova Scotia, Canada, we show how primary MBES bathymetric and backscatter data, along with supplementary data (i.e. in situ video and stills), were processed using a variety of methods to generate a series of maps. Methods conventionally used for classification of multi-spectral data were tested for classification of the MBES data set to produce a map summarizing broad bio-physical characteristics of the seafloor (i.e. a benthoscape map), which is of value for use in many aspects of marine spatial planning. A species-specific habitat map for the sea scallop Placopecten magellanicus was also generated from the MBES data by applying a Species Distribution Modeling (SDM) method to spatially predict habitat suitability, which offers tremendous promise for use in fisheries management. In addition, we explore the challenges of incorporating benthic community data into maps based on species information derived from a large number of seafloor photographs. Through the process of applying multiple methods to generate multiple maps for

  9. Co-validation of three methods for optical characterization of point-focus concentrators

    SciTech Connect

    Wendelin, T.J.; Grossman, J.W.

    1994-10-01

    Three different methods for characterizing point-focus solar concentrator optical performance have been developed for specific applications. These methods include a laser ray trace technique called the Scanning Hartmann Optical Test, a video imaging process called the 2f Technique and actual on-sun testing in conjunction with optical computer modeling. Three concentrator test articles, each of a different design, were characterized using at least two of the methods and, in one case, all three. The results of these tests are compared in order to validate the methods. Excellent agreement is observed in the results, suggesting that the techniques provide consistent and accurate characterizations of solar concentrator optics.

  10. Co-validation of three methods for optical characterization of point-focus concentrators

    NASA Astrophysics Data System (ADS)

    Wendelin, T. J.; Grossman, J. W.

    Three different methods for characterizing point-focus solar concentrator optical performance have been developed for specific applications. These methods include a laser ray trace technique called the Scanning Hartmann Optical Test, a video imaging process called the 2f Technique and actual on-sun testing in conjunction with optical computer modeling. Three concentrator test articles, each of a different design, were characterized using at least two of the methods and, in one case, all three. The results of these tests are compared in order to validate the methods. Excellent agreement is observed in the results, suggesting that the techniques provide consistent and accurate characterizations of solar concentrator optics.