Science.gov

Sample records for purpose validated method

  1. Validity of a simple videogrammetric method to measure the movement of all hand segments for clinical purposes.

    PubMed

    Sancho-Bru, Joaquín L; Jarque-Bou, Néstor J; Vergara, Margarita; Pérez-González, Antonio

    2014-02-01

    Hand movement measurement is important in clinical, ergonomics and biomechanical fields. Videogrammetric techniques allow the measurement of hand movement without interfering with the natural hand behaviour. However, an accurate measurement of the hand movement requires the use of a high number of markers, which limits its applicability for the clinical practice (60 markers would be needed for hand and wrist). In this work, a simple method that uses a reduced number of markers (29), based on a simplified kinematic model of the hand, is proposed and evaluated. A set of experiments have been performed to evaluate the errors associated with the kinematic simplification, together with the evaluation of its accuracy, repeatability and reproducibility. The global error attributed to the kinematic simplification was 6.68°. The method has small errors in repeatability and reproducibility (3.43° and 4.23°, respectively) and shows no statistically significant difference with the use of electronic goniometers. The relevance of the work lies in the ability of measuring all degrees of freedom of the hand with a reduced number of markers without interfering with the natural hand behaviour, which makes it suitable for its use in clinical applications, as well as for ergonomic and biomechanical purposes. PMID:24503512

  2. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  3. Fit for purpose validated method for the determination of the strontium isotopic signature in mineral water samples by multi-collector inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Brach-Papa, Christophe; Van Bocxstaele, Marleen; Ponzevera, Emmanuel; Quétel, Christophe R.

    2009-03-01

    A robust method allowing the routine determination of n( 87Sr)/ n( 86Sr) with at least five significant decimal digits for large sets of mineral water samples is described. It is based on 2 consecutive chromatographic separations of Sr associated to multi-collector inductively coupled plasma mass spectrometry (MC-ICPMS) measurements. Separations are performed using commercial pre-packed columns filled with "Sr resin" to overcome isobaric interferences affecting the determination of strontium isotope ratios. The careful method validation scheme applied is described. It included investigations on all parameters influencing both chromatographic separations and MC-ICPMS measurements, and also the test on a synthetic sample made of an aliquot of the NIST SRM 987 certified reference material dispersed in a saline matrix to mimic complex samples. Correction for mass discrimination was done internally using the n( 88Sr)/ n( 86Sr) ratio. For comparing mineral waters originating from different geological backgrounds or identifying counterfeits, calculations involved the well known consensus value (1/0.1194) ± 0 as reference. The typical uncertainty budget estimated for these results was 40 'ppm' relative ( k = 2). It increased to 150 'ppm' ( k = 2) for the establishment of stand alone results, taking into account a relative difference of about 126 'ppm' systematically observed between measured and certified values of the NIST SRM 987. In case there was suspicion of a deviation of the n( 88Sr)/ n( 86Sr) ratio (worst case scenario) our proposal was to use the NIST SRM 987 value 8.37861 ± 0.00325 ( k = 2) as reference, and assign a typical relative uncertainty budget of 300 'ppm' ( k = 2). This method is thus fit for purpose and was applied to eleven French samples.

  4. Content Validation of the Purpose Dimension.

    ERIC Educational Resources Information Center

    LaPlante, Marilyn J.; Jewett, Ann E.

    1987-01-01

    The article reports on LaPlante's research (1973) on evaluation of the purpose dimension of the Purpose Process Curriculum Framework, which established a set of criteria for evaluating the framework and demonstrated that the Delphi technique is appropriate for study of physical education curriculum. (CB)

  5. Construct Validity in Formative Assessment: Purpose and Practices

    ERIC Educational Resources Information Center

    Rix, Samantha

    2012-01-01

    This paper examines the utilization of construct validity in formative assessment for classroom-based purposes. Construct validity pertains to the notion that interpretations are made by educators who analyze test scores during formative assessment. The purpose of this paper is to note the challenges that educators face when interpreting these…

  6. Homework Purpose Scale for High School Students: A Validation Study

    ERIC Educational Resources Information Center

    Xu, Jianzhong

    2010-01-01

    The purpose of this study is to test the validity of scores on the Homework Purpose Scale using 681 rural and 306 urban high school students. First, confirmatory factor analysis was conducted on the rural sample. The results reveal that the Homework Purpose Scale comprises three separate yet related factors, including Learning-Oriented Reasons,…

  7. Homework Purpose Scale for Middle School Students: A Validation Study

    ERIC Educational Resources Information Center

    Xu, Jianzhong

    2011-01-01

    The purpose of the present study is to test the validity of scores on the Homework Purpose Scale (HPS) for middle school students. The participants were 1,181 eighth graders in the southeastern United States, including (a) 699 students in urban school districts and (b) 482 students in rural school districts. First, confirmatory factor analysis was…

  8. Review of surface steam sterilization for validation purposes.

    PubMed

    van Doornmalen, Joost; Kopinga, Klaas

    2008-03-01

    Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes. PMID:18313509

  9. VAN method lacks validity

    NASA Astrophysics Data System (ADS)

    Jackson, David D.; Kagan, Yan Y.

    Varotsos and colleagues (the VAN group) claim to have successfully predicted many earthquakes in Greece. Several authors have refuted these claims, as reported in the May 27,1996, special issue of Geophysical Research Letters and a recent book, A Critical Review of VAN [Lighthill 1996]. Nevertheless, the myth persists. Here we summarize why the VAN group's claims lack validity.The VAN group observes electrical potential differences that they call “seismic electric signals” (SES) weeks before and hundreds of kilometers away from some earthquakes, claiming that SES are somehow premonitory. This would require that increases in stress or decreases in strength cause the electrical variations, or that some regional process first causes the electrical signals and then helps trigger the earthquakes. Here we adopt their notation SES to refer to the electrical variations, without accepting any link to the quakes.

  10. Toward a Unified Validation Framework in Mixed Methods Research

    ERIC Educational Resources Information Center

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  11. Random Qualitative Validation: A Mixed-Methods Approach to Survey Validation

    ERIC Educational Resources Information Center

    Van Duzer, Eric

    2012-01-01

    The purpose of this paper is to introduce the process and value of Random Qualitative Validation (RQV) in the development and interpretation of survey data. RQV is a method of gathering clarifying qualitative data that improves the validity of the quantitative analysis. This paper is concerned with validity in relation to the participants'…

  12. Purposes and methods of scoring earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2010-12-01

    There are two kinds of purposes in the studies on earthquake prediction or forecasts: one is to give a systematic estimation of earthquake risks in some particular region and period in order to give advice to governments and enterprises for the use of reducing disasters, the other one is to search for reliable precursors that can be used to improve earthquake prediction or forecasts. For the first case, a complete score is necessary, while for the latter case, a partial score, which can be used to evaluate whether the forecasts or predictions have some advantages than a well know model, is necessary. This study reviews different scoring methods for evaluating the performance of earthquake prediction and forecasts. Especially, the gambling scoring method, which is developed recently, shows its capacity in finding good points in an earthquake prediction algorithm or model that are not in a reference model, even if its overall performance is no better than the reference model.

  13. External Validity in Policy Evaluations That Choose Sites Purposively

    ERIC Educational Resources Information Center

    Olsen, Robert B.; Orr, Larry L.; Bell, Stephen H.; Stuart, Elizabeth A.

    2013-01-01

    Evaluations of the impact of social programs are often carried out in multiple sites, such as school districts, housing authorities, local TANF offices, or One-Stop Career Centers. Most evaluations select sites purposively following a process that is nonrandom. Unfortunately, purposive site selection can produce a sample of sites that is not…

  14. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  15. Design for validation, based on formal methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1990-01-01

    Validation of ultra-reliable systems decomposes into two subproblems: (1) quantification of probability of system failure due to physical failure; (2) establishing that Design Errors are not present. Methods of design, testing, and analysis of ultra-reliable software are discussed. It is concluded that a design-for-validation based on formal methods is needed for the digital flight control systems problem, and also that formal methods will play a major role in the development of future high reliability digital systems.

  16. Purpose in Life Test assessment using latent variable methods.

    PubMed

    Harlow, L L; Newcomb, M D; Bentler, P M

    1987-09-01

    A psychometric assessment was conducted on a slightly revised version of the Purpose in Life Test (PIL-R). Factor analyses revealed a large general factor plus four primary factors comprising lack of purpose in life, positive sense of purpose, motivation for meaning, and existential confusion. Validity models showed that the PIL-R was positively related to a construct of happiness and was negatively related to suicidality and meaninglessness. Reliability estimates ranged from 0.78 to 0.86. The revised version can be presented compactly and may be less confusing to subjects than the original PIL. PMID:3664045

  17. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. PMID:24958671

  18. Experimental validation of structural optimization methods

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.

    1992-01-01

    The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.

  19. ASTM Validates Air Pollution Test Methods

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  20. A Practical Guide to Immunoassay Method Validation

    PubMed Central

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J. C.; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H. Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M.; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E.

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer’s disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well. PMID:26347708

  1. Fit-for-purpose bioanalytical cross-validation for LC-MS/MS assays in clinical studies.

    PubMed

    Xu, Xiaohui; Ji, Qin C; Jemal, Mohammed; Gleason, Carol; Shen, Jim X; Stouffer, Bruce; Arnold, Mark E

    2013-01-01

    The paradigm shift of globalized research and conducting clinical studies at different geographic locations worldwide to access broader patient populations has resulted in increased need of correlating bioanalytical results generated in multiple laboratories, often across national borders. Cross-validations of bioanalytical methods are often implemented to assure the equivalency of the bioanalytical results is demonstrated. Regulatory agencies, such as the US FDA and European Medicines Agency, have included the requirement of cross-validations in their respective bioanalytical validation guidance and guidelines. While those documents provide high-level expectations, the detailed implementation is at the discretion of each individual organization. At Bristol-Myers Squibb, we practice a fit-for-purpose approach for conducting cross-validations for small-molecule bioanalytical methods using LC-MS/MS. A step-by-step proposal on the overall strategy, procedures and technical details for conducting a successful cross-validation is presented herein. A case study utilizing the proposed cross-validation approach to rule out method variability as the potential cause for high variance observed in PK studies is also presented. PMID:23256474

  2. Validation of qualitative microbiological test methods.

    PubMed

    IJzerman-Boon, Pieta C; van den Heuvel, Edwin R

    2015-01-01

    This paper considers a statistical model for the detection mechanism of qualitative microbiological test methods with a parameter for the detection proportion (the probability to detect a single organism) and a parameter for the false positive rate. It is demonstrated that the detection proportion and the bacterial density cannot be estimated separately, not even in a multiple dilution experiment. Only the product can be estimated, changing the interpretation of the most probable number estimator. The asymptotic power of the likelihood ratio statistic for comparing an alternative method with the compendial method, is optimal for a single dilution experiment. The bacterial density should either be close to two CFUs per test unit or equal to zero, depending on differences in the model parameters between the two test methods. The proposed strategy for method validation is to use these two dilutions and test for differences in the two model parameters, addressing the validation parameters specificity and accuracy. Robustness of these two parameters might still be required, but all other validation parameters can be omitted. A confidence interval-based approach for the ratio of the detection proportions for the two methods is recommended, since it is most informative and close to the power of the likelihood ratio test. PMID:25412584

  3. Validation of an alternative microbiological method for tissue products.

    PubMed

    Suessner, Susanne; Hennerbichler, Simone; Schreiberhuber, Stefanie; Stuebl, Doris; Gabriel, Christian

    2014-06-01

    According to the European Pharmacopoeia sterility testing of products includes an incubation time of 14 days in thioglycollate medium and soya-bean casein medium. In this case a large period of time is needed for product testing. So we designed a study to evaluate an alternative method for sterility testing. The aim of this study was to reduce the incubation time for the routinely produced products in our tissue bank (cornea and amnion grafts) by obtaining the same detection limit, accurateness and recovery rates as the reference method described in the European Pharmacopoeia. The study included two steps of validation. Primary validation compared the reference method with the alternative method. Therefore eight bacterial and two fungi test strains were tested at their preferred milieu. A geometric dilution series from 10 to 0.625 colony forming unit per 10 ml culture media was used. Subsequent to the evaluation the second part of the study started including the validation of the fertility of the culture media and the parallel testing of the two methods by investigating products. For this purpose two product batches were tested in three independent runs. Concerning the validation we could not find any aberration between the alternative and the reference method. In addition, the recovery rate of each microorganism was between 83.33 and 100 %. The alternative method showed non-inferiority regarding accuracy to the reference method. Due to this study we reduced the sterility testing for cornea and amniotic grafts to 9 days. PMID:24810914

  4. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  5. Validation methods for flight crucial systems

    NASA Technical Reports Server (NTRS)

    Holt, H. M.

    1983-01-01

    Research to develop techniques that can aid in determining the reliability and performance of digital electronic fault-tolerant systems, that have probability of catastrophic system failure on the order of 10 to the -9th at 10 hours, is reviewed. The computer-aided reliability estimation program (CARE III) provides general-purpose reliability analysis and a design tool for fault-tolerant systems; large reduction of state size; and a fault-handling model based on probabilistic description of detection, isolation, and recovery mechanisms. The application of design proof techniques as part of the design and development of the software implemented fault-tolerance computer is mentioned. Emulation techniques and experimental procedures are verified using specimens of fault-tolerant computers and the capabilities of the validation research laboratory, AIRLAB.

  6. [Methods of risk assessment and their validation].

    PubMed

    Baracco, Alessandro

    2014-01-01

    The review of the literature data shows several methods for the the risks assessment of biomnechanical overload of the musculoskeletal system in activities with repetitive strain of the upper limbs and manual material handling. The application of these methods should allow the quantification ofriskfor the working population, the identification of the preventive measures to reduce the risk and their effectiveness and thle design of a specific health surveillance scheme. In this paper we analyze the factors which must be taken into account in Occupational Medicine to implement a process of validation of these methods. In conclusion we believe it will necessary in the future the availability of new methods able to analyze and reduce the risk already in the design phase of the production process. PMID:25558718

  7. New method of deposition of biomolecules for bioelectronic purposes

    NASA Astrophysics Data System (ADS)

    Morales, P.; Sperandei, M.

    1994-02-01

    A laser induced plasma vaporization and ionization technique is proposed for electric field assisted deposition of proteins. Experiments were carried out depositing thick layers of lysozyme on metal, creating submillimeter patterns. The enzymatic activity of horseradish peroxidase deposited by this method was tested, together with that of the enzyme laccase from a micro-organism. The applicability of this method to the construction of nanometric patterns for bioelectronic purposes is discussed.

  8. Establishing the Content Validity of Tests Designed To Serve Multiple Purposes: Bridging Secondary-Postsecondary Mathematics.

    ERIC Educational Resources Information Center

    Burstein, Leigh; And Others

    A method is presented for determining the content validity of a series of secondary school mathematics tests. These tests are part of the Mathematics Diagnostic Testing Project (MDTP), a collaborative effort by California university systems to develop placement examinations and a means to document student preparation in mathematics. Content…

  9. Validation of analytical methods involved in dissolution assays: acceptance limits and decision methodologies.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-11-01

    Dissolution tests are key elements to ensure continuing product quality and performance. The ultimate goal of these tests is to assure consistent product quality within a defined set of specification criteria. Validation of an analytical method aimed at assessing the dissolution profile of products or at verifying pharmacopoeias compliance should demonstrate that this analytical method is able to correctly declare two dissolution profiles as similar or drug products as compliant with respect to their specifications. It is essential to ensure that these analytical methods are fit for their purpose. Method validation is aimed at providing this guarantee. However, even in the ICHQ2 guideline there is no information explaining how to decide whether the method under validation is valid for its final purpose or not. Are the entire validation criterion needed to ensure that a Quality Control (QC) analytical method for dissolution test is valid? What acceptance limits should be set on these criteria? How to decide about method's validity? These are the questions that this work aims at answering. Focus is made to comply with the current implementation of the Quality by Design (QbD) principles in the pharmaceutical industry in order to allow to correctly defining the Analytical Target Profile (ATP) of analytical methods involved in dissolution tests. Analytical method validation is then the natural demonstration that the developed methods are fit for their intended purpose and is not any more the inconsiderate checklist validation approach still generally performed to complete the filing required to obtain product marketing authorization. PMID:23084050

  10. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  11. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  12. Softcopy quality ruler method: implementation and validation

    NASA Astrophysics Data System (ADS)

    Jin, Elaine W.; Keelan, Brian W.; Chen, Junqing; Phillips, Jonathan B.; Chen, Ying

    2009-01-01

    A softcopy quality ruler method was implemented for the International Imaging Industry Association (I3A) Camera Phone Image Quality (CPIQ) Initiative. This work extends ISO 20462 Part 3 by virtue of creating reference digital images of known subjective image quality, complimenting the hardcopy Standard Reference Stimuli (SRS). The softcopy ruler method was developed using images from a Canon EOS 1Ds Mark II D-SLR digital still camera (DSC) and a Kodak P880 point-and-shoot DSC. Images were viewed on an Apple 30in Cinema Display at a viewing distance of 34 inches. Ruler images were made for 16 scenes. Thirty ruler images were generated for each scene, representing ISO 20462 Standard Quality Scale (SQS) values of approximately 2 to 31 at an increment of one just noticeable difference (JND) by adjusting the system modulation transfer function (MTF). A Matlab GUI was developed to display the ruler and test images side-by-side with a user-adjustable ruler level controlled by a slider. A validation study was performed at Kodak, Vista Point Technology, and Aptina Imaging in which all three companies set up a similar viewing lab to run the softcopy ruler method. The results show that the three sets of data are in reasonable agreement with each other, with the differences within the range expected from observer variability. Compared to previous implementations of the quality ruler, the slider-based user interface allows approximately 2x faster assessments with 21.6% better precision.

  13. A special purpose knowledge-based face localization method

    NASA Astrophysics Data System (ADS)

    Hassanat, Ahmad; Jassim, Sabah

    2008-04-01

    This paper is concerned with face localization for visual speech recognition (VSR) system. Face detection and localization have got a great deal of attention in the last few years, because it is an essential pre-processing step in many techniques that handle or deal with faces, (e.g. age, face, gender, race and visual speech recognition). We shall present an efficient method for localization human's faces in video images captured on mobile constrained devices, under a wide variation in lighting conditions. We use a multiphase method that may include all or some of the following steps starting with image pre-processing, followed by a special purpose edge detection, then an image refinement step. The output image will be passed through a discrete wavelet decomposition procedure, and the computed LL sub-band at a certain level will be transformed into a binary image that will be scanned by using a special template to select a number of possible candidate locations. Finally, we fuse the scores from the wavelet step with scores determined by color information for the candidate location and employ a form of fuzzy logic to distinguish face from non-face locations. We shall present results of large number of experiments to demonstrate that the proposed face localization method is efficient and achieve high level of accuracy that outperforms existing general-purpose face detection methods.

  14. Tripless control method for general-purpose inverters

    SciTech Connect

    Mutoh, N.; Ueda, A. . Hitachi Research Lab.); Nandoh, K.; Ibori, S. )

    1992-09-01

    In this paper a new control method is described. This method prevents general-purpose inverters without current regulators from tripping easily, i.e., to be tripless no matter how their load is varied, and enables motors to rotate stably at high frequencies. This control is performed using only current sensors and is a combination of PWM control and torque control. This approach for PWM control changes an asynchronized mode to a synchronized mode when the modulation ratio becomes more than one. This enables the carrier wave frequency to be continuously varied with the inverter frequency. As a result, motors can rotate stably over a wide frequency range. The torque control uses a real and reactive component detector, magnetic flux compensator, slip compensator, and current limit controller.

  15. FDIR Strategy Validation with the B Method

    NASA Astrophysics Data System (ADS)

    Sabatier, D.; Dellandrea, B.; Chemouil, D.

    2008-08-01

    In a formation flying satellite system, the FDIR strategy (Failure Detection, Isolation and Recovery) is paramount. When a failure occurs, satellites should be able to take appropriate reconfiguration actions to obtain the best possible results given the failure, ranging from avoiding satellite-to-satellite collision to continuing the mission without disturbance if possible. To achieve this goal, each satellite in the formation has an implemented FDIR strategy that governs how it detects failures (from tests or by deduction) and how it reacts (reconfiguration using redundant equipments, avoidance manoeuvres, etc.). The goal is to protect the satellites first and the mission as much as possible. In a project initiated by the CNES, ClearSy experiments the B Method to validate the FDIR strategies developed by Thales Alenia Space, of the inter satellite positioning and communication devices that will be used for the SIMBOL-X (2 satellite configuration) and the PEGASE (3 satellite configuration) missions and potentially for other missions afterward. These radio frequency metrology sensor devices provide satellite positioning and inter satellite communication in formation flying. This article presents the results of this experience.

  16. Some useful statistical methods for model validation.

    PubMed Central

    Marcus, A H; Elias, R W

    1998-01-01

    Although formal hypothesis tests provide a convenient framework for displaying the statistical results of empirical comparisons, standard tests should not be used without consideration of underlying measurement error structure. As part of the validation process, predictions of individual blood lead concentrations from models with site-specific input parameters are often compared with blood lead concentrations measured in field studies that also report lead concentrations in environmental media (soil, dust, water, paint) as surrogates for exposure. Measurements of these environmental media are subject to several sources of variability, including temporal and spatial sampling, sample preparation and chemical analysis, and data entry or recording. Adjustments for measurement error must be made before statistical tests can be used to empirically compare environmental data with model predictions. This report illustrates the effect of measurement error correction using a real dataset of child blood lead concentrations for an undisclosed midwestern community. We illustrate both the apparent failure of some standard regression tests and the success of adjustment of such tests for measurement error using the SIMEX (simulation-extrapolation) procedure. This procedure adds simulated measurement error to model predictions and then subtracts the total measurement error, analogous to the method of standard additions used by analytical chemists. Images Figure 1 Figure 3 PMID:9860913

  17. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  18. Methods for developing and validating survivability distributions

    SciTech Connect

    Williams, R.L.

    1993-10-01

    A previous report explored and discussed statistical methods and procedures that may be applied to validate the survivability of a complex system of systems that cannot be tested as an entity. It described a methodology where Monte Carlo simulation was used to develop the system survivability distribution from the component distributions using a system model that registers the logical interactions of the components to perform system functions. This paper discusses methods that can be used to develop the required survivability distributions based upon three sources of knowledge. These are (1) available test results; (2) little or no available test data, but a good understanding of the physical laws and phenomena which can be applied by computer simulation; and (3) neither test data nor adequate knowledge of the physics are known, in which case, one must rely upon, and quantify, the judgement of experts. This paper describes the relationship between the confidence bounds that can be placed on survivability and the number of tests conducted. It discusses the procedure for developing system level survivability distributions from the distributions for lower levels of integration. It demonstrates application of these techniques by defining a communications network for a Hypothetical System Architecture. A logic model for the performance of this communications network is developed, as well as the survivability distributions for the nodes and links based on two alternate data sets, reflecting the effects of increased testing of all elements. It then shows how this additional testing could be optimized by concentrating only on those elements contained in the low-order fault sets which the methodology identifies.

  19. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. PMID:23246613

  20. The method of measurement system software automatic validation using business rules management system

    NASA Astrophysics Data System (ADS)

    Zawistowski, Piotr

    2015-09-01

    The method of measurement system software automatic validation using business rules management system (BRMS) is discussed in this paper. The article contains a description of the new approach to measurement systems execution validation, a description of the implementation of the system that supports mentioned validation and examples documenting the correctness of the approach. In the new approach BRMS are used for measurement systems execution validation. Such systems have not been used for software execution validation nor for measurement systems. The benefits of using them for the listed purposes are discussed as well.

  1. Purpose and methods of a Pollution Prevention Awareness Program

    SciTech Connect

    Flowers, P.A.; Irwin, E.F.; Poligone, S.E.

    1994-08-15

    The purpose of the Pollution Prevention Awareness Program (PPAP), which is required by DOE Order 5400.1, is to foster the philosophy that prevention is superior to remediation. The goal of the program is to incorporate pollution prevention into the decision-making process at every level throughout the organization. The objectives are to instill awareness, disseminate information, provide training and rewards for identifying the true source or cause of wastes, and encourage employee participation in solving environmental issues and preventing pollution. PPAP at the Oak Ridge Y-12 Plant was created several years ago and continues to grow. We believe that we have implemented several unique methods of communicating environmental awareness to promote a more active work force in identifying ways of reducing pollution.

  2. Evaluating regional vulnerability to climate change: purposes and methods

    SciTech Connect

    Malone, Elizabeth L.; Engle, Nathan L.

    2011-03-15

    As the emphasis in climate change research, international negotiations, and developing-country activities has shifted from mitigation to adaptation, vulnerability has emerged as a bridge between impacts on one side and the need for adaptive changes on the other. Still, the term vulnerability remains abstract, its meaning changing with the scale, focus, and purpose of each assessment. Understanding regional vulnerability has advanced over the past several decades, with studies using a combination of indicators, case studies and analogues, stakeholder-driven processes, and scenario-building methodologies. As regions become increasingly relevant scales of inquiry for bridging the aggregate and local, for every analysis, it is perhaps most appropriate to ask three “what” questions: “What/who is vulnerable?,” “What is vulnerability?,” and “Vulnerable to what?” The answers to these questions will yield different definitions of vulnerability as well as different methods for assessing it.

  3. Validation of a Theoretical Model of Diagnostic Classroom Assessment: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Koh, Nancy

    2012-01-01

    The purpose of the study was to validate a theoretical model of diagnostic, formative classroom assessment called, "Proximal Assessment for Learner Diagnosis" (PALD). To achieve its purpose, the study employed a two-stage, mixed-methods design. The study utilized multiple data sources from 11 elementary level mathematics teachers who…

  4. The Value of Qualitative Methods in Social Validity Research

    ERIC Educational Resources Information Center

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based…

  5. A Clinical Method for Identifying Scapular Dyskinesis, Part 2: Validity

    PubMed Central

    Tate, Angela R; McClure, Philip; Kareha, Stephen; Irwin, Dominic; Barbe, Mary F

    2009-01-01

    Context: Although clinical methods for detecting scapular dyskinesis have been described, evidence supporting the validity of these methods is lacking. Objective: To determine the validity of the scapular dyskinesis test, a visually based method of identifying abnormal scapular motion. A secondary purpose was to explore the relationship between scapular dyskinesis and shoulder symptoms. Design: Validation study comparing 3-dimensional measures of scapular motion among participants clinically judged as having either normal motion or scapular dyskinesis. Setting: University athletic training facilities. Patients or Other Participants: A sample of 142 collegiate athletes (National Collegiate Athletic Association Division I and Division III) participating in sports requiring overhead use of the arm was rated, and 66 of these underwent 3-dimensional testing. Intervention(s): Volunteers were viewed by 2 raters while performing weighted shoulder flexion and abduction. The right and left sides were rated independently as normal, subtle dyskinesis, or obvious dyskinesis using the scapular dyskinesis test. Symptoms were assessed using the Penn Shoulder Score. Main Outcome Measure(s): Athletes judged as having either normal motion or obvious dyskinesis underwent 3-dimensional electromagnetic kinematic testing while performing the same movements. The kinematic data from both groups were compared via multifactor analysis of variance with post hoc testing using the least significant difference procedure. The relationship between symptoms and scapular dyskinesis was evaluated by odds ratios. Results: Differences were found between the normal and obvious dyskinesis groups. Participants with obvious dyskinesis showed less scapular upward rotation (P < .001), less clavicular elevation (P < .001), and greater clavicular protraction (P  =  .044). The presence of shoulder symptoms was not different between the normal and obvious dyskinesis volunteers (odds ratio  =  0.79, 95

  6. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  7. Estimates of External Validity Bias When Impact Evaluations Select Sites Purposively

    ERIC Educational Resources Information Center

    Stuart, Elizabeth A.; Olsen, Robert B.; Bell, Stephen H.; Orr, Larry L.

    2012-01-01

    While there has been some increasing interest in external validity, most work to this point has been in assessing the similarity of a randomized trial sample and a population of interest (e.g., Stuart et al., 2010; Tipton, 2011). The goal of this research is to calculate empirical estimates of the external validity bias in educational intervention…

  8. Development and Validation of a Reading-Related Assessment Battery in Malay for the Purpose of Dyslexia Assessment

    ERIC Educational Resources Information Center

    Lee, Lay Wah

    2008-01-01

    Malay is an alphabetic language with transparent orthography. A Malay reading-related assessment battery which was conceptualised based on the International Dyslexia Association definition of dyslexia was developed and validated for the purpose of dyslexia assessment. The battery consisted of ten tests: Letter Naming, Word Reading, Non-word…

  9. Purpose in Life in Emerging Adulthood: Development and Validation of a New Brief Measure

    PubMed Central

    Hill, Patrick L.; Edmonds, Grant W.; Peterson, Missy; Luyckx, Koen; Andrews, Judy A.

    2015-01-01

    Accruing evidence points to the value of studying purpose in life across adolescence and emerging adulthood. Research though is needed to understand the unique role of purpose in life in predicting well-being and developmentally relevant outcomes during emerging adulthood. The current studies (total n = 669) found support for the development of a new brief measure of purpose in life using data from American and Canadian samples, while demonstrating evidence for two important findings. First, purpose in life predicted well-being during emerging adulthood, even when controlling for the Big Five personality traits. Second, purpose in life was positively associated with self-image and negatively associated with delinquency, again controlling for personality traits. Findings are discussed with respect to how studying purpose in life can help understand which individuals are more likely to experience positive transitions into adulthood. PMID:26958072

  10. Illustrating a Mixed-Method Approach for Validating Culturally Specific Constructs

    ERIC Educational Resources Information Center

    Hitchcock, J.H.; Nastasi, B.K.; Dai, D.Y.; Newman, J.; Jayasena, A.; Bernstein-Moore, R.; Sarkar, S.; Varjas, K.

    2005-01-01

    The purpose of this article is to illustrate a mixed-method approach (i.e., combining qualitative and quantitative methods) for advancing the study of construct validation in cross-cultural research. The article offers a detailed illustration of the approach using the responses 612 Sri Lankan adolescents provided to an ethnographic survey. Such…

  11. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  12. Bioanalytical method validation: An updated review.

    PubMed

    Tiwari, Gaurav; Tiwari, Ruchi

    2010-10-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development, culminating in a marketing approval. The objective of this paper is to review the sample preparation of drug in biological matrix and to provide practical approaches for determining selectivity, specificity, limit of detection, lower limit of quantitation, linearity, range, accuracy, precision, recovery, stability, ruggedness, and robustness of liquid chromatographic methods to support pharmacokinetic (PK), toxicokinetic, bioavailability, and bioequivalence studies. Bioanalysis, employed for the quantitative determination of drugs and their metabolites in biological fluids, plays a significant role in the evaluation and interpretation of bioequivalence, PK, and toxicokinetic studies. Selective and sensitive analytical methods for quantitative evaluation of drugs and their metabolites are critical for the successful conduct of pre-clinical and/or biopharmaceutics and clinical pharmacology studies. PMID:23781413

  13. Design and validation of a general purpose robotic testing system for musculoskeletal applications.

    PubMed

    Noble, Lawrence D; Colbrunn, Robb W; Lee, Dong-Gil; van den Bogert, Antonie J; Davis, Brian L

    2010-02-01

    Orthopaedic research on in vitro forces applied to bones, tendons, and ligaments during joint loading has been difficult to perform because of limitations with existing robotic simulators in applying full-physiological loading to the joint under investigation in real time. The objectives of the current work are as follows: (1) describe the design of a musculoskeletal simulator developed to support in vitro testing of cadaveric joint systems, (2) provide component and system-level validation results, and (3) demonstrate the simulator's usefulness for specific applications of the foot-ankle complex and knee. The musculoskeletal simulator allows researchers to simulate a variety of loading conditions on cadaver joints via motorized actuators that simulate muscle forces while simultaneously contacting the joint with an external load applied by a specialized robot. Multiple foot and knee studies have been completed at the Cleveland Clinic to demonstrate the simulator's capabilities. Using a variety of general-use components, experiments can be designed to test other musculoskeletal joints as well (e.g., hip, shoulder, facet joints of the spine). The accuracy of the tendon actuators to generate a target force profile during simulated walking was found to be highly variable and dependent on stance position. Repeatability (the ability of the system to generate the same tendon forces when the same experimental conditions are repeated) results showed that repeat forces were within the measurement accuracy of the system. It was determined that synchronization system accuracy was 6.7+/-2.0 ms and was based on timing measurements from the robot and tendon actuators. The positioning error of the robot ranged from 10 microm to 359 microm, depending on measurement condition (e.g., loaded or unloaded, quasistatic or dynamic motion, centralized movements or extremes of travel, maximum value, or root-mean-square, and x-, y- or z-axis motion). Algorithms and methods for controlling

  14. Exploring valid and reliable assessment methods for care management education.

    PubMed

    Gennissen, Lokke; Stammen, Lorette; Bueno-de-Mesquita, Jolien; Wieringa, Sietse; Busari, Jamiu

    2016-07-01

    Purpose It is assumed that the use of valid and reliable assessment methods can facilitate the development of medical residents' management and leadership competencies. To justify this assertion, the perceptions of an expert panel of health care leaders were explored on assessment methods used for evaluating care management (CM) development in Dutch residency programs. This paper aims to investigate how assessors and trainees value these methods and examine for any inherent benefits or shortcomings when they are applied in practice. Design/methodology/approach A Delphi survey was conducted among members of the platform for medical leadership in The Netherlands. This panel of experts was made up of clinical educators, practitioners and residents interested in CM education. Findings Of the respondents, 40 (55.6 per cent) and 31 (43 per cent) participated in the first and second rounds of the Delphi survey, respectively. The respondents agreed that assessment methods currently being used to measure residents' CM competencies were weak, though feasible for use in many residency programs. Multi-source feedback (MSF, 92.1 per cent), portfolio/e-portfolio (86.8 per cent) and knowledge testing (76.3 per cent) were identified as the most commonly known assessment methods with familiarity rates exceeding 75 per cent. Practical implications The findings suggested that an "assessment framework" comprising MSF, portfolios, individual process improvement projects or self-reflections and observations in clinical practice should be used to measure CM competencies in residents. Originality/value This study reaffirms the need for objective methods to assess CM skills in post-graduate medical education, as there was not a single assessment method that stood out as the best instrument. PMID:27397747

  15. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines. PMID:27179186

  16. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    NASA Astrophysics Data System (ADS)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  17. Validation of Analytical Methods for Biomarkers Employed in Drug Development

    PubMed Central

    Chau, Cindy H.; Rixe, Olivier; McLeod, Howard; Figg, William D.

    2008-01-01

    The role of biomarkers in drug discovery and development has gained precedence over the years. As biomarkers become integrated into drug development and clinical trials, quality assurance and in particular assay validation becomes essential with the need to establish standardized guidelines for analytical methods used in biomarker measurements. New biomarkers can revolutionize both the development and use of therapeutics, but is contingent upon the establishment of a concrete validation process that addresses technology integration and method validation as well as regulatory pathways for efficient biomarker development. This perspective focuses on the general principles of the biomarker validation process with an emphasis on assay validation and the collaborative efforts undertaken by various sectors to promote the standardization of this procedure for efficient biomarker development. PMID:18829475

  18. Validation of population-based disease simulation models: a review of concepts and methods

    PubMed Central

    2010-01-01

    Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1) the process of model development, 2) the performance of a model, and 3) the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction). More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility. PMID:21087466

  19. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    ERIC Educational Resources Information Center

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  20. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    ERIC Educational Resources Information Center

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  1. Reliability and validity of optoelectronic method for biophotonical measurements

    NASA Astrophysics Data System (ADS)

    Karpienko, Katarzyna; Wróbel, Maciej S.; UrniaŻ, Rafał

    2013-11-01

    Reliability and validity of measurements is of utmost importance when assessing measuring capability of instruments developed for research. In order to perform an experiment which is legitimate, used instruments must be both reliable and valid. Reliability estimates the degree of precision of measurement, the extent to which a measurement is internally consistent. Validity is the usefulness of an instrument to perform accurate measurements of quantities it was designed to measure. Statistical analysis for reliability and validity control of low-coherence interferometry method for refractive index measurements of biological fluids is presented. The low-coherence interferometer is sensitive to optical path difference between interfering beams. This difference depends on the refractive index of measured material. To assess the validity and reliability of proposed method for blood measurements, the statistical analysis of the method was performed on several substances with known refractive indices. Analysis of low-coherence interferograms considered the mean distances between fringes. Performed statistical analysis for validity and reliability consisted of Grubb's test for outliers, Shapiro-Wilk test for normal distribution, T-Student test, standard deviation, coefficient of determination and r-Pearson correlation. Overall the tests proved high statistical significance of measurement method with confidence level < 0.0001 of measurement method.

  2. High Explosive Verification and Validation: Systematic and Methodical Approach

    NASA Astrophysics Data System (ADS)

    Scovel, Christina; Menikoff, Ralph

    2011-06-01

    Verification and validation of high explosive (HE) models does not fit the standard mold for several reasons. First, there are no non-trivial test problems with analytic solutions. Second, an HE model depends on a burn rate and the equation of states (EOS) of both the reactants and products. Third, there is a wide range of detonation phenomena from initiation under various stimuli to propagation of curved detonation fronts with non-rigid confining materials. Fourth, in contrast to a shock wave in a non-reactive material, the reaction-zone width is physically significant and affects the behavior of a detonation wave. Because of theses issues, a systematic and methodical approach to HE V & V is needed. Our plan is to build a test suite from the ground up. We have started with the cylinder test and have run simulations with several EOS models and burn models. We have compared with data and cross-compared the different runs to check on the sensitivity to model parameters. A related issue for V & V is what experimental data are available for calibrating and testing models. For this purpose we have started a WEB based high explosive database (HED). The current status of HED will be discussed.

  3. Development and validation of a new fallout transport method using variable spectral winds. Doctoral thesis

    SciTech Connect

    Hopkins, A.T.

    1984-09-01

    The purpose of this research was to develop and validate a fallout prediction method using variable transport calculations. The new method uses National Meteorological Center (NMC) spectral coefficients to compute wind vectors along the space- and time-varying trajectories of falling particles. The method was validated by comparing computed and actual cloud trajectories from a Mount St. Helens volcanic eruption and a high dust cloud. In summary, this research demonstrated the feasibility of using spectral coefficients for fallout transport calculations, developed a two-step smearing model to treat variable winds, and showed that uncertainties in spectral winds do not contribute significantly to the error in computed dose rate.

  4. An introduction to clinical microeconomic analysis: purposes and analytic methods.

    PubMed

    Weintraub, W S; Mauldin, P D; Becker, E R

    1994-06-01

    The recent concern with health care economics has fostered the development of a new discipline that is generally called clinical microeconomics. This is a discipline in which microeconomic methods are used to study the economics of specific medical therapies. It is possible to perform stand alone cost analyses, but more profound insight into the medical decision making process may be accomplished by combining cost studies with measures of outcome. This is most often accomplished with cost-effectiveness or cost-utility studies. In cost-effectiveness studies there is one measure of outcome, often death. In cost-utility studies there are multiple measures of outcome, which must be grouped together to give an overall picture of outcome or utility. There are theoretical limitations to the determination of utility that must be accepted to perform this type of analysis. A summary statement of outcome is quality adjusted life years (QALYs), which is utility time socially discounted survival. Discounting is used because people value a year of future life less than a year of present life. Costs are made up of in-hospital direct, professional, follow-up direct, and follow-up indirect costs. Direct costs are for medical services. Indirect costs reflect opportunity costs such as lost time at work. Cost estimates are often based on marginal costs, or the cost for one additional procedure of the same type. Finally an overall statistic may be generated as cost per unit increase in effectiveness, such as dollars per QALY.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:10151059

  5. Testing and Validation of Computational Methods for Mass Spectrometry.

    PubMed

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-01

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods. PMID:26549429

  6. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    ERIC Educational Resources Information Center

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  7. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  8. Adapting CEF-Descriptors for Rating Purposes: Validation by a Combined Rater Training and Scale Revision Approach

    ERIC Educational Resources Information Center

    Harsch, Claudia; Martin, Guido

    2012-01-01

    We explore how a local rating scale can be based on the Common European Framework CEF-proficiency scales. As part of the scale validation (Alderson, 1991; Lumley, 2002), we examine which adaptations are needed to turn CEF-proficiency descriptors into a rating scale for a local context, and to establish a practicable method to revise the initial…

  9. Cochlear Dummy Electrodes for Insertion Training and Research Purposes: Fabrication, Mechanical Characterization, and Experimental Validation.

    PubMed

    Kobler, Jan-Philipp; Dhanasingh, Anandhan; Kiran, Raphael; Jolly, Claude; Ortmaier, Tobias

    2015-01-01

    To develop skills sufficient for hearing preservation cochlear implant surgery, surgeons need to perform several electrode insertion trials in ex vivo temporal bones, thereby consuming relatively expensive electrode carriers. The objectives of this study were to evaluate the insertion characteristics of cochlear electrodes in a plastic scala tympani model and to fabricate radio opaque polymer filament dummy electrodes of equivalent mechanical properties. In addition, this study should aid the design and development of new cochlear electrodes. Automated insertion force measurement is a new technique to reproducibly analyze and evaluate the insertion dynamics and mechanical characteristics of an electrode. Mechanical properties of MED-EL's FLEX(28), FLEX(24), and FLEX(20) electrodes were assessed with the help of an automated insertion tool. Statistical analysis of the overall mechanical behavior of the electrodes and factors influencing the insertion force are discussed. Radio opaque dummy electrodes of comparable characteristics were fabricated based on insertion force measurements. The platinum-iridium wires were replaced by polymer filament to provide sufficient stiffness to the electrodes and to eradicate the metallic artifacts in X-ray and computed tomography (CT) images. These low-cost dummy electrodes are cheap alternatives for surgical training and for in vitro, ex vivo, and in vivo research purposes. PMID:26247024

  10. Cochlear Dummy Electrodes for Insertion Training and Research Purposes: Fabrication, Mechanical Characterization, and Experimental Validation

    PubMed Central

    Kobler, Jan-Philipp; Dhanasingh, Anandhan; Kiran, Raphael; Jolly, Claude; Ortmaier, Tobias

    2015-01-01

    To develop skills sufficient for hearing preservation cochlear implant surgery, surgeons need to perform several electrode insertion trials in ex vivo temporal bones, thereby consuming relatively expensive electrode carriers. The objectives of this study were to evaluate the insertion characteristics of cochlear electrodes in a plastic scala tympani model and to fabricate radio opaque polymer filament dummy electrodes of equivalent mechanical properties. In addition, this study should aid the design and development of new cochlear electrodes. Automated insertion force measurement is a new technique to reproducibly analyze and evaluate the insertion dynamics and mechanical characteristics of an electrode. Mechanical properties of MED-EL's FLEX28, FLEX24, and FLEX20 electrodes were assessed with the help of an automated insertion tool. Statistical analysis of the overall mechanical behavior of the electrodes and factors influencing the insertion force are discussed. Radio opaque dummy electrodes of comparable characteristics were fabricated based on insertion force measurements. The platinum-iridium wires were replaced by polymer filament to provide sufficient stiffness to the electrodes and to eradicate the metallic artifacts in X-ray and computed tomography (CT) images. These low-cost dummy electrodes are cheap alternatives for surgical training and for in vitro, ex vivo, and in vivo research purposes. PMID:26247024

  11. Beyond Correctness: Development and Validation of Concept-Based Categorical Scoring Rubrics for Diagnostic Purposes

    ERIC Educational Resources Information Center

    Arieli-Attali, Meirav; Liu, Ying

    2016-01-01

    Diagnostic assessment approaches intend to provide fine-grained reports of what students know and can do, focusing on their areas of strengths and weaknesses. However, current application of such diagnostic approaches is limited by the scoring method for item responses; important diagnostic information, such as type of errors and strategy use is…

  12. Visualization of vasculature with convolution surfaces: method, validation and evaluation.

    PubMed

    Oeltze, Steffen; Preim, Bernhard

    2005-04-01

    We present a method for visualizing vasculature based on clinical computed tomography or magnetic resonance data. The vessel skeleton as well as the diameter information per voxel serve as input. Our method adheres to these data, while producing smooth transitions at branchings and closed, rounded ends by means of convolution surfaces. We examine the filter design with respect to irritating bulges, unwanted blending and the correct visualization of the vessel diameter. The method has been applied to a large variety of anatomic trees. We discuss the validation of the method by means of a comparison to other visualization methods. Surface distance measures are carried out to perform a quantitative validation. Furthermore, we present the evaluation of the method which has been accomplished on the basis of a survey by 11 radiologists and surgeons. PMID:15822811

  13. Recommendations on biomarker bioanalytical method validation by GCC.

    PubMed

    Hougton, Richard; Gouty, Dominique; Allinson, John; Green, Rachel; Losauro, Mike; Lowes, Steve; LeLacheur, Richard; Garofolo, Fabio; Couerbe, Philippe; Bronner, Stéphane; Struwe, Petra; Schiebl, Christine; Sangster, Timothy; Pattison, Colin; Islam, Rafiq; Garofolo, Wei; Pawula, Maria; Buonarati, Mike; Hayes, Roger; Cameron, Mark; Nicholson, Robert; Harman, Jake; Wieling, Jaap; De Boer, Theo; Reuschel, Scott; Cojocaru, Laura; Harter, Tammy; Malone, Michele; Nowatzke, William

    2012-10-01

    The 5th GCC in Barcelona (Spain) and 6th GCC in San Antonio (TX, USA) events provided a unique opportunity for CRO leaders to openly share opinions and perspectives, and to agree upon recommendations on biomarker bioanalytical method validation. PMID:23157353

  14. The Relationship between Method and Validity in Social Science Research.

    ERIC Educational Resources Information Center

    MacKinnon, David; And Others

    An endless debate in social science research focuses on whether or not there is a philosophical basis for justifying the application of scientific methods to social inquiry. A review of the philosophies of various scholars in the field indicates that there is no single procedure for arriving at a valid statement in a scientific inquiry. Natural…

  15. DEVELOPMENT AND VALIDATION OF A TEST METHOD FOR ACRYLONITRILE EMISSIONS

    EPA Science Inventory

    Acrylonitrile (AN) has been identified as a suspected carcinogen and may be regulated in the future as a hazardous air pollutant under Section 112 of the Clean Air Act. A method was validated that utilizes a midget impinger containing methanol for trapping AN vapors followed by a...

  16. ePortfolios: The Method of Choice for Validation

    ERIC Educational Resources Information Center

    Scott, Ken; Kim, Jichul

    2015-01-01

    Community colleges have long been institutions of higher education in the arenas of technical education and training, as well as preparing students for transfer to universities. While students are engaged in their student learning outcomes, projects, research, and community service, how have these students validated their work? One method of…

  17. Recommendations for Use and Fit-for-Purpose Validation of Biomarker Multiplex Ligand Binding Assays in Drug Development.

    PubMed

    Jani, Darshana; Allinson, John; Berisha, Flora; Cowan, Kyra J; Devanarayan, Viswanath; Gleason, Carol; Jeromin, Andreas; Keller, Steve; Khan, Masood U; Nowatzke, Bill; Rhyne, Paul; Stephen, Laurie

    2016-01-01

    Multiplex ligand binding assays (LBAs) are increasingly being used to support many stages of drug development. The complexity of multiplex assays creates many unique challenges in comparison to single-plexed assays leading to various adjustments for validation and potentially during sample analysis to accommodate all of the analytes being measured. This often requires a compromise in decision making with respect to choosing final assay conditions and acceptance criteria of some key assay parameters, depending on the intended use of the assay. The critical parameters that are impacted due to the added challenges associated with multiplexing include the minimum required dilution (MRD), quality control samples that span the range of all analytes being measured, quantitative ranges which can be compromised for certain targets, achieving parallelism for all analytes of interest, cross-talk across assays, freeze-thaw stability across analytes, among many others. Thus, these challenges also increase the complexity of validating the performance of the assay for its intended use. This paper describes the challenges encountered with multiplex LBAs, discusses the underlying causes, and provides solutions to help overcome these challenges. Finally, we provide recommendations on how to perform a fit-for-purpose-based validation, emphasizing issues that are unique to multiplex kit assays. PMID:26377333

  18. Validation of a previous day recall for measuring the location and purpose of active and sedentary behaviors compared to direct observation

    PubMed Central

    2014-01-01

    Purpose Gathering contextual information (i.e., location and purpose) about active and sedentary behaviors is an advantage of self-report tools such as previous day recalls (PDR). However, the validity of PDR’s for measuring context has not been empirically tested. The purpose of this paper was to compare PDR estimates of location and purpose to direct observation (DO). Methods Fifteen adult (18–75 y) and 15 adolescent (12–17 y) participants were directly observed during at least one segment of the day (i.e., morning, afternoon or evening). Participants completed their normal daily routine while trained observers recorded the location (i.e., home, community, work/school), purpose (e.g., leisure, transportation) and whether the behavior was sedentary or active. The day following the observation, participants completed an unannounced PDR. Estimates of time in each context were compared between PDR and DO. Intra-class correlations (ICC), percent agreement and Kappa statistics were calculated. Results For adults, percent agreement was 85% or greater for each location and ICC values ranged from 0.71 to 0.96. The PDR-reported purpose of adults’ behaviors were highly correlated with DO for household activities and work (ICCs of 0.84 and 0.88, respectively). Transportation was not significantly correlated with DO (ICC = -0.08). For adolescents, reported classification of activity location was 80.8% or greater. The ICCs for purpose of adolescents’ behaviors ranged from 0.46 to 0.78. Participants were most accurate in classifying the location and purpose of the behaviors in which they spent the most time. Conclusions This study suggests that adults and adolescents can accurately report where and why they spend time in behaviors using a PDR. This information on behavioral context is essential for translating the evidence for specific behavior-disease associations to health interventions and public policy. PMID:24490619

  19. Methods for causal inference from gene perturbation experiments and validation.

    PubMed

    Meinshausen, Nicolai; Hauser, Alain; Mooij, Joris M; Peters, Jonas; Versteeg, Philip; Bühlmann, Peter

    2016-07-01

    Inferring causal effects from observational and interventional data is a highly desirable but ambitious goal. Many of the computational and statistical methods are plagued by fundamental identifiability issues, instability, and unreliable performance, especially for large-scale systems with many measured variables. We present software and provide some validation of a recently developed methodology based on an invariance principle, called invariant causal prediction (ICP). The ICP method quantifies confidence probabilities for inferring causal structures and thus leads to more reliable and confirmatory statements for causal relations and predictions of external intervention effects. We validate the ICP method and some other procedures using large-scale genome-wide gene perturbation experiments in Saccharomyces cerevisiae The results suggest that prediction and prioritization of future experimental interventions, such as gene deletions, can be improved by using our statistical inference techniques. PMID:27382150

  20. Methods for causal inference from gene perturbation experiments and validation

    PubMed Central

    Meinshausen, Nicolai; Hauser, Alain; Mooij, Joris M.; Peters, Jonas; Versteeg, Philip; Bühlmann, Peter

    2016-01-01

    Inferring causal effects from observational and interventional data is a highly desirable but ambitious goal. Many of the computational and statistical methods are plagued by fundamental identifiability issues, instability, and unreliable performance, especially for large-scale systems with many measured variables. We present software and provide some validation of a recently developed methodology based on an invariance principle, called invariant causal prediction (ICP). The ICP method quantifies confidence probabilities for inferring causal structures and thus leads to more reliable and confirmatory statements for causal relations and predictions of external intervention effects. We validate the ICP method and some other procedures using large-scale genome-wide gene perturbation experiments in Saccharomyces cerevisiae. The results suggest that prediction and prioritization of future experimental interventions, such as gene deletions, can be improved by using our statistical inference techniques. PMID:27382150

  1. Validation of cleaning method for various parts fabricated at a Beryllium facility

    SciTech Connect

    Davis, Cynthia M.

    2015-12-15

    This study evaluated and documented a cleaning process that is used to clean parts that are fabricated at a beryllium facility at Los Alamos National Laboratory. The purpose of evaluating this cleaning process was to validate and approve it for future use to assure beryllium surface levels are below the Department of Energy’s release limits without the need to sample all parts leaving the facility. Inhaling or coming in contact with beryllium can cause an immune response that can result in an individual becoming sensitized to beryllium, which can then lead to a disease of the lungs called chronic beryllium disease, and possibly lung cancer. Thirty aluminum and thirty stainless steel parts were fabricated on a lathe in the beryllium facility, as well as thirty-two beryllium parts, for the purpose of testing a parts cleaning method that involved the use of ultrasonic cleaners. A cleaning method was created, documented, validated, and approved, to reduce beryllium contamination.

  2. LC-MS quantification of protein drugs: validating protein LC-MS methods with predigestion immunocapture.

    PubMed

    Duggan, Jeffrey; Ren, Bailuo; Mao, Yan; Chen, Lin-Zhi; Philip, Elsy

    2016-09-01

    A refinement of protein LC-MS bioanalysis is to use predigestion immunoaffinity capture to extract the drug from matrix prior to digestion. Because of their increased sensitivity, such hybrid assays have been successfully validated and applied to a number of clinical studies; however, they can also be subject to potential interferences from antidrug antibodies, circulating ligands or other matrix components specific to patient populations and/or dosed subjects. The purpose of this paper is to describe validation experiments that measure immunocapture efficiency, digestion efficiency, matrix effect and selectivity/specificity that can be used during method optimization and validation to test the resistance of the method to these potential interferences. The designs and benefits of these experiments are discussed in this report using an actual assay case study. PMID:27532431

  3. The Bland-Altman Method Should Not Be Used in Regression Cross-Validation Studies

    ERIC Educational Resources Information Center

    O'Connor, Daniel P.; Mahar, Matthew T.; Laughlin, Mitzi S.; Jackson, Andrew S.

    2011-01-01

    The purpose of this study was to demonstrate the bias in the Bland-Altman (BA) limits of agreement method when it is used to validate regression models. Data from 1,158 men were used to develop three regression equations to estimate maximum oxygen uptake (R[superscript 2] = 0.40, 0.61, and 0.82, respectively). The equations were evaluated in a…

  4. Making clinical trials more relevant: improving and validating the PRECIS tool for matching trial design decisions to trial purpose

    PubMed Central

    2013-01-01

    Background If you want to know which of two or more healthcare interventions is most effective, the randomised controlled trial is the design of choice. Randomisation, however, does not itself promote the applicability of the results to situations other than the one in which the trial was done. A tool published in 2009, PRECIS (PRagmatic Explanatory Continuum Indicator Summaries) aimed to help trialists design trials that produced results matched to the aim of the trial, be that supporting clinical decision-making, or increasing knowledge of how an intervention works. Though generally positive, groups evaluating the tool have also found weaknesses, mainly that its inter-rater reliability is not clear, that it needs a scoring system and that some new domains might be needed. The aim of the study is to: Produce an improved and validated version of the PRECIS tool. Use this tool to compare the internal validity of, and effect estimates from, a set of explanatory and pragmatic trials matched by intervention. Methods The study has four phases. Phase 1 involves brainstorming and a two-round Delphi survey of authors who cited PRECIS. In Phase 2, the Delphi results will then be discussed and alternative versions of PRECIS-2 developed and user-tested by experienced trialists. Phase 3 will evaluate the validity and reliability of the most promising PRECIS-2 candidate using a sample of 15 to 20 trials rated by 15 international trialists. We will assess inter-rater reliability, and raters’ subjective global ratings of pragmatism compared to PRECIS-2 to assess convergent and face validity. Phase 4, to determine if pragmatic trials sacrifice internal validity in order to achieve applicability, will compare the internal validity and effect estimates of matched explanatory and pragmatic trials of the same intervention, condition and participants. Effect sizes for the trials will then be compared in a meta-regression. The Cochrane Risk of Bias scores will be compared with the

  5. Methods for Geometric Data Validation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  6. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset

    PubMed Central

    Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R.

    2016-01-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo “paired-recordings” such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  7. Validation of chemistry models employed in a particle simulation method

    NASA Technical Reports Server (NTRS)

    Haas, Brian L.; Mcdonald, Jeffrey D.

    1991-01-01

    The chemistry models employed in a statistical particle simulation method, as implemented in the Intel iPSC/860 multiprocessor computer, are validated and applied. Chemical relaxation of five-species air in these reservoirs involves 34 simultaneous dissociation, recombination, and atomic-exchange reactions. The reaction rates employed in the analytic solutions are obtained from Arrhenius experimental correlations as functions of temperature for adiabatic gas reservoirs in thermal equilibrium. Favorable agreement with the analytic solutions validates the simulation when applied to relaxation of O2 toward equilibrium in reservoirs dominated by dissociation and recombination, respectively, and when applied to relaxation of air in the temperature range 5000 to 30,000 K. A flow of O2 over a circular cylinder at high Mach number is simulated to demonstrate application of the method to multidimensional reactive flows.

  8. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset.

    PubMed

    Neto, Joana P; Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R

    2016-08-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo "paired-recordings" such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  9. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  10. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    PubMed

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study. PMID:26656945

  11. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  12. Validation of an Impedance Education Method in Flow

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Parrott, Tony L.

    2004-01-01

    This paper reports results of a research effort to validate a method for educing the normal incidence impedance of a locally reacting liner, located in a grazing incidence, nonprogressive acoustic wave environment with flow. The results presented in this paper test the ability of the method to reproduce the measured normal incidence impedance of a solid steel plate and two soft test liners in a uniform flow. The test liners are known to be locally react- ing and exhibit no measurable amplitude-dependent impedance nonlinearities or flow effects. Baseline impedance spectra for these liners were therefore established from measurements in a conventional normal incidence impedance tube. A key feature of the method is the expansion of the unknown impedance function as a piecewise continuous polynomial with undetermined coefficients. Stewart's adaptation of the Davidon-Fletcher-Powell optimization algorithm is used to educe the normal incidence impedance at each Mach number by optimizing an objective function. The method is shown to reproduce the measured normal incidence impedance spectrum for each of the test liners, thus validating its usefulness for determining the normal incidence impedance of test liners for a broad range of source frequencies and flow Mach numbers. Nomenclature

  13. Validated HPTLC method of analysis for artemether and its formulations.

    PubMed

    Tayade, Nitin G; Nagarsenker, Mangal S

    2007-02-19

    A simple, sensitive, precise and rapid high-performance thin-layer chromatographic (HPTLC) method of analysis for artemether both as a bulk drug and in pharmaceutical formulations was developed and validated. The method employed TLC aluminum plates precoated with silica gel 60F-254 as the stationary phase. The solvent system consisted of toluene-ethyl acetate-formic acid (8:2:0.3, v/v/v) as mobile phase. Densitometric analysis of artemether was carried out in the reflectance mode at 565 nm. The system was found to give compact spots for artemether (R(f) value of 0.50+/-0.03). The linear regression analysis data for the calibration plots showed good linear relationship with r(2)=0.9904 in the concentration range 200-1000 ng per spot. The mean value of correlation coefficient, slope and intercept were 0.9904+/-0.011, 7.27+/-0.11 and 166.24+/-56.92, respectively. The method was validated for precision, accuracy, recovery and robustness. The limits of detection and quantitation were 65.91 and 197.74 ng per spot, respectively. The method has been successfully applied in the analysis of lipid based parenteral formulations and marketed oral solid dosage formulation. PMID:17045768

  14. Brazilian Center for the Validation of Alternative Methods (BraCVAM) and the process of validation in Brazil.

    PubMed

    Presgrave, Octavio; Moura, Wlamir; Caldeira, Cristiane; Pereira, Elisabete; Bôas, Maria H Villas; Eskes, Chantra

    2016-03-01

    The need for the creation of a Brazilian centre for the validation of alternative methods was recognised in 2008, and members of academia, industry and existing international validation centres immediately engaged with the idea. In 2012, co-operation between the Oswaldo Cruz Foundation (FIOCRUZ) and the Brazilian Health Surveillance Agency (ANVISA) instigated the establishment of the Brazilian Center for the Validation of Alternative Methods (BraCVAM), which was officially launched in 2013. The Brazilian validation process follows OECD Guidance Document No. 34, where BraCVAM functions as the focal point to identify and/or receive requests from parties interested in submitting tests for validation. BraCVAM then informs the Brazilian National Network on Alternative Methods (RENaMA) of promising assays, which helps with prioritisation and contributes to the validation studies of selected assays. A Validation Management Group supervises the validation study, and the results obtained are peer-reviewed by an ad hoc Scientific Review Committee, organised under the auspices of BraCVAM. Based on the peer-review outcome, BraCVAM will prepare recommendations on the validated test method, which will be sent to the National Council for the Control of Animal Experimentation (CONCEA). CONCEA is in charge of the regulatory adoption of all validated test methods in Brazil, following an open public consultation. PMID:27031604

  15. Flexibility and applicability of β-expectation tolerance interval approach to assess the fitness of purpose of pharmaceutical analytical methods.

    PubMed

    Bouabidi, A; Talbi, M; Bourichi, H; Bouklouze, A; El Karbane, M; Boulanger, B; Brik, Y; Hubert, Ph; Rozet, E

    2012-12-01

    An innovative versatile strategy using Total Error has been proposed to decide about the method's validity that controls the risk of accepting an unsuitable assay together with the ability to predict the reliability of future results. This strategy is based on the simultaneous combination of systematic (bias) and random (imprecision) error of analytical methods. Using validation standards, both types of error are combined through the use of a prediction interval or β-expectation tolerance interval. Finally, an accuracy profile is built by connecting, on one hand all the upper tolerance limits, and on the other hand all the lower tolerance limits. This profile combined with pre-specified acceptance limits allows the evaluation of the validity of any quantitative analytical method and thus their fitness for their intended purpose. In this work, the approach of accuracy profile was evaluated on several types of analytical methods encountered in the pharmaceutical industrial field and also covering different pharmaceutical matrices. The four studied examples depicted the flexibility and applicability of this approach for different matrices ranging from tablets to syrups, different techniques such as liquid chromatography, or UV spectrophotometry, and for different categories of assays commonly encountered in the pharmaceutical industry i.e. content assays, dissolution assays, and quantitative impurity assays. The accuracy profile approach assesses the fitness of purpose of these methods for their future routine application. It also allows the selection of the most suitable calibration curve, the adequate evaluation of a potential matrix effect and propose efficient solution and the correct definition of the limits of quantification of the studied analytical procedures. PMID:22615163

  16. Validation of spectrophotometric method for lactulose assay in syrup preparation

    NASA Astrophysics Data System (ADS)

    Mahardhika, Andhika Bintang; Novelynda, Yoshella; Damayanti, Sophi

    2015-09-01

    Lactulose is a synthetic disaccharide widely used in food and pharmaceutical fields. In the pharmaceutical field, lactulose is used as osmotic laxative in a syrup dosage form. This research was aimed to validate the spectrophotometric method to determine the levels of lactulose in syrup preparation and the commercial sample. Lactulose is hydrolyzed by hydrochloric acid to form fructose and galactose. The fructose was reacted with resorcinol reagent, forming compounds that give absorption peak at 485 nm. Analytical methods was validated, hereafter lactulose content in syrup preparation were determined. The calibration curve was linear in the range of 30-100 μg/mL with a correlation coefficient (r) of 0.9996, coefficient of variance (Vxo) of 1.1 %, limit of detection of 2.32 μg/mL, and limit of quantitation of 7.04 μg/mL. The result of accuracy test for the lactulose assay in the syrup preparation showed recoveries of 96.6 to 100.8 %. Repeatability test of lactulose assay in standard solution of lactulose and sample preparation syrup showed the coefficient of variation (CV) of 0.75 % and 0.7 %. Intermediate precision (interday) test resulted in coefficient of variation 1.06 % on the first day, the second day by 0.99 %, and 0.95 % for the third day. This research gave a valid analysis method and levels of lactulose in syrup preparations of samples A, B, C were 101.6, 100.5, and 100.6 %, respectively.

  17. A General Method of Empirical Q-matrix Validation.

    PubMed

    de la Torre, Jimmy; Chiu, Chia-Yi

    2016-06-01

    In contrast to unidimensional item response models that postulate a single underlying proficiency, cognitive diagnosis models (CDMs) posit multiple, discrete skills or attributes, thus allowing CDMs to provide a finer-grained assessment of examinees' test performance. A common component of CDMs for specifying the attributes required for each item is the Q-matrix. Although construction of Q-matrix is typically performed by domain experts, it nonetheless, to a large extent, remains a subjective process, and misspecifications in the Q-matrix, if left unchecked, can have important practical implications. To address this concern, this paper proposes a discrimination index that can be used with a wide class of CDM subsumed by the generalized deterministic input, noisy "and" gate model to empirically validate the Q-matrix specifications by identifying and replacing misspecified entries in the Q-matrix. The rationale for using the index as the basis for a proposed validation method is provided in the form of mathematical proofs to several relevant lemmas and a theorem. The feasibility of the proposed method was examined using simulated data generated under various conditions. The proposed method is illustrated using fraction subtraction data. PMID:25943366

  18. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology

    PubMed Central

    Krishnamoorthi, Shankarjee; Perotti, Luigi E.; Borgstrom, Nils P.; Ajijola, Olujimi A.; Frid, Anna; Ponnaluri, Aditya V.; Weiss, James N.; Qu, Zhilin; Klug, William S.; Ennis, Daniel B.; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations. PMID:25493967

  19. Method validation for methanol quantification present in working places

    NASA Astrophysics Data System (ADS)

    Muna, E. D. M.; Bizarri, C. H. B.; Maciel, J. R. M.; da Rocha, G. P.; de Araújo, I. O.

    2015-01-01

    Given the widespread use of methanol by different industry sectors and high toxicity associated with this substance, it is necessary to use an analytical method able to determine in a sensitive, precise and accurate levels of methanol in the air of working environments. Based on the methodology established by the National Institute for Occupational Safety and Health (NIOSH), it was validated a methodology for determination of methanol in silica gel tubes which had demonstrated its effectiveness based on the participation of the international collaborative program sponsored by the American Industrial Hygiene Association (AIHA).

  20. Determination of Al in cake mix: Method validation and estimation of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Andrade, G.; Rocha, O.; Junqueira, R.

    2016-07-01

    An analytical method for the determination of Al in cake mix was developed. Acceptable values were obtained for the following parameters: linearity, detection limit - LOD (5.00 mg-kg-1) quantification limit - LOQ (12.5 mg-kg-1), the recovery assays values (between 91 and 102%), the relative standard deviation under repeatability and within-reproducibility conditions (<20.0%) and measurement uncertainty tests (<10.0%) The results of the validation process showed that the proposed method is fitness for purpose.

  1. Determination of methylmercury in marine biota samples: method validation.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2014-05-01

    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  2. Validation of a numerical method for unsteady flow calculations

    SciTech Connect

    Giles, M.; Haimes, R. . Dept. of Aeronautics and Astronautics)

    1993-01-01

    This paper describes and validates a numerical method for the calculation of unsteady inviscid and viscous flows. A companion paper compares experimental measurements of unsteady heat transfer on a transonic rotor with the corresponding computational results. The mathematical model is the Reynolds-averaged unsteady Navier-Stokes equations for a compressible ideal gas. Quasi-three-dimensionality is included through the use of a variable streamtube thickness. The numerical algorithm is unusual in two respects: (a) For reasons of efficiency and flexibility, it uses a hybrid Navier-Stokes/Euler method, and (b) to allow for the computation of stator/rotor combinations with arbitrary pitch ratio, a novel space-time coordinate transformation is used. Several test cases are presented to validate the performance of the computer program, UNSFLO. These include: (a) unsteady, inviscid flat plate cascade flows (b) steady and unsteady, viscous flat plate cascade flows, (c) steady turbine heat transfer and loss prediction. In the first two sets of cases comparisons are made with theory, and in the third the comparison is with experimental data.

  3. Experimental validation of boundary element methods for noise prediction

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Oswald, Fred B.

    1992-01-01

    Experimental validation of methods to predict radiated noise is presented. A combined finite element and boundary element model was used to predict the vibration and noise of a rectangular box excited by a mechanical shaker. The predicted noise was compared to sound power measured by the acoustic intensity method. Inaccuracies in the finite element model shifted the resonance frequencies by about 5 percent. The predicted and measured sound power levels agree within about 2.5 dB. In a second experiment, measured vibration data was used with a boundary element model to predict noise radiation from the top of an operating gearbox. The predicted and measured sound power for the gearbox agree within about 3 dB.

  4. Video quality experts group: the quest for valid objective methods

    NASA Astrophysics Data System (ADS)

    Corriveau, Philip J.; Webster, Arthur A.; Rohaly, Ann M.; Libert, John M.

    2000-06-01

    Subjective assessment methods have been used reliably for many years to evaluate video quality. They continue to provide the most reliable assessments compared to objective methods. Some issues that arise with subjective assessment include the cost of conducting the evaluations and the fact that these methods cannot easily be used to monitor video quality in real time. Furthermore, traditional, analog objective methods, while still necessary, are not sufficient to measure the quality of digitally compressed video systems. Thus, there is a need to develop new objective methods utilizing the characteristics of the human visual system. While several new objective methods have been developed, there is to date no internationally standardized method. The Video Quality Experts Group (VQEG) was formed in October 1997 to address video quality issues. The group is composed of experts from various backgrounds and affiliations, including participants from several internationally recognized organizations working in the field of video quality assessment. The majority of participants are active in the International Telecommunications Union (ITU) and VQEG combines the expertise and resources found in several ITU Study Groups to work towards a common goal. The first task undertaken by VQEG was to provide a validation of objective video quality measurement methods leading to Recommendations in both the Telecommunications (ITU-T) and Radiocommunication (ITU-R) sectors of the ITU. To this end, VQEG designed and executed a test program to compare subjective video quality evaluations to the predictions of a number of proposed objective measurement methods for video quality in the bit rate range of 768 kb/s to 50 Mb/s. The results of this test show that there is no objective measurement system that is currently able to replace subjective testing. Depending on the metric used for evaluation, the performance of eight or nine models was found to be statistically equivalent, leading to the

  5. Validated spectrophotometric methods for determination of some oral hypoglycemic drugs.

    PubMed

    Farouk, M; Abdel-Satar, O; Abdel-Aziz, O; Shaaban, M

    2011-02-01

    Four accurate, precise, rapid, reproducible, and simple spectrophotometric methods were validated for determination of repaglinide (RPG), pioglitazone hydrochloride (PGL) and rosiglitazone maleate (RGL). The first two methods were based on the formation of a charge-transfer purple-colored complex of chloranilic acid with RPG and RGL with a molar absorptivity 1.23 × 103 and 8.67 × 102 l•mol-1•cm-1 and a Sandell's sensitivity of 0.367 and 0.412 μg•cm-2, respectively, and an ion-pair yellow-colored complex of bromophenol blue with RPG, PGL and RGL with molar absorptivity 8.86 × 103, 6.95 × 103, and 7.06 × 103 l•mol-1•cm-1, respectively, and a Sandell's sensitivity of 0.051 μg•cm-2 for all ion-pair complexes. The influence of different parameters on color formation was studied to determine optimum conditions for the visible spectrophotometric methods. The other spectrophotometric methods were adopted for demtermination of the studied drugs in the presence of their acid-, alkaline- and oxidative-degradates by computing derivative and pH-induced difference spectrophotometry, as stability-indicating techniques. All the proposed methods were validated according to the International Conference on Harmonization guidelines and successfully applied for determination of the studied drugs in pure form and in pharmaceutical preparations with good extraction recovery ranges between 98.7-101.4%, 98.2-101.3%, and 99.9-101.4% for RPG, PGL, and RGL, respectively. Results of relative standard deviations did not exceed 1.6%, indicating that the proposed methods having good repeatability and reproducibility. All the obtained results were statistically compared to the official method used for RPG analysis and the manufacturers methods used for PGL and RGL analysis, respectively, where no significant differences were found. PMID:22466095

  6. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Method of identification of substances for reporting purposes. 712.5 Section 712.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of identification of substances for...

  7. Knowledge Transmission versus Social Transformation: A Critical Analysis of Purpose in Elementary Social Studies Methods Textbooks

    ERIC Educational Resources Information Center

    Butler, Brandon M.; Suh, Yonghee; Scott, Wendy

    2015-01-01

    In this article, the authors investigate the extent to which 9 elementary social studies methods textbooks present the purpose of teaching and learning social studies. Using Stanley's three perspectives of teaching social studies for knowledge transmission, method of intelligence, and social transformation; we analyze how these texts prepare…

  8. Examining the Content Validity of the WHOQOL-BRF from Respondents' Perspective by Quantitative Methods

    ERIC Educational Resources Information Center

    Yao, Grace; Wu, Chia-Huei; Yang, Cheng-Ta

    2008-01-01

    Content validity, the extent to which a measurement reflects the specific intended domain of content, is a basic type of validity for a valid measurement. It was usually examined qualitatively and relied on experts' subjective judgments, not on respondents' responses. Therefore, the purpose of this study was to introduce and demonstrate how to use…

  9. Validated spectrofluorometric methods for determination of amlodipine besylate in tablets

    NASA Astrophysics Data System (ADS)

    Abdel-Wadood, Hanaa M.; Mohamed, Niveen A.; Mahmoud, Ashraf M.

    2008-08-01

    Two simple and sensitive spectrofluorometric methods have been developed and validated for determination of amlodipine besylate (AML) in tablets. The first method was based on the condensation reaction of AML with ninhydrin and phenylacetaldehyde in buffered medium (pH 7.0) resulting in formation of a green fluorescent product, which exhibits excitation and emission maxima at 375 and 480 nm, respectively. The second method was based on the reaction of AML with 7-chloro-4-nitro-2,1,3-benzoxadiazole (NBD-Cl) in a buffered medium (pH 8.6) resulting in formation of a highly fluorescent product, which was measured fluorometrically at 535 nm ( λex, 480 nm). The factors affecting the reactions were studied and optimized. Under the optimum reaction conditions, linear relationships with good correlation coefficients (0.9949-0.9997) were found between the fluorescence intensity and the concentrations of AML in the concentration range of 0.35-1.8 and 0.55-3.0 μg ml -1 for ninhydrin and NBD-Cl methods, respectively. The limits of assays detection were 0.09 and 0.16 μg ml -1 for the first and second method, respectively. The precisions of the methods were satisfactory; the relative standard deviations were ranged from 1.69 to 1.98%. The proposed methods were successfully applied to the analysis of AML in pure and pharmaceutical dosage forms with good accuracy; the recovery percentages ranged from 100.4-100.8 ± 1.70-2.32%. The results were compared favorably with those of the reported method.

  10. Indentation Measurements to Validate Dynamic Elasticity Imaging Methods.

    PubMed

    Altahhan, Khaldoon N; Wang, Yue; Sobh, Nahil; Insana, Michael F

    2016-09-01

    We describe macro-indentation techniques for estimating the elastic modulus of soft hydrogels. Our study describes (a) conditions under which quasi-static indentation can validate dynamic shear-wave imaging estimates and (b) how each of these techniques uniquely biases modulus estimates as they couple to the sample geometry. Harmonic shear waves between 25 and 400 Hz were imaged using ultrasonic Doppler and optical coherence tomography methods to estimate shear dispersion. From the shear-wave speed of sound, average elastic moduli of homogeneous samples were estimated. These results are compared directly with macroscopic indentation measurements measured two ways. One set of measurements applied Hertzian theory to the loading phase of the force-displacement curves using samples treated to minimize surface adhesion forces. A second set of measurements applied Johnson-Kendall-Roberts theory to the unloading phase of the force-displacement curve when surface adhesions were significant. All measurements were made using gelatin hydrogel samples of different sizes and concentrations. Agreement within 5% among elastic modulus estimates was achieved for a range of experimental conditions. Consequently, a simple quasi-static indentation measurement using a common gel can provide elastic modulus measurements that help validate dynamic shear-wave imaging estimates. PMID:26376923

  11. Computational Methods for RNA Structure Validation and Improvement.

    PubMed

    Jain, Swati; Richardson, David C; Richardson, Jane S

    2015-01-01

    With increasing recognition of the roles RNA molecules and RNA/protein complexes play in an unexpected variety of biological processes, understanding of RNA structure-function relationships is of high current importance. To make clean biological interpretations from three-dimensional structures, it is imperative to have high-quality, accurate RNA crystal structures available, and the community has thoroughly embraced that goal. However, due to the many degrees of freedom inherent in RNA structure (especially for the backbone), it is a significant challenge to succeed in building accurate experimental models for RNA structures. This chapter describes the tools and techniques our research group and our collaborators have developed over the years to help RNA structural biologists both evaluate and achieve better accuracy. Expert analysis of large, high-resolution, quality-conscious RNA datasets provides the fundamental information that enables automated methods for robust and efficient error diagnosis in validating RNA structures at all resolutions. The even more crucial goal of correcting the diagnosed outliers has steadily developed toward highly effective, computationally based techniques. Automation enables solving complex issues in large RNA structures, but cannot circumvent the need for thoughtful examination of local details, and so we also provide some guidance for interpreting and acting on the results of current structure validation for RNA. PMID:26068742

  12. The Equivalence of Positive and Negative Methods of Validating a Learning Hierarchy.

    ERIC Educational Resources Information Center

    Kee, Kevin N.; White, Richard T.

    1979-01-01

    The compound nature of Gagne's original definition of learning hierarchies leads to two methods of validation, the positive and negative methods. Sections of a hierarchy that had been validated by the negative method were subjected to test by the more cumbersome positive method, and again were found to be valid. (Author/RD)

  13. Validating for Use and Interpretation: A Mixed Methods Contribution Illustrated

    ERIC Educational Resources Information Center

    Morell, Linda; Tan, Rachael Jin Bee

    2009-01-01

    Researchers in the areas of psychology and education strive to understand the intersections among validity, educational measurement, and cognitive theory. Guided by a mixed model conceptual framework, this study investigates how respondents' opinions inform the validation argument. Validity evidence for a science assessment was collected through…

  14. Line profile reconstruction: validation and comparison of reconstruction methods

    NASA Astrophysics Data System (ADS)

    Tsai, Ming-Yi; Yost, Michael G.; Wu, Chang-Fu; Hashmonay, Ram A.; Larson, Timothy V.

    Currently, open path Fourier transform infrared (OP-FTIR) spectrometers have been applied in some fenceline monitoring, but their use has been limited because path-integrated concentration measurements typically only provide an estimate of the average concentration. We present a series of experiments that further explore the use of path-integrated measurements to reconstruct various pollutant distributions along a linear path. Our experiments were conducted in a ventilation chamber using an OP-FTIR instrument to monitor a tracer-gas release over a fenceline configuration. These experiments validate a line profile method (1-D reconstruction). Additionally, we expand current reconstruction techniques by applying the Bootstrap to our measurements. We compared our reconstruction results to our point samplers using the concordance correlation factor (CCF). Of the four different release types, three were successfully reconstructed with CCFs greater than 0.9. The difficult reconstruction involved a narrow release where the pollutant was limited to one segment of the segmented beampath. In general, of the three reconstruction methods employed, the average of the bootstrapped reconstructions was found to have the highest CCFs when compared to the point samplers. Furthermore, the bootstrap method was the most flexible and allowed a determination of the uncertainty surrounding our reconstructions.

  15. [Validation of a HPLC method for ochratoxin A determination].

    PubMed

    Bulea, Delia; Spac, A F; Dorneanu, V

    2011-01-01

    Ochratoxin A is a mycotoxin produced by various species of Aspergillus and Penicillium. Ochratoxin A has been detected in cereals and cereal products, coffee beans, beer, wine, spices, pig's kidney and cow's milk. For ochratoxin A, a HPLC method was developed and validated. Ochratoxin A was determined by RP-HPLC, using a liquid chromatograph type HP 1090 Series II, equiped with a fluorescence detector. The analysis was performed with a Phenomenex column, type Luna C18(2) 100A (150 x 4.6 mm; 5 microm) with a mobile phase consisting of a mixture of acetonitrile/water/acid acetic (99/99/2), a flow of 0.7 mL/min. For detection, the wavelenght of excitation was 228 nm and wavelenght of emision was 423 nm. The calibration graph was linear in 6.25-50 ng/mL concentration range (r2 = 0,9991). The detection limits was 1.6 ng/mL and the quantification limit was 4.9 ng/mL. The method precision (RSD = 2.4975%) and the accuracy (recovery was 100.1%) were studied. The HPLC method was applyed for ochratoxin A from food samples with good results. PMID:21870763

  16. Testing and Validation of the Dynamic Inertia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.

  17. Validation of a digital PCR method for quantification of DNA copy number concentrations by using a certified reference material.

    PubMed

    Deprez, Liesbet; Corbisier, Philippe; Kortekaas, Anne-Marie; Mazoua, Stéphane; Beaz Hidalgo, Roxana; Trapmann, Stefanie; Emons, Hendrik

    2016-09-01

    Digital PCR has become the emerging technique for the sequence-specific detection and quantification of nucleic acids for various applications. During the past years, numerous reports on the development of new digital PCR methods have been published. Maturation of these developments into reliable analytical methods suitable for diagnostic or other routine testing purposes requires their validation for the intended use. Here, the results of an in-house validation of a droplet digital PCR method are presented. This method is intended for the quantification of the absolute copy number concentration of a purified linearized plasmid in solution with a nucleic acid background. It has been investigated which factors within the measurement process have a significant effect on the measurement results, and the contribution to the overall measurement uncertainty has been estimated. A comprehensive overview is provided on all the aspects that should be investigated when performing an in-house method validation of a digital PCR method. PMID:27617230

  18. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  19. Determination of formaldehyde in food and feed by an in-house validated HPLC method.

    PubMed

    Wahed, P; Razzaq, Md A; Dharmapuri, S; Corrales, M

    2016-07-01

    Formalin is carcinogenic and is detrimental to public health. The illegal addition of formalin (37% formaldehyde and 14% methanol) to foods to extend their shelf-life is considered to be a common practice in Bangladesh. The lack of accurate methods and the ubiquitous presence of formaldehyde in foods make the detection of illegally added formalin challenging. With the aim of helping regulatory authorities, a sensitive high performance liquid chromatography method was validated for the quantitative determination of formaldehyde in mango, fish and milk. The method was fit-for-purpose and showed good analytical performance in terms of specificity, linearity, precision, recovery and robustness. The expanded uncertainty was <35%. The validated method was applied to screen samples of fruits, vegetables, fresh fish, milk and fish feed collected from different local markets in Dhaka, Bangladesh. Levels of formaldehyde in food samples were compared with published data. The applicability of the method in different food matrices might mean it has potential as a reference standard method. PMID:26920321

  20. Validation of the Benefit Forecasting Method: Organization Development Program to Increase Health Organization Membership. Training and Development Research Center, Project Number Eleven.

    ERIC Educational Resources Information Center

    Sleezer, Catherine M.; And Others

    This project is the sixth in a series of studies designed to validate the Training and Development Benefit Forecasting Method (BFM) sponsored by the Training and Development Research Center (TDRC) at the University of Minnesota. The purpose of this study was to validate the BFM's ability to forecast the benefits of an organization development…

  1. An Anatomically Validated Brachial Plexus Contouring Method for Intensity Modulated Radiation Therapy Planning

    SciTech Connect

    Van de Velde, Joris; Audenaert, Emmanuel; Speleers, Bruno; Vercauteren, Tom; Mulliez, Thomas; Vandemaele, Pieter; Achten, Eric; Kerckaert, Ingrid; D'Herde, Katharina; De Neve, Wilfried; Van Hoof, Tom

    2013-11-15

    Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validated the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection.

  2. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  3. STANDARDIZATION AND VALIDATION OF MICROBIOLOGICAL METHODS FOR EXAMINATION OF BIOSOLIDS

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within a complex matrix. Implications of ...

  4. Comments on "validation of two innovative methods to measure contaminant mass flux in groundwater" by Goltz et al.

    NASA Astrophysics Data System (ADS)

    Sun, Kerang

    2014-12-01

    I wish to comment on the paper published by Goltz et al. on this journal, titled Validation of two innovative methods to measure contaminant mass flux in groundwater (Goltz et al., 2009). The paper presents the results of experiments Goltz et al. conducted on an artificial aquifer for the purpose of validating two recently developed methods to measure contaminant mass flux in groundwater, the tandem circulation well (TCW) method and the modified integral pumping test (MIPT) method. Their experiment results showed that the TCW method implemented using both the multi-dipole technique and the tracer test technique successfully estimated the mass fluxes with respective accuracies within 2% and 16% of the known values. The MIPT method, on the other hand, underestimated the mass flux by as much as 70%. My comments focus on the MIPT method.

  5. Cleaning validation 2: development and validation of an ion chromatographic method for the detection of traces of CIP-100 detergent.

    PubMed

    Resto, Wilfredo; Hernández, Darimar; Rey, Rosamil; Colón, Héctor; Zayas, José

    2007-05-01

    A cleaning validation method, ion chromatographic method with conductivity detection was developed and validated for the determination of traces of a clean-in-place (CIP) detergent. It was shown to be linear with a squared correlation coefficient (r(2)) of 0.9999 and average recoveries of 71.4% (area response factor) from stainless steel surfaces and 101% from cotton. The repeatability was found to be 2.17% and an intermediate precision of 1.88% across the range. The method was also shown to be sensitive with a detection limit (DL) of 0.13 ppm and a quantitation limit (QL) of 0.39 ppm for EDTA, which translates to less than 1 microL of CIP diluted in 100mL of diluent in both cases. The EDTA signal was well resolved from typical ions encountered in water samples or any other interference presented from swabs and surfaces. The method could be applied to cleaning validation samples. The validated method could be included as a suitable one for rapid and reliable cleaning validation program. PMID:17344013

  6. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    SciTech Connect

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  7. Guidelines for the validation of qualitative multi-residue methods used to detect pesticides in food.

    PubMed

    Mol, H G J; Reynolds, S L; Fussell, R J; Stajnbaher, D

    2012-08-01

    There is a current trend for many laboratories to develop and use qualitative gas chromatography-mass spectrometry (GC-MS) and liquid chromatography-mass spectrometry (LC-MS) based multi-residue methods (MRMs) in order to greatly increase the number of pesticides that they can target. Before these qualitative MRMs can be used for the monitoring of pesticide residues in food, their fitness-for-purpose needs to be established by initial method validation. This paper sets out to assess the performances of two such qualitative MRMs against a set of parameters and criteria that might be suitable for their effective validation. As expected, the ease of detection was often dependent on the particular pesticide/commodity combinations that were targeted, especially at the lowest concentrations tested (0.01 mg/kg). The two examples also clearly demonstrated that the percentage of pesticides detected was dependent on many factors, but particularly on the capabilities of the automated software/library packages and the parameters and threshold settings selected for operation. Another very important consideration was the condition of chromatographic system and detector at the time of analysis. If the system was relatively clean, then the detection rate was much higher than if it had become contaminated over time from previous injections of sample extracts. The parameters and criteria suggested for method validation of qualitative MRMs are aimed at achieving a 95% confidence level of pesticide detection. However, the presence of any pesticide that is 'detected' will need subsequent analysis for quantification and, depending on the qualitative method used, further evidence of identity. PMID:22851355

  8. Use of Validation by Enterprises for Human Resource and Career Development Purposes. Cedefop Reference Series No 96

    ERIC Educational Resources Information Center

    Cedefop - European Centre for the Development of Vocational Training, 2014

    2014-01-01

    European enterprises give high priority to assessing skills and competences, seeing this as crucial for recruitment and human resource management. Based on a survey of 400 enterprises, 20 in-depth case studies and interviews with human resource experts in 10 countries, this report analyses the main purposes of competence assessment, the standards…

  9. Validation of a new method to measure contact and flight times during treadmill running.

    PubMed

    Ogueta-Alday, Ana; Morante, Juan C; Rodríguez-Marroyo, Jose A; García-López, Juan

    2013-05-01

    The purpose of this study was to validate a new method to measure contact and flight times during treadmill running and to test its reliability and sensitivity. Fifteen well-trained runners performed 7 sets of running at different speeds (from 10 to 22 km·h). Contact and flight times were simultaneously recorded by a high-speed video system (gold standard method) and a new method based on laser technology (SportJump System Pro). Athletes were classified according to their foot strike pattern (rearfoot vs. midfoot and forefoot). The new method overestimated the contact time and underestimated the flight time with respect to the gold standard method (p < 0.001). However, relationships and intraclass correlation coefficients (ICCs) between both systems were very strong (r and ICC > 0.99, p < 0.001). Contact time differences between the 2 systems depended on running speed (p < 0.001) but not on foot strike pattern or runners' body mass. This allowed to correct the differences in contact time and flight time. The new method was sensitive for detecting small differences in contact time (<20 ms) when the running speed increased and when the type of foot strike patterns changed. Additionally, a low intraindividual step variability (coefficient of variation = 2.0 ± 0.5%) and high intra- (ICC = 0.998) and interobserver (ICC = 0.977) reliability were shown. In conclusion, the new method was validated, being reliable and sensitive for detecting small differences in contact and flight times during treadmill running. Therefore, it could be used to compare biomechanical variables between groups in cross-sectional studies and to verify the influence of some independent variables (i.e., training, running economy, or performance) on running biomechanics. PMID:22836607

  10. Measurement Practices: Methods for Developing Content-Valid Student Examinations.

    ERIC Educational Resources Information Center

    Bridge, Patrick D.; Musial, Joseph; Frank, Robert; Roe, Thomas; Sawilowsky, Shlomo

    2003-01-01

    Reviews the fundamental principles associated with achieving a high level of content validity when developing tests for students. Suggests that the short-term efforts necessary to develop and integrate measurement theory into practice will lead to long-term gains for students, faculty, and academic institutions. (Includes 21 references.)…

  11. Cost-Benefit Considerations in Choosing among Cross-Validation Methods.

    ERIC Educational Resources Information Center

    Murphy, Kevin R.

    There are two general methods of cross-validation: empirical estimation, and formula estimation. In choosing a specific cross-validation procedure, one should consider both costs (e.g., inefficient use of available data in estimating regression parameters) and benefits (e.g., accuracy in estimating population cross-validity). Empirical…

  12. Validation Methods for Fault-Tolerant avionics and control systems, working group meeting 1

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The proceedings of the first working group meeting on validation methods for fault tolerant computer design are presented. The state of the art in fault tolerant computer validation was examined in order to provide a framework for future discussions concerning research issues for the validation of fault tolerant avionics and flight control systems. The development of positions concerning critical aspects of the validation process are given.

  13. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  14. Validation of a high performance liquid chromatography method for the stabilization of epigallocatechin gallate.

    PubMed

    Fangueiro, Joana F; Parra, Alexander; Silva, Amélia M; Egea, Maria A; Souto, Eliana B; Garcia, Maria L; Calpena, Ana C

    2014-11-20

    Epigallocatechin gallate (EGCG) is a green tea catechin with potential health benefits, such as anti-oxidant, anti-carcinogenic and anti-inflammatory effects. In general, EGCG is highly susceptible to degradation, therefore presenting stability problems. The present paper was focused on the study of EGCG stability in HEPES (N-2-hydroxyethylpiperazine-N'-2-ethanesulfonic acid) medium regarding the pH dependency, storage temperature and in the presence of ascorbic acid a reducing agent. The evaluation of EGCG in HEPES buffer has demonstrated that this molecule is not able of maintaining its physicochemical properties and potential beneficial effects, since it is partially or completely degraded, depending on the EGCG concentration. The storage temperature of EGCG most suitable to maintain its structure was shown to be the lower values (4 or -20 °C). The pH 3.5 was able to provide greater stability than pH 7.4. However, the presence of a reducing agent (i.e., ascorbic acid) was shown to provide greater protection against degradation of EGCG. A validation method based on RP-HPLC with UV-vis detection was carried out for two media: water and a biocompatible physiological medium composed of Transcutol®P, ethanol and ascorbic acid. The quantification of EGCG for purposes, using pure EGCG, requires a validated HPLC method which could be possible to apply in pharmacokinetic and pharmacodynamics studies. PMID:25175728

  15. The development and validation of methods for evaluating the immune system in preweaning piglets.

    PubMed

    Zeigler, Brandon M; Cameron, Mark; Nelson, Keith; Bailey, Kristi; Weiner, Myra L; Mahadevan, Brinda; Thorsrud, Bjorn

    2015-10-01

    The preweaning piglet has been found to be a valuable research model for testing ingredients used in infant formula. As part of the safety assessment, the neonates' immune system is an important component that has to be evaluated. In this study three concurrent strategies were developed to assess immune system status. The methods included (1) immunophenotying to assess circulating innate immune cell populations, (2) monitoring of circulating cytokines, particularly in response to a positive control agent, and (3) monitoring of localized gastrointestinal tissue cytokines using immunohistochemistry (IHC), particularly in response to a positive control agent. All assays were validated using white papers and regulatory guidance within a GLP environment. To validate the assays precision, accuracy and sample stability were evaluated as needed using a fit for purpose approach. In addition animals were treated with proinflammtory substances to detect a positive versus negative signal. In conclusion, these three methods were confirmed to be robust assays to evaluate the immune system and GIT-specific immune responses of preweaning piglets. PMID:26341191

  16. An Automatic Method for Geometric Segmentation of Masonry Arch Bridges for Structural Engineering Purposes

    NASA Astrophysics Data System (ADS)

    Riveiro, B.; DeJong, M.; Conde, B.

    2016-06-01

    Despite the tremendous advantages of the laser scanning technology for the geometric characterization of built constructions, there are important limitations preventing more widespread implementation in the structural engineering domain. Even though the technology provides extensive and accurate information to perform structural assessment and health monitoring, many people are resistant to the technology due to the processing times involved. Thus, new methods that can automatically process LiDAR data and subsequently provide an automatic and organized interpretation are required. This paper presents a new method for fully automated point cloud segmentation of masonry arch bridges. The method efficiently creates segmented, spatially related and organized point clouds, which each contain the relevant geometric data for a particular component (pier, arch, spandrel wall, etc.) of the structure. The segmentation procedure comprises a heuristic approach for the separation of different vertical walls, and later image processing tools adapted to voxel structures allows the efficient segmentation of the main structural elements of the bridge. The proposed methodology provides the essential processed data required for structural assessment of masonry arch bridges based on geometric anomalies. The method is validated using a representative sample of masonry arch bridges in Spain.

  17. Optimal combining of ground-based sensors for the purpose of validating satellite-based rainfall estimates

    NASA Technical Reports Server (NTRS)

    Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie

    1991-01-01

    Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.

  18. The validation of analytical methods for drug substances and drug products in UK pharmaceutical laboratories.

    PubMed

    Clarke, G S

    1994-05-01

    Results of a survey on method validation of analytical procedures used in the testing of drug substances and finished products, of most major research based pharmaceutical companies with laboratories in the UK, are presented. The results indicate that although method validation shows an essential similarity in different laboratories (in particular, chromatographic assay methods are validated in a similar manner in most laboratories), there is much diversity in the detailed application of validation parameters. Testing procedures for drug substances are broadly similar to finished products. Many laboratories validate methods at clinical trial stage to the same extent and detail as at the marketing authorization application (MAA)/new drug application (NDA) submission stage, however, only a small minority of laboratories apply the same criteria to methodology at pre-clinical trial stage. Extensive details of method validation parameters are included in the summary tables of this survey, together with details of the median response given for the validation of the most extensively applied methods. These median response details could be useful in suggesting a harmonized approach to method validation as applied by UK pharmaceutical laboratories. These guidelines would extend beyond the recommendations made to date by regulatory authorities and pharmacopoeias in that minimum requirements for each method validation parameter, e.g. number of replicates, range and tolerance, could be harmonized, both between laboratories and also in Product Licence submissions. PMID:7948185

  19. A simple method to generate adipose stem cell-derived neurons for screening purposes.

    PubMed

    Bossio, Caterina; Mastrangelo, Rosa; Morini, Raffaella; Tonna, Noemi; Coco, Silvia; Verderio, Claudia; Matteoli, Michela; Bianco, Fabio

    2013-10-01

    Strategies involved in mesenchymal stem cell (MSC) differentiation toward neuronal cells for screening purposes are characterized by quality and quantity issues. Differentiated cells are often scarce with respect to starting undifferentiated population, and the differentiation process is usually quite long, with high risk of contamination and low yield efficiency. Here, we describe a novel simple method to induce direct differentiation of MSCs into neuronal cells, without neurosphere formation. Differentiated cells are characterized by clear morphological changes, expression of neuronal specific markers, showing functional response to depolarizing stimuli and electrophysiological properties similar to those of developing neurons. The method described here represents a valuable tool for future strategies aimed at personalized screening of therapeutic agents in vitro. PMID:23468184

  20. Convergent validity of a novel method for quantifying rowing training loads.

    PubMed

    Tran, Jacqueline; Rice, Anthony J; Main, Luana C; Gastin, Paul B

    2015-01-01

    Elite rowers complete rowing-specific and non-specific training, incorporating continuous and interval-like efforts spanning the intensity spectrum. However, established training load measures are unsuitable for use in some modes and intensities. Consequently, a new measure known as the T2minute method was created. The method quantifies load as the time spent in a range of training zones (time-in-zone), multiplied by intensity- and mode-specific weighting factors that scale the relative stress of different intensities and modes to the demands of on-water rowing. The purpose of this study was to examine the convergent validity of the T2minute method with Banister's training impulse (TRIMP), Lucia's TRIMP and Session-RPE when quantifying elite rowing training. Fourteen elite rowers (12 males, 2 females) were monitored during four weeks of routine training. Unadjusted T2minute loads (using coaches' estimates of time-in-zone) demonstrated moderate-to-strong correlations with Banister's TRIMP, Lucia's TRIMP and Session-RPE (rho: 0.58, 0.55 and 0.42, respectively). Adjusting T2minute loads by using actual time-in-zone data resulted in stronger correlations between the T2minute method and Banister's TRIMP and Lucia's TRIMP (rho: 0.85 and 0.81, respectively). The T2minute method is an appropriate in-field measure of elite rowing training loads, particularly when actual time-in-zone values are used to quantify load. PMID:25083912

  1. Optimization and Validation of an ETAAS Method for the Determination of Nickel in Postmortem Material.

    PubMed

    Dudek-Adamska, Danuta; Lech, Teresa; Kościelniak, Paweł

    2015-01-01

    In this article, optimization and validation of a procedure for the determination of total nickel in wet digested samples of human body tissues (internal organs) for forensic toxicological purposes are presented. Four experimental setups of the electrothermal atomic absorption spectrometry (ETAAS) using a Solaar MQZe (Thermo Electron Co.) were compared, using the following (i) no modifier, (ii) magnesium nitrate, (iii) palladium nitrate and (iv) magnesium nitrate and ammonium dihydrogen phosphate mixture as chemical modifiers. It was ascertained that the ETAAS without any modifier with 1,300/2,400°C as the pyrolysis and atomization temperatures, respectively, can be used to determine total nickel at reference levels in biological materials as well as its levels found in chronic or acute poisonings. The method developed was validated, obtaining a linear range of calibration from 0.76 to 15.0 μg/L, limit of detection at 0.23 µg/L, limit of quantification at 0.76 µg/L, precision (as relative standard deviation) up to 10% and accuracy of 97.1% for the analysis of certified material (SRM 1577c Bovine Liver) and within a range from 99.2 to 109.9% for the recovery of fortified liver samples. PMID:25868556

  2. VALIDATION OF A METHOD FOR ESTIMATING LONG-TERM EXPOSURES BASED ON SHORT-TERM MEASUREMENTS

    EPA Science Inventory

    A method for estimating long-term exposures from short-term measurements is validated using data from a recent EPA study of exposure to fine particles. The method was developed a decade ago but long-term exposure data to validate it did not exist until recently. In this paper, ...

  3. ECVAM's approach to intellectual property rights in the validation of alternative methods.

    PubMed

    Linge, Jens P; Hartung, Thomas

    2007-08-01

    In this article, we discuss how intellectual property rights affect the validation of alternative methods at ECVAM. We point out recent cases and summarise relevant EU and OECD documents. Finally, we discuss guidelines for dealing with intellectual property rights during the validation of alternative methods at ECVAM. PMID:17850189

  4. Data on the verification and validation of segmentation and registration methods for diffusion MRI.

    PubMed

    Esteban, Oscar; Zosso, Dominique; Daducci, Alessandro; Bach-Cuadra, Meritxell; Ledesma-Carbayo, María J; Thiran, Jean-Philippe; Santos, Andres

    2016-09-01

    The verification and validation of segmentation and registration methods is a necessary assessment in the development of new processing methods. However, verification and validation of diffusion MRI (dMRI) processing methods is challenging for the lack of gold-standard data. The data described here are related to the research article entitled "Surface-driven registration method for the structure-informed segmentation of diffusion MR images" [1], in which publicly available data are used to derive golden-standard reference-data to validate and evaluate segmentation and registration methods in dMRI. PMID:27508235

  5. Validation of two ribosomal RNA removal methods for microbial metatranscriptomics

    SciTech Connect

    He, Shaomei; Wurtzel, Omri; Singh, Kanwar; Froula, Jeff L; Yilmaz, Suzan; Tringe, Susannah G; Wang, Zhong; Chen, Feng; Lindquist, Erika A; Sorek, Rotem; Hugenholtz, Philip

    2010-10-01

    The predominance of rRNAs in the transcriptome is a major technical challenge in sequence-based analysis of cDNAs from microbial isolates and communities. Several approaches have been applied to deplete rRNAs from (meta)transcriptomes, but no systematic investigation of potential biases introduced by any of these approaches has been reported. Here we validated the effectiveness and fidelity of the two most commonly used approaches, subtractive hybridization and exonuclease digestion, as well as combinations of these treatments, on two synthetic five-microorganism metatranscriptomes using massively parallel sequencing. We found that the effectiveness of rRNA removal was a function of community composition and RNA integrity for these treatments. Subtractive hybridization alone introduced the least bias in relative transcript abundance, whereas exonuclease and in particular combined treatments greatly compromised mRNA abundance fidelity. Illumina sequencing itself also can compromise quantitative data analysis by introducing a G+C bias between runs.

  6. An evaluation of alternate production methods for Pu-238 general purpose heat source pellets

    SciTech Connect

    Mark Borland; Steve Frank

    2009-06-01

    For the past half century, the National Aeronautics and Space Administration (NASA) has used Radioisotope Thermoelectric Generators (RTG) to power deep space satellites. Fabricating heat sources for RTGs, specifically General Purpose Heat Sources (GPHSs), has remained essentially unchanged since their development in the 1970s. Meanwhile, 30 years of technological advancements have been made in the applicable fields of chemistry, manufacturing and control systems. This paper evaluates alternative processes that could be used to produce Pu 238 fueled heat sources. Specifically, this paper discusses the production of the plutonium-oxide granules, which are the input stream to the ceramic pressing and sintering processes. Alternate chemical processes are compared to current methods to determine if alternative fabrication processes could reduce the hazards, especially the production of respirable fines, while producing an equivalent GPHS product.

  7. STATISTICAL VALIDATION OF SULFATE QUANTIFICATION METHODS USED FOR ANALYSIS OF ACID MINE DRAINAGE

    EPA Science Inventory

    Turbidimetric method (TM), ion chromatography (IC) and inductively coupled plasma atomic emission spectrometry (ICP-AES) with and without acid digestion have been compared and validated for the determination of sulfate in mining wastewater. Analytical methods were chosen to compa...

  8. Comparison of Machine Learning Methods for the Purpose Of Human Fall Detection

    NASA Astrophysics Data System (ADS)

    Strémy, Maximilián; Peterková, Andrea

    2014-12-01

    According to several studies, the European population is rapidly aging far over last years. It is therefore important to ensure that aging population is able to live independently without the support of working-age population. In accordance with the studies, fall is the most dangerous and frequent accident in the everyday life of aging population. In our paper, we present a system to track the human fall by a visual detection, i.e. using no wearable equipment. For this purpose, we used a Kinect sensor, which provides the human body position in the Cartesian coordinates. It is possible to directly capture a human body because the Kinect sensor has a depth and also an infrared camera. The first step in our research was to detect postures and classify the fall accident. We experimented and compared the selected machine learning methods including Naive Bayes, decision trees and SVM method to compare the performance in recognizing the human postures (standing, sitting and lying). The highest classification accuracy of over 93.3% was achieved by the decision tree method.

  9. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  10. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  11. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  12. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  13. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the recommended method of...

  14. FIELD VALIDATION OF EPA (ENVIRONMENTAL PROTECTION AGENCY) REFERENCE METHOD 23

    EPA Science Inventory

    The accuracy and precision of U.S. Environmental Protection Agency Reference Method 23 was evaluated at a trichloroethylene degreasing facility and an ethylene dichloride plant. The method consists of a procedure for obtaining an integrated sample followed by gas chromatographic ...

  15. Testing and Validation of the Dynamic Interia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander; Herrera, Claudia; Spivey, Natalie; Fladung, William; Cloutier, David

    2015-01-01

    This presentation describes the DIM method and how it measures the inertia properties of an object by analyzing the frequency response functions measured during a ground vibration test (GVT). The DIM method has been in development at the University of Cincinnati and has shown success on a variety of small scale test articles. The NASA AFRC version was modified for larger applications.

  16. Validation of a Generic qHNMR Method for Natural Products Analysis†

    PubMed Central

    Gödecke, Tanja; Napolitano, José G.; Rodríguez-Brasco, María F.; Chen, Shao-Nong; Jaki, Birgit U.; Lankin, David C.; Pauli, Guido F.

    2014-01-01

    Introduction Nuclear magnetic resonance (NMR) spectroscopy is increasingly employed in the quantitative analysis and quality control (QC) of natural products (NPs) including botanical dietary supplements (BDSs). The establishment of qHNMR based QC protocols requires method validation. Objective Develop and validate a generic qHNMR method. Optimize acquisition and processing parameters, with specific attention to the requirements for the analysis of complex NP samples, including botanicals and purity assessment of NP isolates. Methodology In order to establish the validated qHNMR method, samples containing two highly pure reference materials were used. The influence of acquisition and processing parameters on the method validation were examined, and general aspects of method validation of qHNMR methods discussed. Subsequently, the established method was applied to the analysis of two natural products samples: a purified reference compound and a crude mixture. Results The accuracy and precision of qHNMR using internal or external calibration were compared, using a validated method suitable for complex samples. The impact of post-acquisition processing on method validation was examined using three software packages: TopSpin, MNova, and NUTS. The dynamic range of the developed qHNMR method was 5,000:1 with a limit of detection (LOD) of better than 10 μM. The limit of quantification (LOQ) depends on the desired level of accuracy and experiment time spent. Conclusions This study revealed that acquisition parameters, processing parameters, and processing software all contribute to qHNMR method validation. A validated method with high dynamic range and general workflow for qHNMR analysis of NPs is proposed. PMID:23740625

  17. Flight critical system design guidelines and validation methods

    NASA Technical Reports Server (NTRS)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  18. Production of general purpose heat source (GPHS) using advanced manufacturing methods

    NASA Astrophysics Data System (ADS)

    Miller, Roger G.

    1996-03-01

    Mankind will continue to explore the stars through the use of unmanned space craft until the technology and costs are compatible with sending travelers to the outer planets of our solar system and beyond. Unmanned probes of the present and future will be necessary to develop the necessary technologies and obtain information that will make this travel possible. Because of the significant costs incurred, the use of modern manufacturing technologies must be used to lower the investment needed even when shared by international partnerships. For over the last 30 years, radioisotopes have provided the heat from which electrical power is extracted. Electric power for future spacecraft will be provided by either Radioisotope Thermoelectric Generators (RTG), Radioisotopic Thermophotovoltaic systems (RTPV), radioisotope Stirling systems, or a combination of these. All of these systems will be thermally driven by General Purpose Heat Source (GPHS) fueled clad in some configuration. The GPHS clad contains a 238PuO2 pellet encapsulated in an iridium alloy container. Historically, the fabrication of the iridium alloy shells has been performed at EG&G Mound and Oak Ridge National Laboratory (ORNL), and girth welding at Westinghouse Savannah River Corporation (WSRC) and Los Alamos National Laboratory (LANL). This paper will describe the use of laser processing for welding, drilling, cutting, and machining with other manufacturing methods to reduce the costs of producing GPHS fueled clad components and compléted assemblies. Incorporation of new quality technologies will compliment these manufacturing methods to reduce cost.

  19. Validation of the WHO Hemoglobin Color Scale Method

    PubMed Central

    Darshana, Leeniyagala Gamaralalage Thamal; Uluwaduge, Deepthi Inoka

    2014-01-01

    This study was carried out to evaluate the diagnostic accuracy of WHO color scale in screening anemia during blood donor selection in Sri Lanka. A comparative cross-sectional study was conducted by the Medical Laboratory Sciences Unit of University of Sri Jayewardenepura in collaboration with National Blood Transfusion Centre, Sri Lanka. A total of 100 subjects participated in this study. Hemoglobin value of each participant was analyzed by both WHO color scale method and cyanmethemoglobin method. Bland-Altman plot was used to determine the agreement between the two methods. Sensitivity, specificity, predictive values, false positive, and negative rates were calculated. The sensitivity of the WHO color scale was very low. The highest sensitivity observed was 55.55% in hemoglobin concentrations >13.1 g/dL and the lowest was 28.57% in hemoglobin concentrations between 7.1 and 9.0 g/dL. The mean difference between the WHO color scale and the cyanmethemoglobin method was 0.2 g/dL (95% confidence interval; 3.2 g/dL above and 2.8 g/dL below). Even though the WHO color scale is an inexpensive and portable method for field studies, from the overall results in this study it is concluded that WHO color scale is an inaccurate method to screen anemia during blood donations. PMID:24839555

  20. Thermogravimetric desorption and de novo tests I: method development and validation.

    PubMed

    Tsytsik, Palina; Czech, Jan; Carleer, Robert; Reggers, Guy; Buekens, Alfons

    2008-08-01

    Thermogravimetric analysis (TGA) has been combined with evolved gas analysis (EGA) with the purpose of simulating the thermal behaviour of filter dust samples under inert (desorption) and de novo test oxidising conditions. Emphasis is on studying de novo formation of dioxins, surrogates and precursors arising from filter dust derived from thermal processes, such as municipal solid waste incineration and metallurgy. A new method is tested for sampling and analysing dioxin surrogates and precursors in the TGA effluent, which are collected on sampling tubes; the adsorbed compounds are eventually desorbed and quantified by TD-GC-MS. The major sources of error and losses are considered, including potential sorbent artefacts, possible breakthrough of volatiles through sampling tubes, or eventual losses of semi-volatiles due to their incomplete desorption or re-condensation inside the TG Analyser. The method is optimised and validated for di- to hexa-chlorinated benzenes in a range of 10-1000 ppb with average recovery exceeding 85%. The results are compared with data obtained in similar studies, performed by other research groups. As a result, the method provides the means for simulating de novo synthesis of dioxins in fly-ash and facilitates reliable and easy estimation of de novo activity, comparable with results of other studies, in combination with wide flexibility of testing conditions. PMID:18556042

  1. PEM fuel cell fault detection and identification using differential method: simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Frappé, E.; de Bernardinis, A.; Bethoux, O.; Candusso, D.; Harel, F.; Marchand, C.; Coquery, G.

    2011-05-01

    PEM fuel cell performance and lifetime strongly depend on the polymer membrane and MEA hydration. As the internal moisture is very sensitive to the operating conditions (temperature, stoichiometry, load current, water management…), keeping the optimal working point is complex and requires real-time monitoring. This article focuses on PEM fuel cell stack health diagnosis and more precisely on stack fault detection monitoring. This paper intends to define new, simple and effective methods to get relevant information on usual faults or malfunctions occurring in the fuel cell stack. For this purpose, the authors present a fault detection method using simple and non-intrusive on-line technique based on the space signature of the cell voltages. The authors have the objective to minimize the number of embedded sensors and instrumentation in order to get a precise, reliable and economic solution in a mass market application. A very low number of sensors are indeed needed for this monitoring and the associated algorithm can be implemented on-line. This technique is validated on a 20-cell PEMFC stack. It demonstrates that the developed method is particularly efficient in flooding case. As a matter of fact, it uses directly the stack as a sensor which enables to get a quick feedback on its state of health.

  2. Validation of doubly labeled water method using a ruminant

    SciTech Connect

    Fancy, S.G.; Blanchard, J.M.; Holleman, D.F.; Kokjer, K.J.; White, R.G.

    1986-07-01

    CO/sub 2/ production (CDP, ml CO/sub 2/ . g-1 . h-1) by captive caribou and reindeer (Rangifer tarandus) was measured using the doubly labeled water method (/sup 3/H/sub 2/O and H2(18)O) and compared with CO/sub 2/ expiration rates (VCO/sub 2/), adjusted for CO/sub 2/ losses in CH4 and urine, as determined by open-circuit respirometry. CDP calculated from samples of blood or urine from a reindeer in winter was 1-3% higher than the adjusted VCO/sub 2/. Differences between values derived by the two methods of 5-20% were found in summer trials with caribou. None of these differences were statistically significant (P greater than 0.05). Differences in summer could in part be explained by the net deposition of /sup 3/H, 18O, and unlabeled CO/sub 2/ in antlers and other growing tissues. Total body water volumes calculated from /sup 3/H/sub 2/O dilution were up to 15% higher than those calculated from H/sub 2/(18)O dilution. The doubly labeled water method appears to be a reasonably accurate method for measuring CDP by caribou and reindeer in winter when growth rates are low, but the method may overestimate CDP by rapidly growing and/or fattening animals.

  3. Validation of a Numerical Method for Determining Liner Impedance

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Tanner, Sharon E.; Parrott, Tony L.

    1996-01-01

    This paper reports the initial results of a test series to evaluate a method for determining the normal incidence impedance of a locally reacting acoustically absorbing liner, located on the lower wall of a duct in a grazing incidence, multi-modal, non-progressive acoustic wave environment without flow. This initial evaluation is accomplished by testing the methods' ability to converge to the known normal incidence impedance of a solid steel plate, and to the normal incidence impedance of an absorbing test specimen whose impedance was measured in a conventional normal incidence tube. The method is shown to converge to the normal incident impedance values and thus to be an adequate tool for determining the impedance of specimens in a grazing incidence, multi-modal, nonprogressive acoustic wave environment for a broad range of source frequencies.

  4. Differences among methods to validate genomic evaluations for dairy cattle

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Two methods of testing predictions from genomic evaluations were investigated. Data used were from the April 2010 and August 2006 official USDA genetic evaluations of dairy cattle. The training data set consisted of both cows and bulls that were proven (had own or daughter information) as of Augus...

  5. Application of neural networks and geomorphometry method for purposes of urban planning (Kazan, Russia)

    NASA Astrophysics Data System (ADS)

    Yermolaev, Oleg; Selivanov, Renat

    2013-04-01

    The landscape structure of a territory imposes serious limitations on the adoption of certain decisions. Differentiation of the relief into separate elementary geomorphological sections yields the basis for most adequate determination of the boundaries of urban geosystems. In paper the results of approbation of relief classification methods based on Artificial Neuron Networks are presented. Approbation of Artificial Neuron Networks (ANN) method (Kohonen's Self-Organizing Maps - SOM) for purposes of automated zoning of a modern city's territory on the example of the city of Kazan. The developed model of the restored landscapes represents the city territory as a system of geomorphologically homogenous terrains. Main research objectives: development of a digital model of relief of the city of Kazan; approbation of relief classification methods based on ANN and expert estimations; creation of a SOM-based map of urban geosystems; verification of the received results of classification, clarification and enlargement of landscape units; determination of the applicability of the method in question for purposes of zoning of big cities' territory, identification of strengths and weaknesses. First stage: analysis and digitalization of the detailed large-scale topographic map of Kazan. Digital model of the relief with a grid size of 10m has been produced. We have used this data for building various analytical maps of certain morphometric characteristics of the relief: height, slope, exposition, profile and plan curvature. Calculated morphometric values were transformed into a data matrix. Software packages use training algorithms without the use of a tutor, whereas weight coefficients are redistributed for each specific operational-territorial unit. After several iterations of the "education" process, neural network leads to gradual clumping of groups of operational-territorial unit with similar sets of morphometric parameters. 81 classes have been distinguished. Such atomism

  6. Cost-Benefit Considerations in Choosing among Cross-Validation Methods.

    ERIC Educational Resources Information Center

    Murphy, Kevin R.

    1984-01-01

    Outlines costs and benefits associated with different cross-validation strategies; in particular the way in which the study design affects the cost and benefits of different types of cross-validation. Suggests that the choice between empirical estimation methods and formula estimates involves a trade-off between accuracy and simplicity. (JAC)

  7. Double Cross-Validation in Multiple Regression: A Method of Estimating the Stability of Results.

    ERIC Educational Resources Information Center

    Rowell, R. Kevin

    In multiple regression analysis, where resulting predictive equation effectiveness is subject to shrinkage, it is especially important to evaluate result replicability. Double cross-validation is an empirical method by which an estimate of invariance or stability can be obtained from research data. A procedure for double cross-validation is…

  8. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  9. A Validation of Elements, Methods, and Barriers to Inclusive High School Service-Learning Programs

    ERIC Educational Resources Information Center

    Dymond, Stacy K.; Chun, Eul Jung; Kim, Rah Kyung; Renzaglia, Adelle

    2013-01-01

    A statewide survey of coordinators of inclusive high school service-learning programs was conducted to validate elements, methods, and barriers to including students with and without disabilities in service-learning. Surveys were mailed to 655 service-learning coordinators; 190 (29%) returned a completed survey. Findings support the validity of…

  10. Establishing Survey Validity and Reliability for American Indians Through “Think Aloud” and Test–Retest Methods

    PubMed Central

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L.; Burgess, Katherine M.; Puumala, Susan E.; Wilton, Georgiana; Hanson, Jessica D.

    2015-01-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a “think aloud” methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test–retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test–retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. PMID:25888693

  11. VALIDATION OF AN EMISSION MEASUREMENT METHOD FOR INORGANIC ARSENIC FROM STATIONARY SOURCES: PROPOSED METHOD 108. LABORATORY AND FIELD TEST EVALUATION

    EPA Science Inventory

    The United States Environmental Protection Agency (USEPA) has listed inorganic arsenic emissions as a hazardous air pollutant. USEPA proposed Method 108 for the measurement of these emissions from stationary sources has been subjected to validation studies in this work. Laborator...

  12. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    SciTech Connect

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated.

  13. System identification methods for aircraft flight control development and validation

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1995-01-01

    System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.

  14. Experimental Validation for Hot Stamping Process by Using Taguchi Method

    NASA Astrophysics Data System (ADS)

    Fawzi Zamri, Mohd; Lim, Syh Kai; Razlan Yusoff, Ahmad

    2016-02-01

    Due to the demand for reduction in gas emissions, energy saving and producing safer vehicles has driven the development of Ultra High Strength Steel (UHSS) material. To strengthen UHSS material such as boron steel, it needed to undergo a process of hot stamping for heating at certain temperature and time. In this paper, Taguchi method is applied to determine the appropriate parameter of thickness, heating temperature and heating time to achieve optimum strength of boron steel. The experiment is conducted by using flat square shape of hot stamping tool with tensile dog bone as a blank product. Then, the value of tensile strength and hardness is measured as response. The results showed that the lower thickness, higher heating temperature and heating time give the higher strength and hardness for the final product. In conclusion, boron steel blank are able to achieve up to 1200 MPa tensile strength and 650 HV of hardness.

  15. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    PubMed Central

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  16. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose. PMID:19368351

  17. Validation of an evacuated canister method for measuring part-per-billion levels of chemical warfare agent simulants.

    PubMed

    Coffey, Christopher C; LeBouf, Ryan F; Calvert, Catherine A; Slaven, James E

    2011-08-01

    The National Institute for Occupational Safety and Health (NIOSH) research on direct-reading instruments (DRIs) needed an instantaneous sampling method to provide independent confirmation of the concentrations of chemical warfare agent (CWA) simulants. It was determined that evacuated canisters would be the method of choice. There is no method specifically validated for volatile organic compounds (VOCs) in the NIOSH Manual of Analytical Methods. The purpose of this study was to validate an evacuated canister method for sampling seven specific VOCs that can be used as a simulant for CWA agents (cyclohexane) or influence the DRI measurement of CWA agents (acetone, chloroform, methylene chloride, methyl ethyl ketone, hexane, and carbon tetrachloride [CCl4]). The method used 6-L evacuated stainless-steel fused silica-lined canisters to sample the atmosphere containing VOCs. The contents of the canisters were then introduced into an autosampler/preconcentrator using a microscale purge and trap (MPT) method. The MPT method trapped and concentrated the VOCs in the air sample and removed most of the carbon dioxide and water vapor. After preconcentration, the samples were analyzed using a gas chromatograph with a mass selective detector. The method was tested, evaluated, and validated using the NIOSH recommended guidelines. The evaluation consisted of determining the optimum concentration range for the method; the sample stability over 30 days; and the accuracy, precision, and bias of the method. This method meets the NIOSH guidelines for six of the seven compounds (excluding acetone) tested in the range of 2.3-50 parts per billion (ppb), making it suitable for sampling of these VOCs at the ppb level. PMID:21874953

  18. Development and validity of a method for the evaluation of printed education material

    PubMed Central

    Castro, Mauro Silveira; Pilger, Diogo; Fuchs, Flávio Danni; Ferreira, Maria Beatriz Cardoso

    Objectives To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM); to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men) and 5 nurses (all women). Results Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group. Conclusions The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material. PMID:25214924

  19. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... 40 Protection of Environment 14 2011-07-01 2011-07-01 false Alternative Validation Procedure...

  20. A validated high performance liquid chromatographic method for the analysis of Goldenseal.

    PubMed

    Li, Wenkui; Fitzloff, John F

    2002-03-01

    Goldenseal (Hydrastis canadensis L.) has emerged as one of the top ten herbal supplements on the worldwide market. A rapid, simple and validated high performance liquid chromatographic method, with photodiode array detection, has been developed for the analysis of commercial Goldenseal products. Samples were treated by sonication with acidified methanol/water. The method was validated for LOD, LOQ, linearity, reproducibility and recovery with good results. PMID:11902811

  1. Validity of Eye Movement Methods and Indices for Capturing Semantic (Associative) Priming Effects

    ERIC Educational Resources Information Center

    Odekar, Anshula; Hallowell, Brooke; Kruse, Hans; Moates, Danny; Lee, Chao-Yang

    2009-01-01

    Purpose: The purpose of this investigation was to evaluate the usefulness of eye movement methods and indices as a tool for studying priming effects by verifying whether eye movement indices capture semantic (associative) priming effects in a visual cross-format (written word to semantically related picture) priming paradigm. Method: In the…

  2. General purpose nonlinear system solver based on Newton-Krylov method.

    Energy Science and Technology Software Center (ESTSC)

    2013-12-01

    KINSOL is part of a software family called SUNDIALS: SUite of Nonlinear and Differential/Algebraic equation Solvers [1]. KINSOL is a general-purpose nonlinear system solver based on Newton-Krylov and fixed-point solver technologies [2].

  3. Bioanalytical method validation: concepts, expectations and challenges in small molecule and macromolecule--a report of PITTCON 2013 symposium.

    PubMed

    Bashaw, Edward D; DeSilva, Binodh; Rose, Mark J; Wang, Yow-Ming C; Shukla, Chinmay

    2014-05-01

    The concepts, importance, and implications of bioanalytical method validation has been discussed and debated for a long time. The recent high profile issues related to bioanalytical method validation at both Cetero Houston and former MDS Canada has brought this topic back in the limelight. Hence, a symposium on bioanalytical method validation with the aim of revisiting the building blocks as well as discussing the challenges and implications on the bioanalysis of both small molecules and macromolecules was featured at the PITTCON 2013 Conference and Expo. This symposium was cosponsored by the American Chemical Society (ACS)-Division of Analytical Chemistry and Analysis and Pharmaceutical Quality (APQ) Section of the American Association of Pharmaceutical Scientists (AAPS) and featured leading speakers from the Food & Drug Administration (FDA), academia, and industry. In this symposium, the speakers shared several unique examples, and this session also provided a platform to discuss the need for continuous vigilance of the bioanalytical methods during drug discovery and development. The purpose of this article is to provide a concise report on the materials that were presented. PMID:24700273

  4. Single Lab Validation of a LC/UV/FLD/MS Method for Simultaneous Determination of Water-soluble Vitamins in Multi-Vitamin Dietary Supplements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this study was to develop a Single-Lab Validated Method using high-performance liquid chromatography (HPLC) with different detectors (diode array detector - DAD, fluorescence detector - FLD, and mass spectrometer - MS) for determination of seven B-complex vitamins (B1 - thiamin, B2 – ...

  5. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to Environmental Protection Agency methods developed by the Office of Water and the Office of... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste...

  6. Alternative Methods for Validating Admissions and Course Placement Criteria. AIR 1995 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Noble, Julie; Sawyer, Richard

    Correlational methods are compared to an alternative method based on decision theory and logistic regression for providing validity evidence for college admissions and course placement criteria. The advantages and limitations of both methods are examined. The correlation coefficient measures the strength of the linear statistical relationship…

  7. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment...

  8. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment...

  9. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment...

  10. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  11. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    PubMed

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method. PMID:19533405

  12. Determination of methylmercury in marine sediment samples: method validation and occurrence data.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography-pyrolysis-atomic fluorescence spectrometry (GC-Py-AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO3/CuSO4, solvent extraction and back extraction into Na2S2O3 yielded the highest extraction recovery, i.e., 94±3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC-ICP-MS), using isotopically enriched Me(201)Hg and (202)Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed. With this in mind, blanks, selectivity, working range (1-800 pg), linearity (0.9995), recovery (94-96%), repeatability (3%), intermediate precision (4%), limit of detection (0.45 pg) and limit of quantification (0.85 pg) were systematically assessed with CRM IAEA-405. The uncertainty budget was calculated and the major contribution to the combined uncertainty (16.24%, k=2) was found to arise from the uncertainty associated with recovery (74.1%). Demonstration of traceability of

  13. Development and validation of videotaped scenarios: a method for targeting specific participant groups.

    PubMed

    Noel, Nora E; Maisto, Stephen A; Johnson, James D; Jackson, Lee A; Goings, Christopher D; Hagman, Brett T

    2008-04-01

    Researchers using scenarios often neglect to validate perceived content and salience of embedded stimuli specifically with intended participants, even when such meaning is integral to the study. For example, sex and aggression stimuli are heavily influenced by culture, so participants may not perceive what researchers intended in sexual aggression scenarios. Using four studies, the authors describe the method of scenario validation to produce two videos assessing alcohol-related sexual aggression. Both videos are identical except for the presence in one video of antiforce cues that are extremely salient to the young heterosexual men. Focus groups and questionnaires validate these men's perceptions that (a) the woman was sexually interested, (b) the sexual cues were salient, (c) the antiforce cues were salient (antiaggression video only), and (e) these antiforce cues inhibited acceptance of forced sex. Results show the value of carefully selecting and validating content when assessing socially volatile variables and provide a useful template for developing culturally valid scenarios. PMID:18252938

  14. A Comparative Study of Information-Based Source Number Estimation Methods and Experimental Validations on Mechanical Systems

    PubMed Central

    Cheng, Wei; Zhang, Zhousuo; Cao, Hongrui; He, Zhengjia; Zhu, Guanwen

    2014-01-01

    This paper investigates one eigenvalue decomposition-based source number estimation method, and three information-based source number estimation methods, namely the Akaike Information Criterion (AIC), Minimum Description Length (MDL) and Bayesian Information Criterion (BIC), and improves BIC as Improved BIC (IBIC) to make it more efficient and easier for calculation. The performances of the abovementioned source number estimation methods are studied comparatively with numerical case studies, which contain a linear superposition case and a both linear superposition and nonlinear modulation mixing case. A test bed with three sound sources is constructed to test the performances of these methods on mechanical systems, and source separation is carried out to validate the effectiveness of the experimental studies. This work can benefit model order selection, complexity analysis of a system, and applications of source separation to mechanical systems for condition monitoring and fault diagnosis purposes. PMID:24776935

  15. VDA, a Method of Choosing a Better Algorithm with Fewer Validations

    PubMed Central

    Kluger, Yuval

    2011-01-01

    The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power. Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico. VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms. Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256

  16. VDA, a method of choosing a better algorithm with fewer validations.

    PubMed

    Strino, Francesco; Parisi, Fabio; Kluger, Yuval

    2011-01-01

    The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power.Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico.VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms.Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256

  17. Development and validation of an ionic chromatography method for the determination of nitrate, nitrite and chloride in meat.

    PubMed

    Lopez-Moreno, Cristina; Perez, Isabel Viera; Urbano, Ana M

    2016-03-01

    The purpose of this study is to develop the validation of a method for the analysis of certain preservatives in meat and to obtain a suitable Certified Reference Material (CRM) to achieve this task. The preservatives studied were NO3(-), NO2(-) and Cl(-) as they serve as important antimicrobial agents in meat to inhibit the growth of bacteria spoilage. The meat samples were prepared using a treatment that allowed the production of a known CRM concentration that is highly homogeneous and stable in time. The matrix effects were also studied to evaluate the influence on the analytical signal for the ions of interest, showing that the matrix influence does not affect the final result. An assessment of the signal variation in time was carried out for the ions. In this regard, although the chloride and nitrate signal remained stable for the duration of the study, the nitrite signal decreased appreciably with time. A mathematical treatment of the data gave a stable nitrite signal, obtaining a method suitable for the validation of these anions in meat. A statistical study was needed for the validation of the method, where the precision, accuracy, uncertainty and other mathematical parameters were evaluated obtaining satisfactory results. PMID:26471608

  18. School Discipline: Have We Lost Our Sense of Purpose in Our Search for a Good Method?

    ERIC Educational Resources Information Center

    Burton, Mary Alice Blanford

    The general economic and psychological evolution in America from a producer society to a consumer society has resulted in a conflict of purposes for American educators regarding school discipline. Consequently, contemporary American educators, unlike their forerunners, have ignored the long term social goals of classroom discipline. They have,…

  19. Validation of an in-vivo proton beam range check method in an anthropomorphic pelvic phantom using dose measurements

    SciTech Connect

    Bentefour, El H. Prieels, Damien; Tang, Shikui; Cascio, Ethan W.; Testa, Mauro; Lu, Hsiao-Ming; Samuel, Deepak; Gottschalk, Bernard

    2015-04-15

    Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification in the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose

  20. Wastewater standards and extraction chemistry in validation of microwave-assisted EPA method 3015A

    SciTech Connect

    Link, D.D.; Walter, P.J.; Kingston, H.M. . Dept. of Chemistry and Biochemistry)

    1999-07-15

    The difficulties associated with the control and transfer of environmental leach methods are discussed. Optimized EPA Method 3015A, a microwave-assisted leach of wastewater and drinking water matrices and aqueous extracts, is evaluated. The option to add HCl in addition to HNO[sub 3] provides better complexation and recovery of certain metals that are regulated by the Resource Conservation and Recovery Act (RCRA) than the original HNO[sub 3]-only Method 3015. Also discussed is the preparation and appropriate use of simulated wastewater standards. Standard reference materials for a wastewater matrix are unavailable, and this novel approach provides NIST-traceability of results for the first time on this matrix type. Leach concentrations from these simulated standards were determined using both the 5 mL HNO[sub 3] and the 4 mL HNO[sub 3] and 1 mL HCl leach options of new Method 3015A. Validation of the new mixed-acid option of Method 3015A has been provided by evaluating its performance on the 23 elements for which original Method 3015 was validated. In addition, validation is provided for boron, mercury, and strontium, elements that were not validated in original Method 3015. Method 3015A has been developed into a method capable of evaluating 26 elements in a single, efficient, 20-min procedure.

  1. Application of EU guidelines for the validation of screening methods for veterinary drugs.

    PubMed

    Stolker, Alida A M Linda

    2012-08-01

    Commission Decision (CD) 2002/657/EC describes detailed rules for method validation within the framework of residue monitoring programmes. The approach described in this CD is based on criteria. For (qualitative) screening methods, the most important criteria is that the CCβ has to be below any regulatory limit. Especially when microbiological or immunochemical methods are involved, the approach described in the CD is not easily applied. For example, by those methods, a large number of analytes (all antibiotics) within several different matrices (meat, milk, fish, eggs, etc.) are detected. It is not completely clear whether all those analytes and all matrices have to be taken into account during method validation. To clarify this, a working group - from EU Reference Laboratories - came up with a practical approach to validate multi-analyte multi-matrix screening methods. It describes how many analyte/matrix combinations have to be tested and how these combinations are selected. Furthermore it describes how to determine CCβ for screening methods in relation to a large list of compounds and maximum residue limits (MRLs). First for each analyte/matrix combination the 'cut-off' level - i.e. the level at which the method separates blanks from contaminated samples - is established. The validation is preferably at the concentration of 50% of the regulatory limit. A minimum set of 20 different samples has to be tested. From the experiences with applying these guidelines it was concluded that the validation approach is very 'practical'; however, there are some remarks. One has to be careful with selecting 'representative' analytes and matrices and it is strongly recommended to collect additional validation data during the routine application of the method. PMID:22851358

  2. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    PubMed

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods. PMID:20012027

  3. Validation of High-Performance Thin-Layer Chromatographic Methods for the Identification of Botanicals in a cGMP Environment

    PubMed Central

    REICH, EIKE; SCHIBLI, ANNE; DEBATT, ALISON

    2009-01-01

    Current Good Manufacturing Practices (cGMP) for botanicals stipulates the use of appropriate methods for identification of raw materials. Due to natural variability, chemical analysis of plant material is a great challenge and requires special approaches. This paper presents a comprehensive proposal to the process of validating qualitative high-performance thin-layer chromatographic (HPTLC) methods, proving that such methods are suitable for the purpose. The steps of the validation process are discussed and illustrated with examples taken from a project aiming at validation of methods for identification of green tea leaf, ginseng root, eleuthero root, echinacea root, black cohosh rhizome, licorice root, kava root, milk thistle aerial parts, feverfew aerial parts, and ginger root. The appendix of the paper, which includes complete documentation and method write-up for those plants, is available on the J. AOAC Int. Website (http://www.atypon-link.com/AOAC/loi/jaoi). PMID:18376581

  4. Verification, Validation, and Solution Quality in Computational Physics: CFD Methods Applied to Ice Sheet Physics

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    2005-01-01

    Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.

  5. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials. PMID:24282943

  6. Bridging the gap between comprehensive extraction protocols in plant metabolomics studies and method validation.

    PubMed

    Bijttebier, Sebastiaan; Van der Auwera, Anastasia; Foubert, Kenn; Voorspoels, Stefan; Pieters, Luc; Apers, Sandra

    2016-09-01

    It is vital to pay much attention to the design of extraction methods developed for plant metabolomics, as any non-extracted or converted metabolites will greatly affect the overall quality of the metabolomics study. Method validation is however often omitted in plant metabolome studies, as the well-established methodologies for classical targeted analyses such as recovery optimization cannot be strictly applied. The aim of the present study is to thoroughly evaluate state-of-the-art comprehensive extraction protocols for plant metabolomics with liquid chromatography-photodiode array-accurate mass mass spectrometry (LC-PDA-amMS) by bridging the gap with method validation. Validation of an extraction protocol in untargeted plant metabolomics should ideally be accomplished by validating the protocol for all possible outcomes, i.e. for all secondary metabolites potentially present in the plant. In an effort to approach this ideal validation scenario, two plant matrices were selected based on their wide versatility of phytochemicals: meadowsweet (Filipendula ulmaria) for its polyphenols content, and spicy paprika powder (from the genus Capsicum) for its apolar phytochemicals content (carotenoids, phytosterols, capsaicinoids). These matrices were extracted with comprehensive extraction protocols adapted from literature and analysed with a generic LC-PDA-amMS characterization platform that was previously validated for broad range phytochemical analysis. The performance of the comprehensive sample preparation protocols was assessed based on extraction efficiency, repeatability and intermediate precision and on ionization suppression/enhancement evaluation. The manuscript elaborates on the finding that none of the extraction methods allowed to exhaustively extract the metabolites. Furthermore, it is shown that depending on the extraction conditions enzymatic degradation mechanisms can occur. Investigation of the fractions obtained with the different extraction methods

  7. External Standards or Standard Additions? Selecting and Validating a Method of Standardization.

    ERIC Educational Resources Information Center

    Harvey, David

    2002-01-01

    Reports an experiment which is suitable for an introductory course in analytical chemistry and which illustrates the importance of matrix effects when selecting a method of standardization. Asserts that students learn how a spike recovery is used to validate an analytical method, and obtain practical experience in the difference between performing…

  8. Multi-laboratory validation of a standard method for quantifying proanthocyanidins in cranberry powders

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this study was to validate an improved 4-dimethylaminocinnamaldehyde (DMAC) colorimetric method using a commercially available standard (procyanidin A2), for the standard method for quantification of proanthocyanidins (PACs) in cranberry powders, in order to establish dosage guideli...

  9. Method of validating measurement data of a process parameter from a plurality of individual sensor inputs

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1998-01-01

    A method for generating a validated measurement of a process parameter at a point in time by using a plurality of individual sensor inputs from a scan of said sensors at said point in time. The sensor inputs from said scan are stored and a first validation pass is initiated by computing an initial average of all stored sensor inputs. Each sensor input is deviation checked by comparing each input including a preset tolerance against the initial average input. If the first deviation check is unsatisfactory, the sensor which produced the unsatisfactory input is flagged as suspect. It is then determined whether at least two of the inputs have not been flagged as suspect and are therefore considered good inputs. If two or more inputs are good, a second validation pass is initiated by computing a second average of all the good sensor inputs, and deviation checking the good inputs by comparing each good input including a present tolerance against the second average. If the second deviation check is satisfactory, the second average is displayed as the validated measurement and the suspect sensor as flagged as bad. A validation fault occurs if at least two inputs are not considered good, or if the second deviation check is not satisfactory. In the latter situation the inputs from each of all the sensors are compared against the last validated measurement and the value from the sensor input that deviates the least from the last valid measurement is displayed.

  10. Tutorial review on validation of liquid chromatography-mass spectrometry methods: part I.

    PubMed

    Kruve, Anneli; Rebane, Riin; Kipper, Karin; Oldekop, Maarja-Liisa; Evard, Hanno; Herodes, Koit; Ravio, Pekka; Leito, Ivo

    2015-04-22

    This is the part I of a tutorial review intending to give an overview of the state of the art of method validation in liquid chromatography mass spectrometry (LC-MS) and discuss specific issues that arise with MS (and MS/MS) detection in LC (as opposed to the "conventional" detectors). The Part I briefly introduces the principles of operation of LC-MS (emphasizing the aspects important from the validation point of view, in particular the ionization process and ionization suppression/enhancement); reviews the main validation guideline documents and discusses in detail the following performance parameters: selectivity/specificity/identity, ruggedness/robustness, limit of detection, limit of quantification, decision limit and detection capability. With every method performance characteristic its essence and terminology are addressed, the current status of treating it is reviewed and recommendations are given, how to determine it, specifically in the case of LC-MS methods. PMID:25819785

  11. Bioanalytical method validation considerations for LC-MS/MS assays of therapeutic proteins.

    PubMed

    Duggan, Jeffrey X; Vazvaei, Faye; Jenkins, Rand

    2015-01-01

    This paper highlights the recommendations of a group of industry scientists in validating regulated bioanalytical LC-MS/MS methods for protein therapeutics in a 2015 AAPSJ White Paper. This group recommends that most of the same precision and accuracy validation criteria used for ligand-binding assays (LBAs) be applied to LC-MS/MS-based assays where proteins are quantified using the LC-MS/MS signal from a surrogate peptide after proteolytic digestion (PrD-LCMS methods). PrD-LCMS methods are generally more complex than small molecule LC-MS/MS assays and may often include LBA procedures, leading to the recommendation for a combination of chromatographic and LBA validation strategies and appropriate acceptance criteria. Several key aspects of this bioanalytical approach that are discussed in the White Paper are treated here in additional detail. These topics include selectivity/specificity, matrix effect, digestion efficiency, stability and critical reagent considerations. PMID:26110712

  12. Display format, highlight validity, and highlight method: Their effects on search performance

    NASA Technical Reports Server (NTRS)

    Donner, Kimberly A.; Mckay, Tim D.; Obrien, Kevin M.; Rudisill, Marianne

    1991-01-01

    Display format and highlight validity were shown to affect visual display search performance; however, these studies were conducted on small, artificial displays of alphanumeric stimuli. A study manipulating these variables was conducted using realistic, complex Space Shuttle information displays. A 2x2x3 within-subjects analysis of variance found that search times were faster for items in reformatted displays than for current displays. Responses to valid applications of highlight were significantly faster than responses to non or invalidly highlighted applications. The significant format by highlight validity interaction showed that there was little difference in response time to both current and reformatted displays when the highlight validity was applied; however, under the non or invalid highlight conditions, search times were faster with reformatted displays. A separate within-subject analysis of variance of display format, highlight validity, and several highlight methods did not reveal a main effect of highlight method. In addition, observed display search times were compared to search time predicted by Tullis' Display Analysis Program. Benefits of highlighting and reformatting displays to enhance search and the necessity to consider highlight validity and format characteristics in tandem for predicting search performance are discussed.

  13. Validity and Feasibility of a Digital Diet Estimation Method for Use with Preschool Children: A Pilot Study

    ERIC Educational Resources Information Center

    Nicklas, Theresa A.; O'Neil, Carol E.; Stuff, Janice; Goodell, Lora Suzanne; Liu, Yan; Martin, Corby K.

    2012-01-01

    Objective: The goal of the study was to assess the validity and feasibility of a digital diet estimation method for use with preschool children in "Head Start." Methods: Preschool children and their caregivers participated in validation (n = 22) and feasibility (n = 24) pilot studies. Validity was determined in the metabolic research unit using…

  14. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... chemical have no reporting responsibilities under this Part. Note, however, that any method of extraction... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of... otherwise required, respondents must report only about quantities of a chemical that is defined as...

  15. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... chemical have no reporting responsibilities under this Part. Note, however, that any method of extraction... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of... otherwise required, respondents must report only about quantities of a chemical that is defined as...

  16. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures § 46.261... method or procedure. 46.261 Section 46.261 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO...

  17. Use of glyceroltriheptanoate as marker for processed animal by-products: development and validation of an analytical method.

    PubMed

    von Holst, C; Boix, A; Bellorini, S; Serano, F; Androni, S; Verkuylen, B; Margry, R

    2009-04-01

    A recently published European Regulation requires that the artificial marker, glycerol triheptanoate (GTH), be added to processed animal by-product (ABPs) prohibited from entering the food chain. The objective of this new requirement is to allow full traceability and ensure that these materials are disposed of in a proper way. Here, we report the development and single-laboratory validation of an analytical method for the determination of GTH in meat and bone meal plus animal fat. The method comprises three steps: (1) extraction of GTH from the samples with petroleum ether when analysing meat and bone meal or dissolving the sample in n-hexane when analysing fat; (2) clean-up of the extract using commercially available SPE cartridges; (3) determination of GTH by GC/MS or GC with flame ionisation detection (FID). The results of the validation study demonstrated that the relative standard for intermediate precision varied between 2.5 and 8.2%, depending on GTH concentration and the detector utilised. In all cases, the relative recovery rate was above 96%. The limit of quantification was 16 mg kg(-1) (GTH/fat content of the sample) with MS as detector and 20 mg kg(-1) with FID. Moreover, the method has been successfully applied in a second laboratory, indicating its transferability. Considering the minimum GTH concentration in ABPs of 250 mg kg(-1), the method is considered suitable for the intended purpose and can be utilised by EU Member States laboratories for official control and monitoring. PMID:19680920

  18. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  19. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  20. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  1. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  2. From the Bronx to Bengifunda (and Other Lines of Flight): Deterritorializing Purposes and Methods in Science Education Research

    ERIC Educational Resources Information Center

    Gough, Noel

    2011-01-01

    In this essay I explore a number of questions about purposes and methods in science education research prompted by my reading of Wesley Pitts' ethnographic study of interactions among four students and their teacher in a chemistry classroom in the Bronx, New York City. I commence three "lines of flight" (small acts of Deleuzo-Guattarian…

  3. Validated spectrofluorimetric method for the determination of clonazepam in pharmaceutical preparations.

    PubMed

    Ibrahim, Fawzia; El-Enany, Nahed; Shalan, Shereen; Elsharawy, Rasha

    2016-05-01

    A simple, highly sensitive and validated spectrofluorimetric method was applied in the determination of clonazepam (CLZ). The method is based on reduction of the nitro group of clonazepam with zinc/CaCl2 , and the product is then reacted with 2-cyanoacetamide (2-CNA) in the presence of ammonia (25%) yielding a highly fluorescent product. The produced fluorophore exhibits strong fluorescence intensity at ʎem = 383 nm after excitation at ʎex = 333 nm. The method was rectilinear over a concentration range of 0.1-0.5 ng/mL with a limit of detection (LOD) of 0.0057 ng/mL and a limit of quantification (LOQ) of 0.017 ng/mL. The method was fully validated and successfully applied to the determination of CLZ in its tablets with a mean percentage recovery of 100.10 ± 0.75%. Method validation according to ICH Guidelines was evaluated. Statistical analysis of the results obtained using the proposed method was successfully compared with those obtained using a reference method, and there was no significance difference between the two methods in terms of accuracy and precision. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26335592

  4. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    EPA Science Inventory

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  5. Developing and Validating the Youth Conduct Problems Scale-Rwanda: A Mixed Methods Approach

    PubMed Central

    Ng, Lauren C.; Kanyanganzi, Frederick; Munyanah, Morris; Mushashi, Christine; Betancourt, Theresa S.

    2014-01-01

    This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R). Qualitative free listing (n = 74) and key informant interviews (n = 47) identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research. PMID:24949628

  6. Validation of the Eriksen method for the exact Foldy-Wouthuysen representation

    NASA Astrophysics Data System (ADS)

    Silenko, A. Ya.

    2013-05-01

    The Eriksen method is proven to yield a correct and exact result when a sufficient condition of exact transformation to the Foldy-Wouthuysen (FW) representation is satisfied. Therefore, the Eriksen method is confirmed as valid. This makes it possible to establish the limits within which the approximate "step-by-step" methods are applicable. The latter is done by comparing the relativistic formulas for a Hamiltonian operator in FW representation (obtained using those methods) and the known expression for the first terms of a series, which defines the expansion of this operator in powers of v/ c as found by applying the Eriksen method.

  7. Suitability of analytical methods to measure solubility for the purpose of nanoregulation.

    PubMed

    Tantra, Ratna; Bouwmeester, Hans; Bolea, Eduardo; Rey-Castro, Carlos; David, Calin A; Dogné, Jean-Michel; Jarman, John; Laborda, Francisco; Laloy, Julie; Robinson, Kenneth N; Undas, Anna K; van der Zande, Meike

    2016-01-01

    Solubility is an important physicochemical parameter in nanoregulation. If nanomaterial is completely soluble, then from a risk assessment point of view, its disposal can be treated much in the same way as "ordinary" chemicals, which will simplify testing and characterisation regimes. This review assesses potential techniques for the measurement of nanomaterial solubility and evaluates the performance against a set of analytical criteria (based on satisfying the requirements as governed by the cosmetic regulation as well as the need to quantify the concentration of free (hydrated) ions). Our findings show that no universal method exists. A complementary approach is thus recommended, to comprise an atomic spectrometry-based method in conjunction with an electrochemical (or colorimetric) method. This article shows that although some techniques are more commonly used than others, a huge research gap remains, related with the need to ensure data reliability. PMID:26001188

  8. Review of gaseous methods of killing poultry on-farm for disease control purposes.

    PubMed

    Raj, A B M; Sandilands, V; Sparks, N H C

    2006-08-19

    Poultry may need to be culled in the event of an outbreak of disease. Gassing has advantages over mechanical and electrical methods or overdoses of anaesthetics because large numbers can be killed simultaneously and little or no handling of the birds is required. However, gaseous killing methods may have welfare implications for the birds, which may find various gases more or less aversive, may undergo respiratory distress and/or experience convulsions, and may remain conscious for a considerable time before they die. In addition, the gases used may present health and safety risks to human operators, and be difficult to supply and deliver. PMID:16921011

  9. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  10. Experimental validation of applied strain sensors: importance, methods and still unsolved challenges

    NASA Astrophysics Data System (ADS)

    Habel, Wolfgang R.; Schukar, Vivien G.; Mewis, Franziska; Kohlhoff, Harald

    2013-09-01

    Fiber-optic strain sensors are increasingly used in very different technical fields. Sensors are provided with specifications defined by the manufacturer or ascertained by the interested user. If deformation sensors are to be used to evaluate the long-term behavior of safety-relevant structures or to monitor critical structure components, their performance and signal stability must be of high quality to enable reliable data recording. The measurement system must therefore be validated according to established technical rules and standards before its application and after. In some cases, not all details of the complex characteristic and performance of applied fiber-optic sensors are sufficiently understood, or can be validated because of a lack of knowledge and methods to check the sensors' behavior. This contribution focusses therefore on the importance of serious validation in avoiding a decrease or even deterioration of the sensors' function. Methods for validation of applied sensors are discussed and should reveal weaknesses in validation of embedded or integrated fiber-optic deformation and/or strain sensors. An outlook to some research work that has to be carried out to ensure a well-accepted practical use of fiber-optic sensors is given.

  11. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  12. A validation framework for microbial forensic methods based on statistical pattern recognition

    SciTech Connect

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  13. 77 FR 61610 - Interagency Coordinating Committee on the Validation of Alternative Methods Evaluation Report and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-10

    ...-18, 2010 meeting (75 FR 26758, May 12, 2010) for comment. The public was also given an opportunity to....niehs.nih.gov/methods/ocutox/reducenum.htm ) for comment by the broad stakeholder community (76 FR 50220... HUMAN SERVICES National Institutes of Health Interagency Coordinating Committee on the Validation...

  14. Validity of a Simulation Game as a Method for History Teaching

    ERIC Educational Resources Information Center

    Corbeil, Pierre; Laveault, Dany

    2011-01-01

    The aim of this research is, first, to determine the validity of a simulation game as a method of teaching and an instrument for the development of reasoning and, second, to study the relationship between learning and students' behavior toward games. The participants were college students in a History of International Relations course, with two…

  15. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  16. [Comparison of the selected methods of cord processing for transplantation purposes].

    PubMed

    Ołdak, T; Machaj, E K; Gajkowska, A; Kruszewski, M; Kłos, M; Szczecina, R; Czajkowski, K; Kuczyńska-Sicińska, J; Pojda, Z

    2000-09-01

    Human umbilical cord blood (UCB) has been successfully used as a source of allogeneic hematopoietic cells for transplantation. Banking of the UCB requires its volume reduction to decrease storage space, costs and volume of infused DMSO. In order to select an optimal method for volume reduction we compared several methods of cord blood processing, namely buffy coat centrifugation, red cell lysis, hydroxyethyl starch (HES)-, methylcellulose- and gelatin-sedimentations. The viability of cells and the recoveries of total white blood cells, mononuclear cells and CD34+ cells was evaluated. We also compared the efficacy of red cells depletion from the original UCB sample. Buffy coat centrifugation, red cell lysis, HES, gelatin or methylcellulose resulted in high mononuclear cell recoveries, whereas high hematopoietic cell recovery was observed only after HES sedimentation and buffy coat processing. The HES sedimentation procedure compared to buffy coat processing is more time and labor consuming and resulted in higher red blood cell and platelets depletion. Both methods can be recommended as a method at choice for the umbilical cord blood processing before banking. PMID:11083012

  17. Comparison of different mass transport calculation methods for wind erosion quantification purposes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative estimation of the material transported by the wind is essential in the study and control of wind erosion, although methods for its calculation are still controversial. Sampling the dust cloud at discrete heights, fitting an equation to the data, and integrating this equation from the so...

  18. Development and Validation of Stability Indicating RP-HPLC Method for Voriconazole.

    PubMed

    Khetre, A B; Sinha, P K; Damle, Mrinalini C; Mehendre, R

    2009-09-01

    This study describes the development and validation of stability indicating HPLC method for voriconazole, an antifungal drug. Voriconazole was subjected to stress degradation under different conditions recommended by International Conference on Harmonization. The sample so generated was used to develop a stability-indicating high performance liquid chromatographic method for voriconazole. The peak for voriconazole was well resolved from peaks of degradation products, using a Hypersil C18 (250x4.6 mm) column and a mobile phase comprising of acetonitrile: water (40:60, v/v), at flow rate of 1 ml/min. Detection was carried out using photodiode array detector. A linear response (r > 0.99) was observed in the range of 5-25 mug/ml. The method showed good recoveries (average 100.06%) and relative standard deviation for intra and inter-day were method was validated for specificity and robustness also. PMID:20502568

  19. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders. PMID:23179190

  20. Validity and reliability of an alternative method for measuring power output during six-second all-out cycling.

    PubMed

    Watson, Martin; Bibbo, Daniele; Duffy, Charles R; Riches, Philip E; Conforto, Silvia; Macaluso, Andrea

    2014-08-01

    In a laboratory setting where both a mechanically-braked cycling ergometer and a motion analysis (MA) system are available, flywheel angular displacement can be estimated by using MA. The purpose of this investigation was to assess the validity and reliability of a MA method for measuring maximal power output (Pmax) in comparison with a force transducer (FT) method. Eight males and eight females undertook three identical sessions, separated by 4 to 6 days; the first being a familiarization session. Individuals performed three 6-second sprints against 50% of the maximal resistance to complete two pedal revolutions with a 3-minute rest between trials. Power was determined independently using both MA and FT analyses. Validity: MA recorded significantly higher Pmax than FT (P < .05). Bland-Altman plots showed that there was a systematic bias in the difference between the measures of the two systems. This difference increased as power increased. Repeatability: Intraclass correlation coefficients were on average 0.90 ± 0.05 in males and 0.85 ± 0.08 in females. Measuring Pmax by MA, therefore, is as appropriate for use in exercise physiology research as Pmax measured by FT, provided that a bias between these measurements methods is allowed for. PMID:24977624

  1. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    NASA Astrophysics Data System (ADS)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  2. Refraction-based X-ray Computed Tomography for Biomedical Purpose Using Dark Field Imaging Method

    NASA Astrophysics Data System (ADS)

    Sunaguchi, Naoki; Yuasa, Tetsuya; Huo, Qingkai; Ichihara, Shu; Ando, Masami

    We have proposed a tomographic x-ray imaging system using DFI (dark field imaging) optics along with a data-processing method to extract information on refraction from the measured intensities, and a reconstruction algorithm to reconstruct a refractive-index field from the projections generated from the extracted refraction information. The DFI imaging system consists of a tandem optical system of Bragg- and Laue-case crystals, a positioning device system for a sample, and two CCD (charge coupled device) cameras. Then, we developed a software code to simulate the data-acquisition, data-processing, and reconstruction methods to investigate the feasibility of the proposed methods. Finally, in order to demonstrate its efficacy, we imaged a sample with DCIS (ductal carcinoma in situ) excised from a breast cancer patient using a system constructed at the vertical wiggler beamline BL-14C in KEK-PF. Its CT images depicted a variety of fine histological structures, such as milk ducts, duct walls, secretions, adipose and fibrous tissue. They correlate well with histological sections.

  3. Development and Validation of a Terbium-Sensitized LuminescenceAnalytical Method for Deferiprone

    PubMed Central

    Manzoori Lashkar, Jamshid; Amjadi, Mohammad; Soleymani, Jafar; Tamizi, Elnaz; Panahi-Azar, Vahid; Jouyban, Abolghasem

    2012-01-01

    A sensitive fluorometric method for the determination of deferiprone (DFP) based on the formation of a luminescent complex with Tb3+ ions in aqueous solutions is reported. The maximum excitation and emission wavelengths were 295 and 545 nm, respectively. The effects of various factors on the luminescence intensity of the system were investigated and optimized, then under the optimum conditions, the method was validated. The method validation results indicated that the relative intensity at 545 nm has a linear relationship with the concentration of DFP in aqueous solutions at the range of 7.2 × 10-9 to 1.4 × 10-5 M, the detection and quantification limits were calculated respectively as 6.3 × 10-9 and 2.1 × 10-8 M, precision and accuracy of the method were lower than 5% and the recovery was between 100.1% and 102.3%. The results indicated that this method was simple, time saving, specific, accurate and precise for the determination of DFP in aqueous solutions. After optimization and validation, the method successfully applied for determination of DFP in tablet dosage forms. The stoichiometry of the Tb3+-DFP complex was found as 1:3 and the complex formation constant was 1.6 × 1016. PMID:24250504

  4. E-Flux2 and SPOT: Validated Methods for Inferring Intracellular Metabolic Flux Distributions from Transcriptomic Data

    PubMed Central

    Kim, Min Kyung; Lane, Anatoliy; Kelley, James J.; Lun, Desmond S.

    2016-01-01

    Background Several methods have been developed to predict system-wide and condition-specific intracellular metabolic fluxes by integrating transcriptomic data with genome-scale metabolic models. While powerful in many settings, existing methods have several shortcomings, and it is unclear which method has the best accuracy in general because of limited validation against experimentally measured intracellular fluxes. Results We present a general optimization strategy for inferring intracellular metabolic flux distributions from transcriptomic data coupled with genome-scale metabolic reconstructions. It consists of two different template models called DC (determined carbon source model) and AC (all possible carbon sources model) and two different new methods called E-Flux2 (E-Flux method combined with minimization of l2 norm) and SPOT (Simplified Pearson cOrrelation with Transcriptomic data), which can be chosen and combined depending on the availability of knowledge on carbon source or objective function. This enables us to simulate a broad range of experimental conditions. We examined E. coli and S. cerevisiae as representative prokaryotic and eukaryotic microorganisms respectively. The predictive accuracy of our algorithm was validated by calculating the uncentered Pearson correlation between predicted fluxes and measured fluxes. To this end, we compiled 20 experimental conditions (11 in E. coli and 9 in S. cerevisiae), of transcriptome measurements coupled with corresponding central carbon metabolism intracellular flux measurements determined by 13C metabolic flux analysis (13C-MFA), which is the largest dataset assembled to date for the purpose of validating inference methods for predicting intracellular fluxes. In both organisms, our method achieves an average correlation coefficient ranging from 0.59 to 0.87, outperforming a representative sample of competing methods. Easy-to-use implementations of E-Flux2 and SPOT are available as part of the open

  5. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    NASA Astrophysics Data System (ADS)

    Mermet, J. M.; Granier, G.

    2012-10-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725-4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation.

  6. Standardization of water purification in the central dialysis fluid delivery system: validation and parametric method.

    PubMed

    Tomo, Tadashi; Shinoda, Tosiho

    2009-01-01

    The central dialysis fluid delivery system (CDDS) has been mainly used for hemodialysis therapy in Japan. Validation and a parametric method are necessary for the quality control of dialysis fluid in CDDS. Validation is a concept for the assurance of system compatibility and product quality, and is defined as follows: the manufacturing and quality control methods including the system design and equipment of the manufacturing facility, manufacturing procedure and processes. Confirmed results must be kept within acceptable limits and they must be documented in a record. Important parameters for validating CDDS include: (1) setting the sterilized area; (2) decision of sterilization level; (3) confirmation of the maximum bio-burden; (4) performance of endotoxin retentive filter and reverse osmosis (RO) module, and (5) checkpoints of purity of dialysis water in the system. Taking the concept of validation and a parametric method in the management of CDDS into consideration enables the supply the purified dialysis fluid or the online prepared substitution fluid that meet the 2008 standards of the Japanese Society for Dialysis Therapy. PMID:19556762

  7. Multiple methods, maps, and management applications: Purpose made seafloor maps in support of ocean management

    NASA Astrophysics Data System (ADS)

    Brown, Craig J.; Sameoto, Jessica A.; Smith, Stephen J.

    2012-08-01

    The establishment of multibeam echosounders (MBES) as a mainstream tool in ocean mapping has facilitated integrative approaches toward nautical charting, benthic habitat mapping, and seafloor geotechnical surveys. The inherent bathymetric and backscatter information generated by MBES enables marine scientists to present highly accurate bathymetric data with a spatial resolution closely matching that of terrestrial mapping. Furthermore, developments in data collection and processing of MBES backscatter, combined with the quality of the co-registered depth information, have resulted in the increasing preferential use of multibeam technology over conventional sidescan sonar for the production of benthic habitat maps. A range of post-processing approaches can generate customized map products to meet multiple ocean management needs, thus extracting maximum value from a single survey data set. Based on recent studies over German Bank off SW Nova Scotia, Canada, we show how primary MBES bathymetric and backscatter data, along with supplementary data (i.e. in situ video and stills), were processed using a variety of methods to generate a series of maps. Methods conventionally used for classification of multi-spectral data were tested for classification of the MBES data set to produce a map summarizing broad bio-physical characteristics of the seafloor (i.e. a benthoscape map), which is of value for use in many aspects of marine spatial planning. A species-specific habitat map for the sea scallop Placopecten magellanicus was also generated from the MBES data by applying a Species Distribution Modeling (SDM) method to spatially predict habitat suitability, which offers tremendous promise for use in fisheries management. In addition, we explore the challenges of incorporating benthic community data into maps based on species information derived from a large number of seafloor photographs. Through the process of applying multiple methods to generate multiple maps for

  8. Assembly for collecting samples for purposes of identification or analysis and method of use

    DOEpatents

    Thompson, Cyril V [Knoxville, TN; Smith, Rob R [Knoxville, TN

    2010-02-02

    An assembly and an associated method for collecting a sample of material desired to be characterized with diagnostic equipment includes or utilizes an elongated member having a proximal end with which the assembly is manipulated by a user and a distal end. In addition, a collection tip which is capable of being placed into contact with the material to be characterized is supported upon the distal end. The collection tip includes a body of chemically-inert porous material for binding a sample of material when the tip is placed into contact with the material and thereby holds the sample of material for subsequent introduction to the diagnostic equipment.

  9. Co-validation of three methods for optical characterization of point-focus concentrators

    NASA Astrophysics Data System (ADS)

    Wendelin, T. J.; Grossman, J. W.

    Three different methods for characterizing point-focus solar concentrator optical performance have been developed for specific applications. These methods include a laser ray trace technique called the Scanning Hartmann Optical Test, a video imaging process called the 2f Technique and actual on-sun testing in conjunction with optical computer modeling. Three concentrator test articles, each of a different design, were characterized using at least two of the methods and, in one case, all three. The results of these tests are compared in order to validate the methods. Excellent agreement is observed in the results, suggesting that the techniques provide consistent and accurate characterizations of solar concentrator optics.

  10. Co-validation of three methods for optical characterization of point-focus concentrators

    SciTech Connect

    Wendelin, T.J.; Grossman, J.W.

    1994-10-01

    Three different methods for characterizing point-focus solar concentrator optical performance have been developed for specific applications. These methods include a laser ray trace technique called the Scanning Hartmann Optical Test, a video imaging process called the 2f Technique and actual on-sun testing in conjunction with optical computer modeling. Three concentrator test articles, each of a different design, were characterized using at least two of the methods and, in one case, all three. The results of these tests are compared in order to validate the methods. Excellent agreement is observed in the results, suggesting that the techniques provide consistent and accurate characterizations of solar concentrator optics.

  11. PAH detection in Quercus robur leaves and Pinus pinaster needles: A fast method for biomonitoring purpose.

    PubMed

    De Nicola, F; Concha Graña, E; Aboal, J R; Carballeira, A; Fernández, J Á; López Mahía, P; Prada Rodríguez, D; Muniategui Lorenzo, S

    2016-06-01

    Due to the complexity and heterogeneity of plant matrices, new procedure should be standardized for each single biomonitor. Thus, here is described a matrix solid-phase dispersion extraction method, previously used for moss samples, improved and modified for the analyses of PAHs in Quercus robur leaves and Pinus pinaster needles, species widely used in biomonitoring studies across Europe. The improvements compared to the previous procedure are the use of Florisil added with further clean-up sorbents, 10% deactivated silica for pine needles and PSA for oak leaves, being these matrices rich in interfering compounds, as shown by the gas chromatography-mass spectrometry analyses acquired in full scan mode. Good trueness, with values in the range 90-120% for the most of compounds, high precision (intermediate precision between 2% and 12%) and good sensitivity using only 250mg of samples (limits of quantification lower than 3 and 1.5ngg(-1), respectively for pine and oak) were achieved by the selected procedures. These methods proved to be reliable for PAH analyses and, having advantage of fastness, can be used in biomonitoring studies of PAH air contamination. PMID:27130099

  12. Owner-collected swabs of pets: a method fit for the purpose of zoonoses research.

    PubMed

    Möbius, N; Hille, K; Verspohl, J; Wefstaedt, P; Kreienbrock, L

    2013-09-01

    As part of the preparation of a large cohort study in the entire German population, this study examined the feasibility of cat and dog owners collecting nasal and oral swabs of their animals at home as a method of assessing exposure to zoonoses. In veterinary clinics in Hannover, Germany, 100 pet owners were recruited. Nasal and oral swabs of pets were taken by a veterinarian at the clinic and owners took swabs at home. Swabs were analysed regarding bacterial growth and compared (owner vs. vet) using Cohen's kappa and McNemar's test. The return rate of kits was 92%, and 77% of owners thought it unnecessary to have veterinarian assistance to swab the mouth. McNemar's test results: oral swabs 78% agreement with Gram-positive bacterial growth, 87% agreement with Gram-negative bacterial growth; with similar results for nasal swabs. Although sample quality differed, this method allowed the receipt of swabs from pets in order to obtain information about colonization with zoonotic pathogens. PMID:23114113

  13. Cross-validation of component models: a critical look at current methods.

    PubMed

    Bro, R; Kjeldahl, K; Smilde, A K; Kiers, H A L

    2008-03-01

    In regression, cross-validation is an effective and popular approach that is used to decide, for example, the number of underlying features, and to estimate the average prediction error. The basic principle of cross-validation is to leave out part of the data, build a model, and then predict the left-out samples. While such an approach can also be envisioned for component models such as principal component analysis (PCA), most current implementations do not comply with the essential requirement that the predictions should be independent of the entity being predicted. Further, these methods have not been properly reviewed in the literature. In this paper, we review the most commonly used generic PCA cross-validation schemes and assess how well they work in various scenarios. PMID:18214448

  14. Validating 3D Seismic Velocity Models Using the Spectral Element Method

    NASA Astrophysics Data System (ADS)

    Maceira, M.; Rowe, C. A.; Allen, R. M.; Obrebski, M. J.

    2010-12-01

    As seismic instrumentation, data storage and dissemination and computational power improve, seismic velocity models attempt to resolve smaller structures and cover larger areas. However, it is unclear how accurate these velocity models are and, while the best models available are used for event determination, it is difficult to put uncertainties on seismic event parameters. Model validation is typically done using resolution tests that assume the imaging theory used is accurate and thus only considers the impact of the data coverage on resolution. We present the results of a more rigorous approach to model validation via full three-dimensional waveform propagation using Spectral Element Methods (SEM). This approach makes no assumptions about the theory used to generate the models but require substantial computational resources. We first validate 3D tomographic models for the Western USA generated using both ray-theoretical and finite-frequency methods. The Dynamic North America (DNA) Models of P- and S- velocity structure (DNA09-P and DNA09-S) use teleseismic body-wave traveltime residuals recorded at over 800 seismic stations provided by the Earthscope USArray and regional seismic networks. We performed systematic computations of synthetics for the dataset used to generate the DNA models. Direct comparison of these synthetic seismograms to the actual observations allows us to accurately assess and validate the models. Implementation of the method for a densely instrumented region such as that covered by the DNA model provides a useful testbed for the validation methods that we will subsequently apply to other, more challenging study areas.

  15. A method in search of a purpose: the internal morality of medicine.

    PubMed

    Arras, J D

    2001-12-01

    I begin this commentary with an expanded typology of theories that endorse an internal morality of medicine. I then subject these theories to a philosophical critique. I argue that the more robust claims for an internal morality fail to establish a stand-alone method for bioethics because they ignore crucial non-medical values, violate norms of justice and fail to establish the normativity of medical values. I then argue that weaker versions of internalism avoid such problems, but at the cost of failing to provide a clear sense in which their moral norms are internal or can ground a comprehensive approach to moral problems. Finally, I explore various functions that an internal morality might serve, concluding with the observation that, while there may be a core of good sense to the notion of an internal morality of medicine, our expectations for it must be drastically lowered. PMID:11735054

  16. A method of setting limits for the purpose of quality assurance.

    PubMed

    Sanghangthum, Taweap; Suriyapee, Sivalee; Kim, Gwe-Ya; Pawlicki, Todd

    2013-10-01

    The result from any assurance measurement needs to be checked against some limits for acceptability. There are two types of limits; those that define clinical acceptability (action limits) and those that are meant to serve as a warning that the measurement is close to the action limits (tolerance limits). Currently, there is no standard procedure to set these limits. In this work, we propose an operational procedure to set tolerance limits and action limits. The approach to establish the limits is based on techniques of quality engineering using control charts and a process capability index. The method is different for tolerance limits and action limits with action limits being categorized into those that are specified and unspecified. The procedure is to first ensure process control using the I-MR control charts. Then, the tolerance limits are set equal to the control chart limits on the I chart. Action limits are determined using the Cpm process capability index with the requirements that the process must be in-control. The limits from the proposed procedure are compared to an existing or conventional method. Four examples are investigated: two of volumetric modulated arc therapy (VMAT) point dose quality assurance (QA) and two of routine linear accelerator output QA. The tolerance limits range from about 6% larger to 9% smaller than conventional action limits for VMAT QA cases. For the linac output QA, tolerance limits are about 60% smaller than conventional action limits. The operational procedure describe in this work is based on established quality management tools and will provide a systematic guide to set up tolerance and action limits for different equipment and processes. PMID:24043363

  17. A method of setting limits for the purpose of quality assurance

    NASA Astrophysics Data System (ADS)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Kim, Gwe-Ya; Pawlicki, Todd

    2013-10-01

    The result from any assurance measurement needs to be checked against some limits for acceptability. There are two types of limits; those that define clinical acceptability (action limits) and those that are meant to serve as a warning that the measurement is close to the action limits (tolerance limits). Currently, there is no standard procedure to set these limits. In this work, we propose an operational procedure to set tolerance limits and action limits. The approach to establish the limits is based on techniques of quality engineering using control charts and a process capability index. The method is different for tolerance limits and action limits with action limits being categorized into those that are specified and unspecified. The procedure is to first ensure process control using the I-MR control charts. Then, the tolerance limits are set equal to the control chart limits on the I chart. Action limits are determined using the Cpm process capability index with the requirements that the process must be in-control. The limits from the proposed procedure are compared to an existing or conventional method. Four examples are investigated: two of volumetric modulated arc therapy (VMAT) point dose quality assurance (QA) and two of routine linear accelerator output QA. The tolerance limits range from about 6% larger to 9% smaller than conventional action limits for VMAT QA cases. For the linac output QA, tolerance limits are about 60% smaller than conventional action limits. The operational procedure describe in this work is based on established quality management tools and will provide a systematic guide to set up tolerance and action limits for different equipment and processes.

  18. Validity and feasibility of a digital diet estimation method for use with preschool children: a pilot study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The goal of the study was to assess the validity and feasibility of a digital diet estimation method for use with preschool children in Head Start. Preschool children and their caregivers participated in validation (n=22) and feasibility (n=24) pilot studies. Validity was determined in the metabolic...

  19. Use of Empirical Bayes Methods in the Study of the Validity of Academic Predictors of Graduate School Performance.

    ERIC Educational Resources Information Center

    Braun, Henry I.; Jones, Douglas H.

    Classical statistical methods and the small enrollments in graduate departments have constrained the Graduate Record Examinations (GRE) Validity Study Service to providing only validities for single predictors. Estimates of the validity of two or more predictors, used jointly, are considered too unreliable because the corresponding prediction…

  20. Optimization and validation of a new CE method for the determination of pantoprazole enantiomers.

    PubMed

    Guan, Jin; Yan, Feng; Shi, Shuang; Wang, Silin

    2012-06-01

    A new CE method using sulfobutylether-beta-cyclodextrin (SBE-beta-CD) as chiral additive was developed and validated for the determination of pantoprazole enantiomers. The primary factors affecting its separation efficiency, which include chiral selector, buffer pH, organic additive, and applied voltage, were optimized. The best results were obtained using a buffer consisting of 50 mM borax-150 mM phosphate adjusted to pH 6.5, 20 mg/mL SBE-beta-CD, and a 10 kV applied voltage. The optimized method was validated for linearity, precision, accuracy, and proved to be robust. The LOD and LOQ for R-(+)-pantoprazole were 0.9 and 2.5 μg/mL, respectively. The method is capable of determining a minimum limit of 0.1% (w/w) of R-enantiomer in S-(-)-pantoprazole bulk samples. PMID:22736366

  1. Tutorial review on validation of liquid chromatography-mass spectrometry methods: part II.

    PubMed

    Kruve, Anneli; Rebane, Riin; Kipper, Karin; Oldekop, Maarja-Liisa; Evard, Hanno; Herodes, Koit; Ravio, Pekka; Leito, Ivo

    2015-04-22

    This is the part II of a tutorial review intending to give an overview of the state of the art of method validation in liquid chromatography mass spectrometry (LC-MS) and discuss specific issues that arise with MS (and MS-MS) detection in LC (as opposed to the "conventional" detectors). The Part II starts with briefly introducing the main quantitation methods and then addresses the performance related to quantification: linearity of signal, sensitivity, precision, trueness, accuracy, stability and measurement uncertainty. The last section is devoted to practical considerations in validation. With every performance characteristic its essence and terminology are addressed, the current status of treating it is reviewed and recommendations are given, how to handle it, specifically in the case of LC-MS methods. PMID:25819784

  2. Validated UV-spectrophotometric method for the evaluation of the efficacy of makeup remover.

    PubMed

    Charoennit, P; Lourith, N

    2012-04-01

    A UV-spectrophotometric method for the analysis of makeup remover was developed and validated according to ICH guidelines. Three makeup removers for which the main ingredients consisted of vegetable oil (A), mineral oil and silicone (B) and mineral oil and water (C) were sampled in this study. Ethanol was the optimal solvent because it did not interfere with the maximum absorbance of the liquid foundation at 250 nm. The linearity was determined over a range of makeup concentrations from 0.540 to 1.412 mg mL⁻¹ (R² = 0.9977). The accuracy of this method was determined by analysing low, intermediate and high concentrations of the liquid foundation and gave 78.59-91.57% recoveries with a relative standard deviation of <2% (0.56-1.45%). This result demonstrates the validity and reliability of this method. The reproducibilities were 97.32 ± 1.79, 88.34 ± 2.69 and 95.63 ± 2.94 for preparations A, B and C respectively, which are within the acceptable limits set forth by the ASEAN analytical validation guidelines, which ensure the precision of the method under the same operating conditions over a short time interval and the inter-assay precision within the laboratory. The proposed method is therefore a simple, rapid, accurate, precise and inexpensive technique for the routine analysis of makeup remover efficacy. PMID:22243432

  3. Validity of Demirjian's and modified Demirjian's methods in age estimation for Korean juveniles and adolescents.

    PubMed

    Lee, Sang-Seob; Kim, Dongjae; Lee, Saebomi; Lee, U-Young; Seo, Joong Seok; Ahn, Yong Woo; Han, Seung-Ho

    2011-09-10

    In estimating age of juveniles and adolescents, the teeth are employed primarily because of its low variability and less affection by endocrine and nutritional status in development. Demirjian established criteria for evaluating maturity of teeth and his method has been used throughout the world. However, several studies showed the inappropriateness of Demirjian's method on populations other than the one it is based on. Consequently some researchers modified Demirjian's method using data of several different populations. Demirjian himself also published a revised method to overcome other shortcomings of his original method. The aim of this study was to test the validity of Demirjian's and the modified methods (Demirjian's revised, Willems', Chaillet's and new Korean methods) for Korean juveniles and adolescents. 1483 digital orthopantomograms which consist of 754 males and 729 females in the age range of 3-16 years were collected. New age estimation method based on Korean population data was calculated. Dental age was estimated according to each method and the validity was evaluated using the differences between chronological and dental age. The inter- and intra-observer reliability was evaluated to be excellent. Statistically significant difference was observed between chronological and dental age in all the methods for both sexes except new Korean method for both sexes and Demirjian's revised method for males. However, when analyzing absolute and squared value of difference, Willems' method was found to be most accurate followed by new Korean method with slight difference for Korean population for both sexes. In conclusion, both Willems' method and new Korean method conducted by present study were proven to be suitable for Korean population. PMID:21561728

  4. A Thematic Review of Interactive Whiteboard Use in Science Education: Rationales, Purposes, Methods and General Knowledge

    NASA Astrophysics Data System (ADS)

    Ormanci, Ummuhan; Cepni, Salih; Deveci, Isa; Aydin, Ozhan

    2015-10-01

    In Turkey and many other countries, the importance of the interactive whiteboard (IWB) is increasing, and as a result, projects and studies are being conducted regarding the use of the IWB in classrooms. Accordingly, in these countries, many issues are being researched, such as the IWB's contribution to the education process, its use in classroom settings and problems that occur when using the IWB. In this context, the research and analysis of studies regarding the use of the IWB have important implications for educators, researchers and teachers. This study aims to review and analyze studies conducted regarding the use of the IWB in the field of science. Accordingly, as a thematic review of the research was deemed appropriate, extant articles available in the literature were analyzed using a matrix that consisted of general features (type of journal, year and demographic properties) and content features (rationales, aims, research methods, samples, data collections, results and suggestions). According to the findings, it was concluded that the studies regarding the use of IWBs were conducted due to deficiencies in the current literature. However, there are rare studies in which the reasons for the research were associated with the nature of science education. There were also studies that focused on the effects of the IWB on student academic success and learning outcomes. Within this context, it is evident that there is a need for further research concerning the use of IWBs in science education and for studies regarding the effect of IWBs on students' skills.

  5. A hot-wire method based thermal conductivity measurement apparatus for teaching purposes

    NASA Astrophysics Data System (ADS)

    Alvarado, S.; Marín, E.; Juárez, A. G.; Calderón, A.; Ivanov, R.

    2012-07-01

    The implementation of an automated system based on the hot-wire technique is described for the measurement of the thermal conductivity of liquids using equipment easily available in modern physics laboratories at high schools and universities (basically a precision current source and a voltage meter, a data acquisition card, a personal computer and a high purity platinum wire). The wire, which is immersed in the investigated sample, is heated by passing a constant electrical current through it, and its temperature evolution, ΔT, is measured as a function of time, t, for several values of the current. A straightforward methodology is then used for data processing in order to obtain the liquid thermal conductivity. The start point is the well known linear relationship between ΔT and ln(t) predicted for long heating times by a model based on a solution of the heat conduction equation for an infinite lineal heat source embedded in an infinite medium into which heat is conducted without convective and radiative heat losses. A criterion is used to verify that the selected linear region is the one that matches the conditions imposed by the theoretical model. As a consequence the method involves least-squares fits in linear, semi-logarithmic (semi-log) and log-log graphs, so that it becomes attractive not only to teach about heat transfer and thermal properties measurement techniques, but also as a good exercise for students of undergraduate courses of physics and engineering learning about these kinds of mathematical functional relationships between variables. The functionality of the experiment was demonstrated by measuring the thermal conductivity in samples of liquids with well known thermal properties.

  6. Validation of the Endopep-MS method for qualitative detection of active botulinum neurotoxins in human and chicken serum

    PubMed Central

    Björnstad, Kristian; Åberg, Annica Tevell; Kalb, Suzanne R.; Wang, Dongxia; Barr, John R.; Bondesson, Ulf; Hedeland, Mikael

    2015-01-01

    Botulinum neurotoxins (BoNTs) are highly toxic proteases produced by anaerobic bacteria. Traditionally, a mouse bioassay (MBA) has been used for detection of BoNTs, but for a long time, laboratories have worked with alternative methods for their detection. One of the most promising in vitro methods is a combination of an enzymatic and mass spectrometric assay called Endopep-MS. However, no comprehensive validation of the method has been presented. The main purpose of this work was to perform an in-house validation for the qualitative analysis of BoNT-A, B, C, C/D, D, D/C, E, and F in serum. The limit of detection (LOD), selectivity, precision, stability in matrix and solution, and correlation with the MBA were evaluated. The LOD was equal to or even better than that of the MBA for BoNT-A, B, D/C, E, and F. Furthermore, Endopep-MS was for the first time successfully used to differentiate between BoNT-C, D and their mosaics C/D and D/C by different combinations of antibodies and target peptides. In addition, sequential antibody capture was presented as a new way to multiplex the method when only a small sample volume is available. In the comparison with the MBA, all the samples analyzed were positive for BoNT-C/D with both methods. These results indicate that the Endopep-MS method is a good alternative to the MBA as the gold standard for BoNT detection based on its sensitivity, selectivity, speed, and that it does not require experimental animals. PMID:25228079

  7. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1314 and Method 1315

    EPA Science Inventory

    This report summarizes the results of an interlaboratory study conducted to generate precision estimates for two leaching methods under review by the U.S. EPA’s OSWER for inclusion into the EPA’s SW-846: Method 1314: Liquid-Solid Partitioning as a Function of Liquid...

  8. Development and validation of a molecular size distribution method for polysaccharide vaccines.

    PubMed

    Clément, G; Dierick, J-F; Lenfant, C; Giffroy, D

    2014-01-01

    Determination of the molecular size distribution of vaccine products by high performance size exclusion chromatography coupled to refractive index detection is important during the manufacturing process. Partial elution of high molecular weight compounds in the void volume of the chromatographic column is responsible for variation in the results obtained with a reference method using a TSK G5000PWXL chromatographic column. GlaxoSmithKline Vaccines has developed an alternative method relying on the selection of a different chromatographic column with a wider separation range and the generation of a dextran calibration curve to determine the optimal molecular weight cut-off values for all tested products. Validation of this method was performed according to The International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). The new method detected product degradation with the same sensitivity as that observed for the reference method. All validation parameters were within the pre-specified range. Precision (relative standard deviation (RSD) of mean values) was < 5 per cent (intra-assay) and < 10 per cent (inter-assay). Sample recovery was > 70 per cent for all polysaccharide conjugates and for the Haemophilus influenzae type B final container vaccine. All results obtained for robustness met the acceptance criteria defined in the validation protocol (≤ 2 times (RSD) or ≤ 2 per cent difference between the modified and the reference parameter value if RSD = 0 per cent). The new method was shown to be a suitable quality control method for the release and stability follow-up of polysaccharide-containing vaccines. The new method gave comparable results to the reference method, but with less intra- and inter-assay variability. PMID:25655242

  9. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    SciTech Connect

    Melius, J.; Margolis, R.; Ong, S.

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  10. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    PubMed

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. PMID:24360478

  11. The concurrent validity of three computerized methods of muscle activity onset detection.

    PubMed

    Carter, Sylvester; Gutierrez, Gregory

    2015-10-01

    Although the visual (VIS) method for muscle activation onset detection has been the gold standard, this method has been criticized because of its moderate reproducibility and for being laborious. The simple threshold (STH), approximated generalized likelihood-step (AGL-step), and k-means (KM) algorithms are more repeatable and less laborious but require validation for gait speeds encountered in clinical research. We, therefore, assessed the intra-rater reliability of the VIS method and the concurrent validity of the algorithms against the VIS for 3 gait speeds. We recruited 10 healthy young adults (4 male, 6 female; mean age=28.5±4.2). Participants completed 10 walking trials each at 3 speeds. Electromyographic data from 1 gait cycle (GC) were collected from 6 right lower extremity muscles during each trial. We used custom Labview programs to determine muscle activity onset for all 4 methods. Repeatability coefficients for the VIS method ranged from 12.51% to 45.08% of the GC, depending on the muscle. The AGL-step algorithm agreed best with the VIS method (root mean squared error (RMSE) 0.86-6.95% of GC) followed by the STH (1.19-15.6% of GC) and KM (4.6-16.9% of GC) methods. A single rater demonstrated large errors (RMSE 8-23% of GC) between VIS assessments. Based on this study's parameters, the AGL-step agreed best with the VIS method and may be an alternative to the VIS. PMID:26250750

  12. Verification and validation of the maximum entropy method of moment reconstruction of energy dependent neutron flux

    NASA Astrophysics Data System (ADS)

    Crawford, Douglas Spencer

    Verification and Validation of reconstructed neutron flux based on the maximum entropy method, is presented in this paper. The verification is carried out by comparing the neutron flux spectrum from the maximum entropy method with Monte Carlo N Particle 5 version 1.40 (MCNP5) and Attila-7.1.0-beta (Attila). A spherical 100% 235U critical assembly is modeled as the test case to compare the three methods. The verification error range for the maximum entropy method is 15% to 23% where MCNP5 is taken to be the comparison standard. Attila relative error for the critical assembly is 20% to 35%. Validation is accomplished by comparing a neutron flux spectrum that is back calculated from foil activation measurements performed in the GODIVA experiment (GODIVA). The error range of the reconstructed flux compared to GODIVA is 0%-10%. The error range of the neutron flux spectrum from MCNP5 compared to GODIVA is 0%-20% and the Attila error range compared to the GODIVA is 0%-35%. The maximum entropy method for reconstructing flux is shown to be a fast reliable method, compared to either Monte Carlo methods (MCNP5) or 30 multienergy group methods (Attila) and with respect to the GODIVA experiment.

  13. HPLC-UV method validation for the identification and quantification of bioactive amines in commercial eggs.

    PubMed

    de Figueiredo, Tadeu Chaves; de Assis, Débora Cristina Sampaio; Menezes, Liliane Denize Miranda; da Silva, Guilherme Resende; Lanza, Isabela Pereira; Heneine, Luiz Guilherme Dias; Cançado, Silvana de Vasconcelos

    2015-09-01

    A quantitative and confirmatory high-performance liquid chromatography with ultraviolet detection (HPLC-UV) method for the determination of bioactive amines in the albumen and yolk of commercial eggs was developed, optimized and validated by analyte extraction with trichloroacetic acid and pre-column derivatization with dansyl chloride. Phenylethylamine, putrescine, cadaverine, histamine, tyramine, spermidine and spermine standards were used to evaluate the following performance parameters: limit of detection (LoD), limit of quantification (LoQ), selectivity, linearity, precision, recovery and ruggedness. The LoD of the method was defined from 0.2 to 0.3 mg kg(-1) for the yolk matrix and from 0.2 to 0.4 mg kg(-1) for the albumen matrix; the LoQ was from 0.7 to 1.0 mg kg(-1) for the yolk matrix and from 0.7 to 1.1 mg kg(-1) for the albumen matrix. The validated method exhibited excellent selectivity and separation of all amines with coefficients of determination higher than 0.99. The obtained recovery values were from 90.5% to 108.3%, and the relative standard deviation (RSD) was lower than 10% under repeatability conditions for the studied analytes. The performance parameters show the validated method to be adequate for the determination of bioactive amines in egg albumen and yolk. PMID:26003718

  14. Development and Validation of HPTLC Method for the Estimation of Almotriptan Malate in Tablet Dosage Form.

    PubMed

    Suneetha, A; Syamasundar, B

    2010-09-01

    A new, simple, precise and accurate high performance thin layer chromatographic method has been proposed for the determination of almotriptan malate in a tablet dosage form. The drug was separated on aluminum plates precoated with silica gel 60 GF(254) with butanol:acetic acid:water (3:1:1) was used as mobilephase. Quantitative analysis was performed by densitometric scanning at 300 nm. The method was validated for linearity, accuracy, precision and robustness. The calibration plot was linear over the range of 100-700 ng/band for almotriptan malate. The method was successfully applied to the analysis of drug in a pharmaceutical dosage form. PMID:21694997

  15. Tocopherol and tocotrienol analysis in raw and cooked vegetables: a validated method with emphasis on sample preparation.

    PubMed

    Knecht, Katharina; Sandfuchs, Katja; Kulling, Sabine E; Bunzel, Diana

    2015-02-15

    Vegetables can be important dietary sources of vitamin E. However, data on vitamin E in raw and cooked vegetables are in part conflicting, indicating analytical pitfalls. The purpose of the study was to develop and validate an HPLC-FLD method for tocochromanol (tocopherols and tocotrienols) analysis equally suitable for raw and cooked vegetables. Significant instability of tocochromanols was observed in raw broccoli and carrot homogenates. Tocochromanols could be stabilized by freeze-drying or ascorbic acid addition prior to homogenization. The optimized protocol for tocochromanol analysis included knife and ball milling of freeze-dried vegetable pieces. Direct acetone extraction of vegetable powders allowed for satisfactory recoveries and precisions. A significant decrease of tocochromanols in baked compared to raw vegetables was shown, the extent of which varied largely between vegetables. For some raw vegetables, such as spinach or broccoli, underestimation of vitamin E in nutrient databases cannot be ruled out and should be examined. PMID:25236193

  16. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  17. Validation of a partial coherence interferometry method for estimating retinal shape

    PubMed Central

    Verkicharla, Pavan K.; Suheimat, Marwan; Pope, James M.; Sepehrband, Farshid; Mathur, Ankit; Schmid, Katrina L.; Atchison, David A.

    2015-01-01

    To validate a simple partial coherence interferometry (PCI) based retinal shape method, estimates of retinal shape were determined in 60 young adults using off-axis PCI, with three stages of modeling using variants of the Le Grand model eye, and magnetic resonance imaging (MRI). Stage 1 and 2 involved a basic model eye without and with surface ray deviation, respectively and Stage 3 used model with individual ocular biometry and ray deviation at surfaces. Considering the theoretical uncertainty of MRI (12-14%), the results of the study indicate good agreement between MRI and all three stages of PCI modeling with <4% and <7% differences in retinal shapes along horizontal and vertical meridians, respectively. Stage 2 and Stage 3 gave slightly different retinal co-ordinates than Stage 1 and we recommend the intermediate Stage 2 as providing a simple and valid method of determining retinal shape from PCI data. PMID:26417496

  18. Validation Study for Analytical Method of Diarrhetic Shellfish Poisons in 9 Kinds of Shellfish.

    PubMed

    Yamaguchi, Mizuka; Yamaguchi, Takahiro; Kakimoto, Kensaku; Nagayoshi, Haruna; Okihashi, Masahiro; Kajimura, Keiji

    2016-01-01

    A method was developed for the simultaneous determination of okadaic acid, dinophysistoxin-1 and dinophysistoxin-2 in shellfish using ultra performance liquid chromatography with tandem mass spectrometry. Shellfish poisons in spiked samples were extracted with methanol and 90% methanol, and were hydrolyzed with 2.5 mol/L sodium hydroxide solution. Purification was done on an HLB solid-phase extraction column. This method was validated in accordance with the notification of Ministry of Health, Labour and Welfare of Japan. As a result of the validation study in nine kinds of shellfish, the trueness, repeatability and within-laboratory reproducibility were 79-101%, less than 12 and 16%, respectively. The trueness and precision met the target values of notification. PMID:26936305

  19. Validity of Standard Measures of Family Planning Service Quality: Findings from the Simulated Client Method

    PubMed Central

    Tumlinson, Katherine; Speizer, Ilene S.; Curtis, Sian L.; Pence, Brian W.

    2014-01-01

    Despite widespread endorsement within the field of international family planning regarding the importance of quality of care as a reproductive right, the field has yet to develop validated data collection instruments to accurately assess quality in terms of its public health importance. This study, conducted among 19 higher volume public and private facilities in Kisumu, Kenya, used the simulated client method to test the validity of three standard data collection instruments included in large-scale facility surveys: provider interviews, client interviews, and observation of client-provider interactions. Results found low specificity and positive predictive values in each of the three instruments for a number of quality indicators, suggesting that quality of care may be overestimated by traditional methods. Revised approaches to measuring family planning service quality may be needed to ensure accurate assessment of programs and to better inform quality improvement interventions. PMID:25469929

  20. Comparison of Assertive Community Treatment Fidelity Assessment Methods: Reliability and Validity.

    PubMed

    Rollins, Angela L; McGrew, John H; Kukla, Marina; McGuire, Alan B; Flanagan, Mindy E; Hunt, Marcia G; Leslie, Doug L; Collins, Linda A; Wright-Berryman, Jennifer L; Hicks, Lia J; Salyers, Michelle P

    2016-03-01

    Assertive community treatment is known for improving consumer outcomes, but is difficult to implement. On-site fidelity measurement can help ensure model adherence, but is costly in large systems. This study compared reliability and validity of three methods of fidelity assessment (on-site, phone-administered, and expert-scored self-report) using a stratified random sample of 32 mental health intensive case management teams from the Department of Veterans Affairs. Overall, phone, and to a lesser extent, expert-scored self-report fidelity assessments compared favorably to on-site methods in inter-rater reliability and concurrent validity. If used appropriately, these alternative protocols hold promise in monitoring large-scale program fidelity with limited resources. PMID:25721146

  1. Validation of a partial coherence interferometry method for estimating retinal shape.

    PubMed

    Verkicharla, Pavan K; Suheimat, Marwan; Pope, James M; Sepehrband, Farshid; Mathur, Ankit; Schmid, Katrina L; Atchison, David A

    2015-09-01

    To validate a simple partial coherence interferometry (PCI) based retinal shape method, estimates of retinal shape were determined in 60 young adults using off-axis PCI, with three stages of modeling using variants of the Le Grand model eye, and magnetic resonance imaging (MRI). Stage 1 and 2 involved a basic model eye without and with surface ray deviation, respectively and Stage 3 used model with individual ocular biometry and ray deviation at surfaces. Considering the theoretical uncertainty of MRI (12-14%), the results of the study indicate good agreement between MRI and all three stages of PCI modeling with <4% and <7% differences in retinal shapes along horizontal and vertical meridians, respectively. Stage 2 and Stage 3 gave slightly different retinal co-ordinates than Stage 1 and we recommend the intermediate Stage 2 as providing a simple and valid method of determining retinal shape from PCI data. PMID:26417496

  2. Validated spectrophotometric methods for simultaneous determination of troxerutin and carbazochrome in dosage form

    NASA Astrophysics Data System (ADS)

    Khattab, Fatma I.; Ramadan, Nesrin K.; Hegazy, Maha A.; Al-Ghobashy, Medhat A.; Ghoniem, Nermine S.

    2015-03-01

    Four simple, accurate, sensitive and precise spectrophotometric methods were developed and validated for simultaneous determination of Troxerutin (TXN) and Carbazochrome (CZM) in their bulk powders, laboratory prepared mixtures and pharmaceutical dosage forms. Method A is first derivative spectrophotometry (D1) where TXN and CZM were determined at 294 and 483.5 nm, respectively. Method B is first derivative of ratio spectra (DD1) where the peak amplitude at 248 for TXN and 439 nm for CZM were used for their determination. Method C is ratio subtraction (RS); in which TXN was determined at its λmax (352 nm) in the presence of CZM which was determined by D1 at 483.5 nm. While, method D is mean centering of the ratio spectra (MCR) in which the mean centered values at 300 nm and 340.0 nm were used for the two drugs in a respective order. The two compounds were simultaneously determined in the concentration ranges of 5.00-50.00 μg mL-1 and 0.5-10.0 μg mL-1 for TXN and CZM, respectively. The methods were validated according to the ICH guidelines and the results were statistically compared to the manufacturer's method.

  3. Challenges in the analytical method development and validation for an unstable active pharmaceutical ingredient.

    PubMed

    Sajonz, Peter; Wu, Yan; Natishan, Theresa K; McGachy, Neil T; Detora, David

    2006-03-01

    A sensitive high-performance liquid chromatography (HPLC) impurity profile method for the antibiotic ertapenem is developed and subsequently validated. The method utilizes an Inertsil phenyl column at ambient temperature, gradient elution with aqueous sodium phosphate buffer at pH 8, and acetonitrile as the mobile phase. The linearity, method precision, method ruggedness, limit of quantitation, and limit of detection of the impurity profile HPLC method are found to be satisfactory. The method is determined to be specific, as judged by resolving ertapenem from in-process impurities in crude samples and degradation products that arise from solid state thermal and light stress, acid, base, and oxidative stressed solutions. In addition, evidence is obtained by photodiode array detection studies that no degradate or impurity having a different UV spectrum coeluted with the major component in stressed or unstressed samples. The challenges during the development and validation of the method are discussed. The difficulties of analyzing an unstable active pharmaceutical ingredient (API) are addressed. Several major impurities/degradates of the API have very different UV response factors from the API. These impurities/degradates are synthesized or prepared by controlled degradation and the relative response factors are determined. PMID:16620508

  4. Experimental Validation of Normalized Uniform Load Surface Curvature Method for Damage Localization

    PubMed Central

    Jung, Ho-Yeon; Sung, Seung-Hoon; Jung, Hyung-Jo

    2015-01-01

    In this study, we experimentally validated the normalized uniform load surface (NULS) curvature method, which has been developed recently to assess damage localization in beam-type structures. The normalization technique allows for the accurate assessment of damage localization with greater sensitivity irrespective of the damage location. In this study, damage to a simply supported beam was numerically and experimentally investigated on the basis of the changes in the NULS curvatures, which were estimated from the modal flexibility matrices obtained from the acceleration responses under an ambient excitation. Two damage scenarios were considered for the single damage case as well as the multiple damages case by reducing the bending stiffness (EI) of the affected element(s). Numerical simulations were performed using MATLAB as a preliminary step. During the validation experiments, a series of tests were performed. It was found that the damage locations could be identified successfully without any false-positive or false-negative detections using the proposed method. For comparison, the damage detection performances were compared with those of two other well-known methods based on the modal flexibility matrix, namely, the uniform load surface (ULS) method and the ULS curvature method. It was confirmed that the proposed method is more effective for investigating the damage locations of simply supported beams than the two conventional methods in terms of sensitivity to damage under measurement noise. PMID:26501286

  5. The convergent validity between two objective methods for quantifying training load in young taekwondo athletes.

    PubMed

    Haddad, Monoem; Chaouachi, Anis; Castagna, Carlo; Wong, Del P; Chamari, Karim

    2012-01-01

    Various studies used objective heart rate (HR)-based methods to assess training load (TL). The common methods were Banister's Training Impulse (TRIMP; weights the duration using a weighting factor) and Edwards' TL (a summated HR zone score). Both the methods use the direct physiological measure of HR as a fundamental part of the calculation. To eliminate the redundancy of using various methods to quantify the same construct (i.e., TL), we have to verify if these methods are strongly convergent and are interchangeable. Therefore, the aim of this study was to investigate the convergent validity between Banister's TRIMP and Edwards' TL used for the assessment of internal TL. The HRs were recorded and analyzed during 10 training weeks of the preseason period in 10 male Taekwondo (TKD) athletes. The TL was calculated using Banister's TRIMP and Edwards' TL. Pearson product moment correlation coefficient was used to evaluate the convergent validity between the 2 methods for assessing TL. Very large to nearly perfect relationships were found between individual Banister's TRIMP and Edwards' TL (r values from 0.80 to 0.99; p < 0.001). Pooled Banister's TRIMP and pooled Edwards' TL (pooled data n = 284) were nearly largely correlated (r = 0.89; p < 0.05; 95% confidence interval: 0.86-0.91). In conclusion, these findings suggest that these 2 objective methods, measuring a similar construct, are interchangeable. PMID:21904234

  6. Reliability and validity of eight dental age estimation methods for adults.

    PubMed

    Soomer, Helena; Ranta, Helena; Lincoln, Michael J; Penttilä, Antti; Leibur, Edvitar

    2003-01-01

    This paper evaluates the reliability and validity of eight published dental age estimation methods for adults that may aid in victim identification. Age was calculated on 20 Caucasian teeth of known age according to the methods of Kvaal (for in situ and extracted teeth), Solheim (for in situ and sectioned teeth), Lamendin (for extracted teeth), Johanson (for sectioned teeth) and Bang (for extracted and sectioned teeth) by one independent observer. For each method, mean age error and standard error were assessed as the measures of accuracy and precision. In addition, method simplicity, requirements for tooth preparation and the equipment necessary were assessed and recommendations given for forensic use in various situations. Methods for sectioned teeth gave more reliable results when compared to methods for intact teeth. PMID:12570217

  7. Experimental validation of a modal flexibility-based damage detection method for a cyber-physical system

    NASA Astrophysics Data System (ADS)

    Martinez-Castro, Rosana E.; Eskew, Edward L.; Jang, Shinae

    2014-03-01

    The detection and localization of damage in a timely manner is critical in order to avoid the failure of structures. When a structure is subjected to an unscheduled impulsive force, the resulting damage can lead to failure in a very short period of time. As such, a monitoring strategy that can adapt to variability in the environment and that anticipates changes in physical processes has the potential of detecting, locating and mitigating damage. These requirements can be met by a cyber-physical system (CPS) equipped with Wireless Smart Sensor Network (WSSN) systems that is capable of measuring and analyzing dynamic responses in real time using on-board in network processing. The Eigenparameter Decomposition of Structural Flexibility Change (ED) Method is validated with real data and considered to be used in the computational core of this CPS. The condition screening is implemented on a damaged structure and compared to an original baseline calculation, hence providing a supervised learning environment. An experimental laboratory study on a 5-story shear building with three damage conditions subjected to an impulsive force has been chosen to validate the effectiveness of the method proposed to locate and quantify the extent of damage. A numerical simulation of the same building subject to band-limited white noise has also been developed with this purpose. The effectiveness of the ED Method to locate damage is compared to that of the Damage Index Method. With some modifications, the ED Method is capable of locating and quantifying damage satisfactorily in a shear building subject to a lower frequency content predominant excitation.

  8. A Comparative Study of Methods To Validate Formaldehyde Decontamination of Biological Safety Cabinets

    PubMed Central

    Munro, Kerry; Lanser, Janice; Flower, Robert

    1999-01-01

    Methods of validation of formaldehyde decontamination of biological safety cabinets were compared. Decontamination of metal strips inoculated with Mycobacterium bovis, poliovirus, or Bacillus spp. spores was compared with the results obtained with three biological indicators. Conditions for successful decontamination, particularly relative humidity, were defined. The Attest 1291 biological indicator was the only biological indicator which was an aid in the detection of gross decontamination failure. PMID:9925635

  9. Experimental validation of finite element and boundary element methods for predicting structural vibration and radiated noise

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Wu, T. W.; Wu, X. F.

    1994-01-01

    This research report is presented in three parts. In the first part, acoustical analyses were performed on modes of vibration of the housing of a transmission of a gear test rig developed by NASA. The modes of vibration of the transmission housing were measured using experimental modal analysis. The boundary element method (BEM) was used to calculate the sound pressure and sound intensity on the surface of the housing and the radiation efficiency of each mode. The radiation efficiency of each of the transmission housing modes was then compared to theoretical results for a finite baffled plate. In the second part, analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise level radiated from the box. The FEM was used to predict the vibration, while the BEM was used to predict the sound intensity and total radiated sound power using surface vibration as the input data. Vibration predicted by the FEM model was validated by experimental modal analysis; noise predicted by the BEM was validated by measurements of sound intensity. Three types of results are presented for the total radiated sound power: sound power predicted by the BEM model using vibration data measured on the surface of the box; sound power predicted by the FEM/BEM model; and sound power measured by an acoustic intensity scan. In the third part, the structure used in part two was modified. A rib was attached to the top plate of the structure. The FEM and BEM were then used to predict structural vibration and radiated noise respectively. The predicted vibration and radiated noise were then validated through experimentation.

  10. Determination of paraquat and diquat: LC-MS method optimization and validation.

    PubMed

    Pizzutti, Ionara R; Vela, Giovana M E; de Kok, André; Scholten, Jos M; Dias, Jonatan V; Cardoso, Carmem D; Concenço, Germani; Vivian, Rafael

    2016-10-15

    This study describes the optimization and single-laboratory validation of a single residue method for determination of two bipyridylium herbicides, paraquat and diquat, in cowpeas by UPLC-MS/MS in a total run time of 9.3min. The method is based on extraction with an acidified methanol-water mixture. Different extraction parameters (extraction solvent composition, temperature, sample extract filtration, and pre-treatment of the laboratory sample) were evaluated in order to optimize the extraction method efficiency. Isotopically labeled internal standards, Paraquat-D6 and Diquat-D4, were used and added to the test portions prior to extraction. The method validation was performed by analyzing spiked samples at three concentrations (10, 20 and 50μgkg(-1)), with seven replicates (n=7) for each concentration. Linearity (r(2)) of analytical curves, accuracy (trueness as recovery % and precision as RSD%), instrument and method limits of detection and quantification (LOD and LOQ) and matrix effects were determined. Average recoveries obtained for diquat were between 77 and 85% with RSD values ⩽20%, for all spike levels studied. On the other hand, paraquat showed average recoveries between 68 and 103% with RSDs in the range 14.4-25.4%. The method LOQ was 10 and 20μgkg(-1) for diquat and paraquat, respectively. The matrix effect was significant for both pesticides. Consequently, matrix-matched calibration standards and using isotopically labeled (IL) analogues as internal standards for the target analytes are required for application in routine analysis. The validated method was successfully applied for cowpea samples obtained from various field studies. PMID:27173559

  11. Validation of analysis methods for assessing flawed piping subjected to dynamic loading

    SciTech Connect

    Olson, R.J.; Wolterman, R.L.; Wilkowski, G.M.; Kot, C.A.

    1994-08-01

    Argonne National Laboratory and Battelle have jointly conducted a research program for the USNRC to evaluate the ability of current engineering analysis methods and one state-of-the-art analysis method to predict the behavior of circumferentially surface-cracked pipe system water-hammer experiment. The experimental data used in the evaluation were from the HDR Test Group E31 series conducted by the Kernforschungszentrum Karlsruhe (KfK) in Germany. The incentive for this evaluation was that simplified engineering methods, as well as newer ``state-of-the-art`` fracture analysis methods, have been typically validated only with static experimental data. Hence, these dynamic experiments were of high interest. High-rate dynamic loading can be classified as either repeating, e.g., seismic, or nonrepeating, e.g., water hammer. Development of experimental data and validation of cracked pipe analyses under seismic loading (repeating dynamic loads) are being pursued separately within the NRC`s International Piping Integrity Research Group (IPIRG) program. This report describes developmental and validation efforts to predict crack stability under water hammer loading, as well as comparisons using currently used analysis procedures. Current fracture analysis methods use the elastic stress analysis loads decoupled from the fracture mechanics analysis, while state-of-the-art methods employ nonlinear cracked-pipe time-history finite element analyses. The results showed that the current decoupled methods were conservative in their predictions, whereas the cracked pipe finite element analyses were more accurate, yet slightly conservative. The nonlinear time-history cracked-pipe finite element analyses conducted in this program were also attractive in that they were done on a small Apollo DN5500 workstation, whereas other cracked-pipe dynamic analyses conducted in Europe on the same experiments required the use of a CRAY2 supercomputer, and were less accurate.

  12. Capillary isoelectric focusing method development and validation for investigation of recombinant therapeutic monoclonal antibody.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2015-10-10

    Capillary isoelectric focusing (cIEF) is a basic and highly accurate routine analytical tool to prove identity of protein drugs in quality control (QC) and release tests in biopharmaceutical industries. However there are some "out-of-the-box" applications commercially available which provide easy and rapid isoelectric focusing solutions for investigating monoclonal antibody drug proteins. However use of these kits in routine testings requires high costs. A capillary isoelectric focusing method was developed and validated for identification testing of monoclonal antibody drug products with isoelectric point between 7.0 and 9.0. A method was developed providing good pH gradient for internal calibration (R(2)>0.99) and good resolution between all of the isoform peaks (R=2), minimizing the time and complexity of sample preparation (no urea or salt used). The method is highly reproducible and it is suitable for validation and method transfer to any QC laboratories. Another advantage of the method is that it operates with commercially available chemicals which can be purchased from any suppliers. The interaction with capillary walls (avoid precipitation and adsorption as far as possible) was minimized and synthetic isoelectric small molecular markers were used instead of peptide or protein based markers. The developed method was validated according to the recent ICH guideline (Q2(R1)). Relative standard deviation results were below 0.2% for isoelectric points and below 4% according to the normalized migration times. The method is robust to buffer components with different lot numbers and neutral capillaries with different type of inner coatings. The fluoro-carbon coated column was chosen because of costs-effectivity aspects. PMID:26025812

  13. Validation of finite element and boundary element methods for predicting structural vibration and radiated noise

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Wu, X. F.; Oswald, Fred B.

    1992-01-01

    Analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise radiated from the box. The FEM was used to predict the vibration, and the surface vibration was used as input to the BEM to predict the sound intensity and sound power. Vibration predicted by the FEM model was validated by experimental modal analysis. Noise predicted by the BEM was validated by sound intensity measurements. Three types of results are presented for the total radiated sound power: (1) sound power predicted by the BEM modeling using vibration data measured on the surface of the box; (2) sound power predicted by the FEM/BEM model; and (3) sound power measured by a sound intensity scan. The sound power predicted from the BEM model using measured vibration data yields an excellent prediction of radiated noise. The sound power predicted by the combined FEM/BEM model also gives a good prediction of radiated noise except for a shift of the natural frequencies that are due to limitations in the FEM model.

  14. A validated method for analysis of Swerchirin in Swertia longifolia Boiss. by high performance liquid chromatography

    PubMed Central

    Shekarchi, M.; Hajimehdipoor, H.; Khanavi, M.; Adib, N.; Bozorgi, M.; Akbari-Adergani, B.

    2010-01-01

    Swertia spp. (Gentianaceae) grow widely in the eastern and southern Asian countries and are used as traditional medicine for gastrointestinal disorders. Swerchirin, one of the xanthones in Swertia spp., has many pharmacological properties, such as, antimalarial, antihepatotoxic, and hypoglycemic effects. Because of the pharmacological importance of Swerchirin in this investigation, it was purified from Swertia longifolia Boiss. as one of the main components and quantified by means of a validated high performance liquid chromatography (HPLC) technique. Aerial parts of the plant were extracted with acetone 80%. Phenolic and non-phenolic constituents of the extract were separated from each other during several processes. The phenolic fraction was injected into the semi-preparative HPLC system, which consisted of a C18 column and a gradient methanol: 0.1% formic acid mode. Using this method, we were able to purify six xanthones from the plant, in order to use them as standard materials. The analytical method was validated for Swerchirin as one of the most important components of the plant, with more pharmacological activities according to the validation parameters, such as, selectivity, linearity (r2 > 0.9998), precision (≤3.3), and accuracy, which were measured by the determination of recovery (98-107%). The limits of detection and quantization were found to be 2.1 and 6.3 μg/mL, respectively. On account of the speed and accuracy, the UV-HPLC method may be used for quantitative analysis of Swerchirin. PMID:20548931

  15. HPLC method development, validation, and impurity characterization of a potent antitumor indenoisoquinoline, LMP776 (NSC 725776).

    PubMed

    Wang, Jennie; Liu, Mingtao; Yang, Chun; Wu, Xiaogang; Wang, Euphemia; Liu, Paul

    2016-05-30

    An HPLC method for the assay of a DNA topoisomerase inhibitor, LMP776 (NSC 725776), has been developed and validated. The stress testing of LMP776 was carried out in accordance with International Conference on Harmonization (ICH) guidelines Q1A (R2) under acidic, alkaline, oxidative, thermolytic, and photolytic conditions. The separation of LMP776 from its impurities and degradation products was achieved within 40 min on a Supelco Discovery HS F5 column (150 mm × 4.6 mm i.d., 5 μm) with a gradient mobile phase comprising 38-80% acetonitrile in water, with 0.1% trifluoroacetic acid in both phases. LC/MS was used to obtain mass data for characterization of impurities and degradation products. One major impurity was isolated through chloroform extraction and identified by NMR. The proposed HPLC assay method was validated for specificity, linearity (concentration range 0.25-0.75 mg/mL, r = 0.9999), accuracy (recovery 98.6-100.4%), precision (RSD ≤ 1.4%), and sensitivity (LOD 0.13 μg/mL). The validated method was used in the stability study of the LMP776 drug substance in conformance with the ICH Q1A (R2) guideline. PMID:26970596

  16. Pressure ulcer prevention algorithm content validation: a mixed-methods, quantitative study.

    PubMed

    van Rijswijk, Lia; Beitz, Janice M

    2015-04-01

    Translating pressure ulcer prevention (PUP) evidence-based recommendations into practice remains challenging for a variety of reasons, including the perceived quality, validity, and usability of the research or the guideline itself. Following the development and face validation testing of an evidence-based PUP algorithm, additional stakeholder input and testing were needed. Using convenience sampling methods, wound care experts attending a national wound care conference and a regional wound ostomy continence nursing (WOCN) conference and/or graduates of a WOCN program were invited to participate in an Internal Review Board-approved, mixed-methods quantitative survey with qualitative components to examine algorithm content validity. After participants provided written informed consent, demographic variables were collected and participants were asked to comment on and rate the relevance and appropriateness of each of the 26 algorithm decision points/steps using standard content validation study procedures. All responses were anonymous. Descriptive summary statistics, mean relevance/appropriateness scores, and the content validity index (CVI) were calculated. Qualitative comments were transcribed and thematically analyzed. Of the 553 wound care experts invited, 79 (average age 52.9 years, SD 10.1; range 23-73) consented to participate and completed the study (a response rate of 14%). Most (67, 85%) were female, registered (49, 62%) or advanced practice (12, 15%) nurses, and had > 10 years of health care experience (88, 92%). Other health disciplines included medical doctors, physical therapists, nurse practitioners, and certified nurse specialists. Almost all had received formal wound care education (75, 95%). On a Likert-type scale of 1 (not relevant/appropriate) to 4 (very relevant and appropriate), the average score for the entire algorithm/all decision points (N = 1,912) was 3.72 with an overall CVI of 0.94 (out of 1). The only decision point/step recommendation

  17. Methodology of method comparison studies evaluating the validity of cardiac output monitors: a stepwise approach and checklist.

    PubMed

    Montenij, L J; Buhre, W F; Jansen, J R; Kruitwagen, C L; de Waal, E E

    2016-06-01

    The validity of each new cardiac output (CO) monitor should be established before implementation in clinical practice. For this purpose, method comparison studies investigate the accuracy and precision against a reference technique. With the emergence of continuous CO monitors, the ability to detect changes in CO, in addition to its absolute value, has gained interest. Therefore, method comparison studies increasingly include assessment of trending ability in the data analysis. A number of methodological challenges arise in method comparison research with respect to the application of Bland-Altman and trending analysis. Failure to face these methodological challenges will lead to misinterpretation and erroneous conclusions. We therefore review the basic principles and pitfalls of Bland-Altman analysis in method comparison studies concerning new CO monitors. In addition, the concept of clinical concordance is introduced to evaluate trending ability from a clinical perspective. The primary scope of this review is to provide a complete overview of the pitfalls in CO method comparison research, whereas other publications focused on a single aspect of the study design or data analysis. This leads to a stepwise approach and checklist for a complete data analysis and data representation. PMID:27199309

  18. Quantitative validation of the 3D SAR profile of hyperthermia applicators using the gamma method.

    PubMed

    de Bruijne, Maarten; Samaras, Theodoros; Chavannes, Nicolas; van Rhoon, Gerard C

    2007-06-01

    For quality assurance of hyperthermia treatment planning systems, quantitative validation of the electromagnetic model of an applicator is essential. The objective of this study was to validate a finite-difference time-domain (FDTD) model implementation of the Lucite cone applicator (LCA) for superficial hyperthermia. The validation involved (i) the assessment of the match between the predicted and measured 3D specific absorption rate (SAR) distribution, and (ii) the assessment of the ratio between model power and real-world power. The 3D SAR distribution of seven LCAs was scanned in a phantom bath using the DASY4 dosimetric measurement system. The same set-up was modelled in SEMCAD X. The match between the predicted and the measured SAR distribution was quantified with the gamma method, which combines distance-to-agreement and dose difference criteria. Good quantitative agreement was observed: more than 95% of the measurement points met the acceptance criteria 2 mm/2% for all applicators. The ratio between measured and predicted power absorption ranged from 0.75 to 0.92 (mean 0.85). This study shows that quantitative validation of hyperthermia applicator models is feasible and is worth considering as a part of hyperthermia quality assurance procedures. PMID:17505090

  19. Development and Validation of a Method to Measure Lumbosacral Motion Using Ultrasound Imaging.

    PubMed

    van den Hoorn, Wolbert; Coppieters, Michel W; van Dieën, Jaap H; Hodges, Paul W

    2016-05-01

    The study aim was to validate an ultrasound imaging technique to measure sagittal plane lumbosacral motion. Direct and indirect measures of lumbosacral angle change were developed and validated. Lumbosacral angle was estimated by the angle between lines through two landmarks on the sacrum and lowest lumbar vertebrae. Distance measure was made between the sacrum and lumbar vertebrae, and angle was estimated after distance was calibrated to angle. This method was tested in an in vitro spine and an in vivo porcine spine and validated to video and fluoroscopy measures, respectively. R(2), regression coefficients and mean absolute differences between ultrasound measures and validation measures were, respectively: 0.77, 0.982, 0.67° (in vitro, angle); 0.97, 0.992, 0.82° (in vitro, distance); 0.94, 0.995, 2.1° (in vivo, angle); and 0.95, 0.997, 1.7° (in vivo, distance). Lumbosacral motion can be accurately measured with ultrasound. This provides a basis to develop measurements for use in humans. PMID:26895754

  20. Development of a benchtop baking method for chemically leavened crackers. II. Validation of the method

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...

  1. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... of 40 CFR part 63 on June 3, 1991. We proposed amendments to Method 301 on December 22, 2004 (69 FR... proposed on December 22, 2004, EPA promulgated a rule on September 13, 2010 (75 FR 55636), that moves all... terms of Executive Order 12866 (58 FR 51735, October 4, 1993) and is therefore not subject to...

  2. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  3. Method validation for determination of alkaloid content in goldenseal root powder.

    PubMed

    Weber, Holly A; Zart, Matthew K; Hodges, Andrew E; White, Kellie D; Barnes, Sarah M; Moody, Leslie A; Clark, Alice P; Harris, Roger K; Overstreet, J Diane; Smith, Cynthia S

    2003-01-01

    A fast, practical ambient extraction methodology followed by isocratic liquid chromatography (LC) analysis with UV detection was validated for the determination of berberine, hydrastine, and canadine in goldenseal (Hydrastis canadensis L.) root powder. The method was also validated for palmatine, a major alkaloid present in the possible bioadulterants Coptis, Oregon grape root, and barberry bark. Alkaloid standard solutions were linear over the evaluated concentration ranges. The analytical method was linear for alkaloid extraction using 0.3-2 g goldenseal root powder/100 mL extraction solvent. Precision of the method was demonstrated using 10 replicate extractions of 0.5 g goldenseal root powder, with percent relative standard deviation for all 4 alkaloids < or = 1.6. Alkaloid recovery was determined by spiking each alkaloid into triplicate aliquots of neat goldenseal root powder. Recoveries ranged from 92.3% for palmatine to 101.9% for hydrastine. Ruggedness of the method was evaluated by performing multiple analyses of goldenseal root powder from 3 suppliers over a 2-year period. The method was also used to analyze Coptis root, Oregon grape root, barberry bark, and celandine herb, which are possible goldenseal bioadulterants. The resulting chromatographic profiles of the bioadulterants were significantly different from that of goldenseal. The method was directly transferred to LC with mass spectrometry, which was used to confirm the presence of goldenseal alkaloids tetrahydroberberastine, berberastine, canadaline, berberine, hydrastine, and canadine, as well as alkaloids from the bioadulterants, including palmatine, jatrorrhizine, and coptisine. PMID:12852562

  4. Evaluating Processes, Parameters and Observations Using Cross Validation and Computationally Frugal Sensitivity Analysis Methods

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Mehl, S.; Hill, M. C.

    2013-12-01

    Sensitivity analysis methods are used to identify measurements most likely to provide important information for model development and predictions and therefore identify critical processes. Methods range from computationally demanding Monte Carlo and cross-validation methods, to very computationally efficient linear methods. The methods are able to account for interrelations between parameters, but some argue that because linear methods neglect the effects of model nonlinearity, they are not worth considering when examining complex, nonlinear models of environmental systems. However, when faced with computationally demanding models needed to simulate, for example, climate change, the chance of obtaining fundamental insights (such as important and relationships between predictions and parameters) with few model runs is tempting. In the first part of this work, comparisons of local sensitivity analysis and cross-validation are conducted using a nonlinear groundwater model of the Maggia Valley, Southern Switzerland; sensitivity analysis are then applied to an integrated hydrological model of the same system where the impact of more processes and of using different sets of observations on the model results are considered; applicability to models of a variety of situations (climate, water quality, water management) is inferred. Results show that the frugal linear methods produced about 70% of the insight from about 2% of the model runs required by the computationally demanding methods. Regarding important observations, linear methods were not always able to distinguish between moderately and unimportant observations. However, they consistently identified the most important observations which are critical to characterize relationships between parameters and to assess the worth of potential new data collection efforts. Importance both to estimate parameters and predictions of interest was readily identified. The results suggest that it can be advantageous to consider local

  5. Diffuse reflectance near infrared-chemometric methods development and validation of amoxicillin capsule formulations

    PubMed Central

    Khan, Ahmed Nawaz; Khar, Roop Krishen; Ajayakumar, P. V.

    2016-01-01

    Objective: The aim of present study was to establish near infrared-chemometric methods that could be effectively used for quality profiling through identification and quantification of amoxicillin (AMOX) in formulated capsule which were similar to commercial products. In order to evaluate a large number of market products easily and quickly, these methods were modeled. Materials and Methods: Thermo Scientific Antaris II near infrared analyzer with TQ Analyst Chemometric Software were used for the development and validation of the identification and quantification models. Several AMOX formulations were composed with four excipients microcrystalline cellulose, magnesium stearate, croscarmellose sodium and colloidal silicon dioxide. Development includes quadratic mixture formulation design, near infrared spectrum acquisition, spectral pretreatment and outlier detection. According to prescribed guidelines by International Conference on Harmonization (ICH) and European Medicine Agency (EMA) developed methods were validated in terms of specificity, accuracy, precision, linearity, and robustness. Results: On diffuse reflectance mode, an identification model based on discriminant analysis was successfully processed with 76 formulations; and same samples were also used for quantitative analysis using partial least square algorithm with four latent variables and 0.9937 correlation of coefficient followed by 2.17% root mean square error of calibration (RMSEC), 2.38% root mean square error of prediction (RMSEP), 2.43% root mean square error of cross-validation (RMSECV). Conclusion: Proposed model established a good relationship between the spectral information and AMOX identity as well as content. Resulted values show the performance of the proposed models which offers alternate choice for AMOX capsule evaluation, relative to that of well-established high-performance liquid chromatography method. Ultimately three commercial products were successfully evaluated using developed

  6. The development and validation of a single SNaPshot multiplex for tiger species and subspecies identification--implications for forensic purposes.

    PubMed

    Kitpipit, Thitika; Tobe, Shanan S; Kitchener, Andrew C; Gill, Peter; Linacre, Adrian

    2012-03-01

    The tiger (Panthera tigris) is currently listed on Appendix I of the Convention on the International Trade in Endangered Species of Wild Fauna and Flora; this affords it the highest level of international protection. To aid in the investigation of alleged illegal trade in tiger body parts and derivatives, molecular approaches have been developed to identify biological material as being of tiger in origin. Some countries also require knowledge of the exact tiger subspecies present in order to prosecute anyone alleged to be trading in tiger products. In this study we aimed to develop and validate a reliable single assay to identify tiger species and subspecies simultaneously; this test is based on identification of single nucleotide polymorphisms (SNPs) within the tiger mitochondrial genome. The mitochondrial DNA sequence from four of the five extant putative tiger subspecies that currently exist in the wild were obtained and combined with DNA sequence data from 492 tiger and 349 other mammalian species available on GenBank. From the sequence data a total of 11 SNP loci were identified as suitable for further analyses. Five SNPs were species-specific for tiger and six amplify one of the tiger subspecies-specific SNPs, three of which were specific to P. t. sumatrae and the other three were specific to P. t. tigris. The multiplex assay was able to reliably identify 15 voucher tiger samples. The sensitivity of the test was 15,000 mitochondrial DNA copies (approximately 0.26 pg), indicating that it will work on trace amounts of tissue, bone or hair samples. This simple test will add to the DNA-based methods currently being used to identify the presence of tiger within mixed samples. PMID:21723800

  7. Validation of simultaneous volumetric and spectrophotometric methods for the determination of captopril in pharmaceutical formulations.

    PubMed

    Rahman, Nafisur; Singh, Manisha; Hoda, Nasrul

    2005-01-01

    Simple, sensitive and economical simultaneous volumetric and spectrophotometric methods for the determination of captopril have been developed. The methods were based on the reaction of captopril with potassium iodate in HCl medium. Amaranth was used as indicator to detect the end-point of the titration in aqueous layer. The iodine formed during the titration was extracted into CCl4 and subsequently determined spectrophotometrically at 510 nm. The Beer's law was obeyed in the concentration range of 120-520 microg ml-1. Rigorous statistical analyses were performed for the validation of the proposed methods. The proposed methods were successfully applied to the determination of captopril in dosage forms. Comparison of the means of the proposed procedures with those of reference methods using point and interval hypothesis tests showed no statistically significant difference. PMID:15927181

  8. Two validated HPLC methods for the quantification of alizarin and other anthraquinones in Rubia tinctorum cultivars.

    PubMed

    Derksen, Goverdina C H; Lelyveld, Gerrit P; van Beek, Teris A; Capelle, Anthony; de Groot, A E

    2004-01-01

    Direct and indirect HPLC-UV methods for the quantitative determination of anthraquinones in dried madder root have been developed, validated and compared. In the direct method, madder root was extracted twice with refluxing ethanol-water. This method allowed the determination of the two major native anthraquinone glycosides lucidin primeveroside and ruberythric acid. In the indirect extraction method, the anthraquinone glycosides were first converted into aglycones by endogenous enzymes and the aglycones were subsequently extracted with tetrahydrofuran-water and then analysed. In this case the anthraquinones alizarin, purpurin and nordamnacanthal may be determined. The content of nordamnacanthal is proportional to the amount of lucidin primeveroside originally present. The indirect extraction method is easier to apply. Different madder cultivars were screened for their anthraquinone content. PMID:15599964

  9. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  10. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1989-01-01

    The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.

  11. A validated HPTLC method for estimation of moxifloxacin hydrochloride in tablets.

    PubMed

    Dhillon, Vandana; Chaudhary, Alok Kumar

    2010-10-01

    A simple HPTLC method having high accuracy, precision and reproducibility was developed for the routine estimation of moxifloxacin hydrochloride in the tablets available in market and was validated for various parameters according to ICH guidelines. moxifloxacin hydrochloride was estimated at 292 nm by densitometry using Silica gel 60 F254 as stationary phase and a premix of methylene chloride: methanol: strong ammonia solution and acetonitrile (10:10:5:10) as mobile phase. Method was found linear in a range of 9-54 nanograms with a correlation coefficient >0.99. The regression equation was: AUC = 65.57 × (Amount in nanograms) + 163 (r(2) = 0.9908). PMID:23781417

  12. Validation of rapid assessment methods to determine streamflow duration classes in the Pacific Northwest, USA.

    PubMed

    Nadeau, Tracie-Lynn; Leibowitz, Scott G; Wigington, Parker J; Ebersole, Joseph L; Fritz, Ken M; Coulombe, Robert A; Comeleo, Randy L; Blocksom, Karen A

    2015-07-01

    United States Supreme Court rulings have created uncertainty regarding U.S. Clean Water Act (CWA) authority over certain waters, and established new data and analytical requirements for determining CWA jurisdiction. Thus, rapid assessment methods are needed that can differentiate between ephemeral, intermittent, and perennial streams. We report on the validation of several methods. The first (Interim Method) was developed through best professional judgment (BPJ); an alternative (Revised Method) resulted from statistical analysis. We tested the Interim Method on 178 study reaches in Oregon, and constructed the Revised Method based on statistical analysis of the Oregon data. Next, we evaluated the regional applicability of the methods on 86 study reaches across a variety of hydrologic landscapes in Washington and Idaho. During the second phase, we also compared the Revised Method with a similar approach (Combined Method) based on combined field data from Oregon, Washington, and Idaho. We further compared field-based methods with a GIS-based approach (GIS Method) that used the National Hydrography Dataset and a synthetic stream network. Evaluations of all methods compared results with actual streamflow duration classes. The Revised Method correctly determined known streamflow duration 83.9% of the time, versus 62.3% accuracy of the Interim Method and 43.6% accuracy for the GIS-based approach. The Combined Method did not significantly outperform the Revised Method. Analysis showed biological indicators most accurately discriminate streamflow duration classes. While BPJ established a testable hypothesis, this study illustrates the importance of quantitative field testing of rapid assessment methods. Results support a consistent method applicable across the Pacific Northwest. PMID:25931296

  13. Validation of Rapid Assessment Methods to Determine Streamflow Duration Classes in the Pacific Northwest, USA

    NASA Astrophysics Data System (ADS)

    Nadeau, Tracie-Lynn; Leibowitz, Scott G.; Wigington, Parker J.; Ebersole, Joseph L.; Fritz, Ken M.; Coulombe, Robert A.; Comeleo, Randy L.; Blocksom, Karen A.

    2015-07-01

    United States Supreme Court rulings have created uncertainty regarding U.S. Clean Water Act (CWA) authority over certain waters, and established new data and analytical requirements for determining CWA jurisdiction. Thus, rapid assessment methods are needed that can differentiate between ephemeral, intermittent, and perennial streams. We report on the validation of several methods. The first (Interim Method) was developed through best professional judgment (BPJ); an alternative (Revised Method) resulted from statistical analysis. We tested the Interim Method on 178 study reaches in Oregon, and constructed the Revised Method based on statistical analysis of the Oregon data. Next, we evaluated the regional applicability of the methods on 86 study reaches across a variety of hydrologic landscapes in Washington and Idaho. During the second phase, we also compared the Revised Method with a similar approach (Combined Method) based on combined field data from Oregon, Washington, and Idaho. We further compared field-based methods with a GIS-based approach (GIS Method) that used the National Hydrography Dataset and a synthetic stream network. Evaluations of all methods compared results with actual streamflow duration classes. The Revised Method correctly determined known streamflow duration 83.9 % of the time, versus 62.3 % accuracy of the Interim Method and 43.6 % accuracy for the GIS-based approach. The Combined Method did not significantly outperform the Revised Method. Analysis showed biological indicators most accurately discriminate streamflow duration classes. While BPJ established a testable hypothesis, this study illustrates the importance of quantitative field testing of rapid assessment methods. Results support a consistent method applicable across the Pacific Northwest.

  14. Validity of a Manual Soft Tissue Profile Prediction Method Following Mandibular Setback Osteotomy

    PubMed Central

    Kolokitha, Olga-Elpis

    2007-01-01

    Objectives The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. Methods To test the validity of the manual method the prediction tracings were compared to the actual post-operative tracings. The Dentofacial Planner software was used to develop the computerized post-surgical prediction tracings. Both manual and computerized prediction printouts were analyzed by using the cephalometric system PORDIOS. Statistical analysis was performed by means of t-test. Results Comparison between manual prediction tracings and the actual post-operative profile showed that the manual method results in more convex soft tissue profiles; the upper lip was found in a more prominent position, upper lip thickness was increased and, the mandible and lower lip were found in a less posterior position than that of the actual profiles. Comparison between computerized and manual prediction methods showed that in the manual method upper lip thickness was increased, the upper lip was found in a more anterior position and the lower anterior facial height was increased as compared to the computerized prediction method. Conclusions Cephalometric simulation of post-operative soft tissue profile following orthodontic-surgical management of mandibular prognathism imposes certain limitations related to the methods implied. However, both manual and computerized prediction methods remain a useful tool for patient communication. PMID:19212468

  15. Validation Study of a Method for Assessing Complex Ill-Structured Problem Solving by Using Causal Representations

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ifenthaler, Dirk; Ge, Xun

    2013-01-01

    The important but little understood problem that motivated this study was the lack of research on valid assessment methods to determine progress in higher-order learning in situations involving complex and ill-structured problems. Without a valid assessment method, little progress can occur in instructional design research with regard to designing…

  16. Validation of Modifications to the ANSR® Salmonella Method for Improved Ease of Use.

    PubMed

    Caballero, Oscar; Alles, Susan; Walton, Kayla; Gray, R Lucas; Mozola, Mark; Rice, Jennifer

    2015-01-01

    This paper describes the results of a study to validate minor reagent formulation and procedural changes to the ANSR® Salmonella method, AOAC Performance Tested Method™ 061203. In order to improve ease of use and diminish risk of amplicon contamination, the lyophilized reagent components were reformulated for increased solubility, thus eliminating the need to mix by pipetting. In the alternative procedure, an aliquot of the lysate is added to lyophilized ANSR reagents, immediately capped, and briefly mixed by vortex. Results of the validation study with ice cream, peanut butter, dry dog food, raw ground turkey, raw ground beef, and sponge samples from a stainless steel surface showed no statistically significant differences in performance between the ANSR method and the U.S. Food and Drug Administration Bacteriological Analytical Manual or U.S. Department of Agriculture-Food Safety and Inspection Services Microbiology Laboratory Guidebook reference culture procedures. Results of inclusivity and exclusivity testing were unchanged from those of the original validation study; exclusivity was 100% and inclusivity was 99.1% with only a single strain of Salmonella Weslaco testing negative. Robustness testing was also conducted, with variations to lysis buffer volume, lysis time, and sample volume having no demonstrable effect on assay results. PMID:26086257

  17. Method of Administration of PROMIS Scales Did Not Significantly Impact Score Level, Reliability or Validity

    PubMed Central

    Bjorner, Jakob B.; Rose, Matthias; Gandek, Barbara; Stone, Arthur A.; Junghaenel, Doerte U.; Ware, John E.

    2014-01-01

    Objective To test the impact of method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). Study Design and Setting Two non-overlapping parallel forms each containing 8 items from each of three PROMIS item banks (Physical Function, Fatigue and Depression) were completed by 923 adults with COPD, depression, or rheumatoid arthritis. In a randomized cross-over design, subjects answered one form by interactive voice response (IVR) technology, paper questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICC), and convergent/discriminant validity. Results In difference score analyses, no significant mode differences were found and all confidence intervals were within the pre-specified MID of 0.2 SD. Parallel forms reliabilities were very high (ICC=0.85-0.93). Only one across mode ICC was significantly lower than the same mode ICC. Tests of validity showed no differential effect by MOA. Participants preferred screen interface over PQ and IVR. Conclusion We found no statistically or clinically significant differences in score levels or psychometric properties of IVR, PQ or PDA administration as compared to PC. PMID:24262772

  18. Validated LC Method for the Estimation of Voriconazole in Bulk and Formulation.

    PubMed

    Patel, C N; Dave, J B; Patel, J V; Panigrahi, B

    2009-11-01

    Reversed phase high performance liquid chromatographic method was developed and validated for the estimation of voriconazole in bulk and formulation using prominence diode array detector. Selected mobile phase was a combination of water:acetonitrile (35:65 % v/v) and wavelength selected was 256 nm. Retention time of voriconazole was 3.95 min. Linearity of the method was found to be 0.1 to 2 mug/ml, with the regression coefficient of 0.999. This method was validated according to ICH guidelines. Quantification was done by calculating area of the peak and the detection limit and quantitation limit ware 0.026 mug/ml and 0.1 mug/ml, respectively. There was no significant difference in the intra day and inter day analysis of voriconazole determined for three different concentrations using this method. Present method can be applied for the determination of voriconazole in quality control of formulation without interference of the excipients. PMID:20376229

  19. Development and Validation of HPTLC Method for Estimation of Tenoxicam and its Formulations.

    PubMed

    Chandel, S; Barhate, C R; Srivastava, A R; Kulkarni, S R; Kulkarni, S K; Kapadia, C J

    2012-01-01

    A simple, precise, accurate and rapid high performance thin layer chromatographic method has been developed and validated for the estimation of tenoxicam in the microemulsion gels. Tenoxicam was chromatographed on silica gel 60 F(254) TLC plate, as a stationary phase. The mobile phase was toluene: ethyl acetate: formic acid (6:4:0.3 v/v/v), which gave a dense and compact spot of tenoxicam with a R(f) value of 0.38±0.03. The quantification was carried out at 379 nm. The method was validated in terms of linearity, accuracy, precision and specificity. To justify the suitability, accuracy and precision of the proposed method, recovery studies were performed at three concentration levels. Statistical analysis proved that the proposed method is accurate and reproducible with linearity in the range of 100 to 400 ng. The limit of detection and limit of quantification for tenoxicam were 25 and 50 μg/spot, respectively. The proposed method can be employed for the routine analysis of tenoxicam as well as in pharmaceutical formulations. PMID:23204620

  20. Development and Validation of HPTLC Method for Estimation of Tenoxicam and its Formulations

    PubMed Central

    Chandel, S.; Barhate, C. R.; Srivastava, A. R.; Kulkarni, S. R.; Kapadia, C. J.

    2012-01-01

    A simple, precise, accurate and rapid high performance thin layer chromatographic method has been developed and validated for the estimation of tenoxicam in the microemulsion gels. Tenoxicam was chromatographed on silica gel 60 F254 TLC plate, as a stationary phase. The mobile phase was toluene: ethyl acetate: formic acid (6:4:0.3 v/v/v), which gave a dense and compact spot of tenoxicam with a Rf value of 0.38±0.03. The quantification was carried out at 379 nm. The method was validated in terms of linearity, accuracy, precision and specificity. To justify the suitability, accuracy and precision of the proposed method, recovery studies were performed at three concentration levels. Statistical analysis proved that the proposed method is accurate and reproducible with linearity in the range of 100 to 400 ng. The limit of detection and limit of quantification for tenoxicam were 25 and 50 μg/spot, respectively. The proposed method can be employed for the routine analysis of tenoxicam as well as in pharmaceutical formulations. PMID:23204620

  1. Determination of rupatadine in pharmaceutical formulations by a validated stability-indicating MEKC method.

    PubMed

    Nogueira, Daniele Rubert; da Silva Sangoi, Maximiliano; da Silva, Lucélia Magalhães; Todeschini, Vítor; Dalmora, Sérgio Luiz

    2008-09-01

    A stability-indicating MEKC was developed and validated for the analysis of rupatadine in tablet dosage forms, using nimesulide as internal standard. The MEKC method was performed on a fused-silica capillary (50 microm id; effective length, 40 cm). The BGE consisted of 15 mM borate buffer and 25 mM anionic detergent SDS solution at pH 10. The capillary temperature was maintained at 35 degrees C and the applied voltage was 25 kV. The injection was performed using the hydrodynamic mode at 50 mbar for 5 s, with detection by photodiode array detector set at 205 nm. The method was linear in the range of 0.5-150 microg/mL (r2=0.9996). The specificity and stability-indicating capability of the method were proven through degradation studies inclusive by MS, and showing also that there was no interference of the excipients and no increase of the cytotoxicity. The accuracy was 99.98% with bias lower than 1.06%. The LOD and LOQ were 0.1 and 0.5 microg/mL, respectively. The proposed method was successfully applied for the quantitative analysis of rupatadine in pharmaceutical formulations, and the results were compared to a validated RP-LC method, showing non-significant difference (p>0.05). PMID:18693320

  2. Development and validation of an HPLC-MS/MS method to determine clopidogrel in human plasma.

    PubMed

    Liu, Gangyi; Dong, Chunxia; Shen, Weiwei; Lu, Xiaopei; Zhang, Mengqi; Gui, Yuzhou; Zhou, Qinyi; Yu, Chen

    2016-01-01

    A quantitative method for clopidogrel using online-SPE tandem LC-MS/MS was developed and fully validated according to the well-established FDA guidelines. The method achieves adequate sensitivity for pharmacokinetic studies, with lower limit of quantifications (LLOQs) as low as 10 pg/mL. Chromatographic separations were performed on reversed phase columns Kromasil Eternity-2.5-C18-UHPLC for both methods. Positive electrospray ionization in multiple reaction monitoring (MRM) mode was employed for signal detection and a deuterated analogue (clopidogrel-d 4) was used as internal standard (IS). Adjustments in sample preparation, including introduction of an online-SPE system proved to be the most effective method to solve the analyte back-conversion in clinical samples. Pooled clinical samples (two levels) were prepared and successfully used as real-sample quality control (QC) in the validation of back-conversion testing under different conditions. The result showed that the real samples were stable in room temperature for 24 h. Linearity, precision, extraction recovery, matrix effect on spiked QC samples and stability tests on both spiked QCs and real sample QCs stored in different conditions met the acceptance criteria. This online-SPE method was successfully applied to a bioequivalence study of 75 mg single dose clopidogrel tablets in 48 healthy male subjects. PMID:26904399

  3. Update from the Japanese Center for the Validation of Alternative Methods (JaCVAM).

    PubMed

    Kojima, Hajime

    2013-12-01

    The Japanese Center for the Validation of Alternative Methods (JaCVAM) was established in 2005 to promote the use of alternatives to animal testing in regulatory studies, thereby replacing, reducing, or refining the use of animals, according to the Three Rs principles. JaCVAM assesses the utility, limitations and suitability for use in regulatory studies, of test methods needed to determine the safety of chemicals and other materials. JaCVAM also organises and performs validation studies of new test methods, when necessary. In addition, JaCVAM co-operates and collaborates with similar organisations in related fields, both in Japan and internationally, which also enables JaCVAM to provide input during the establishment of guidelines for new alternative experimental methods. These activities help facilitate application and approval processes for the manufacture and sale of pharmaceuticals, chemicals, pesticides, and other products, as well as for revisions to standards for cosmetic products. In this manner, JaCVAM plays a leadership role in the introduction of new alternative experimental methods for regulatory acceptance in Japan. PMID:24512226

  4. Glycol ethers - validation procedures for tube/pump and dosimeter monitoring methods

    SciTech Connect

    Langhorst, M.L.

    1984-06-01

    Methods were developed and validated for personal monitoring of exposures to airborne glycol ethers, both short-term and long-term time-weighted-averages. Either a 600 mg charcoal tube or a 780 mg silica gel tube is recommended for monitoring nine glycol ethers, depending upon the humidity and other organic compounds to be monitored. The charcoal tube allows maximum sensitivity and is unaffected by high humidity conditions. Two-phase solvent desorption with CS/sub 2/ and water allows aqueous phase recoveries of DOWANOLEM, EM, PM, EE, DM, DPM, and TM glycol ethers. DOWANOL EB, DB, and TPM glycol ethers are partitioned between the two layers, necessitating chromatographic analysis of both layers. The silica gel tube method can be used to monitor all nine glycol ethers tested, but is affected by high humidity conditions, resulting in significant breakthrough of th more volatile glycol ethers. The 3M organic vapor monitor can accurately and conveniently determine exposure concentrations for DOWANOL EM, EE, and PM glycol ethers, but sensitivities may be inadequate for sampling periods less than one hour. These methods were validated at levels down to 0.1 times the Dow internal exposure guidelines for those substances with DOW exposure guidelines and well above the current ACGIH and OSHA guidelines. This paper also illustrates validation procedures for tube/pump and dosimeter methods allowing good definition of method accuracy and precision. Some screening experiments are described for diffusional dosimeters to check the most important parameters in a minimum of time. This methodology will allow assessment of human airborne exposures relative to the new toxicology data avilable on animals.

  5. Validation of a method to directly and specifically measure nitrite in biological matrices.

    PubMed

    Almeida, Luis E F; Kamimura, Sayuri; Kenyon, Nicholas; Khaibullina, Alfia; Wang, Li; de Souza Batista, Celia M; Quezado, Zenaide M N

    2015-02-15

    The bioactivity of nitric oxide (NO) is influenced by chemical species generated through reactions with proteins, lipids, metals, and its conversion to nitrite and nitrate. A better understanding of the functions played by each of these species could be achieved by developing selective assays able of distinguishing nitrite from other NO species. Nagababu and Rifkind developed a method using acetic and ascorbic acids to measure nitrite-derived NO in plasma. Here, we adapted, optimized, and validated this method to assay nitrite in tissues. The method yielded linear measurements over 1-300 pmol of nitrite and was validated for tissue preserved in a nitrite stabilization solution composed of potassium ferricyanide, N-ethylmaleimide and NP-40. When samples were processed with chloroform, but not with methanol, ethanol, acetic acid or acetonitrile, reliable and reproducible nitrite measurements in up to 20 sample replicates were obtained. The method's accuracy in tissue was ≈ 90% and in plasma 99.9%. In mice, during basal conditions, brain, heart, lung, liver, spleen and kidney cortex had similar nitrite levels. In addition, nitrite tissue levels were similar regardless of when organs were processed: immediately upon collection, kept in stabilization solution for later analysis or frozen and later processed. After ip nitrite injections, rapidly changing nitrite concentrations in tissue and plasma could be measured and were shown to change in significantly distinct patterns. This validated method could be valuable for investigations of nitrite biology in conditions such as sickle cell disease, cardiovascular disease, and diabetes, where nitrite is thought to play a role. PMID:25445633

  6. Glycol ethers--validation procedures for tube/pump and dosimeter monitoring methods.

    PubMed

    Langhorst, M L

    1984-06-01

    Methods were developed and validated for personal monitoring of exposures to airborne glycol ethers, both short-term and long-term time-weighted-averages. Either a 600 mg charcoal tube or a 780 mg silica gel tube is recommended for monitoring nine glycol ethers, depending upon the humidity and other organic compounds to be monitored. The charcoal tube allows maximum sensitivity and is unaffected by high humidity conditions. Two-phase solvent desorption with CS2 and water allows aqueous phase recoveries of DOWANOL EM, PM, EE, DM, DPM, and TM glycol ethers. DOWANOL EB, DB and TPM glycol ethers are partitioned between the two layers, necessitating chromatographic analysis of both layers. The silica gel tube method can be used to monitor all nine glycol ethers tested, but is affected by high humidity conditions, resulting in significant breakthrough of the more volatile glycol ethers. The 3M organic vapor monitor can accurately and conveniently determine exposure concentrations for DOWANOL EM, EE, and PM glycol ethers, but sensitivities may be inadequate for sampling periods less than one hour. These methods were validated at levels down to 0.1 times the Dow internal exposure guidelines for those substances with Dow exposure guidelines and well above the current ACGIH and OSHA guidelines. This paper also illustrates validation procedures for tube/pump and dosimeter methods, allowing good definition of method accuracy and precision. Some screening experiments are described for diffusional dosimeters to check the most important parameters in a minimum of time. This methodology will allow assessment of human airborne exposures relative to the new toxicology data available on animals. PMID:6331145

  7. Quantitative Imaging Methods for the Development and Validation of Brain Biomechanics Models

    PubMed Central

    Bayly, Philip V.; Clayton, Erik H.; Genin, Guy M.

    2013-01-01

    Rapid deformation of brain tissue in response to head impact or acceleration can lead to numerous pathological changes, both immediate and delayed. Modeling and simulation hold promise for illuminating the mechanisms of traumatic brain injury (TBI) and for developing preventive devices and strategies. However, mathematical models have predictive value only if they satisfy two conditions. First, they must capture the biomechanics of the brain as both a material and a structure, including the mechanics of brain tissue and its interactions with the skull. Second, they must be validated by direct comparison with experimental data. Emerging imaging technologies and recent imaging studies provide important data for these purposes. This review describes these techniques and data, with an emphasis on magnetic resonance imaging approaches. In combination, these imaging tools promise to extend our understanding of brain biomechanics and improve our ability to study TBI in silico. PMID:22655600

  8. Optimization and validation of spectrophotometric methods for determination of finasteride in dosage and biological forms

    PubMed Central

    Amin, Alaa S.; Kassem, Mohammed A.

    2012-01-01

    Aim and Background: Three simple, accurate and sensitive spectrophotometric methods for the determination of finasteride in pure, dosage and biological forms, and in the presence of its oxidative degradates were developed. Materials and Methods: These methods are indirect, involve the addition of excess oxidant potassium permanganate for method A; cerric sulfate [Ce(SO4)2] for methods B; and N-bromosuccinimide (NBS) for method C of known concentration in acid medium to finasteride, and the determination of the unreacted oxidant by measurement of the decrease in absorbance of methylene blue for method A, chromotrope 2R for method B, and amaranth for method C at a suitable maximum wavelength, λmax: 663, 528, and 520 nm, for the three methods, respectively. The reaction conditions for each method were optimized. Results: Regression analysis of the Beer plots showed good correlation in the concentration ranges of 0.12–3.84 μg mL–1 for method A, and 0.12–3.28 μg mL–1 for method B and 0.14 – 3.56 μg mL–1 for method C. The apparent molar absorptivity, Sandell sensitivity, detection and quantification limits were evaluated. The stoichiometric ratio between the finasteride and the oxidant was estimated. The validity of the proposed methods was tested by analyzing dosage forms and biological samples containing finasteride with relative standard deviation ≤ 0.95. Conclusion: The proposed methods could successfully determine the studied drug with varying excess of its oxidative degradation products, with recovery between 99.0 and 101.4, 99.2 and 101.6, and 99.6 and 101.0% for methods A, B, and C, respectively. PMID:23781478

  9. A validated RP-HPLC method for quantitative determination of related impurities of ursodeoxycholic acid (API) by refractive index detection.

    PubMed

    Peepliwal, Ashok; Bonde, C G; Bothara, K G

    2011-03-25

    An isocratic RP-HPLC method was developed and validated for quantitative determination of ursodeoxycholic acid (UDCA) and its related impurities. Considering the lower molecular absorptivity of UDCA, refractive index detector was used to detect the impurities on a Phenomenex Luna C(18), 150 mm × 4.6 mm, 5 μm column. The mobile phase was 0.1% acetic acid/methanol (30:70, v/v) and flow rate was 0.8 ml/min. The detector and column temperature was maintained at 40°C. The method is linear over a range of 0.25-3.5 μg/ml for all impurities and coefficient of correlation (r(2)) was ≥0.9945. The accuracy of method demonstrated at three levels in the range of 50-150% of the specification limit and recoveries were found to be in the range of 97.11-100.75%. The precision for all related impurities was below 3.5% R.S.D. The method was applied to commercial bulk drug sample for assay purpose. PMID:21095088

  10. From the Bronx to Bengifunda (and other lines of flight): deterritorializing purposes and methods in science education research

    NASA Astrophysics Data System (ADS)

    Gough, Noel

    2011-03-01

    In this essay I explore a number of questions about purposes and methods in science education research prompted by my reading of Wesley Pitts' ethnographic study of interactions among four students and their teacher in a chemistry classroom in the Bronx, New York City. I commence three `lines of flight' (small acts of Deleuzo-Guattarian deterritorialization) that depart from the conceptual territory regulated by science education's dominant systems of signification and make new connections within and beyond that territory. I offer neither a comprehensive review nor a thorough critique of Wesley's paper but, rather, suggest some alternative directions for science education research in the genre he exemplifies.

  11. Validation and recommendation of methods to measure biogas production potential of animal manure.

    PubMed

    Pham, C H; Triolo, J M; Cu, T T T; Pedersen, L; Sommer, S G

    2013-06-01

    In developing countries, biogas energy production is seen as a technology that can provide clean energy in poor regions and reduce pollution caused by animal manure. Laboratories in these countries have little access to advanced gas measuring equipment, which may limit research aimed at improving local adapted biogas production. They may also be unable to produce valid estimates of an international standard that can be used for articles published in international peer-reviewed science journals. This study tested and validated methods for measuring total biogas and methane (CH4) production using batch fermentation and for characterizing the biomass. The biochemical methane potential (BMP) (CH4 NL kg(-1) VS) of pig manure, cow manure and cellulose determined with the Moller and VDI methods was not significantly different in this test (p>0.05). The biodegradability using a ratio of BMP and theoretical BMP (TBMP) was slightly higher using the Hansen method, but differences were not significant. Degradation rate assessed by methane formation rate showed wide variation within the batch method tested. The first-order kinetics constant k for the cumulative methane production curve was highest when two animal manures were fermented using the VDI 4630 method, indicating that this method was able to reach steady conditions in a shorter time, reducing fermentation duration. In precision tests, the repeatability of the relative standard deviation (RSDr) for all batch methods was very low (4.8 to 8.1%), while the reproducibility of the relative standard deviation (RSDR) varied widely, from 7.3 to 19.8%. In determination of biomethane concentration, the values obtained using the liquid replacement method (LRM) were comparable to those obtained using gas chromatography (GC). This indicates that the LRM method could be used to determine biomethane concentration in biogas in laboratories with limited access to GC. PMID:25049861

  12. Validated spectrophotometric methods for determination of sodium valproate based on charge transfer complexation reactions

    NASA Astrophysics Data System (ADS)

    Belal, Tarek S.; El-Kafrawy, Dina S.; Mahrous, Mohamed S.; Abdel-Khalek, Magdi M.; Abo-Gharam, Amira H.

    2016-02-01

    This work presents the development, validation and application of four simple and direct spectrophotometric methods for determination of sodium valproate (VP) through charge transfer complexation reactions. The first method is based on the reaction of the drug with p-chloranilic acid (p-CA) in acetone to give a purple colored product with maximum absorbance at 524 nm. The second method depends on the reaction of VP with dichlone (DC) in dimethylformamide forming a reddish orange product measured at 490 nm. The third method is based upon the interaction of VP and picric acid (PA) in chloroform resulting in the formation of a yellow complex measured at 415 nm. The fourth method involves the formation of a yellow complex peaking at 361 nm upon the reaction of the drug with iodine in chloroform. Experimental conditions affecting the color development were studied and optimized. Stoichiometry of the reactions was determined. The proposed spectrophotometric procedures were effectively validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. Calibration curves of the formed color products with p-CA, DC, PA and iodine showed good linear relationships over the concentration ranges 24-144, 40-200, 2-20 and 1-8 μg/mL respectively. The proposed methods were successfully applied to the assay of sodium valproate in tablets and oral solution dosage forms with good accuracy and precision. Assay results were statistically compared to a reference pharmacopoeial HPLC method where no significant differences were observed between the proposed methods and reference method.

  13. Validation and Recommendation of Methods to Measure Biogas Production Potential of Animal Manure

    PubMed Central

    Pham, C. H.; Triolo, J. M.; Cu, T. T. T.; Pedersen, L.; Sommer, S. G.

    2013-01-01

    In developing countries, biogas energy production is seen as a technology that can provide clean energy in poor regions and reduce pollution caused by animal manure. Laboratories in these countries have little access to advanced gas measuring equipment, which may limit research aimed at improving local adapted biogas production. They may also be unable to produce valid estimates of an international standard that can be used for articles published in international peer-reviewed science journals. This study tested and validated methods for measuring total biogas and methane (CH4) production using batch fermentation and for characterizing the biomass. The biochemical methane potential (BMP) (CH4 NL kg−1 VS) of pig manure, cow manure and cellulose determined with the Moller and VDI methods was not significantly different in this test (p>0.05). The biodegradability using a ratio of BMP and theoretical BMP (TBMP) was slightly higher using the Hansen method, but differences were not significant. Degradation rate assessed by methane formation rate showed wide variation within the batch method tested. The first-order kinetics constant k for the cumulative methane production curve was highest when two animal manures were fermented using the VDI 4630 method, indicating that this method was able to reach steady conditions in a shorter time, reducing fermentation duration. In precision tests, the repeatability of the relative standard deviation (RSDr) for all batch methods was very low (4.8 to 8.1%), while the reproducibility of the relative standard deviation (RSDR) varied widely, from 7.3 to 19.8%. In determination of biomethane concentration, the values obtained using the liquid replacement method (LRM) were comparable to those obtained using gas chromatography (GC). This indicates that the LRM method could be used to determine biomethane concentration in biogas in laboratories with limited access to GC. PMID:25049861

  14. Validated spectrophotometric methods for determination of sodium valproate based on charge transfer complexation reactions.

    PubMed

    Belal, Tarek S; El-Kafrawy, Dina S; Mahrous, Mohamed S; Abdel-Khalek, Magdi M; Abo-Gharam, Amira H

    2016-02-15

    This work presents the development, validation and application of four simple and direct spectrophotometric methods for determination of sodium valproate (VP) through charge transfer complexation reactions. The first method is based on the reaction of the drug with p-chloranilic acid (p-CA) in acetone to give a purple colored product with maximum absorbance at 524nm. The second method depends on the reaction of VP with dichlone (DC) in dimethylformamide forming a reddish orange product measured at 490nm. The third method is based upon the interaction of VP and picric acid (PA) in chloroform resulting in the formation of a yellow complex measured at 415nm. The fourth method involves the formation of a yellow complex peaking at 361nm upon the reaction of the drug with iodine in chloroform. Experimental conditions affecting the color development were studied and optimized. Stoichiometry of the reactions was determined. The proposed spectrophotometric procedures were effectively validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. Calibration curves of the formed color products with p-CA, DC, PA and iodine showed good linear relationships over the concentration ranges 24-144, 40-200, 2-20 and 1-8μg/mL respectively. The proposed methods were successfully applied to the assay of sodium valproate in tablets and oral solution dosage forms with good accuracy and precision. Assay results were statistically compared to a reference pharmacopoeial HPLC method where no significant differences were observed between the proposed methods and reference method. PMID:26574649

  15. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  16. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  17. Improved computational neutronics methods and validation protocols for the advanced test reactor

    SciTech Connect

    Nigg, D. W.; Nielsen, J. W.; Chase, B. M.; Murray, R. K.; Steuhm, K. A.; Unruh, T.

    2012-07-01

    The Idaho National Laboratory (INL) is in the process of updating the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purposes. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry have been conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for flexible and repeatable ATR physics code validation protocols that are consistent with applicable national standards. (authors)

  18. SWeRF—A Method for Estimating the Relevant Fine Particle Fraction in Bulk Materials for Classification and Labelling Purposes

    PubMed Central

    2014-01-01

    In accordance with the European regulation for classification, labelling and packaging of substances and mixtures (CLP) as well as the criteria as set out in the Globally Harmonized System (GHS), fine fraction of crystalline silica (CS) has been classified as a specific target organ toxicity, the specific organ in this case being the lung. Generic cut-off values for products containing a fine fraction of CS trigger the need for a method for the quantification of the fine fraction of CS in bulk materials. This article describes the so-called SWeRF method, the size-weighted relevant fine fraction. The SWeRF method combines the particle size distribution of a powder with probability factors from the EN 481 standard and allows the relevant fine fraction of a material to be calculated. The SWeRF method has been validated with a number of industrial minerals. This will enable manufacturers and blenders to apply the CLP and GHS criteria for the classification of mineral products containing RCS a fine fraction of CS. PMID:24389081

  19. A fast and reliable method for GHB quantitation in whole blood by GC-MS/MS (TQD) for forensic purposes.

    PubMed

    Castro, André L; Tarelho, Sónia; Dias, Mário; Reis, Flávio; Teixeira, Helena M

    2016-02-01

    Gamma-hydroxybutyric acid (GHB) is an endogenous compound with a story of clinical use since the 1960s. However, due to its secondary effects, it has become a controlled substance, entering the illicit market. A fully validated, sensitive and reproducible method for the quantification of GHB by methanolic precipitation and GC-MS/MS (TQD) in whole blood is presented. Using 100μL of whole blood, obtained results included a LOD and LLOQ of 0.1mg/L and a recovery of 86% in a working range between 0.1 and 100mg/L. This method is sensitive and specific to detect the presence of GHB in small amounts of whole blood (both ante-mortem or post-mortem), and is, to the authors' knowledge, the first GC-MS-MS TQD method that uses different precursor ions and product ions for the identification of GHB and GHB-D6 (internal standard). Hence, this method may be especially useful for the study of endogenous values in this biological sample. PMID:26678181

  20. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections. PMID:26836506

  1. Validated spectrophotometric methods for simultaneous determination of Omeprazole, Tinidazole and Doxycycline in their ternary mixture.

    PubMed

    Lotfy, Hayam M; Hegazy, Maha A; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-01-15

    A comparative study of smart spectrophotometric techniques for the simultaneous determination of Omeprazole (OMP), Tinidazole (TIN) and Doxycycline (DOX) without prior separation steps is developed. These techniques consist of several consecutive steps utilizing zero/or ratio/or derivative spectra. The proposed techniques adopt nine simple different methods, namely direct spectrophotometry, dual wavelength, first derivative-zero crossing, amplitude factor, spectrum subtraction, ratio subtraction, derivative ratio-zero crossing, constant center, and successive derivative ratio method. The calibration graphs are linear over the concentration range of 1-20 μg/mL, 5-40 μg/mL and 2-30 μg/mL for OMP, TIN and DOX, respectively. These methods are tested by analyzing synthetic mixtures of the above drugs and successfully applied to commercial pharmaceutical preparation. The methods that are validated according to the ICH guidelines, accuracy, precision, and repeatability, were found to be within the acceptable limits. PMID:26322842

  2. Validated spectrofluorometric method for determination of gemfibrozil in self nanoemulsifying drug delivery systems (SNEDDS)

    NASA Astrophysics Data System (ADS)

    Sierra Villar, Ana M.; Calpena Campmany, Ana C.; Bellowa, Lyda Halbaut; Trenchs, Monserrat Aróztegui; Naveros, Beatriz Clares

    2013-09-01

    A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r2 > 0.999) and low limits of detection and quantification (LOD of 0.075 μg mL-1 and LOQ of 0.226 μg mL-1) in the range of 0.2-5 μg mL-1, equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily.

  3. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  4. Validated spectrophotometric methods for simultaneous determination of Omeprazole, Tinidazole and Doxycycline in their ternary mixture

    NASA Astrophysics Data System (ADS)

    Lotfy, Hayam M.; Hegazy, Maha A.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-01-01

    A comparative study of smart spectrophotometric techniques for the simultaneous determination of Omeprazole (OMP), Tinidazole (TIN) and Doxycycline (DOX) without prior separation steps is developed. These techniques consist of several consecutive steps utilizing zero/or ratio/or derivative spectra. The proposed techniques adopt nine simple different methods, namely direct spectrophotometry, dual wavelength, first derivative-zero crossing, amplitude factor, spectrum subtraction, ratio subtraction, derivative ratio-zero crossing, constant center, and successive derivative ratio method. The calibration graphs are linear over the concentration range of 1-20 μg/mL, 5-40 μg/mL and 2-30 μg/mL for OMP, TIN and DOX, respectively. These methods are tested by analyzing synthetic mixtures of the above drugs and successfully applied to commercial pharmaceutical preparation. The methods that are validated according to the ICH guidelines, accuracy, precision, and repeatability, were found to be within the acceptable limits.

  5. Validating a nondestructive optical method for apportioning colored particulate matter into black carbon and additional components

    NASA Astrophysics Data System (ADS)

    Yan, Beizhan; Kennedy, Daniel; Miller, Rachel L.; Cowin, James P.; Jung, Kyung-hwa; Perzanowski, Matt; Balletta, Marco; Perera, Federica P.; Kinney, Patrick L.; Chillrud, Steven N.

    2011-12-01

    Exposure of black carbon (BC) is associated with a variety of adverse health outcomes. A number of optical methods for estimating BC on Teflon filters have been adopted but most assume all light absorption is due to BC while other sources of colored particulate matter exist. Recently, a four-wavelength-optical reflectance measurement for distinguishing second hand cigarette smoke (SHS) from soot-BC was developed (Brook et al., 2010; Lawless et al., 2004). However, the method has not been validated for soot-BC nor SHS and little work has been done to look at the methodological issues of the optical reflectance measurements for samples that could have SHS, BC, and other colored particles. We refined this method using a lab-modified integrating sphere with absorption measured continuously from 350 nm to 1000 nm. Furthermore, we characterized the absorption spectrum of additional components of particulate matter (PM) on PM 2.5 filters including ammonium sulfate, hematite, goethite, and magnetite. Finally, we validate this method for BC by comparison to other standard methods. Use of synthesized data indicates that it is important to optimize the choice of wavelengths to minimize computational errors as additional components (more than 2) are added to the apportionment model of colored components. We found that substantial errors are introduced when using 4 wavelengths suggested by Lawless et al. to quantify four substances, while an optimized choice of wavelengths can reduce model-derived error from over 10% to less than 2%. For environmental samples, the method was sensitive for estimating airborne levels of BC and SHS, but not mass loadings of iron oxides and sulfate. Duplicate samples collected in NYC show high reproducibility (points consistent with a 1:1 line, R2 = 0.95). BC data measured by this method were consistent with those measured by other optical methods, including Aethalometer and Smoke-stain Reflectometer (SSR); although the SSR looses sensitivity at

  6. Validating a nondestructive optical method for apportioning colored particulate matter into black carbon and additional components

    PubMed Central

    Yan, Beizhan; Kennedy, Daniel; Miller, Rachel L.; Cowin, James P.; Jung, Kyung-hwa; Perzanowski, Matt; Balletta, Marco; Perera, Federica P.; Kinney, Patrick L.; Chillrud, Steven N.

    2011-01-01

    Exposure of black carbon (BC) is associated with a variety of adverse health outcomes. A number of optical methods for estimating BC on Teflon filters have been adopted but most assume all light absorption is due to BC while other sources of colored particulate matter exist. Recently, a four-wavelength-optical reflectance measurement for distinguishing second hand cigarette smoke (SHS) from soot-BC was developed (Brook et al., 2010; Lawless et al., 2004). However, the method has not been validated for soot-BC nor SHS and little work has been done to look at the methodological issues of the optical reflectance measurements for samples that could have SHS, BC, and other colored particles. We refined this method using a lab-modified integrating sphere with absorption measured continuously from 350 nm to 1000 nm. Furthermore, we characterized the absorption spectrum of additional components of particulate matter (PM) on PM2.5 filters including ammonium sulfate, hematite, goethite, and magnetite. Finally, we validate this method for BC by comparison to other standard methods. Use of synthesized data indicates that it is important to optimize the choice of wavelengths to minimize computational errors as additional components (more than 2) are added to the apportionment model of colored components. We found that substantial errors are introduced when using 4 wavelengths suggested by Lawless et al. to quantify four substances, while an optimized choice of wavelengths can reduce model-derived error from over 10% to less than 2%. For environmental samples, the method was sensitive for estimating airborne levels of BC and SHS, but not mass loadings of iron oxides and sulfate. Duplicate samples collected in NYC show high reproducibility (points consistent with a 1:1 line, R2 = 0.95). BC data measured by this method were consistent with those measured by other optical methods, including Aethalometer and Smoke-stain Reflectometer (SSR); although the SSR looses sensitivity at

  7. Chemometric approach to open validation protocols: Prediction of validation parameters in multi-residue ultra-high performance liquid chromatography-tandem mass spectrometry methods.

    PubMed

    Alladio, Eugenio; Pirro, Valentina; Salomone, Alberto; Vincenti, Marco; Leardi, Riccardo

    2015-06-01

    The recent technological advancements of liquid chromatography-tandem mass spectrometry allow the simultaneous determination of tens, or even hundreds, of target analytes. In such cases, the traditional approach to quantitative method validation presents three major drawbacks: (i) it is extremely laborious, repetitive and rigid; (ii) it does not allow to introduce new target analytes without starting the validation from its very beginning and (iii) it is performed on spiked blank matrices, whose very nature is significantly modified by the addition of a large number of spiking substances, especially at high concentration. In the present study, several predictive chemometric models were developed from closed sets of analytes in order to estimate validation parameters on molecules of the same class, but not included in the original training set. Retention time, matrix effect, recovery, detection and quantification limits were predicted with partial least squares regression method. In particular, iterative stepwise elimination, iterative predictors weighting and genetic algorithms approaches were utilized and compared to achieve effective variables selection. These procedures were applied to data reported in our previously validated ultra-high performance liquid chromatography-tandem mass spectrometry multi-residue method for the determination of pharmaceutical and illicit drugs in oral fluid samples in accordance with national and international guidelines. Then, the partial least squares model was successfully tested on naloxone and lormetazepam, in order to introduce these new compounds in the oral fluid validated method, which adopts reverse-phase chromatography. Retention time, matrix effect, recovery, limit of detection and limit of quantification parameters for naloxone and lormetazepam were predicted by the model and then positively compared with their corresponding experimental values. The whole study represents a proof-of-concept of chemometrics potential to

  8. Experimental comparison and validation of hot-ball method with guarded hot plate method on polyurethane foams

    NASA Astrophysics Data System (ADS)

    Hudec, Ján; Glorieux, Christ; Dieška, Peter; Kubičár, Ľudovít

    2016-07-01

    The Hot-ball method is an innovative transient method for measuring thermophysical properties. The principle is based on heating of a small ball, incorporated in measured medium, by constant heating power and simultaneous measuring of the ball's temperature response since the heating was initiated. The shape of the temperature response depends on thermophysical properties of the medium, where the sensor is placed. This method is patented by Institute of Physics, SAS, where the method and sensors based on this method are being developed. At the beginning of the development of sensors for this method we were oriented on monitoring applications, where relative precision is much more important than accuracy. Meanwhile, the quality of sensors was improved good enough to be used for a new application - absolute measuring of thermophysical parameters of low thermally conductive materials. This paper describes experimental verification and validation of measurement by hot-ball method. Thanks to cooperation with Laboratory of Soft Matter and Biophysics of Catholic University of Leuven in Belgium, established Guarded Hot Plate method was used as a reference. Details about measuring setups, description of the experiments and results of the comparison are presented.

  9. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    PubMed Central

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-01-01

    Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625

  10. Development and Validation of a Computational Method for Assessment of Missense Variants in Hypertrophic Cardiomyopathy

    PubMed Central

    Jordan, Daniel M.; Kiezun, Adam; Baxter, Samantha M.; Agarwala, Vineeta; Green, Robert C.; Murray, Michael F.; Pugh, Trevor; Lebo, Matthew S.; Rehm, Heidi L.; Funke, Birgit H.; Sunyaev, Shamil R.

    2011-01-01

    Assessing the significance of novel genetic variants revealed by DNA sequencing is a major challenge to the integration of genomic techniques with medical practice. Many variants remain difficult to classify by traditional genetic methods. Computational methods have been developed that could contribute to classifying these variants, but they have not been properly validated and are generally not considered mature enough to be used effectively in a clinical setting. We developed a computational method for predicting the effects of missense variants detected in patients with hypertrophic cardiomyopathy (HCM). We used a curated clinical data set of 74 missense variants in six genes associated with HCM to train and validate an automated predictor. The predictor is based on support vector regression and uses phylogenetic and structural features specific to genes involved in HCM. Ten-fold cross validation estimated our predictor's sensitivity at 94% (95% confidence interval: 83%–98%) and specificity at 89% (95% confidence interval: 72%–100%). This corresponds to an odds ratio of 10 for a prediction of pathogenic (95% confidence interval: 4.0–infinity), or an odds ratio of 9.9 for a prediction of benign (95% confidence interval: 4.6–21). Coverage (proportion of variants for which a prediction was made) was 57% (95% confidence interval: 49%–64%). This performance exceeds that of existing methods that are not specifically designed for HCM. The accuracy of this predictor provides support for the clinical use of automated predictions alongside family segregation and population frequency data in the interpretation of new missense variants and suggests future development of similar tools for other diseases. PMID:21310275

  11. [Development and validation of method for the determination of cynarin, luteolin in plasma].

    PubMed

    Kulza, Maksymilian; Malinowska, Katarzyna; Woźniak, Anna; Seńczuk-Przybyłowska, Monika; Nowak, Gerard; Florek, Ewa

    2012-01-01

    The aim of this study was to develop and validate the method of cynarin and luteolin, the main constituents of artichoke (Cynara scolymus L.) leaf extract, determination in plasma. The compounds were separated using the high-performance liquid chromatography technique with diode array detection (HPLC-DAD). The analysis was preceded by liquid-liquid extraction using as the extracting agent ethyl acetate. The HPLC separation was performed on C18 column under gradient conditions using a mobile phase - 0,05% trifluoroacetic acid in water and methanol. The detector was set at lambda=330 nm. The validation was related to linearity, sensitivity (LOD and LOQ), accuracy and repeatability. In the validated method the linearity was achieved within concentration range 1,5625 - 50,0 microg/cm3 for the cynarin (R2=0,9989) and 1,5625 - 200,0 microg/cm3 for the luteolin (R2=0998). The limits of detection for cynarin and luteolin was: 0,75 microg/cm3 and 0,1 microg/cm3 and the limits of quatification: 2,25 microg/cm3 and 0,2 microg/cm3, respectively. Coefficient of variation for the inter-day and the intra-day analysis, which is a precision and accuracy parameter, do not exceed 10%. Recovery was 67% for the cynarin and 96% for the luteolin. The practical application of this method was proved by analysis of plasma samples from rats. The animals were administrated artichoke leaf extract - orally and intraperitoneally at a dose of 3 g/kg body weight or pure substances - intraperitoneally at a dose 1 mg/kg of luteolin and 0,5 mg/kg of cynarin. The presence of investigated compounds was proved only in samples after intraperitoneal administration of pure substances. The developed method is used to determine simultaneously cynarin and luteolin, after intraperitoneal administration of pure compounds. PMID:23421076

  12. AAPS and US FDA Crystal City VI workshop on bioanalytical method validation for biomarkers.

    PubMed

    Lowes, Steve; Ackermann, Bradley L

    2016-02-01

    Crystal City VI Workshop on Bioanalytical Method Validation of Biomarkers, Renaissance Baltimore Harborplace Hotel, Baltimore, MD, USA, 28-29 September 2015 The Crystal City VI workshop was organized by the American Association of Pharmaceutical Scientists in association with the US FDA to continue discussion on the bioanalysis of biomarkers. An outcome of the Crystal City V workshop, convened following release of the draft FDA Guidance for Industry on Bioanalytical Methods Validation in 2013 was the need to have further discussion on biomarker methods. Biomarkers ultimately became the sole focal point for Crystal City VI, a meeting attended by approximately 200 people and composed of industry scientists and regulators from around the world. The meeting format included several panel discussions to maximize the opportunity for dialogue among participants. Following an initial session on the general topic of biomarker assays and intended use, more focused sessions were held on chromatographic (LC-MS) and ligand-binding assays. In addition to participation by the drug development community, significant representation was present from clinical testing laboratories. The experience of this latter group, collectively identified as practitioners of CLIA (Clinical Laboratory Improvement Amendments), helped shape the discussion and takeaways from the meeting. While the need to operate within the framework of the current BMV guidance was clearly acknowledged, a general understanding that biomarker methods validation cannot be adequately depicted by current PK-centric guidelines emerged as a consensus from the meeting. This report is not intended to constitute the official proceedings from Crystal City VI, which is expected to be published in early 2016. PMID:26795584

  13. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given. PMID:22468371

  14. Validation of the Hygicult E dipslides method in surface hygiene control: a Nordic collaborative study.

    PubMed

    Salo, Satu; Alanko, Timo; Sjöberg, Anna-Maija; Wirtanen, Gun

    2002-01-01

    A collaborative study with Enterobacteriaceae was conducted to validate Hygicult E dipslides by comparison with violet red bile glucose agar (VRBGA) contact plates and swabbing, using stainless steel surfaces artificially contaminated with microbes at various levels. Twelve laboratories participated in the validation procedure. The total number of collaborative samples was 108. The microbial level in each sample was assessed in triplicate by using the 3 above-mentioned methods. No Enterobacteriaceae were used at the low inoculation level. At the middle inoculation level, the percentages detached from the test surfaces were 16.6 with the Hygicult E method, 15.3 with the contact plate method, and 14.6 with swabbing; at the high innoculation level, the percentages were 14.5, 15.8, and 9.8, respectively. The percentage of acceptable results after the removal of outliers was 97.2. Repeatability relative standard deviations ranged from 33.4 to 44.9%; reproducibility relative standard deviations ranged from 45.2 to 77.1%. The Hygicult E dipslide, VRBGA contact plate, and swabbing methods gave similar results at all 3 microbial levels tested: <1.0 colony-forming units (CFU)/cm2 at the low level, 1.2-1.3 CFU/cm2 at the middle level (theoretical yield 8.0 CFU/cm2), and 1.2-2.0 CFU/cm2 at the high level (theoretical yield 12.5 CFU/cm2). PMID:11990024

  15. Validation of a UV Spectrometric Method for the Assay of Tolfenamic Acid in Organic Solvents

    PubMed Central

    Ahmed, Sofia; Mustaan, Nafeesa; Sheraz, Muhammad Ali; Nabi, Syeda Ayesha Ahmed un; Ahmad, Iqbal

    2015-01-01

    The present study has been carried out to validate a UV spectrometric method for the assay of tolfenamic acid (TA) in organic solvents. TA is insoluble in water; therefore, a total of thirteen commonly used organic solvents have been selected in which the drug is soluble. Fresh stock solutions of TA in each solvent in a concentration of 1 × 10−4 M (2.62 mg%) were prepared for the assay. The method has been validated according to the guideline of International Conference on Harmonization and parameters like linearity, range, accuracy, precision, sensitivity, and robustness have been studied. Although the method was found to be efficient for the determination of TA in all solvents on the basis of statistical data 1-octanol, followed by ethanol and methanol, was found to be comparatively better than the other studied solvents. No change in the stock solution stability of TA has been observed in each solvent for 24 hours stored either at room (25 ± 1°C) or at refrigerated temperature (2–8°C). A shift in the absorption maxima has been observed for TA in various solvents indicating drug-solvent interactions. The studied method is simple, rapid, economical, accurate, and precise for the assay of TA in different organic solvents. PMID:26783497

  16. Determination of phosphine in plant materials: method optimization and validation in interlaboratory comparison tests.

    PubMed

    Amrein, Thomas M; Ringier, Lara; Amstein, Nathalie; Clerc, Laurence; Bernauer, Sabine; Baumgartner, Thomas; Roux, Bernard; Stebler, Thomas; Niederer, Markus

    2014-03-01

    The optimization and validation of a method for the determination of phosphine in plant materials are described. The method is based on headspace sampling over the sample heated in 5% sulfuric acid. Critical factors such as sample amount, equilibration conditions, method of quantitation, and matrix effects are discussed, and validation data are presented. Grinding of coarse samples does not lead to lower results and is a prerequisite for standard addition experiments, which present the most reliable approach for quantitation because of notable matrix effects. Two interlaboratory comparisons showed that results varied considerably and that an uncertainty of measurement of about 50% has to be assessed. Flame photometric and mass spectrometric detection gave similar results. The proposed method is well reproducible within one laboratory, and results from the authors' laboratories using different injection and detection techniques are very close to each other. The considerable variation in the interlaboratory comparison shows that this analysis is still challenging in practice and further proficiency testing is needed. PMID:24564743

  17. Development and validation of a heated canister-based source sampling method

    SciTech Connect

    Crawford, R.J.; Elam, D.L.

    1994-12-31

    In response to the Clean Air Act Amendments of 1990, the US pulp and paper industry through the American Forest and Paper Association (AF and PA) has instituted a program to characterize hazardous air pollutant (HAP) emissions from a variety of sources at 16 facilities. To meet some of the specific needs of this program, a method has been developed, based on EPA Method 18, that uses a heated sampling system to transfer source gas samples to a heated stainless steel summa polished canister. After sampling, the canister is kept hot in an insulated box and transferred to an on-site mobile laboratory. All of the analyte system components are also heated so that the moisture is not allowed to condense in the sample before it is analyzed. An initial mill screening study, laboratory evaluation/ validation, and an EPA Method 301 validation on pulp mill sources have all been completed with acceptable results. This method is being used to quantitate 26 VOCs, e.g., methanol, acetone, methylene chloride, chloroform, benzene, methyl ethyl ketone, and methyl isobutyl ketone.

  18. Development and Validation of Stability-indicating HPLC Method for Simultaneous Estimation of Cefixime and Linezolid

    PubMed Central

    Patel, Nidhi S.; Tandel, Falguni B.; Patel, Yogita D.; Thakkar, Kartavya B.

    2014-01-01

    A stability-indicating reverse phase high performance liquid chromatography method was developed and validated for cefixime and linezolid. The wavelength selected for quantitation was 276 nm. The method has been validated for linearity, accuracy, precision, robustness, limit of detection and limit of quantitation. Linearity was observed in the concentration range of 2-12 μg/ml for cefixime and 6-36 μg/ml for linezolid. For RP-HPLC, the separation was achieved by Phenomenex Luna C18 (250×4.6 mm) 5 μm column using phosphate buffer (pH 7):methanol (60:40 v/v) as mobile phase with flow rate 1 ml/min. The retention time of cefixime and linezolid were found to be 3.127 min and 11.986 min, respectively. During force degradation, drug product was exposed to hydrolysis (acid and base hydrolysis), H2O2, thermal degradation and photo degradation. The % degradation was found to be 10 to 20% for both cefixime and linezolid in the given condition. The method specifically estimates both the drugs in presence of all the degradants generated during forced degradation study. The developed methods were simple, specific and economic, which can be used for simultaneous estimation of cefixime and linezolid in tablet dosage form. PMID:25593387

  19. Development and Validation of Stability-indicating HPLC Method for Simultaneous Estimation of Cefixime and Linezolid.

    PubMed

    Patel, Nidhi S; Tandel, Falguni B; Patel, Yogita D; Thakkar, Kartavya B

    2014-01-01

    A stability-indicating reverse phase high performance liquid chromatography method was developed and validated for cefixime and linezolid. The wavelength selected for quantitation was 276 nm. The method has been validated for linearity, accuracy, precision, robustness, limit of detection and limit of quantitation. Linearity was observed in the concentration range of 2-12 μg/ml for cefixime and 6-36 μg/ml for linezolid. For RP-HPLC, the separation was achieved by Phenomenex Luna C18 (250×4.6 mm) 5 μm column using phosphate buffer (pH 7):methanol (60:40 v/v) as mobile phase with flow rate 1 ml/min. The retention time of cefixime and linezolid were found to be 3.127 min and 11.986 min, respectively. During force degradation, drug product was exposed to hydrolysis (acid and base hydrolysis), H2O2, thermal degradation and photo degradation. The % degradation was found to be 10 to 20% for both cefixime and linezolid in the given condition. The method specifically estimates both the drugs in presence of all the degradants generated during forced degradation study. The developed methods were simple, specific and economic, which can be used for simultaneous estimation of cefixime and linezolid in tablet dosage form. PMID:25593387

  20. Quantification of histone modifications by parallel-reaction monitoring: a method validation.

    PubMed

    Sowers, James L; Mirfattah, Barsam; Xu, Pei; Tang, Hui; Park, In Young; Walker, Cheryl; Wu, Ping; Laezza, Fernanda; Sowers, Lawrence C; Zhang, Kangling

    2015-10-01

    Abnormal epigenetic reprogramming is one of the major causes leading to irregular gene expression and regulatory pathway perturbations, in the cells, resulting in unhealthy cell development or diseases. Accurate measurements of these changes of epigenetic modifications, especially the complex histone modifications, are very important, and the methods for these measurements are not trivial. By following our previous introduction of PRM to targeting histone modifications (Tang, H.; Fang, H.; Yin, E.; Brasier, A. R.; Sowers, L. C.; Zhang, K. Multiplexed parallel reaction monitoring targeting histone modifications on the QExactive mass spectrometer. Anal. Chem. 2014, 86 (11), 5526-34), herein we validated this method by varying the protein/trypsin ratios via serial dilutions. Our data demonstrated that PRM with SILAC histones as the internal standards allowed reproducible measurements of histone H3/H4 acetylation and methylation in the samples whose histone contents differ at least one-order of magnitude. The method was further validated by histones isolated from histone H3 K36 trimethyltransferase SETD2 knockout mouse embryonic fibroblasts (MEF) cells. Furthermore, histone acetylation and methylation in human neural stem cells (hNSC) treated with ascorbic acid phosphate (AAP) were measured by this method, revealing that H3 K36 trimethylation was significantly down-regulated by 6 days of treatment with vitamin C. PMID:26356480

  1. Development and Validation of RP-HPLC Method for the Estimation of Ivabradine Hydrochloride in Tablets

    PubMed Central

    Seerapu, Sunitha; Srinivasan, B. P.

    2010-01-01

    A simple, sensitive, precise and robust reverse–phase high-performance liquid chromatographic method for analysis of ivabradine hydrochloride in pharmaceutical formulations was developed and validated as per ICH guidelines. The separation was performed on SS Wakosil C18AR, 250×4.6 mm, 5 μm column with methanol:25 mM phosphate buffer (60:40 v/v), adjusted to pH 6.5 with orthophosphoric acid, added drop wise, as mobile phase. A well defined chromatographic peak of Ivabradine hydrochloride was exhibited with a retention time of 6.55±0.05 min and tailing factor of 1.14 at the flow rate of 0.8 ml/min and at ambient temperature, when monitored at 285 nm. The linear regression analysis data for calibration plots showed good linear relationship with R=0.9998 in the concentration range of 30-210 μg/ml. The method was validated for precision, recovery and robustness. Intra and Inter-day precision (% relative standard deviation) were always less than 2%. The method showed the mean % recovery of 99.00 and 98.55 % for Ivabrad and Inapure tablets, respectively. The proposed method has been successfully applied to the commercial tablets without any interference of excipients. PMID:21695008

  2. Validation of an Association Rule Mining-Based Method to Infer Associations Between Medications and Problems

    PubMed Central

    Wright, A.; McCoy, A.; Henkin, S.; Flaherty, M.; Sittig, D.

    2013-01-01

    Background In a prior study, we developed methods for automatically identifying associations between medications and problems using association rule mining on a large clinical data warehouse and validated these methods at a single site which used a self-developed electronic health record. Objective To demonstrate the generalizability of these methods by validating them at an external site. Methods We received data on medications and problems for 263,597 patients from the University of Texas Health Science Center at Houston Faculty Practice, an ambulatory practice that uses the Allscripts Enterprise commercial electronic health record product. We then conducted association rule mining to identify associated pairs of medications and problems and characterized these associations with five measures of interestingness: support, confidence, chi-square, interest and conviction and compared the top-ranked pairs to a gold standard. Results 25,088 medication-problem pairs were identified that exceeded our confidence and support thresholds. An analysis of the top 500 pairs according to each measure of interestingness showed a high degree of accuracy for highly-ranked pairs. Conclusion The same technique was successfully employed at the University of Texas and accuracy was comparable to our previous results. Top associations included many medications that are highly specific for a particular problem as well as a large number of common, accurate medication-problem pairs that reflect practice patterns. PMID:23650491

  3. Improvement and validation of the method to determine neutral detergent fiber in feed.

    PubMed

    Hiraoka, Hisaaki; Fukunaka, Rie; Ishikuro, Eiichi; Enishi, Osamu; Goto, Tetsuhisa

    2012-10-01

    To improve the performance of the analytical method for neutral detergent fiber in feed with heat-stable α-amylase treatment (aNDFom), the process of adding heat-stable α-amylase, as well as other analytical conditions, were examined. In this new process, the starch in the samples was removed by adding amylase to neutral detergent (ND) solution twice, just after the start of heating and immediately after refluxing. We also examined the effects of the use of sodium sulfite, and drying and ashing conditions for aNDFom analysis by this modified amylase addition method. A collaborative study to validate this new method was carried out with 15 laboratories. These laboratories analyzed two samples, alfalfa pellet and dairy mixed feed, with blind duplicates. Ten laboratories used a conventional apparatus and five used a Fibertec(®) type apparatus. There were no significant differences in aNDFom values between these two refluxing apparatuses. The aNDFom values in alfalfa pellet and dairy mixed feed were 388 g/kg and 145 g/kg, the coefficients of variation for the repeatability and reproducibility (CV(r) and CV(R) ) were 1.3% and 2.9%, and the HorRat values were 0.8 and 1.1, respectively. This new method was validated with 5.8% uncertainty (k = 2) from the collaborative study. PMID:23035708

  4. Fisk-based criteria to support validation of detection methods for drinking water and air.

    SciTech Connect

    MacDonell, M.; Bhattacharyya, M.; Finster, M.; Williams, M.; Picel, K.; Chang, Y.-S.; Peterson, J.; Adeshina, F.; Sonich-Mullin, C.; Environmental Science Division; EPA

    2009-02-18

    This report was prepared to support the validation of analytical methods for threat contaminants under the U.S. Environmental Protection Agency (EPA) National Homeland Security Research Center (NHSRC) program. It is designed to serve as a resource for certain applications of benchmark and fate information for homeland security threat contaminants. The report identifies risk-based criteria from existing health benchmarks for drinking water and air for potential use as validation targets. The focus is on benchmarks for chronic public exposures. The priority sources are standard EPA concentration limits for drinking water and air, along with oral and inhalation toxicity values. Many contaminants identified as homeland security threats to drinking water or air would convert to other chemicals within minutes to hours of being released. For this reason, a fate analysis has been performed to identify potential transformation products and removal half-lives in air and water so appropriate forms can be targeted for detection over time. The risk-based criteria presented in this report to frame method validation are expected to be lower than actual operational targets based on realistic exposures following a release. Note that many target criteria provided in this report are taken from available benchmarks without assessing the underlying toxicological details. That is, although the relevance of the chemical form and analogues are evaluated, the toxicological interpretations and extrapolations conducted by the authoring organizations are not. It is also important to emphasize that such targets in the current analysis are not health-based advisory levels to guide homeland security responses. This integrated evaluation of chronic public benchmarks and contaminant fate has identified more than 200 risk-based criteria as method validation targets across numerous contaminants and fate products in drinking water and air combined. The gap in directly applicable values is

  5. Validation of different spectrophotometric methods for determination of vildagliptin and metformin in binary mixture

    NASA Astrophysics Data System (ADS)

    Abdel-Ghany, Maha F.; Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    New, simple, specific, accurate, precise and reproducible spectrophotometric methods have been developed and subsequently validated for determination of vildagliptin (VLG) and metformin (MET) in binary mixture. Zero order spectrophotometric method was the first method used for determination of MET in the range of 2-12 μg mL-1 by measuring the absorbance at 237.6 nm. The second method was derivative spectrophotometric technique; utilized for determination of MET at 247.4 nm, in the range of 1-12 μg mL-1. Derivative ratio spectrophotometric method was the third technique; used for determination of VLG in the range of 4-24 μg mL-1 at 265.8 nm. Fourth and fifth methods adopted for determination of VLG in the range of 4-24 μg mL-1; were ratio subtraction and mean centering spectrophotometric methods, respectively. All the results were statistically compared with the reported methods, using one-way analysis of variance (ANOVA). The developed methods were satisfactorily applied to analysis of the investigated drugs and proved to be specific and accurate for quality control of them in pharmaceutical dosage forms.

  6. Continental-scale Validation of MODIS-based and LEDAPS Landsat ETM+ Atmospheric Correction Methods

    NASA Technical Reports Server (NTRS)

    Ju, Junchang; Roy, David P.; Vermote, Eric; Masek, Jeffrey; Kovalskyy, Valeriy

    2012-01-01

    The potential of Landsat data processing to provide systematic continental scale products has been demonstrated by several projects including the NASA Web-enabled Landsat Data (WELD) project. The recent free availability of Landsat data increases the need for robust and efficient atmospheric correction algorithms applicable to large volume Landsat data sets. This paper compares the accuracy of two Landsat atmospheric correction methods: a MODIS-based method and the Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) method. Both methods are based on the 6SV radiative transfer code but have different atmospheric characterization approaches. The MODIS-based method uses the MODIS Terra derived dynamic aerosol type, aerosol optical thickness, and water vapor to atmospherically correct ETM+ acquisitions in each coincident orbit. The LEDAPS method uses aerosol characterizations derived independently from each Landsat acquisition and assumes a fixed continental aerosol type and uses ancillary water vapor. Validation results are presented comparing ETM+ atmospherically corrected data generated using these two methods with AERONET corrected ETM+ data for 95 10 km×10 km 30 m subsets, a total of nearly 8 million 30 m pixels, located across the conterminous United States. The results indicate that the MODIS-based method has better accuracy than the LEDAPS method for the ETM+ red and longer wavelength bands.

  7. Development and Validation of Spectrophotometric, Atomic Absorption and Kinetic Methods for Determination of Moxifloxacin Hydrochloride

    PubMed Central

    Abdellaziz, Lobna M.; Hosny, Mervat M.

    2011-01-01

    Three simple spectrophotometric and atomic absorption spectrometric methods are developed and validated for the determination of moxifloxacin HCl in pure form and in pharmaceutical formulations. Method (A) is a kinetic method based on the oxidation of moxifloxacin HCl by Fe3+ ion in the presence of 1,10 o-phenanthroline (o-phen). Method (B) describes spectrophotometric procedures for determination of moxifloxacin HCl based on its ability to reduce Fe (III) to Fe (II), which was rapidly converted to the corresponding stable coloured complex after reacting with 2,2′ bipyridyl (bipy). The formation of the tris-complex formed in both methods (A) and (B) were carefully studied and their absorbance were measured at 510 and 520 nm respectively. Method (C) is based on the formation of ion- pair associated between the drug and bismuth (III) tetraiodide in acidic medium to form orange—red ion-pair associates. This associate can be quantitatively determined by three different procedures. The formed precipitate is either filtered off, dissolved in acetone and quantified spectrophotometrically at 462 nm (Procedure 1), or decomposed by hydrochloric acid, and the bismuth content is determined by direct atomic absorption spectrometric (Procedure 2). Also the residual unreacted metal complex in the filtrate is determined through its metal content using indirect atomic absorption spectrometric technique (procedure 3). All the proposed methods were validated according to the International Conference on Harmonization (ICH) guidelines, the three proposed methods permit the determination of moxifloxacin HCl in the range of (0.8–6, 0.8–4) for methods A and B, (16–96, 16–96 and 16–72) for procedures 1–3 in method C. The limits of detection and quantitation were calculated, the precision of the methods were satisfactory; the values of relative standard deviations did not exceed 2%. The proposed methods were successfully applied to determine the drug in its pharmaceutical

  8. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    PubMed Central

    Musmade, Kranti P.; Trilok, M.; Dengale, Swapnil J.; Bhat, Krishnamurthy; Reddy, M. S.; Musmade, Prashant B.; Udupa, N.

    2014-01-01

    A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC) method with UV detection has been developed and validated for quantification of naringin (NAR) in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm) column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v) as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1). The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV) less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity. PMID:26556205

  9. Method for counting motor units in mice and validation using a mathematical model.

    PubMed

    Major, Lora A; Hegedus, Janka; Weber, Douglas J; Gordon, Tessa; Jones, Kelvin E

    2007-02-01

    Weakness and atrophy are clinical signs that accompany muscle denervation resulting from motor neuron disease, peripheral neuropathies, and injury. Advances in our understanding of the genetics and molecular biology of these disorders have led to the development of therapeutic alternatives designed to slow denervation and promote reinnervation. Preclinical in vitro research gave rise to the need of a method for measuring the effects in animal models. Our goal was to develop an efficient method to determine the number of motor neurons making functional connections to muscle in a transgenic mouse model of amyotrophic lateral sclerosis (ALS). We developed a novel protocol for motor unit number estimation (MUNE) using incremental stimulation. The method involves analysis of twitch waveforms using a new software program, ITS-MUNE, designed for interactive calculation of motor unit number. The method was validated by testing simulated twitch data from a mathematical model of the neuromuscular system. Computer simulations followed the same stimulus-response protocol and produced waveform data that were indistinguishable from experiments. We show that our MUNE protocol is valid, with high precision and small bias across a wide range of motor unit numbers. The method is especially useful for large muscle groups where MUNE could not be done using manual methods. The results are reproducible across naïve and expert analysts, making it suitable for easy implementation. The ITS-MUNE analysis method has the potential to quantitatively measure the progression of motor neuron diseases and therefore the efficacy of treatments designed to alleviate pathologic processes of muscle denervation. PMID:17151224

  10. A novel method for extraction of a proteinous coagulant from Plantago ovata seeds for water treatment purposes.

    PubMed

    Ramavandi, Bahman; Hashemi, Seyedenayat; Kafaei, Raheleh

    2015-01-01

    Several chemicals have been applied in the process of coagulant extraction from herbal seeds, and the best extraction has been obtained in the presence of KCl or NaNO3[1-3], and NaCl [4]. However, the main challenge posed to these methods of coagulant extraction is their relatively low efficiency for water treatment purposes and the formation of dissolved organic matter during the treatment process. In these methods the salts, which have a one-valance metal (Na(+) and K(+)), are deposited in the internal structure and the pore of the coagulant, and may be useful for the coagulation/flocculation process. In this research, we found that modified methods produced more dense protein. Therefore, the modified procedure was better than the older one for removal of turbidity and harness from the contaminated water. Here we describe a method where: •According to the Hardy-Schulze rule, we applied the Fe(3+) ions instead of Na(+) and K(+) for the extraction of protein from Plantago ovata seeds.•The method was narrowed to extract protein by ethanol (defatting) and ammonium acetate and CM-Sepharose (protein extraction).•Two consecutive elutriations of crude extract was directly performed using 0.025-M FeCl3 and 0.05-M FeCl3 according to the basis of the ion-exchange processes. PMID:26150999

  11. A novel method for extraction of a proteinous coagulant from Plantago ovata seeds for water treatment purposes

    PubMed Central

    Ramavandi, Bahman; Hashemi, Seyedenayat; Kafaei, Raheleh

    2015-01-01

    Several chemicals have been applied in the process of coagulant extraction from herbal seeds, and the best extraction has been obtained in the presence of KCl or NaNO3[1], [2], [3], and NaCl [4]. However, the main challenge posed to these methods of coagulant extraction is their relatively low efficiency for water treatment purposes and the formation of dissolved organic matter during the treatment process. In these methods the salts, which have a one-valance metal (Na+ and K+), are deposited in the internal structure and the pore of the coagulant, and may be useful for the coagulation/flocculation process. In this research, we found that modified methods produced more dense protein. Therefore, the modified procedure was better than the older one for removal of turbidity and harness from the contaminated water. Here we describe a method where: • According to the Hardy–Schulze rule, we applied the Fe3+ ions instead of Na+ and K+ for the extraction of protein from Plantago ovata seeds. • The method was narrowed to extract protein by ethanol (defatting) and ammonium acetate and CM-Sepharose (protein extraction). • Two consecutive elutriations of crude extract was directly performed using 0.025-M FeCl3 and 0.05-M FeCl3 according to the basis of the ion-exchange processes. PMID:26150999

  12. Determination of benzimidazoles and levamisole residues in milk by liquid chromatography-mass spectrometry: screening method development and validation.

    PubMed

    Jedziniak, Piotr; Szprengier-Juszkiewicz, Teresa; Olejnik, Małgorzata

    2009-11-13

    The screening method for the determination of residues of 19 benzimidazoles (parent drugs and their metabolites) and levamisole in bovine milk has been developed and validated. Milk samples were extracted with ethyl acetate, sample extracts were cleaned up by liquid-liquid partitioning with hexane and acidic ethanol. Liquid chromatography-single-quadrupole mass spectrometry was used for the separation and determination of analytes. The method was validated in bovine milk, according to the CD 2002/657/EC criteria. An alternative approach to the validation of the method was applied ("sum MRL" substances). The method was successfully verified in CRL proficiency test. PMID:19656518

  13. Volume and Surface Measurements from Tomographic Images: in Vivo Validation of AN Unsupervised Method

    NASA Astrophysics Data System (ADS)

    Alyassin, Abdalmajeid Musa

    The maximum unit normal component method (MUNC) used for surface area measurement and the divergence theorem algorithm (DTA) used for volume measurement were evaluated in vitro and validated in vivo. To evaluate these methods in vitro, their accuracy and precision were investigated at varying conditions of signal-to-noise ratio (SNR), sampling, volume averaging, and orientation. These algorithms were also enhanced to provide interactive surface area and volume measurements for regions bounded by orthogonal cut planes. The in vitro evaluations showed that a minimum SNR of 6:1 was necessary to provide accurate surface area and volume measurements. This test also revealed surface area measurements were more sensitive to noise than volume measurement. Sampling tests showed that at least twelve samples across the shortest dimension of simulated objects are necessary to provide accurate surface area and volume measurements. Volume averaging tests, however, revealed that at least seven voxels across the diameter (51.44 mm) of a computed tomography wooden sphere image are necessary for accurate surface area and volume measurements. Orientation tests indicated that the accuracy of the measured surface area and volume was primarily dependent on the number of samples across the shortest dimension of the object. Interactive measurement tests proved that the enhanced algorithms can provide accurate and precise interactive surface area and volume measurements. To validate the investigated algorithms in vivo, an unsupervised method incorporating these algorithms was developed for measuring surface area and volume of the urinary bladder using dual-echo, T2-weighted magnetic resonance images. Accuracy and precision of the unsupervised method in estimating urine volumes in vivo for nine normal subjects were <5%. Results from in vitro evaluations show that volume measured using the DTA method compares well with the volume measured by a proven voxel counting method. The MUNC method

  14. Experimental methods to validate measures of emotional state and readiness for duty in critical operations.

    SciTech Connect

    Weston, Louise Marie

    2007-09-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended.

  15. Validation of a Method for Assessing Resident Physicians’ Quality Improvement Proposals

    PubMed Central

    Leenstra, James L.; Beckman, Thomas J.; Reed, Darcy A.; Mundell, William C.; Thomas, Kris G.; Krajicek, Bryan J.; Cha, Stephen S.; Kolars, Joseph C.

    2007-01-01

    BACKGROUND Residency programs involve trainees in quality improvement (QI) projects to evaluate competency in systems-based practice and practice-based learning and improvement. Valid approaches to assess QI proposals are lacking. OBJECTIVE We developed an instrument for assessing resident QI proposals—the Quality Improvement Proposal Assessment Tool (QIPAT-7)—and determined its validity and reliability. DESIGN QIPAT-7 content was initially obtained from a national panel of QI experts. Through an iterative process, the instrument was refined, pilot-tested, and revised. PARTICIPANTS Seven raters used the instrument to assess 45 resident QI proposals. MEASUREMENTS Principal factor analysis was used to explore the dimensionality of instrument scores. Cronbach’s alpha and intraclass correlations were calculated to determine internal consistency and interrater reliability, respectively. RESULTS QIPAT-7 items comprised a single factor (eigenvalue = 3.4) suggesting a single assessment dimension. Interrater reliability for each item (range 0.79 to 0.93) and internal consistency reliability among the items (Cronbach’s alpha = 0.87) were high. CONCLUSIONS This method for assessing resident physician QI proposals is supported by content and internal structure validity evidence. QIPAT-7 is a useful tool for assessing resident QI proposals. Future research should determine the reliability of QIPAT-7 scores in other residency and fellowship training programs. Correlations should also be made between assessment scores and criteria for QI proposal success such as implementation of QI proposals, resident scholarly productivity, and improved patient outcomes. PMID:17602270

  16. Reliability and Validity of Prisoner Self-Reports Gathered Using the Life Event Calendar Method

    PubMed Central

    Sutton, James E.; Bellair, Paul E.; Kowalski, Brian R.; Light, Ryan; Hutcherson, Donald T.

    2013-01-01

    Data collection using the life event calendar method is growing, but reliability is not well established. We examine test-retest reliability of monthly self-reports of criminal behavior collected using a life event calendar from a random sample of minimum and medium security prisoners. Tabular analysis indicates substantial agreement between self-reports of drug dealing, property, and violent crime during a baseline interview (test) and a follow-up (retest) approximately three weeks later. Hierarchical analysis reveals that criminal activity reported during the initial test is strongly associated with responses given in the retest, and that the relationship varies only by the lag in days between the initial interview and the retest. Analysis of validity reveals that self-reported incarceration history is strongly predictive of official incarceration history although we were unable to address whether subjects could correctly identify the months they were incarcerated. African Americans and older subjects provide more valid responses but in practical terms the differences in validity are not large. PMID:24031156

  17. Validation of the intrinsic spatial efficiency method for non cylindrical homogeneous sources using MC simulation

    NASA Astrophysics Data System (ADS)

    Ortiz-Ramírez, Pablo; Ruiz, Andrés

    2016-07-01

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of the intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.

  18. Validity and reliability of different kinematics methods used for bike fitting.

    PubMed

    Fonda, Borut; Sarabon, Nejc; Li, François-Xavier

    2014-01-01

    The most common bike fitting method to set the seat height is based on the knee angle when the pedal is in its lowest position, i.e. bottom dead centre (BDC). However, there is no consensus on what method should be used to measure the knee angle. Therefore, the first aim of this study was to compare three dynamic methods to each other and against a static method. The second aim was to test the intra-session reliability of the knee angle at BDC measured by dynamic methods. Eleven cyclists performed five 3-min cycling trials; three at different seat heights (25°, 30° and 35° knee angle at BDC according to static measure) and two at preferred seat height. Thirteen infrared cameras (3D), a high-speed camera (2D), and an electrogoniometer were used to measure the knee angle during pedalling, when the pedal was at the BDC. Compared to 3D kinematics, all other methods statistically significantly underestimated the knee angle (P = 0.00; η(2) = 0.73). All three dynamic methods have been found to be substantially different compared to the static measure (effect sizes between 0.4 and 0.6). All dynamic methods achieved good intra-session reliability. 2D kinematics is a valid tool for knee angle assessment during bike fitting. However, for higher precision, one should use correction factor by adding 2.2° to the measured value. PMID:24499342

  19. Development and Validation of New Spectrophotometric Methods to Determine Enrofloxacin in Pharmaceuticals

    NASA Astrophysics Data System (ADS)

    Rajendraprasad, N.; Basavaiah, K.

    2015-07-01

    Four spectrophotometric methods, based on oxidation with cerium(IV), are investigated and developed to determine EFX in pure form and in dosage forms. The frst and second methods (Method A and method B) are direct, in which after the oxidation of EFX with cerium(IV) in acid medium, the absorbance of reduced and unreacted oxidant is measured at 275 and 320 nm, respectively. In the third (C) and fourth (D) methods after the reaction between EFX and oxidant is ensured to be completed the surplus oxidant is treated with either N-phenylanthranilic acid (NPA) or Alizarin Red S (ARS) dye and the absorbance of the oxidized NPA or ARS is measured at 440 or 420 nm. The methods showed good linearity over the concentration ranges of 0.5-5.0, 1.25-12.5, 10.0-100.0, and 6.0-60.0 μg/ml, for method A, B, C and D, respectively, with apparent molar absorptivity values of 4.42 × 10 4 , 8.7 × 10 3 , 9.31 × 10 2 , and 2.28 × 10 3 l/(mol· cm). The limits of detection (LOD), quantification (LOQ), and Sandell's sensitivity values and other validation results have also been reported. The proposed methods are successfully applied to determine EFX in pure form and in dosage forms.

  20. Validate or falsify: Lessons learned from a microscopy method claimed to be useful for detecting Borrelia and Babesia organisms in human blood.

    PubMed

    Aase, Audun; Hajdusek, Ondrej; Øines, Øivind; Quarsten, Hanne; Wilhelmsson, Peter; Herstad, Tove K; Kjelland, Vivian; Sima, Radek; Jalovecka, Marie; Lindgren, Per-Eric; Aaberge, Ingeborg S

    2016-06-01

    Background A modified microscopy protocol (the LM-method) was used to demonstrate what was interpreted as Borrelia spirochetes and later also Babesia sp., in peripheral blood from patients. The method gained much publicity, but was not validated prior to publication, which became the purpose of this study using appropriate scientific methodology, including a control group. Methods Blood from 21 patients previously interpreted as positive for Borrelia and/or Babesia infection by the LM-method and 41 healthy controls without known history of tick bite were collected, blinded and analysed for these pathogens by microscopy in two laboratories by the LM-method and conventional method, respectively, by PCR methods in five laboratories and by serology in one laboratory. Results Microscopy by the LM-method identified structures claimed to be Borrelia- and/or Babesia in 66% of the blood samples of the patient group and in 85% in the healthy control group. Microscopy by the conventional method for Babesia only did not identify Babesia in any samples. PCR analysis detected Borrelia DNA in one sample of the patient group and in eight samples of the control group; whereas Babesia DNA was not detected in any of the blood samples using molecular methods. Conclusions The structures interpreted as Borrelia and Babesia by the LM-method could not be verified by PCR. The method was, thus, falsified. This study underlines the importance of doing proper test validation before new or modified assays are introduced. PMID:27030913

  1. Development and Validation of Stability-Indicating Derivative Spectrophotometric Methods for Determination of Dronedarone Hydrochloride

    NASA Astrophysics Data System (ADS)

    Chadha, R.; Bali, A.

    2016-05-01

    Rapid, sensitive, cost effective and reproducible stability-indicating derivative spectrophotometric methods have been developed for the estimation of dronedarone HCl employing peak-zero (P-0) and peak-peak (P-P) techniques, and their stability-indicating potential assessed in forced degraded solutions of the drug. The methods were validated with respect to linearity, accuracy, precision and robustness. Excellent linearity was observed in concentrations 2-40 μg/ml (r 2 = 0.9986). LOD and LOQ values for the proposed methods ranged from 0.42-0.46 μg/ml and 1.21-1.27 μg/ml, respectively, and excellent recovery of the drug was obtained in the tablet samples (99.70 ± 0.84%).

  2. Development and Validation of Stability-Indicating Derivative Spectrophotometric Methods for Determination of Dronedarone Hydrochloride

    NASA Astrophysics Data System (ADS)

    Chadha, R.; Bali, A.

    2016-05-01

    Rapid, sensitive, cost effective and reproducible stability-indicating derivative spectrophotometric methods have been developed for the estimation of dronedarone HCl employing peak-zero (P-0) and peak-peak (P-P) techniques, and their stability-indicating potential assessed in forced degraded solutions of the drug. The methods were validated with respect to linearity, accuracy, precision and robustness. Excellent linearity was observed in concentrations 2-40 μg/ml ( r 2 = 0.9986). LOD and LOQ values for the proposed methods ranged from 0.42-0.46 μg/ml and 1.21-1.27 μg/ml, respectively, and excellent recovery of the drug was obtained in the tablet samples (99.70 ± 0.84%).

  3. Antioxidant Activity and Validation of Quantification Method for Lycopene Extracted from Tomato.

    PubMed

    Cefali, Letícia Caramori; Cazedey, Edith Cristina Laignier; Souza-Moreira, Tatiana Maria; Correa, Marcos Antônio; Salgado, Hérida Regina Nunes; Isaac, Vera Lucia Borges

    2015-01-01

    Lycopene is a carotenoid found in tomatoes with potent antioxidant activity. The aim of the study was to obtain an extract containing lycopene from four types of tomatoes, validate a quantification method for the extracts by HPLC, and assess its antioxidant activity. Results revealed that the tomatoes analyzed contained lycopene and antioxidant activity. Salad tomato presented the highest concentration of this carotenoid and antioxidant activity. The quantification method exhibited linearity with a correlation coefficient of 0.9992. Tests for the assessment of precision, accuracy, and robustness achieved coefficients with variation of less than 5%. The LOD and LOQ were 0.0012 and 0.0039 μg/mL, respectively. Salad tomato can be used as a source of lycopene for the development of topical formulations, and based on performed tests, the chosen method for the identification and quantification of lycopene was considered to be linear, precise, exact, selective, and robust. PMID:26525253

  4. A tutorial on the validation of qualitative methods: from the univariate to the multivariate approach.

    PubMed

    López, M Isabel; Callao, M Pilar; Ruisánchez, Itziar

    2015-09-01

    This tutorial provides an overview of the validation of qualitative analytical methods, with particular focus on their main performance parameters, for both univariate and multivariate methods. We discuss specific parameters (sensitivity, specificity, false positive and false negative rates), global parameters (efficiency, Youden's index and likelihood ratio) and those parameters that have a quantitative connotation since they are usually associated to concentration values (decision limit, detection capability and unreliability region). Some methodologies that can be used to estimate these parameters are also described: the use of contingency tables for the specific and global parameters and the performance characteristic curve (PCC) for the ones with quantitative connotation. To date, PCC has been less commonly used in multivariate methods. To illustrate the proposals summarized in this tutorial, two cases study are discussed at the end, one for a univariate qualitative analysis and the other for multivariate one. PMID:26388364

  5. Validation of an HPLC method on short columns to assay ketoconazole and formaldehyde in shampoo.

    PubMed

    Nguyen, Minh Nguyet A; Tallieu, L; Plaizier-Vercammen, J; Massart, D L; Vander Heyden, Y

    2003-04-24

    An HPLC method to determine simultaneously ketoconazole and formaldehyde in an anti-dandruff shampoo, originally developed on a long column, was transferred to two short columns with similar stationary phase properties, but with a length of at the most 30% of the initial one. Using the conventional column as reference, the fast HPLC methods on the short columns were validated. The validation characteristics consisted of selectivity, linearity range, precision (repeatability and time-different intermediate precision), bias and robustness. For the ketoconazole assay, linearity for peak area was found in the concentration range up to 0.20 mg/ml. For formaldehyde, two calibration ranges (0-10 x 10(-5) and 0-10 x 10(-4)%) were linear, both for peak area and height. The assays for both ketoconazole and formaldehyde in these ranges showed no bias and an acceptable precision, although the precision found with the short columns was slightly worse than with the long one. The robustness tests were performed applying a Plackett-Burman design. For the ketoconazole assay, 6 factors were examined in a 12 experiments design and for formaldehyde, 11 factors in 16 experiments. The methods were found to be robust. Despite the somewhat less good precision the transfer seems to be successful and the obtained assays on the short columns are applicable for fast routine analysis. PMID:12852444

  6. Simultaneous quantification of paracetamol, acetylsalicylic acid and papaverine with a validated HPLC method.

    PubMed

    Kalmár, Eva; Gyuricza, Anett; Kunos-Tóth, Erika; Szakonyi, Gerda; Dombi, György

    2014-01-01

    Combined drug products have the advantages of better patient compliance and possible synergic effects. The simultaneous application of several active ingredients at a time is therefore frequently chosen. However, the quantitative analysis of such medicines can be challenging. The aim of this study is to provide a validated method for the investigation of a multidose packed oral powder that contained acetylsalicylic acid, paracetamol and papaverine-HCl. Reversed-phase high-pressure liquid chromatography was used. The Agilent Zorbax SB-C18 column was found to be the most suitable of the three different stationary phases tested for the separation of the components of this sample. The key parameters in the method development (apart from the nature of the column) were the pH of the aqueous phase (set to 3.4) and the ratio of the organic (acetonitrile) and the aqueous (25 mM phosphate buffer) phases, which was varied from 7:93 (v/v) to 25:75 (v/v) in a linear gradient, preceded by an initial hold. The method was validated: linearity, precision (repeatability and intermediate precision), accuracy, specificity and robustness were all tested, and the results met the ICH guidelines. PMID:24344050

  7. Validated HPTLC Method for Quantification of Luteolin and Apigenin in Premna mucronata Roxb., Verbenaceae

    PubMed Central

    Patel, Nayan G.; Patel, Kalpana G.; Patel, Kirti V.; Gandhi, Tejal R.

    2015-01-01

    A simple, rapid, and precise high-performance thin-layer chromatographic method was developed for quantitative estimation of luteolin and apigenin in Premna mucronata Roxb., family Verbenaceae. Separation was performed on silica gel 60 F254 HPTLC plates using toluene : ethyl acetate : formic acid (6 : 4 : 0.3) as mobile phase for elution of markers from extract. The determination was carried out in fluorescence mode using densitometric absorbance-reflection mode at 366 nm for both luteolin and apigenin. The methanolic extract of Premna mucronata was found to contain 10.2 mg/g % luteolin and 0.165 mg/g % of apigenin. The method was validated in terms of linearity, LOD and LOQ, accuracy, precision, and specificity. The calibration curve was found to be linear between 200 and 1000 ng/band for luteolin and 50 and 250 ng/band for apigenin. For luteolin and apigenin, the limit of detection was found to be 42.6 ng/band and 7.97 ng/band while the limit of quantitation was found to be 129.08 ng/band and 24.155 ng/band, respectively. This developed validated method is capable of quantifying and resolving luteolin and apigenin and can be applicable for routine analysis of extract and plant as a whole. PMID:26421008

  8. Validated method for the quantification of free and total carnitine, butyrobetaine, and acylcarnitines in biological samples.

    PubMed

    Minkler, Paul E; Stoll, Maria S K; Ingalls, Stephen T; Kerner, Janos; Hoppel, Charles L

    2015-09-01

    A validated quantitative method for the determination of free and total carnitine, butyrobetaine, and acylcarnitines is presented. The versatile method has four components: (1) isolation using strong cation-exchange solid-phase extraction, (2) derivatization with pentafluorophenacyl trifluoromethanesulfonate, (3) sequential ion-exchange/reversed-phase (ultra) high-performance liquid chromatography [(U)HPLC] using a strong cation-exchange trap in series with a fused-core HPLC column, and (4) detection with electrospray ionization multiple reaction monitoring (MRM) mass spectrometry (MS). Standardized carnitine along with 65 synthesized, standardized acylcarnitines (including short-chain, medium-chain, long-chain, dicarboxylic, hydroxylated, and unsaturated acyl moieties) were used to construct multiple-point calibration curves, resulting in accurate and precise quantification. Separation of the 65 acylcarnitines was accomplished in a single chromatogram in as little as 14 min. Validation studies were performed showing a high level of accuracy, precision, and reproducibility. The method provides capabilities unavailable by tandem MS procedures, making it an ideal approach for confirmation of newborn screening results and for clinical and basic research projects, including treatment protocol studies, acylcarnitine biomarker studies, and metabolite studies using plasma, urine, tissue, or other sample matrixes. PMID:26270397

  9. Comparative Study in Laboratory Rats to Validate Sperm Quality Methods and Endpoints

    NASA Technical Reports Server (NTRS)

    Price, W. A.; Briggs, G. B.; Alexander, W. K.; Still, K. R.; Grasman, K. A.

    2000-01-01

    Abstract The Naval Health Research Center, Detachment (Toxicology) performs toxicity studies in laboratory animals to characterize the risk of exposure to chemicals of Navy interest. Research was conducted at the Toxicology Detachment at WPAFB, OH in collaboration with Wright State University, Department of Biological Sciences for the validation of new bioassay methods for evaluating reproductive toxicity. The Hamilton Thorne sperm analyzer was used to evaluate sperm damage produced by exposure to a known testicular toxic agent, methoxyacetic acid and by inhalation exposure to JP-8 and JP-5 in laboratory rats. Sperm quality parameters were evaluated (sperm concentration, motility, and morphology) to provide evidence of sperm damage. The Hamilton Thorne sperm analyzer utilizes a DNA specific fluorescent stain (similar to flow cytometry) and digitized optical computer analysis to detect sperm cell damage. The computer assisted sperm analysis (CASA) is a more rapid, robust, predictive and sensitive method for characterizing reproductive toxicity. The results presented in this poster report validation information showing exposure to methoxyacetic acid causes reproductive toxicity and inhalation exposure to JP-8 and JP-5 had no significant effects. The CASA method detects early changes that result in reproductive deficits and these data will be used in a continuing program to characterize the toxicity of chemicals, and combinations of chemicals, of military interest to formulate permissible exposure limits.

  10. Measure profile surrogates: A method to validate the performance of epileptic seizure prediction algorithms

    NASA Astrophysics Data System (ADS)

    Kreuz, Thomas; Andrzejak, Ralph G.; Mormann, Florian; Kraskov, Alexander; Stögbauer, Harald; Elger, Christian E.; Lehnertz, Klaus; Grassberger, Peter

    2004-06-01

    In a growing number of publications it is claimed that epileptic seizures can be predicted by analyzing the electroencephalogram (EEG) with different characterizing measures. However, many of these studies suffer from a severe lack of statistical validation. Only rarely are results passed to a statistical test and verified against some null hypothesis H0 in order to quantify their significance. In this paper we propose a method to statistically validate the performance of measures used to predict epileptic seizures. From measure profiles rendered by applying a moving-window technique to the electroencephalogram we first generate an ensemble of surrogates by a constrained randomization using simulated annealing. Subsequently the seizure prediction algorithm is applied to the original measure profile and to the surrogates. If detectable changes before seizure onset exist, highest performance values should be obtained for the original measure profiles and the null hypothesis. “The measure is not suited for seizure prediction” can be rejected. We demonstrate our method by applying two measures of synchronization to a quasicontinuous EEG recording and by evaluating their predictive performance using a straightforward seizure prediction statistics. We would like to stress that the proposed method is rather universal and can be applied to many other prediction and detection problems.

  11. Glossary of reference terms for alternative test methods and their validation.

    PubMed

    Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas

    2014-01-01

    This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material. PMID:24819604

  12. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. PMID:25842334

  13. Method validation program for the long duration sampling of PCDDs/PCDFs in ambient air

    SciTech Connect

    Maisel, B.E.; Hunt, G.T.; Hoyt, M.P.; Rowe, N.; Scarfo, L.

    1994-12-31

    A method validation program was completed to assess the technical viability of extended, long duration sampling periods (15- and 30-day) for the collection of PCDDs/PCDFs in ambient air in lieu of the 48-hour sampling periods typically employed. This long duration approach, if successful, would provide measurements data more representative of average ambient PCDDs/PCDFs levels on an annual basis, and hence provide enhanced support of the 1.0 pg/m{sup 3} annual ambient standard for PCDDs/PCDFs (expressed at 1987 EPA toxic equivalents) required by Connecticut regulation. The method validation program utilized nine collocated PUF samplers which were operated for 15-day and 30-day periods during each of two seasonal monitoring campaigns (winter and summer). Samples were analyzed using high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) based on EPA Method 8290. Each PUF cartridge consisted of two foam halves; the top half PUF and filter were analyzed as a single sample separately from the bottom half PUF section. This approach provided an assessment of analyte breakthrough using the sampling system for large sample volumes of approximately 4,000 m{sup 3} and 8,000 m{sup 3} for the 15-day and respectively.

  14. Self-validated Variance-based Methods for Sensitivity Analysis of Model Outputs

    SciTech Connect

    Tong, C

    2009-04-20

    Global sensitivity analysis (GSA) has the advantage over local sensitivity analysis in that GSA does not require strong model assumptions such as linearity or monotonicity. As a result, GSA methods such as those based on variance decomposition are well-suited to multi-physics models, which are often plagued by large nonlinearities. However, as with many other sampling-based methods, inadequate sample size can badly pollute the result accuracies. A natural remedy is to adaptively increase the sample size until sufficient accuracy is obtained. This paper proposes an iterative methodology comprising mechanisms for guiding sample size selection and self-assessing result accuracy. The elegant features in the the proposed methodology are the adaptive refinement strategies for stratified designs. We first apply this iterative methodology to the design of a self-validated first-order sensitivity analysis algorithm. We also extend this methodology to design a self-validated second-order sensitivity analysis algorithm based on refining replicated orthogonal array designs. Several numerical experiments are given to demonstrate the effectiveness of these methods.

  15. Method Development and Validation for UHPLC-MS-MS Determination of Hop Prenylflavonoids in Human Serum

    PubMed Central

    Yuan, Yang; Qiu, Xi; Nikolic, Dejan; Dahl, Jeffrey H.; van Breemen, Richard B.

    2013-01-01

    Hops (Humulus lupulus L.) are used in the brewing of beer, and hop extracts containing prenylated compounds such as xanthohumol and 8-prenylnaringenin are under investigation as dietary supplements for cancer chemoprevention and for the management of hot flashes in menopausal women. To facilitate clinical studies of hop safety and efficacy, a selective, sensitive, and fast ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS-MS) method was developed and validated for the simultaneous determination of the hop prenylflavonoids xanthohumol, isoxanthohumol, 6-prenylnaringenin, and 8-prenylnaringenin in human serum. The analytical method requires 300 μL of human serum which is processed using liquid-liquid extraction. UHPLC separation was carried out in 2.5 min with gradient elution using a reversed phase column containing 1.6 μm packing material. Prenylflavonoids were measured using negative ion electrospray mass spectrometry with collision-induced dissociation and selected reaction monitoring. The method was validated and showed good accuracy and precision with a lower limit of quantitation (LLOQ) of 0.50 ng/mL for XN (1.4 nM) and 1.0 ng/mL for 6-PN (2.8 nM), XN and IX (2.9 nM) in serum for each analyte. PMID:23451393

  16. Development and validation of an analytical method for the determination of 4-hexylresorcinol in food.

    PubMed

    Kim, Young-Hyun; Kim, Jae-Min; Lee, Jong Seok; Gang, Seong-Ran; Lim, Ho-Soo; Kim, Meehye; Lee, Ok-Hwan

    2016-01-01

    This study presents a method validation for extraction and quantitative analysis of 4-hexylresorcinol residues in shrimp and crab meat using HPLC-FLD. We were focused on the collaboratively analysis of each shrimp and crab meat samples, and developed LC-MS/MS method for the correct confirmation of the identity of compound. Validation parameters; selectivity, linearity, LOD, LOQ, accuracy, precision, and measurement of uncertainty were attained. The measurement of uncertainty was based on the precision study, data related to the performance of the analytical process and quantification of 4-hexylresorcinol. For HPLC-FLD analysis, the recoveries of 4-hexylresorcinol from spiked samples at levels of 0.2-10.0 ppm ranged from 92.54% to 97.67% with RSDs between 0.07% and 1.88%. According to these results, the method has been proven to be appropriate for extraction and determination of 4-hexylresorcinol, and can be used to maintain the safety of shrimp and crab products containing 4-hexylresorcinol residues. PMID:26213080

  17. Validation of a new method for immobilising kinetoplastid parasites for live cell imaging

    PubMed Central

    Price, Helen P.; MacLean, Lorna; Marrison, Joanne; O’Toole, Peter J.; Smith, Deborah F.

    2010-01-01

    The kinetoplastid parasites are responsible for three of the ten most neglected tropical diseases as classified by the WHO. Recent advances in molecular and cellular analyses have allowed rapid progress in our understanding of the biology of these lethal pathogens. In this study we validate a new method for immobilising Trypanosoma brucei and Leishmania major parasites while maintaining a high level of viability. This allows reproducible live cell imaging of these highly motile organisms, thus enabling a full complement of advanced microscopic techniques to be utilised to better understand these pathogenic species. PMID:19815033

  18. Validate the universal pattern decomposition method using satellite data acquired over the Three Gorges region

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Fujiwara, N.; Furumi, S.; Muramatsu, K.; Daigo, M.; Zhang, L.

    2005-10-01

    The universal pattern decomposition method (UPDM) has been successfully applied to simulated data for Landsat/ETM+, Terra/MODIS, ADEOS-II/GLI and 92 bands-CONTINUE sensors using ground-measured data. This paper validates the UPDM using MODIS and ETM+ data acquired over the Three Gorges region of China. The reduced 2 values of selected area D, that with the smallest terrain influences, are 0.000409 (MODIS) and 0.000181 (ETM+), and the average linear regression factor between MODIS and ETM+ is 1.0077, with rms 0.0082. The results demonstrated that the UPDM coefficients are sensor-independent.

  19. Validation of nested PCR and a selective biochemical method as alternatives for mycoplasma detection.

    PubMed

    Cheong, Kyung Ah; Agrawal, Santosh Rani; Lee, Ai-Young

    2011-04-01

    Direct culture is the most common way to reliably detect mycoplasma, but it is not practical for the qualitative control of cell therapeutics because of the elaborate culture medium, the prolonged incubation time, and the large sample volumes. Here, we chose two alternative methods using commercial detection kits, the PCR mycoplasma detection kit with nested PCR and the selective biochemical method, MycoAlert(®), and validated them with the direct culture method as a reference. We tested eight mycoplasma species and five validation parameters: specificity, detection limit, robustness, repeatability, and ruggedness, based on the regulatory guidelines in the US Pharmacopoeia. All experiments were performed using fibroblasts spiked with mycoplasma. Specificity tests for both methods included all mycoplasma species, except Mycoplasma pneumonia and M. genitalium for the nested PCR and Ureaplasma urealyticum for the MycoAlert(®) assay. Regarding the detection limit, the nested PCR proved to be as sensitive as the direct culture method and more sensitive than the MycoAlert(®) assay. The predicted median for probit = 0.9 was 54 (44-76) CFU/ml for M. hyorhinis and 16 (13-23) CFU/ml for M. hominis by the nested PCR, but 431 (346-593) CFU/ml and 105 (87-142) CFU/ml, respectively, with MycoAlert(®). Changes in the concentration of reagents, reagent lot, or individual analysts did not influence the results of the examined methods. The results of this study support nested PCR as a valuable alternative for mycoplasma detection. PMID:20806253

  20. Validated Method for the Determination of Piroxicam by Capillary Zone Electrophoresis and Its Application to Tablets

    PubMed Central

    Dal, Arın Gül; Oktayer, Zeynep; Doğrukol-Ak, Dilek

    2014-01-01

    Simple and rapid capillary zone electrophoretic method was developed and validated in this study for the determination of piroxicam in tablets. The separation of piroxicam was conducted in a fused-silica capillary by using 10 mM borate buffer (pH 9.0) containing 10% (v/v) methanol as background electrolyte. The optimum conditions determined were 25 kV for separation voltage and 1 s for injection time. Analysis was carried out with UV detection at 204 nm. Naproxen sodium was used as an internal standard. The method was linear over the range of 0.23–28.79 µg/mL. The accuracy and precision were found to be satisfied within the acceptable limits (<2%). The LOD and LOQ were found to be 0.07 and 0.19 µg/mL, respectively. The method described here was applied to tablet dosage forms and the content of a tablet was found in the limits of USP-24 suggestions. To compare the results of capillary electrophoretic method, UV spectrophotometric method was developed and the difference between two methods was found to be insignificant. The capillary zone electrophoretic method developed in this study is rapid, simple, and suitable for routine analysis of piroxicam in pharmaceutical tablets. PMID:25295220

  1. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions.

    PubMed

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E; Geller, Jil T; Fisher, Susan J; Hall, Steven C; Hazen, Terry C; Brenner, Steven E; Butland, Gareth; Jin, Jian; Witkowska, H Ewa; Chandonia, John-Marc; Biggin, Mark D

    2016-06-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  2. Comparing 10 Methods for Solution Verification, and Linking to Model Validation

    SciTech Connect

    Logan, R W; Nitta, C K

    2005-03-23

    Grid convergence is often assumed as a given during computational analyses involving discretization of an assumed continuum process. In practical use of finite difference and finite element analyses, perfect grid convergence is rarely achieved or assured, and this fact must be addressed to make statements about model validation or the use of models in risk analysis. We have previously provided a 4-step quantitative implementation for a quantitative V&V process. One of the steps in the 4-step process is that of Solution Verification. Solution Verification is the process of assuring that a model approximating a physical reality with a discretized continuum (e.g. finite element) code converges in each discretized domain to a converged answer on the quantity of subsequent validation interest. The modeling reality is that often we are modeling a problem with a discretized code because it is neither continuous spatially (e.g. contact and impact) nor smooth in relevant physics (e.g. shocks, melting, etc). The typical result is a non-monotonic convergence plot that can lead to spurious conclusions about the order of convergence, and a lack of means to estimate residual solution verification error or uncertainty at confidence. We compare ten techniques for grid convergence assessment, each formulated to enable a quantification of solution verification uncertainty at confidence and order of convergence for monotonic and nonmonotonic mesh convergence studies. The more rigorous of these methods require a minimum of four grids in a grid convergence study to quantify the grid convergence uncertainty. The methods supply the quantitative terms for solution verification error and uncertainty estimates needed for inclusion into subsequent model validation, confidence, and reliability analyses. Naturally, most such methodologies are still evolving, and this work represents the views of the authors and not necessarily the views of Lawrence Livermore National Laboratory.

  3. Assessment of the sustainability of dual-purpose farms by the IDEA method in the subtropical area of central Mexico.

    PubMed

    Salas-Reyes, Isela Guadalupe; Arriaga-Jordán, Carlos Manuel; Rebollar-Rebollar, Samuel; García-Martínez, Anastacio; Albarrán-Portillo, Benito

    2015-08-01

    The objective of this study was to assess the sustainability of 10 dual-purpose cattle farms in a subtropical area of central Mexico. The IDEA method (Indicateurs de Durabilité des Exploitations Agricoles) was applied, which includes the agroecological, socio-territorial and economic scales (scores from 0 to 100 points per scale). A sample of 47 farms from a total of 91 registered in the local livestock growers association was analysed with principal component analysis and cluster analysis. From results, 10 farms were selected for the in-depth study herein reported, being the selection criterion continuous milk production throughout the year. Farms had a score of 88 and 86 points for the agroecological scale in the rainy and dry seasons. In the socio-territorial scale, scores were 73 points for both seasons, being the component of employment and services the strongest. Scores for the economic scale were 64 and 56 points for the rainy and dry seasons, respectively, when no economic cost for family labour is charged, which decreases to 59 and 45 points when an opportunity cost for family labour is considered. Dual-purpose farms in the subtropical area of central Mexico have a medium sustainability, with the economic scale being the limiting factor, and an area of opportunity. PMID:25958175

  4. Molecular characterization and validation of commercially available methods for haptoglobin measurement in bottlenose dolphin☆

    PubMed Central

    Segawa, Takao; Amatsuji, Hazumu; Suzuki, Kento; Suzuki, Miwa; Yanagisawa, Makio; Itou, Takuya; Sakai, Takeo; Nakanishi, Teruyuki

    2013-01-01

    Haptoglobin (Hp) is a positive acute-phase protein and a valuable marker of inflammation in both human and veterinary medicine. The aim of this study was to validate the molecular characterization of Hp in dolphins and to validate commercially available Hp measurement methods such as Hp-ELISA (originally designed for pigs) and Hp–hemoglobin (Hb) binding assay. The dolphin Hp (dHp) amino acid sequence appeared most similar to pig Hp by sequence homology and phylogenetic clustering. Amino acid sequence analysis revealed that dHp comprises the Hp1 form of α1 and β chains. The anti-pig Hp antibody cross-reacted with both recombinant dHp, expressed by Escherichia coli, and dHp from serum. The intra- and inter-assay levels of imprecision of pig Hp-ELISA and the Hp–Hb binding assay were found to be tolerable for the determination of Hp in dolphin, and there was no significant discrepancy between the two determination methods. The ability of the assay to differentiate between healthy and inflammation groups was investigated, and a significant increase in Hp concentration was detected in inflammatory conditions. Thus, Hp is a useful inflammation marker for dolphin, and the Hp concentration in dolphin serum samples can be reliably measured using commercially available pig Hp-ELISA and Hp–Hb binding assay. PMID:24600559

  5. A validated new method for nevirapine quantitation in human plasma via high-performance liquid chromatography.

    PubMed

    Silverthorn, Courtney F; Parsons, Teresa L

    2006-01-01

    A fully validated and clinically relevant assay was developed for the assessment of nevirapine concentrations in neonate blood plasma samples. Solid-phase extraction with an acid-base wash series was used to prepare subject samples for analysis. Samples were separated by high performance liquid chromatography and detected at 280 nm on a C8 reverse-phase column in an isocratic mobile phase. The retention times of nevirapine and its internal standard were 5.0 and 6.9 min, respectively. The method was validated by assessment of accuracy and precision (statistical values <15%), specificity, and stability. The assay was linear in the range 25-10,000 ng/mL (r2 > 0.996) and the average recovery was 93% (n = 18). The lower limit of quantification (relative standard deviation <20%) was determined to be 25 ng/mL for 50 microL of plasma, allowing detection of as little as 1.25 ng of nevirapine in a sample. This value represents an increase in sensitivity of up to 30-fold over previously published methods. PMID:15920701

  6. Validation of a novel sequential cultivation method for the production of enzymatic cocktails from Trichoderma strains.

    PubMed

    Florencio, C; Cunha, F M; Badino, A C; Farinas, C S

    2015-02-01

    The development of new cost-effective bioprocesses for the production of cellulolytic enzymes is needed in order to ensure that the conversion of biomass becomes economically viable. The aim of this study was to determine whether a novel sequential solid-state and submerged fermentation method (SF) could be validated for different strains of the Trichoderma genus. Cultivation of the Trichoderma reesei Rut-C30 reference strain under SF using sugarcane bagasse as substrate was shown to be favorable for endoglucanase (EGase) production, resulting in up to 4.2-fold improvement compared with conventional submerged fermentation. Characterization of the enzymes in terms of the optimum pH and temperature for EGase activity and comparison of the hydrolysis profiles obtained using a synthetic substrate did not reveal any qualitative differences among the different cultivation conditions investigated. However, the thermostability of the EGase was influenced by the type of carbon source and cultivation system. All three strains of Trichoderma tested (T. reesei Rut-C30, Trichoderma harzianum, and Trichoderma sp INPA 666) achieved higher enzymatic productivity when cultivated under SF, hence validating the proposed SF method for use with different Trichoderma strains. The results suggest that this bioprocess configuration is a very promising development for the cellulosic biofuels industry. PMID:25399068

  7. Ecological validity and the study of publics: The case for organic public engagement methods.

    PubMed

    Gehrke, Pat J

    2014-01-01

    This essay argues for a method of public engagement grounded in the criteria of ecological validity. Motivated by what Hammersly called the responsibility that comes with intellectual authority: "to seek, as far as possible, to ensure the validity of their conclusions and to participate in rational debate about those conclusions" (1993: 29), organic public engagement follows the empirical turn in citizenship theory and in rhetorical studies of actually existing publics. Rather than shaping citizens into either the compliant subjects of the cynical view or the deliberatively disciplined subjects of the idealist view, organic public engagement instead takes Asen's advice that "we should ask: how do people enact citizenship?" (2004: 191). In short, organic engagement methods engage publics in the places where they already exist and through those discourses and social practices by which they enact their status as publics. Such engagements can generate practical middle-range theories that facilitate future actions and decisions that are attentive to the local ecologies of diverse publics. PMID:23887250

  8. Validity of the t-plot method to assess microporosity in hierarchical micro/mesoporous materials.

    PubMed

    Galarneau, Anne; Villemot, François; Rodriguez, Jeremy; Fajula, François; Coasne, Benoit

    2014-11-11

    The t-plot method is a well-known technique which allows determining the micro- and/or mesoporous volumes and the specific surface area of a sample by comparison with a reference adsorption isotherm of a nonporous material having the same surface chemistry. In this paper, the validity of the t-plot method is discussed in the case of hierarchical porous materials exhibiting both micro- and mesoporosities. Different hierarchical zeolites with MCM-41 type ordered mesoporosity are prepared using pseudomorphic transformation. For comparison, we also consider simple mechanical mixtures of microporous and mesoporous materials. We first show an intrinsic failure of the t-plot method; this method does not describe the fact that, for a given surface chemistry and pressure, the thickness of the film adsorbed in micropores or small mesopores (< 10σ, σ being the diameter of the adsorbate) increases with decreasing the pore size (curvature effect). We further show that such an effect, which arises from the fact that the surface area and, hence, the free energy of the curved gas/liquid interface decreases with increasing the film thickness, is captured using the simple thermodynamical model by Derjaguin. The effect of such a drawback on the ability of the t-plot method to estimate the micro- and mesoporous volumes of hierarchical samples is then discussed, and an abacus is given to correct the underestimated microporous volume by the t-plot method. PMID:25232908

  9. Validation of a new restraint docking method for solution structure determinations of protein-ligand complexes.

    PubMed

    Polshakov, V I; Morgan, W D; Birdsall, B; Feeney, J

    1999-06-01

    A new method is proposed for docking ligands into proteins in cases where an NMR-determined solution structure of a related complex is available. The method uses a set of experimentally determined values for protein-ligand, ligand-ligand, and protein-protein restraints for residues in or near to the binding site, combined with a set of protein-protein restraints involving all the other residues which is taken from the list of restraints previously used to generate the reference structure of a related complex. This approach differs from ordinary docking methods where the calculation uses fixed atomic coordinates from the reference structure rather than the restraints used to determine the reference structure. The binding site residues influenced by replacing the reference ligand by the new ligand were determined by monitoring differences in 1H chemical shifts. The method has been validated by showing the excellent agreement between structures of L. casei dihydrofolate reductase trimetrexate calculated by conventional methods using a full experimentally determined set of restraints and those using this new restraint docking method based on an L. casei dihydrofolate reductase methotrexate reference structure. PMID:10610140

  10. The role of validated analytical methods in JECFA drug assessments and evaluation for recommending MRLs.

    PubMed

    Boison, Joe O

    2016-05-01

    The Joint Food and Agriculture Organization and World Health Organization (FAO/WHO) Expert Committee on Food Additives (JECFA) is one of three Codex committees tasked with applying risk analysis and relying on independent scientific advice provided by expert bodies organized by FAO/WHO when developing standards. While not officially part of the Codex Alimentarius Commission structure, JECFA provides independent scientific advice to the Commission and its specialist committees such as the Codex Committee on Residues of Veterinary Drugs in Foods (CCRVDF) in setting maximum residue limits (MRLs) for veterinary drugs. Codex methods of analysis (Types I, II, III, and IV) are defined in the Codex Procedural Manual as are criteria to be used for selecting methods of analysis. However, if a method is to be used under a single laboratory condition to support regulatory work, it must be validated according to an internationally recognized protocol and the use of the method must be embedded in a quality assurance system in compliance with ISO/IEC 17025:2005. This paper examines the attributes of the methods used to generate residue depletion data for drug registration and/or licensing and for supporting regulatory enforcement initiatives that experts consider to be useful and appropriate in their assessment of methods of analysis. Copyright © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd. PMID:27443214

  11. Method validation and dissipation dynamics of chlorfenapyr in squash and okra.

    PubMed

    Abdel Ghani, Sherif B; Abdallah, Osama I

    2016-03-01

    QuEChERS method combined with GC-IT-MS was developed and validated for the determination of chlorfenapyr residues in squash and okra matrices. Method accuracy, repeatability, linearity and specificity were investigated. Matrix effect was discussed. Determination coefficients (R(2)) were 0.9992 and 0.9987 in both matrices. LODs were 2.4 and 2.2μg/kg, while LOQs were 8.2 and 7.3μg/kg. Method accuracy ranged from 92.76% to 106.49%. Method precision RSDs were ⩽12.59%. A field trial to assess chlorfenapyr dissipation behavior was carried out. The developed method was employed in analyzing field samples. Dissipation behavior followed first order kinetics in both crops. Half-life values (t1/2) ranged from 0.2 to 6.58days with determination coefficient (R(2)) ranged from 0.78 to 0.96. The developed method was utilized for surveying chlorfenapyr residues in squash and okra samples collected from the market. Monitoring results are discussed. PMID:26471587

  12. Optimized and validated spectrophotometric methods for the determination of nicorandil in drug formulations and biological fluids.

    PubMed

    Rahman, Nafisur; Ahmad Khan, Nadeem; Hejaz Azmi, Syed Najmul

    2004-07-01

    Two simple, sensitive and economical spectrophotometric methods have been developed for the determination of nicorandil in drug formulations and biological fluids. Method A is based on the reaction of the drug with brucine-sulphanilic acid reagent in sulphuric acid medium producing a yellow-coloured product, which absorbs maximally at 410 nm. Method B depends on the formation of the intensely blue-coloured product which results due to the interaction of an electrophilic intermediate of 3-methyl-2-benzothiazolinone hydrazone hydrochloride (MBTH) with oxidized product of 4-(methyl amino) phenol sulphate (metol) in the presence of nicorandil as an oxidizing agent in sulphuric acid medium. The coloured product shows absorbance maximum at 560 nm. Under the optimized experimental conditions, Beer's law is obeyed in the concentration range of 2.5-35.0 and 0.40-2.2 microg ml(-1) for Methods A and B, respectively. Both the methods have been successfully applied to the determination of nicorandil in drug formulations and biological fluids. The results are validated statistically and through recovery studies. In order to establish the bias and the performance of the proposed methods, the point and interval hypothesis tests have been performed. The experimental true bias of all samples is smaller than +/-2%. PMID:15231427

  13. Spectrophotometric method for the determination, validation, spectroscopic and thermal analysis of diphenhydramine in pharmaceutical preparation

    NASA Astrophysics Data System (ADS)

    Ulu, Sevgi Tatar; Elmali, Fikriye Tuncel

    2010-09-01

    A sensitive, simple and rapid spectrophotometric method was developed for the determination of diphenhydramine in pharmaceutical preparation. The method was based on the charge-transfer complex of the drug, as n-electron donor, with 2,3-dichloro-5,6-dicyano- p-benzoquinone (DDQ), as π-acceptor. The formation of this complex was also confirmed by UV-vis, FTIR and 1H NMR spectra techniques and thermal analysis. The proposed method was validated according to the ICH guidelines with respect to linearity, limit of detection, limit of quantification, accuracy, precision, recovery and robustness. The linearity range for concentrations of diphenhydramine was found to be 12.5-150 μg/mL with acceptable correlation coefficients. The detection and quantification limits were found to be 2.09 and 6.27 μg/mL, respectively. The proposed and references methods were applied to the determination of drug in syrup. This preparation were also analyzed with an reference method and statistical comparison by t- and F-tests revealed that there was no significant difference between the results of the two methods with respect to mean values and standard deviations at the 95% confidence level.

  14. Spectrophotometric method for the determination, validation, spectroscopic and thermal analysis of diphenhydramine in pharmaceutical preparation.

    PubMed

    Ulu, Sevgi Tatar; Elmali, Fikriye Tuncel

    2010-09-15

    A sensitive, simple and rapid spectrophotometric method was developed for the determination of diphenhydramine in pharmaceutical preparation. The method was based on the charge-transfer complex of the drug, as n-electron donor, with 2,3-dichloro-5,6-dicyano-p-benzoquinone (DDQ), as pi-acceptor. The formation of this complex was also confirmed by UV-vis, FTIR and (1)H NMR spectra techniques and thermal analysis. The proposed method was validated according to the ICH guidelines with respect to linearity, limit of detection, limit of quantification, accuracy, precision, recovery and robustness. The linearity range for concentrations of diphenhydramine was found to be 12.5-150 microg/mL with acceptable correlation coefficients. The detection and quantification limits were found to be 2.09 and 6.27 microg/mL, respectively. The proposed and references methods were applied to the determination of drug in syrup. This preparation were also analyzed with an reference method and statistical comparison by t- and F-tests revealed that there was no significant difference between the results of the two methods with respect to mean values and standard deviations at the 95% confidence level. PMID:20621611

  15. Further validation to the variational method to obtain flow relations for generalized Newtonian fluids

    NASA Astrophysics Data System (ADS)

    Sochi, Taha

    2015-05-01

    We continue our investigation to the use of the variational method to derive flow relations for generalized Newtonian fluids in confined geometries. While in the previous investigations we used the straight circular tube geometry with eight fluid rheological models to demonstrate and establish the variational method, the focus here is on the plane long thin slit geometry using those eight rheological models, namely: Newtonian, power law, Ree-Eyring, Carreau, Cross, Casson, Bingham and Herschel-Bulkley. We demonstrate how the variational principle based on minimizing the total stress in the flow conduit can be used to derive analytical expressions, which are previously derived by other methods, or used in conjunction with numerical procedures to obtain numerical solutions which are virtually identical to the solutions obtained previously from well established methods of fluid dynamics. In this regard, we use the method of Weissenberg-Rabinowitsch- Mooney-Schofield (WRMS), with our adaptation from the circular pipe geometry to the long thin slit geometry, to derive analytical formulae for the eight types of fluid where these derived formulae are used for comparison and validation of the variational formulae and numerical solutions. Although some examples may be of little value, the optimization principle which the variational method is based upon has a significant theoretical value as it reveals the tendency of the flow system to assume a configuration that minimizes the total stress. Our proposal also offers a new methodology to tackle common problems in fluid dynamics and rheology.

  16. Extension and validation of a method for locating damaged members in large space trusses

    NASA Technical Reports Server (NTRS)

    Smith, Suzanne Weaver

    1988-01-01

    The damage location approach employs the control system capabilities for the structure to test the structure and measure the dynamic response. The measurements are then used in a system identification algorithm to produce a model of the damaged structure. The model is compared to one for the undamaged structure to find regions of reduced stiffness which indicate the location of damage. Kabe's 3,4 stiffness matrix adjustment method was the central identification algorithm. The strength of his method is that, with minimal data, it preserves the representation of the physical connectivity of the structure in the resulting model of the damaged truss. However, extensive storage and computational effort were required as a result. Extension of the damage location method to overcome these problems is the first part of the current work. The central system identification algorithm is replaced with the MSMT method of stiffness matrix adjustment which was previously derived by generalizing an optimal-update secant method form quasi-Newton approaches for nonlinear optimization. Validation of the extended damage location method is the second goal.

  17. Experimental validation of theoretical methods to estimate the energy radiated by elastic waves during an impact

    NASA Astrophysics Data System (ADS)

    Farin, Maxime; Mangeney, Anne; Rosny, Julien de; Toussaint, Renaud; Sainte-Marie, Jacques; Shapiro, Nikolaï M.

    2016-02-01

    Estimating the energy lost in elastic waves during an impact is an important problem in seismology and in industry. We propose three complementary methods to estimate the elastic energy radiated by bead impacts on thin plates and thick blocks from the generated vibration. The first two methods are based on the direct wave front and are shown to be equivalent. The third method makes use of the diffuse regime. These methods are tested for laboratory experiments of impacts and are shown to give the same results, with error bars of 40 percent and 300 percent for impacts on a smooth plate and on a rough block, respectively. We show that these methods are relevant to establish the energy budget of an impact. On plates of glass and PMMA, the radiated elastic energy increases from 2 percent to almost 100 percent of the total energy lost as the bead diameter approaches the plate thickness. The rest of the lost energy is dissipated by viscoelasticity. For beads larger than the plate thickness, plastic deformation occurs and reduces the amount of energy radiated in the form of elastic waves. On a concrete block, the energy dissipation during the impact is principally inelastic because only 0.2-2 percent of the energy lost by the bead is transported by elastic waves. The radiated elastic energy estimated with the presented methods is quantitatively validated by Hertz's model of elastic impact.

  18. Comparison of sample preparation methods, validation of an UPLC-MS/MS procedure for the quantification of tetrodotoxin present in marine gastropods and analysis of pufferfish.

    PubMed

    Nzoughet, Judith Kouassi; Campbell, Katrina; Barnes, Paul; Cooper, Kevin M; Chevallier, Olivier P; Elliott, Christopher T

    2013-02-15

    Tetrodotoxin (TTX) is one of the most potent marine neurotoxins reported. The global distribution of this toxin is spreading with the European Atlantic coastline now being affected. Climate change and increasing pollution have been suggested as underlying causes for this. In the present study, two different sample preparation techniques were used to extract TTX from Trumpet shells and pufferfish samples. Both extraction procedures (accelerated solvent extraction (ASE) and a simple solvent extraction) were shown to provide good recoveries (80-92%). A UPLC-MS/MS method was developed for the analysis of TTX and validated following the guidelines contained in the Commission Decision 2002/657/EC for chemical contaminant analysis. The performance of this procedure was demonstrated to be fit for purpose. This study is the first report on the use of ASE as a mean for TTX extraction, the use of UPLC-MS/MS for TTX analysis, and the validation of this method for TTX in gastropods. PMID:23194566

  19. Using Self- and Peer-Assessments for Summative Purposes: Analysing the Relative Validity of the AASL (Authentic Assessment for Sustainable Learning) Model

    ERIC Educational Resources Information Center

    Kearney, Sean; Perkins, Timothy; Kennedy-Clark, Shannon

    2016-01-01

    The purpose of this paper is to provide a proof of concept of a collaborative peer-, self- and lecturer assessment processes. The research presented here is part of an ongoing study on self- and peer assessments in higher education. The authentic assessment for sustainable learning (AASL) model is evaluated in terms of the correlations between…

  20. Validated LC-MS-MS Method for Multiresidual Analysis of 13 Illicit Phenethylamines in Amniotic Fluid.

    PubMed

    Burrai, Lucia; Nieddu, Maria; Carta, Antonio; Trignano, Claudia; Sanna, Raimonda; Boatto, Gianpiero

    2016-04-01

    A multi-residue analytical method was developed for the determination in amniotic fluid (AF) of 13 illicit phenethylamines, including 12 compounds never investigated in this matrix before. Samples were subject to solid-phase extraction using; hydrophilic-lipophilic balance cartridges which gave good recoveries and low matrix effects on analysis of the extracts. The quantification was performed by liquid chromatography electrospray tandem mass spectrometry. The water-acetonitrile mobile phase containing 0.1% formic acid, used with a C18 reversed phase column, provided adequate separation, resolution and signal-to-noise ratio for the analytes and the internal standard. The final optimized method was validated according to international guidelines. A monitoring campaign to assess fetal exposure to these 13 substances of abuse has been performed on AF test samples obtained from pregnant women. All mothers (n = 194) reported no use of drugs of abuse during pregnancy, and this was confirmed by the analytical data. PMID:26755540

  1. Field validation of test methods for solidified waste evaluation -- a status report

    SciTech Connect

    Stegemann, J.A.; Caldwell, R.J.; Shi, C.

    1996-12-31

    Application of solidification/stabilization as a treatment technology for hazardous wastes has been hindered by the lack of a regulatory approval mechanism for solidified wastes. The Wastewater Technology Centre (WTC) has developed a protocol for evaluation of solidified wastes, which uses the performance of a solidified product in twelve laboratory test methods to recommend one of four categories of utilization and disposal. In order to facilitate acceptance of the protocol, a validation study of the test methods has been initiated by the WTC. A 63 m{sup 3} field test cell has been constructed using electric arc furnace dust solidified with an activated blast furnace slag binder system. The behavior of the solidified waste in the field will be investigated by monitoring of leachate and testing of core samples, and compared with properties measured in the laboratory.

  2. Approach to a Method of Integrated Evaluation of Thermal Fatigue and its Validation Using SPECTRA

    NASA Astrophysics Data System (ADS)

    Oumaya, Toru; Nakamura, Akira; Takenaka, Nobuyuki

    Thermal fatigue may initiate at a T-junction or a branched off line where high and low temperature fluids mix. These are common piping elements in nuclear power plants. To ensure structural integrity against thermal fatigue during the design phase, it is important to estimate thermal load from such design specifications as flow rate, temperature difference, pipe diameter, etc. IMAT-F, an evaluation method integrating thermal hydraulic and structural analysis, was developed in this study to precisely determine thermal load excluding safety margins or conservative engineering judgment. The method was validated by numerical flow simulations of high-cycle thermal fatigue experiment SPECTRA, conducted by Japan Atomic Energy Agency. Results confirmed that IMAT-F can accurately simulate fluid and pipe wall temperature fluctuation using fluid-structure coupled analysis. Thermal stress fluctuation resulting from distribution of temperature fluctuation in the pipe wall was then calculated. Fluctuation fatigue life was also estimated for comparison with the experimental results.

  3. Validation of Inlet and Exhaust Boundary Conditions for a Cartesian Method

    NASA Technical Reports Server (NTRS)

    Pandya, Shishir A.; Murman, Scott M.; Aftosmis, Michael J.

    2004-01-01

    Inlets and exhaust nozzles are often omitted in aerodynamic simulations of aircraft due to the complexities involved in the modeling of engine details and flow physics. However, the omission is often improper since inlet or plume flows may have a substantial effect on vehicle aerodynamics. A method for modeling the effect of inlets and exhaust plumes using boundary conditions within an inviscid Cartesian flow solver is presented. This approach couples with both CAD systems and legacy geometry to provide an automated tool suitable for parameter studies. The method is validated using two and three-dimensional test problems which are compared with both theoretical and experimental results. The numerical results demonstrate excellent agreement with theory and available data, even for extremely strong jets and very sensitive inlets.

  4. Validation of the activity expansion method with ultrahigh pressure shock equations of state

    NASA Astrophysics Data System (ADS)

    Rogers, Forrest J.; Young, David A.

    1997-11-01

    Laser shock experiments have recently been used to measure the equation of state (EOS) of matter in the ultrahigh pressure region between condensed matter and a weakly coupled plasma. Some ultrahigh pressure data from nuclear-generated shocks are also available. Matter at these conditions has proven very difficult to treat theoretically. The many-body activity expansion method (ACTEX) has been used for some time to calculate EOS and opacity data in this region, for use in modeling inertial confinement fusion and stellar interior plasmas. In the present work, we carry out a detailed comparison with the available experimental data in order to validate the method. The agreement is good, showing that ACTEX adequately describes strongly shocked matter.

  5. Fit-for-purpose chromatographic method for the determination of amikacin in human plasma for the dosage control of patients.

    PubMed

    Ezquer-Garin, C; Escuder-Gilabert, L; Martín-Biosca, Y; Lisart, R Ferriols; Sagrado, S; Medina-Hernández, M J

    2016-04-01

    In this paper, a simple, rapid and sensitive method based on liquid chromatography with fluorimetric detection (HPLC-FLD) for the determination of amikacin (AMK) in human plasma is developed. Determination is performed by pre-column derivatization of AMK with ortho-phtalaldehyde (OPA) in presence of N-acetyl-L-cysteine (NAC) at pH 9.5 for 5 min at 80 °C. In our knowledge, this is the first time that NAC has been used in AMK derivatization. Derivatization conditions (pH, AMK/OPA/NAC molar ratios, temperature and reaction time) are optimized to obtain a single and stable, at room temperature, derivative. Separation of the derivative is achieved on a reversed phase LC column (Kromasil C18, 5 μm, 150 × 4.6 i.d. mm) with a mobile phase of 0.05 M phosphate buffer:acetonitrile (80:20, v/v) pumped at flow rate of 1.0 mL/min. Detection is performed using 337 and 439 nm for excitation and emission wavelengths, respectively. The method is fitted for the purpose of being a competitive alternative to the currently used method in many hospitals for AMK dosage control: fluorescence polarization immunoassay (FPIA). The method exhibits linearity in the 0.17-10 µg mL(-1) concentration range with a squared correlation coefficient higher than 0.995. Trueness and intermediate precision are estimated using spiked drug free plasma samples, which fulfill current UNE-EN ISO15189:2007 accreditation schemes. Finally, for the first time, statistical comparison against the FPIA method is demonstrated using plasma samples from 31 patients under treatment with AMK. PMID:26838437

  6. Validated spectrophotometric methods for the estimation of moxifloxacin in bulk and pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Motwani, Sanjay K.; Chopra, Shruti; Ahmad, Farhan J.; Khar, Roop K.

    2007-10-01

    New, simple, cost effective, accurate and reproducible UV-spectrophotometric methods were developed and validated for the estimation of moxifloxacin in bulk and pharmaceutical formulations. Moxifloxacin was estimated at 296 nm in 0.1N hydrochloric acid (pH 1.2) and at 289 nm in phosphate buffer (pH 7.4). Beer's law was obeyed in the concentration range of 1-12 μg ml -1 ( r2 = 0.9999) in hydrochloric acid and 1-14 μg ml -1 ( r2 = 0.9998) in the phosphate buffer medium. The apparent molar absorptivity and Sandell's sensitivity coefficient were found to be 4.63 × 10 4 l mol -1 cm -1 and 9.5 ng cm -2/0.001 A in hydrochloric acid; and 4.08 × 10 4 l mol -1 cm -1 and 10.8 ng cm -2/0.001 A in phosphate buffer media, respectively indicating the high sensitivity of the proposed methods. These methods were tested and validated for various parameters according to ICH guidelines. The detection and quantitation limits were found to be 0.0402, 0.1217 μg ml -1 in hydrochloric acid and 0.0384, 0.1163 μg ml -1 in phosphate buffer medium, respectively. The proposed methods were successfully applied for the determination of moxifloxacin in pharmaceutical formulations (tablets, i.v. infusions, eye drops and polymeric nanoparticles). The results demonstrated that the procedure is accurate, precise and reproducible (relative standard deviation <2%), while being simple, cheap and less time consuming and hence can be suitably applied for the estimation of moxifloxacin in different dosage forms and dissolution studies.

  7. Overcoming barriers to validation of non-animal partial replacement methods/Integrated Testing Strategies: the report of an EPAA-ECVAM workshop.

    PubMed

    Kinsner-Ovaskainen, Agnieszka; Akkan, Zerrin; Casati, Silvia; Coecke, Sandra; Corvi, Raffaella; Dal Negro, Gianni; De Bruijn, Jack; De Silva, Odile; Gribaldo, Laura; Griesinger, Claudius; Jaworska, Joanna; Kreysa, Joachim; Maxwell, Gavin; McNamee, Pauline; Price, Anna; Prieto, Pilar; Schubert, Roland; Tosti, Luca; Worth, Andrew; Zuang, Valerie

    2009-09-01

    The use of Integrated Testing Strategies (ITS) in toxicological hazard identification and characterisation is becoming increasingly common as a method for enabling the integration of diverse types of toxicology data. At present, there are no existing procedures and guidelines for the construction and validation of ITS, so a joint EPAA WG5-ECVAM workshop was held with the following objectives: a) to investigate the role of ITS and the need for validation of ITS in the different industry sectors (pharmaceuticals, cosmetics, chemicals); b) to formulate a common definition of ITS applicable across different sectors; c) to explore how and when Three Rs methods are used within ITS; and d) to propose a validation rationale for ITS and for alternative methods that are foreseen to be used within ITS. The EPAA provided a platform for comparing experiences with ITS across different industry sectors. It became clear that every ITS has to be adapted to the product type, R&D stage, and regulatory context. However, common features of ITS were also identified, and this permitted the formulation of a general definition of ITS in a regulatory context. The definition served as a basis for discussing the needs, rationale and process of formal ITS validation. One of the main conclusions was that a formal validation should not be required, unless the strategy will serve as full replacement of an in vivo study used for regulatory purposes. Finally, several challenges and bottlenecks to the ITS validation were identified, and it was agreed that a roadmap on how to address these barriers would be established by the EPAA partners. PMID:19807215

  8. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  9. Validation of the Use of Dried Blood Spot (DBS) Method to Assess Vitamin A Status

    PubMed Central

    Fallah, Elham; Peighambardoust, Seyed Hadi

    2012-01-01

    Background: Vitamin A deficiency is an important dietary deficiency in the world. Thus, the ne¬cessity of screening for deficient populations is obvious. This paper introduces a fast, cheap and relatively reliable method called “dried blood spot” (DBS) method in screening the deficient populations. The validity of this method for retinol measurement was investigated. Method: The “precision” and “agreement” criteria of the DBS method were assessed. The preci¬sion was calculated and compared with those of plasma using F-test. The agreement was eva¬luated using Bland-Altman plot. Results: The imprecision of retinol measurements in dried spots was not significantly different from those of the control (plasma). A good correlation coefficient (r2=0.78) was obtained for dried spots’ retinol measurements versus plasma’s retinol analysis (P < 0.01). Paired t-test showed no significant difference between the DBS and retinol methods on a group level. Imprecision of DBS measurement was acceptable, compared to that of the plasma method. The difference be¬tween these two methods was not statistically significant on a group level. Conclusion: Application of DBS standard samples, in which a part of the plasma was replaced with the artificial plasma, was shown to be a reliable calibration mean for retinol measurements in DBS samples. Retinol in dried spots was stable for 90 days. Overall, the DBS method provided a precise measurement of retinol, showing results that were comparable with the measurement of retinol in plasma. PMID:24688932

  10. Validated spectrophotometric and chromatographic methods for simultaneous determination of ketorolac tromethamine and phenylephrine hydrochloride.

    PubMed

    Belal, T S; El-Kafrawy, D S; Mahrous, M S; Abdel-Khalek, M M; Abo-Gharam, A H

    2016-07-01

    This work describes five simple and reliable spectrophotometric and chromatographic methods for analysis of the binary mixture of ketorolac tromethamine (KTR) and phenylephrine hydrochloride (PHE). Method I is based on the use of conventional Amax and derivative spectrophotometry with the zero-crossing technique where KTR was determined using its Amax and (1)D amplitudes at 323 and 341nm respectively, while PHE was determined by measuring the (1)D amplitudes at 248.5nm. Method II involves the application of the ratio spectra derivative spectrophotometry. For KTR, 12μg/mL PHE was used as a divisor and the (1)DD amplitudes at 265nm were plotted against KTR concentrations; while - by using 4μg/mL KTR as divisor - the (1)DD amplitudes at 243.5nm were found proportional to PHE concentrations. Method III depends on ratio-difference measurement where the peak to trough amplitudes between 260 and 284nm were measured and correlated to KTR concentration. Similarly, the peak to trough amplitudes between 235 and 260nm in the PHE ratio spectra were recorded. For method IV, the two compounds were separated using Merck HPTLC sheets of silica gel 60 F254 and a mobile phase composed of chloroform/methanol/ammonia (70:30:2, by volume) followed by densitometric measurement of KTR and PHE spots at 320 and 278nm respectively. Method V depends on HPLC-DAD. Effective chromatographic separation was achieved using Zorbax eclipse plus C8 column (4.6×250mm, 5μm) with a mobile phase consisting of 0.05M o-phosphoric acid and acetonitrile (50:50, by volume) at a flow rate 1mL/min and detection at 313 and 274nm for KTR and PHE respectively. Analytical performance of the developed methods was statistically validated according to the ICH guidelines with respect to linearity, ranges, precision, accuracy, detection and quantification limits. The validated spectrophotometric and chromatographic methods were successfully applied to the simultaneous analysis of KTR and PHE in synthetic mixtures

  11. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    SciTech Connect

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  12. Validation of analytical methods and instrumentation for beryllium measurement: review and summary of available guides, procedures, and protocols.

    PubMed

    Ekechukwu, Amy; Hendricks, Warren; White, Kenneth T; Liabastre, Albert; Archuleta, Melecita; Hoover, Mark D

    2009-12-01

    This document provides a listing of available sources that can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. An annotated listing of the articles, papers, and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of the validation process at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization, the International Electrotechnical Commission, and the Association of Official Analytical Chemists. This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. PMID:19894179

  13. Validation of the MODIS Land-Surface Temperature Products with Temperature and Radiance-based Methods

    NASA Astrophysics Data System (ADS)

    Wan, Z.; Zhang, Y.; Zhang, Q.

    2003-12-01

    A major field campaign was conducted in Railroad Valley, NV, in June 2003. Ground-based measurements were made in the clear-sky days from June 26 to 30. Sky radiance and surface-leaving TIR radiance in sunshine and shadow conditions were measured with a Bomem TIR spectroradiometer. Diurnal surface temperatures were measured with four TIR radiometers. Six radio sounding balloons were launched in the period of clear-sky days to measure the atmospheric temperature and water vapor profiles. MODIS Airborne Simulator (MAS) data were acquired in a daytime flight and a nighttime flight on June 27. An excellent match between the measured spectral sky radiance and the radiance calculated with atmospheric radiative transfer code MODTRAN4.0 based on the measured atmospheric profiles provides a solid evidence of the good quality of both the TIR spectroradiometer and the radiative transfer code. The measured surface-leaving TIR radiance in sunshine and shadow conditions were used to retrieve playa surface spectral emissivity by a sun-shadow method. The band-averaged emissivities calculated from the retrieved spectral emissivity agree within 0.005 with those used in the MODIS split-window LST algorithm for the site. Terra and Aqua MODIS 1km LST products were validated with a temperature-based method using the LSTs measured by the TIR radiometers at nights. This method is limited by the spatial variation in LSTs, which is obviously shown in the day and night MAS images. The LST products were also validated in day and night conditions with a radiance-based method, which is based on the MODTRAN code, measured surface emissivity and atmospheric profiles. The LST accuracies are better than 1K in all seven Aqua cases where zenith viewing angles are up to 56\\deg, and in four of six Terra cases. The LST accuracy is better than 1.5K in the remaining two Terra cases with viewing angles at 54\\deg and 60\\deg. The accuracy of nighttime LSTs at viewing angles within 47\\deg is better than 0

  14. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  15. Validation of a novel derivatization method for GC-ECD determination of acrylamide in food.

    PubMed

    Notardonato, Ivan; Avino, Pasquale; Centola, Angela; Cinelli, Giuseppe; Russo, Mario Vincenzo

    2013-07-01

    This paper proposes a new method for quantitative analysis of acrylamide in cereal-based foods and potato chips. The method uses reaction with trifluoroacetic anhydride, and analyses the resulting derivative by use of gas chromatography with electron-capture detection (GC-ECD). The effects of derivatization conditions, including temperature, reaction time, and catalyst, on the acylation reaction were evaluated. Chromatographic analysis was performed on an SE-54 capillary column. Under the optimum conditions, good retention and peak response were achieved for the acrylamide derivative. The analytical method was fully validated by assessment of LODs and LOQs (1 ng g(-1) and 25 ng g(-1), with relative standard deviations (RSD) 2.1 and 3.6, respectively), linearity (R = 0.9935 over the range 0.03-10 μg g(-1)), and extraction recovery (>96%, with RSD below 2.0, for acrylamide spiked at 1, 20, 50, and 100 ng g(-1); 99.8% for acrylamide content >1000 ng g(-1)). The method requires no clean-up of the acrylamide derivative before injection. The method has been successfully used to determine acrylamide levels in different commercial cereal-based foods, French fries, and potato chips. PMID:23660693

  16. Validated UPLC method for determination of unbound bile acids in colesevelam HCl tablets.

    PubMed

    Vallapragada, Venkata Vivekanand; Inti, Gopichand; Vidiyala, Sudhakar Rao; Jadi, Sreeramulu

    2015-01-01

    A simple, precise and accurate gradient reverse-phase ultra-performance liquid chromatographic method was developed for the quantitative determination of bile acids [glycocholic acid (GCA), glycochenodeoxycholic acid (GCDA) and taurodeoxycholic acid (TDCA)) in in vitro bile acid-binding study of Welchol tablets. The method was developed using Phenomenex Kinetex C18 (50 × 2.10 mm, 1.7 µm) column with mobile phase containing a gradient mixture of solvent A consisting of 0.02 M tetrabutylammonium phosphate (pH 7.5) and solvent B consists acetonitrile. The eluted compounds were monitored at 210 nm and the runtime was within 2 min. The binding parameter constants of Colesevelam HCl tablets 625 mg were determined using the Langmuir approximation at pH 6.8 by UPLC. The method is selective and capable of detecting bile acids in the presence of placebo matrix. The method has been validated with a lower limit of quantitation of 0.01 mM for bile acids. A linear response function was established for the range of concentrations 0.01-30.0 mM (r > 0.99) for GCA, GCDA and TDCA. The intra- and interday precision values for bile acids met the acceptance as per Food and Drug Administrations guidelines. The developed method was applied to in vitro bile acid-binding studies of Colesevelam HCl tablets. PMID:24795077

  17. Physically consistent viscosity of polyphase rocks: a new method and its validation

    NASA Astrophysics Data System (ADS)

    Huet, B.; Yamato, P.; Grasemann, B.

    2012-04-01

    Metamorphic reactions constitute one of the major processes inducing strain localisation and influencing the strength of the lithosphere. However, this process has seldom been explicitly taken into account in large-scale thermomechanical models so far. Such a development requires the calculation of the strength of any rock knowing its mineralogical composition and the strength of its components. Most of the existing polyphase rocks strength models are empirical. Those that are physically consistent provide strength bounds and/or lead to long and complex calculations, which is not suitable for large scale modelling. Here, we present a new method to calculate the bulk viscosity of a polyphase rock knowing the fraction and the creep parameters of each phase constituting the rock. This analytical method uses a minimization procedure of the power dissipated in the polyphase rock with the Lagrange multiplier technique. This method is simple and quickly leads to values of the bulk viscosity as well as partitioning of stress and strain rate between phases. It allows us to revaluate the classical bounds and to compute a close approximate of bulk viscosity and bulk creep parameters, that are physically consistent. Then, this method is tested and validated against experimental data and numerical models under simple shear condition. Finally, we present an application of this method to the evolution of strength in a subducting slab.

  18. Description and validation of a limb scatter retrieval method for Odin/OSIRIS

    NASA Astrophysics Data System (ADS)

    Tukiainen, S.; Hassinen, S.; SeppäLä, A.; Auvinen, H.; KyröLä, E.; Tamminen, J.; Haley, C. S.; Lloyd, N.; Verronen, P. T.

    2008-02-01

    In this paper we present the Modified Onion Peeling (MOP) inversion method, which is for the first time used to retrieve vertical profiles of stratospheric trace gases from Odin/OSIRIS limb scatter measurements. Since the original publication of the method in 2002, the method has undergone major modifications discussed here. The MOP method now uses a spectral microwindow for the NO2 retrieval, instead of the wide UV-visible band used for the ozone, air, and aerosol retrievals. We give a brief description of the algorithm itself and show its performance with both simulated and real data. Retrieved ozone and NO2 profiles from the OSIRIS measurements were compared with data from the GOMOS and HALOE instruments. No more than 5% difference was found between OSIRIS daytime and GOMOS nighttime ozone profiles between 21 and 45 km. The difference between OSIRIS and HALOE sunset NO2 mixing ratio profiles was at most 25% between 20 and 40 km. The neutral air density was compared with the ECMWF analyzed data and around 5% difference was found at altitudes from 20 to 55 km. However, OSIRIS observations yield as much as 80% greater aerosols number density than GOMOS observations between 15 and 35 km. These validation results indicate that the quality of MOP ozone, NO2, and neutral air is good. The new version of the method introduced here is also easily expanded to retrieve additional species of interest.

  19. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments.

    PubMed

    Abdel-Aziz, Omar; Ayad, Miriam F; Tadros, Mariam M

    2015-04-01

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml(-1). The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml(-1). The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml(-1). All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms. PMID:25613694

  20. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments

    NASA Astrophysics Data System (ADS)

    Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    2015-04-01

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml-1. The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml-1. The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml-1. All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms.

  1. European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.

    PubMed

    Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario

    2014-08-01

    The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories. PMID:24513055

  2. Spectrofluorimetric Method for Estimation of Curcumin in Rat Blood Plasma: Development and Validation

    NASA Astrophysics Data System (ADS)

    Trivedi, J.; Variya, B.; Gandhi, H.; Rathod, S. P.

    2016-01-01

    Curcumin is a medicinally important phytoconstituent of curcuminoids. The present study describes development of a simple method for estimation of curcumin in rat plasma. This method involves the use of spectrofluorimetry for evaluation of curcumin at 257 (Ex) and 504 nm (Em). Sample preparation involves only two steps: extraction of curcumin and drying the extract. Following this procedure, the samples are reconstituted with ethyl acetate, and relative fluorescence intensity is measured using a spectrofluorimeter. The method was validated as per CDER guidelines. The linearity of the method was found to be in the range of 100-500 ng/mL with accuracy and precision lying within 2% RSD. The LOD and LOQ were found to be 15.3 and 46.1 ng/mL, respectively. The method was applied for pharmacokinetic evaluation in rats, and AUC, Cmax, and Tmax were found to be 5580 ± 1006 h × ng/mL, 1526 ± 209 ng/mL, and 2.97 ± 0.28 h, respectively, with a plasma half-life of 1.14 ± 0.27 h.

  3. Validation of prescribing appropriateness criteria for older Australians using the RAND/UCLA appropriateness method

    PubMed Central

    Basger, Benjamin Joseph; Chen, Timothy Frank; Moles, Rebekah Jane

    2012-01-01

    Objective To further develop and validate previously published national prescribing appropriateness criteria to assist in identifying drug-related problems (DRPs) for commonly occurring medications and medical conditions in older (≥65 years old) Australians. Design RAND/UCLA appropriateness method. Participants A panel of medication management experts were identified consisting of geriatricians/pharmacologists, clinical pharmacists and disease management advisors to organisations that produce Australian evidence-based therapeutic publications. This resulted in a round-one panel of 15 members, and a round-two panel of 12 members. Main outcome measure Agreement on all criteria. Results Forty-eight prescribing criteria were rated. In the first rating round via email, there was disagreement regarding 17 of the criteria according to median panel ratings. During a face-to-face second round meeting, discussion resulted in retention of 25 criteria after amendments, agreement for 14 criteria with no changes required and deletion of 9 criteria. Two new criteria were added, resulting in a final validated list of 41 prescribing appropriateness criteria. Agreement after round two was reached for all 41 criteria, measured by median panel ratings and the amount of dispersion of panel ratings, based on the interpercentile range. Conclusions A set of 41 Australian prescribing appropriateness criteria were validated by an expert panel. Use of these criteria, together with clinical judgement and other medication review processes such as patient interview, is intended to assist in improving patient care by efficiently detecting potential DRPs related to commonly occurring medicines and medical conditions in older Australians. These criteria may also contribute to the medication management education of healthcare professionals. PMID:22983875

  4. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  5. CE-C(4)D method development and validation for the assay of ciprofloxacin.

    PubMed

    Paul, Prasanta; Van Laeken, Christophe; Sänger-van de Griend, Cari; Adams, Erwin; Van Schepdael, Ann

    2016-09-10

    A capillary electrophoresis method with capacitively coupled contactless conductivity detection (CE-C(4)D) has been developed, optimized and validated for the determination of ciprofloxacin. Ciprofloxacin is a member of the fluoroquinolone antibiotics with a broad spectrum bactericidal activity and recommended for complicated respiratory infections, sexually transmitted diseases, tuberculosis, bacterial diarrhea etc. Method development was conducted with major focus on the quality by design (QbD) approach. During method development, multiple buffers were tried at different ionic strength. However, the optimized method finally involved a very simple background electrolyte, monosodium citrate at a concentration of 10mM without pH adjustment. The optimized CE-C(4)D method involved an uncoated fused silica capillary (59/39cm, 50μm i.d.) and hydrodynamic sample injection at a pressure of 0.5 p.s.i. for 5s. The actual separation was conducted for 10min at normal polarity with a voltage of 20kV corresponding to 5.9μA current. LiCl (1mg/mL) was used as an internal standard. The optimized method is robust and accurate (recovery >98%) which rendered the ciprofloxacin peak within five minutes with good linearity (R(2)>0.999) in the concentration range of 0.0126-0.8mg/mL. The repeatability is expressed by percentage relative standard deviation (%RSD) of the relative peak areas (RPA) and it showed good repeatability both intra-day (<3%) and inter-day (3.1%). This method, proven to be free of matrix interference, showed that the estimated percent content of ciprofloxacin (102%) was within the official requirements. Moreover, due to its ease of use and robustness, the method should also be applicable in less well controlled laboratory environments. PMID:27386824

  6. A validated spectrofluorimetric method for the determination of nifuroxazide through coumarin formation using experimental design

    PubMed Central

    2013-01-01

    Background Nifuroxazide (NF) is an oral nitrofuran antibiotic, having a wide range of bactericidal activity against gram positive and gram negative enteropathogenic organisms. It is formulated either in single form, as intestinal antiseptic or in combination with drotaverine (DV) for the treatment of gastroenteritis accompanied with gastrointestinal spasm. Spectrofluorimetry is a convenient and sensitive technique for pharmaceutical quality control. The new proposed spectrofluorimetric method allows its determination either in single form or in binary mixture with DV. Furthermore, experimental conditions were optimized using the new approach: Experimental design, which has many advantages over the old one, one variable at a time (OVAT approach). Results A novel and sensitive spectrofluorimetric method was designed and validated for the determination of NF in pharmaceutical formulation. The method was based upon the formation of a highly fluorescent coumarin compound by the reaction between NF and ethylacetoacetate (EAA) using sulfuric acid as catalyst. The fluorescence was measured at 390 nm upon excitation at 340 nm. Experimental design was used to optimize experimental conditions. Volumes of EAA and sulfuric acid, temperature and heating time were considered the critical factors to be studied in order to establish an optimum fluorescence. Each two factors were co-tried at three levels. Regression analysis revealed good correlation between fluorescence intensity and concentration over the range 20–400 ng ml-1. The suggested method was successfully applied for the determination of NF in pure and capsule forms. The procedure was validated in terms of linearity, accuracy, precision, limit of detection and limit of quantification. The selectivity of the method was investigated by analysis of NF in presence of the co-mixed drug DV where no interference was observed. The reaction pathway was suggested and the structure of the fluorescent product was proposed

  7. Fatal Intoxication Involving 3-MeO-PCP: A Case Report and Validated Method.

    PubMed

    Bakota, Erica; Arndt, Crystal; Romoser, Amelia A; Wilson, Stephen K

    2016-09-01

    We present in this case report a validated method for accurate quantitative analysis of 3-methoxy phencyclidine (3-MeO-PCP) to determine postmortem blood concentrations of this PCP analog. A 29-year-old male with a history of illicit drug use was found unresponsive in his bed with a bag of white powder next to him. Resuscitation efforts were unsuccessful and the individual was pronounced dead 9 minutes after arrival to the hospital. Initial ELISA screening suggested the presence of PCP in the decedent's blood. However, confirmatory testing revealed no detectable PCP. Instead, a large peak corresponding to a m/z 274.218 species with retention time similar to PCP was present on a LC-TOF-MS drug screen, suggesting a possible PCP analog. This mass corresponds specifically to a methoxy-PCP analog, several of which are available for purchase online. Standards for 3-MeO-PCP and 4-MeO-PCP were obtained and injected on the same instrument. Although the 3- and 4-MeO-PCP analogs have identical masses and retention times, they are still distinguishable through their mass spectra. The peak from the decedent's sample matched both the mass spectrum and the retention time of 3-MeO-PCP. A quantitative LC-MS-MS method was subsequently developed and validated for casework. Analysis using this method revealed a concentration of 139 ± 41 µg/L 3-MeO-PCP in the decedent's blood. Diphenhydramine (4.1 ± 0.7 mg/L), marijuana metabolite (presumptive positive, confirmation not performed) and a small amount of amphetamine (<0.10 mg/L) were also found in the decedent's blood. The cause of death was determined to be combined 3-MeO-PCP, diphenhydramine and amphetamine toxicity. The manner of death was certified as an accident. PMID:27339479

  8. Development and Content Validation of the Information Assessment Method for Patients and Consumers

    PubMed Central

    Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan LM; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-01-01

    Background Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. Objective We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Methods Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. Results The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded

  9. Validating a local Arterial Input Function method for improved perfusion quantification in stroke

    PubMed Central

    Willats, Lisa; Christensen, Soren; K Ma, Henry; A Donnan, Geoffrey; Connelly, Alan; Calamante, Fernando

    2011-01-01

    In bolus-tracking perfusion magnetic resonance imaging (MRI), temporal dispersion of the contrast bolus due to stenosis or collateral supply presents a significant problem for accurate perfusion quantification in stroke. One means to reduce the associated perfusion errors is to deconvolve the bolus concentration time-course data with local Arterial Input Functions (AIFs) measured close to the capillary bed and downstream of the arterial abnormalities causing dispersion. Because the MRI voxel resolution precludes direct local AIF measurements, they must be extrapolated from the surrounding data. To date, there have been no published studies directly validating these local AIFs. We assess the effectiveness of local AIFs in reducing dispersion-induced perfusion error by measuring the residual dispersion remaining in the local AIF deconvolved perfusion maps. Two approaches to locating the local AIF voxels are assessed and compared with a global AIF deconvolution across 19 bolus-tracking data sets from patients with stroke. The local AIF methods reduced dispersion in the majority of data sets, suggesting more accurate perfusion quantification. Importantly, the validation inherently identifies potential areas for perfusion underestimation. This is valuable information for the identification of at-risk tissue and management of stroke patients. PMID:21629260

  10. Method validation and analysis of nine dithiocarbamates in fruits and vegetables by LC-MS/MS.

    PubMed

    Schmidt, B; Christensen, H B; Petersen, A; Sloth, J J; Poulsen, M E

    2013-01-01

    An analytical method for separation and quantitative determination of nine dithiocarbamates (DTCs) in fruits and vegetables by using LC-MS/MS was developed, validated and applied to samples purchased in local supermarkets. The nine DTCs were ziram, ferbam, thiram, maneb, zineb, nabam, metiram, mancozeb and propineb. Validation parameters of mean recovery for two matrices at two concentration levels, relative repeatability (RSDr), relative within-laboratory reproducibility (RSDR) and LOD were obtained for the nine DTCs. The results from the analysis of fruits and vegetables served as the basis for an exposure assessment within the given commodities and a risk assessment by comparing the calculated exposure to the acceptable daily intake and acute reference dose for various exposure groups. The analysis indicated positive findings of DTCs in apples, pears, plums, table grapes, papaya and broccoli at concentrations ranging from 0.03 mg/kg to 2.69 mg/kg expressed as the equivalent amount of CS2. None of the values exceeded the Maximum residue level (MRL) set by the European Union, and furthermore, it was not possible to state whether illegal use had taken place or not, because a clear differentiation between the various DTCs in the LC-MS/MS analysis was lacking. The exposure and risk assessment showed that only for maneb in the case of apples and apple juice, the acute reference dose was exceeded for infants in the United Kingdom and for children in Germany, respectively. PMID:23799268

  11. Advanced validation of CFD-FDTD combined method using highly applicable solver for reentry blackout prediction

    NASA Astrophysics Data System (ADS)

    Takahashi, Yusuke

    2016-01-01

    An analysis model of plasma flow and electromagnetic waves around a reentry vehicle for radio frequency blackout prediction during aerodynamic heating was developed in this study. The model was validated based on experimental results from the radio attenuation measurement program. The plasma flow properties, such as electron number density, in the shock layer and wake region were obtained using a newly developed unstructured grid solver that incorporated real gas effect models and could treat thermochemically non-equilibrium flow. To predict the electromagnetic waves in plasma, a frequency-dependent finite-difference time-domain method was used. Moreover, the complicated behaviour of electromagnetic waves in the plasma layer during atmospheric reentry was clarified at several altitudes. The prediction performance of the combined model was evaluated with profiles and peak values of the electron number density in the plasma layer. In addition, to validate the models, the signal losses measured during communication with the reentry vehicle were directly compared with the predicted results. Based on the study, it was suggested that the present analysis model accurately predicts the radio frequency blackout and plasma attenuation of electromagnetic waves in plasma in communication.

  12. Validation of the femoral anteversion measurement method used in imageless navigation.

    PubMed

    Turley, Glen A; Ahmed, Shahbaz M Y; Williams, Mark A; Griffin, Damian R

    2012-01-01

    Total hip arthroplasty restores lost mobility to patients suffering from osteoarthritis and acute trauma. In recent years, navigated surgery has been used to control prosthetic component placement. Furthermore, there has been increasing research on what constitutes correct placement. This has resulted in the definition of a safe-zone for acetabular cup orientation. However, there is less definition with regard to femoral anteversion and how it should be measured. This study assesses the validity of the femoral anteversion measurement method used in imageless navigation, with particular attention to how the neutral rotation of the femur is defined. CT and gait analysis methodologies are used to validate the reference which defines this neutral rotation, i.e., the ankle epicondyle piriformis (AEP) plane. The findings of this study indicate that the posterior condylar axis is a reliable reference for defining the neutral rotation of the femur. In imageless navigation, when these landmarks are not accessible, the AEP plane provides a useful surrogate to the condylar axis, providing a reliable baseline for femoral anteversion measurement. PMID:22681336

  13. Validation of RF CCP Discharge Model against Experimental Data using PIC Method

    NASA Astrophysics Data System (ADS)

    Icenhour, Casey; Kummerer, Theresa; Green, David L.; Smithe, David; Shannon, Steven

    2014-10-01

    The particle-in-cell (PIC) simulation method is a well-known standard for the simulation of laboratory plasma discharges. Using parallel computation on the Titan supercomputer at Oak Ridge National Laboratory (ORNL), this research is concerned with validation of a radio-frequency (RF) capacitively-coupled plasma (CCP) discharge PIC model against previously obtained experimental data. The plasma sources under simulation are 10--100 mTorr argon plasmas with a 13 MHz source and 27 MHz source operating at 50--200 W in both pulse and constant power conditions. Plasma parameters of interest in the validation include peak electron density, electron temperature, and RF plasma sheath voltages and thicknesses. The plasma is modeled utilizing the VSim plasma simulation tool, developed by the Tech-X Corporation. The implementation used here is a two-dimensional electromagnetic model, with corresponding external circuit model of the experimental setup. The goal of this study is to develop models for more complex RF plasma systems utilizing highly parallel computing technologies and methodology. This work is carried out with the support of Oak Ridge National Laboratory and the Tech-X Corporation.

  14. A new validated analytical method for the quality control of red ginseng products

    PubMed Central

    Kim, Il-Woung; Cha, Kyu-Min; Wee, Jae Joon; Ye, Michael B.; Kim, Si-Kwan

    2013-01-01

    The main active components of Panax ginseng are ginsenosides. Ginsenoside Rb1 and Rg1 are accepted as marker substances for quality control worldwide. The analytical methods currently used to detect these two compounds unfairly penalize steamed and dried (red) P. ginseng preparations, because it has a lower content of those ginsenosides than white ginseng. To manufacture red ginseng products from fresh ginseng, the ginseng roots are exposed to high temperatures for many hours. This heating process converts the naturally occurring ginsenoside Rb1 and Rg1 into artifact ginsenosides such as ginsenoside Rg3, Rg5, Rh1, and Rh2, among others. This study highlights the absurdity of the current analytical practice by investigating the time-dependent changes in the crude saponin and the major natural and artifact ginsenosides contents during simmering. The results lead us to recommend (20S)- and (20R)-ginsenoside Rg3 as new reference materials to complement the current P. ginseng preparation reference materials ginsenoside Rb1 and Rg1. An attempt has also been made to establish validated qualitative and quantitative analytical procedures for these four compounds that meet International Conference of Harmonization (ICH) guidelines for specificity, linearity, range, accuracy, precision, detection limit, quantitation limit, robustness and system suitability. Based on these results, we suggest a validated analytical procedure which conforms to ICH guidelines and equally values the contents of ginsenosides in white and red ginseng preparations. PMID:24235862

  15. Experimental validation of a method characterizing bow tie filters in CT scanners using a real-time dose probe

    SciTech Connect

    McKenney, Sarah E.; Nosratieh, Anita; Gelskey, Dale; Yang Kai; Huang Shinying; Chen Lin; Boone, John M.

    2011-03-15

    Purpose: Beam-shaping or ''bow tie'' (BT) filters are used to spatially modulate the x-ray beam in a CT scanner, but the conventional method of step-and-shoot measurement to characterize a beam's profile is tedious and time-consuming. The theory for characterization of bow tie relative attenuation (COBRA) method, which relies on a real-time dosimeter to address the issues of conventional measurement techniques, was previously demonstrated using computer simulations. In this study, the feasibility of the COBRA theory is further validated experimentally through the employment of a prototype real-time radiation meter and a known BT filter. Methods: The COBRA method consisted of four basic steps: (1) The probe was placed at the edge of a scanner's field of view; (2) a real-time signal train was collected as the scanner's gantry rotated with the x-ray beam on; (3) the signal train, without a BT filter, was modeled using peak values measured in the signal train of step 2; and (4) the relative attenuation of the BT filter was estimated from filtered and unfiltered data sets. The prototype probe was first verified to have an isotropic and linear response to incident x-rays. The COBRA method was then tested on a dedicated breast CT scanner with a custom-designed BT filter and compared to the conventional step-and-shoot characterization of the BT filter. Using basis decomposition of dual energy signal data, the thickness of the filter was estimated and compared to the BT filter's manufacturing specifications. The COBRA method was also demonstrated with a clinical whole body CT scanner using the body BT filter. The relative attenuation was calculated at four discrete x-ray tube potentials and used to estimate the thickness of the BT filter. Results: The prototype probe was found to have a linear and isotropic response to x-rays. The relative attenuation produced from the COBRA method fell within the error of the relative attenuation measured with the step-and-shoot method

  16. Collaborative validation of the quantification method for suspected allergens and test of an automated data treatment.

    PubMed

    Chaintreau, Alain; Cicchetti, Esmeralda; David, Nathalie; Earls, Andy; Gimeno, Pascal; Grimaud, Béatrice; Joulain, Daniel; Kupfermann, Nikolai; Kuropka, Gryta; Saltron, Frédéric; Schippa, Christine

    2011-10-28

    Previous publications investigated different data treatment strategies for quantification of volatile suspected allergens by GC/MS. This publication presents the validation results obtained on "ready to inject" samples under reproducibility conditions following inter-laboratory ring-testing. The approach is based on the monitoring of three selected ions per analyte using two different GC capillary columns. To aid the analysts a decisional tree is used for guidance during the interpretation of the analytical results. The method is evaluated using a fragrance oil concentrate spiked with all suspected allergens to mimic the difficulty of a real sample extract or perfume oil. At the concentrations of 10 and 100mg/kg, imposed by Directive 76/768/EEC for labeling of leave-on and rinse-off cosmetics, the mean bias is +14% and -4%, respectively. The method is linear for all analytes, and the prediction intervals for each analyte have been determined. To speed up the analyst's task, an automated data treatment is also proposed. The method mean bias is slightly shifted towards negative values, but the method prediction intervals are close to that resulting from the decisional tree. PMID:21945622

  17. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation.

    PubMed

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2-90.8% and 83.3-86.9% and a specificity of 97.7-98.8% and 95.1-97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  18. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    PubMed Central

    2012-01-01

    Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO). MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA) with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO) correctly identified (p < 0.05) microarray data in which genes annotated to differentially expressed GO terms are upregulated. We found that GSEA + MIMGO was slightly less effective than, or comparable to, GSEA (Pearson), a method that uses Pearson’s correlation as a metric, at detecting true differentially expressed GO terms. However, unlike other methods including GSEA (Pearson), GSEA + MIMGO can comprehensively identify the microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively. PMID:23232071

  19. Development and validation of a new method for measuring friction between skin and nonwoven materials.

    PubMed

    Cottenden, A M; Wong, W K; Cottenden, D J; Farbrot, A

    2008-07-01

    A new method for measuring the coefficient of friction between nonwoven materials and the curved surface of the volar forearm has been developed and validated. The method was used to measure the coefficient of static friction for three different nonwoven materials on the normal (dry) and over-hydrated volar forearms of five female volunteers (ages 18-44). The method proved simple to run and had good repeatability: the coefficient of variation (standard deviation expressed as a percentage of the mean) for triplets of repeat measurements was usually (80 per cent of the time) less than 10 per cent. Measurements involving the geometrically simpler configuration of pulling a weighted fabric sample horizontally across a quasi-planar area of volar forearm skin proved experimentally more difficult and had poorer repeatability. However, correlations between values of coefficient of static friction derived using the two methods were good (R = 0.81 for normal (dry) skin, and 0.91 for over-hydrated skin). Measurements of the coefficient of static friction for the three nonwovens for normal (dry) and for over-hydrated skin varied in the ranges of about 0.3-0.5 and 0.9-1.3, respectively. In agreement with Amontons' law, coefficients of friction were invariant with normal pressure over the entire experimental range (0.1-8.2 kPa). PMID:18756696

  20. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study. PMID:21313806

  1. Cerebral blood flow determinations using fluorescent microspheres: variations on the sedimentation method validated.

    PubMed

    Powers, K M; Schimmel, C; Glenny, R W; Bernards, C M

    1999-03-01

    We validate a modification of the sedimentation method for measuring fluorescent microspheres (FM) that improves the determination of regional cerebral blood flow (rCBF). Our FM method for rCBF determination is compared to the radioactive microspheres (RM) method for rCBF measurement by simultaneous injection of one radioactive and two fluorescent labeled doses, at two separate time points, into the left ventricle of a pig. The pig was killed, the brain and spinal cord removed, and divided into 92 pieces averaging 0.83 g. Our modifications to FM analysis by sedimentation includes: 2 instead of 1 week of autolysis, pellet washing with 1% Triton X-100 instead of 0.25% Tween 80, phosphate buffer addition during rinse, fluorescent dye extraction using 2-ethoxyethylacetate instead of 2-(2-ethoxyethoxy)ethyl acetate and polypropylene instead of glass tubes. Comparing rCBF using Sc46 RM, to yellow-green and orange FM, yielded mean differences of 0.026 and 0.021 ml/min per piece, respectively. Sn(113) RM compared to blue-green and scarlet FM gave mean differences of -0.010 and 0.137 ml/min per piece, respectively. All RM-FM differences, except those for scarlet FM, are within acceptable limits. This assay provides a reliable method for determining rCBF. PMID:11230812

  2. At-sea Validation of a Birefringence Method for Determining PIC Concentrations in Seawater

    NASA Astrophysics Data System (ADS)

    Guay, C. K.; Bishop, J. K.

    2001-12-01

    We have previously described a spectrophotometer-based method for making optical measurements of particulate inorganic carbon (PIC) in seawater. This method, based on the extreme birefringence of calcium carbonate (CaCO3) relative to other major components of marine particulate matter, was developed in the laboratory using sample suspensions prepared from calcareous marine sediment material and varying amounts of non-birefringent diatomaceous earth. Here we report the first successful measurements of birefringence signals in natural seawater samples, which were obtained during a recent cruise to the North Pacific off the California coast. The spectrophotometer-based method was used onboard to measure PIC in samples collected from Niskin bottle casts in a variety of environments (nearshore to open ocean, eutrophic to oligotrophic). These samples contained a diverse mixture of particles, including calcareous, siliceous and organic material. Birefringence signals clearly above the detection level were observed in several samples, with the strongest signals occurring in productive surface waters off Point Concepcion. The spectrophotometer-based method was validated against PIC concentrations determined by chemical analysis of particulate matter collected by filtration of the Niskin bottle samples and from large-volume (1000's of L) in situ filtration performed immediately after the Niskin casts. In addition, these data were compared with in situ birefringence measurements made using a prototype profiling PIC sensor deployed on the rosette during the Niskin casts.

  3. Columbia River Stock Identification Study; Validation of Genetic Method, 1980-1981 Final Report.

    SciTech Connect

    Milner, George B.; Teel, David J.; Utter, Fred M.

    1981-06-01

    The reliability of a method for obtaining maximum likelihood estimate of component stocks in mixed populations of salmonids through the frequency of genetic variants in a mixed population and in potentially contributing stocks was tested in 1980. A data base of 10 polymorphic loci from 14 hatchery stocks of spring chinook salmon of the Columbia River was used to estimate proportions of these stocks in four different blind'' mixtures whose true composition was only revealed subsequent to obtaining estimates. The accuracy and precision of these blind tests have validated the genetic method as a valuable means for identifying components of stock mixtures. Properties of the genetic method were further examined by simulation studies using the pooled data of the four blind tests as a mixed fishery. Replicated tests with samples sizes between 100 and 1,000 indicated that actual standard deviations on estimated contributions were consistently lower than calculated standard deviations; this difference diminished as sample size increased. It is recommended that future applications of the method be preceded by simulation studies that will identify appropriate levels of sampling required for acceptable levels of accuracy and precision. Variables in such studies include the stocks involved, the loci used, and the genetic differentiation among stocks. 8 refs., 6 figs., 4 tabs.

  4. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation

    PubMed Central

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2–90.8% and 83.3–86.9% and a specificity of 97.7–98.8% and 95.1–97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  5. Method validation of a survey of thevetia cardiac glycosides in serum samples.

    PubMed

    Kohls, Sarah; Scholz-Böttcher, Barbara; Rullkötter, Jürgen; Teske, Jörg

    2012-02-10

    A sensitive and specific liquid chromatography tandem mass spectrometry (HPLC-ESI(+)-MS/MS) procedure was developed and validated for the identification and quantification of thevetin B and further cardiac glycosides in human serum. The seeds of Yellow Oleander (Thevetia peruviana) contain cardiac glycosides that can cause serious intoxication. A mixture of six thevetia glycosides was extracted from these seeds and characterized. Thevetin B, isolated and efficiently purified from that mixture, is the main component and can be used as evidence. Solid phase extraction (SPE) proved to be an effective sample preparation method. Digoxin-d3 was used as the internal standard. Although ion suppression occurs, the limit of detection (LOD) is 0.27 ng/ml serum for thevetin B. Recovery is higher than 94%, and accuracy and precision were proficient. Method refinement was carried out with regard to developing a general screening method for cardiac glycosides. The assay is linear over the range of 0.5-8 ng/ml serum. Finally, the method was applied to a case of thevetia seed ingestion. PMID:21376490

  6. Validation of a GC-MS screening method for anabolizing agents in aqueous nutritional supplements.

    PubMed

    Thuyne, W Van; Delbeke, F T

    2005-01-01

    A sensitive and selective method for the screening of anabolizing agents in aqueous nutritional supplements is described and validated. A total of 28 different anabolizing agents are screened for, including testosterone and prohormones, nandrolone and prohormones, stanozolol, and metandienone. The different analytes are extracted from the aqueous nutritional supplements by liquid-liquid extraction with a mixture of pentane and freshly distilled diethylether (1:1) after the supplements have been made alkaline with a NaHCO3-K2CO3 (2:1) buffer. The anabolizing agents are derivatized with a mixture of MSTFA-NH4I-ethanethiol (320:1:2) as routinely used for the screening of anabolic steroids extracted from urine. The derivatives are analyzed by gas chromatography (GC)-mass spectrometry (MS) in the selective ion monitoring mode. The limits of detection range from 1 to 10 ng/mL. One aqueous nutritional supplement (creatine serum) was analyzed with this screening method and was found to contain dehydroepiandrosterone (DHEA) at very low concentrations. The presence of DHEA could be confirmed with GC-MS-MS. Results of the application of this method and a similar method for solid nutritional supplements previously described are given. PMID:15808000

  7. Validation of a UV spectrophotometric method for the determination of melatonine in solid dosage forms.

    PubMed

    Pérez, R F; Lemus, I G; Bocic, R V; Pérez, M V; García-Madrid, R

    2001-01-01

    The aim of the work described in this paper was to provide a fast, easy, inexpensive, precise, and accurate method for the determination of melatonine in solid pharmaceutical dosage forms. The developed method is based on a UV first-derivative spectrophotometric determination, which exhibits excellent linearity in aqueous solutions (r2 = 0.996) for analyte concentrations of 1.5-4.5 mg/dL within a pH range of 5-9. Neither excipients present in the formulation nor indole adulterants, such as tryptophan (up to 5%), interfere with the assay. A study of variation parameters showed that sonication temperature was the main factor for successful determination. At temperatures of <45 degrees C, the sample dissolved completely, and accurate spectrophotometric measurements were obtained. A study was conducted of all the parameters established by the United States Pharmacopeia, 23rd Rev., to validate an analytical method for a solid pharmaceutical form, i.e., linearity, range, accuracy, precision, and specificity. All the parameters were in accordance with the acceptance criteria of the Comité de Guías Oficiales de Validación de la Dirección General de Control de Insumos para la Salud de Méjico. In addition, robustness and content uniformity tests were performed to substantiate the usefulness of the method. PMID:11601453

  8. Study on correlation methods for damage detection: Simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Dall'Acqua, D.; Di Maio, D.

    2014-05-01

    In the current days there is an increment of interest in damage detection methods, aimed to assure the operating status of existing structures or for intensifier quality control on production line. These are only some of the applications whereby damage detection methods have been dev eloped. In the past several researches have been addressed towards damage detection using vibration analysis, especially through mode shape and natural frequencies changes. In the preset study correlation methods based on ODSs have been developed. The structure was taken under consideration is steel plate. The correlation methods presented are based on the comparison of the ODSs generated by two FEM models of the plate, one defined as pristine and the other as damaged. The latter has been modelled adding a single node mass element to the model surface. This mass element was chosen to simulate a magnet attached to the surface plate in the experimental case. Several simulations have been performed using combinations of mass and positions, for a total of 16 cases. Studying the correlations between a ODSs pair, given by the same excitation frequency and position, is possible to identify the presence of damage in the structure. The experimental model validation has been performed using the best excitation condition obtained by simulation, which can point out large differences between the damaged ODS and undamaged ODS.

  9. Development and validation of personal monitoring methods for low levels of acrylonitrile in workplace atmosphere. I. Test atmosphere generation and solvent desorption methods

    SciTech Connect

    Melcher, R.G.; Borders, R.A.; Coyne, L.B.

    1986-03-01

    The purpose of this study was to optimize monitoring methods and to investigate new technology for the determination of low levels of acrylonitrile (0.05 to 5 ppm) in workplace atmospheres. In the first phase of the study, a dynamic atmosphere generation system was developed to produce low levels of acrylonitrile in simulated workplace atmospheres. Various potential sorbents were investigated in the second phase, and the candidate methods were compared in a laboratory validation study over a concentration range from 0.05 to 5 ppm acrylonitrile in the presence of potential interferences and under relative humidity conditions from 30% to 95% RH. A collection tube containing 600 mg Pittsburgh coconut base charcoal was found to be the optimum tube for sampling for a full 8 -hr shift. No breakthrough was observed over the concentrations and humidities tested. The recovery was 91.3% with a total relative precision of +/-21% over the test range, and the recovery was not affected by storage for up to five weeks.

  10. Development and validation spectroscopic methods for the determination of lomefloxacin in bulk and pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    El-Didamony, A. M.; Hafeez, S. M.

    2016-01-01

    Four simple, sensitive spectrophotometric and spectrofluorimetric methods (A-D) for the determination of antibacterial drug lomefloxacin (LMFX) in pharmaceutical formulations have been developed. Method A is based on formation of ternary complex between Pd(II), eosin and LMFX in the presence of methyl cellulose as surfactant and acetate-HCl buffer pH 4.0. Spectrophotometrically, under the optimum conditions, the ternary complex showed absorption maximum at 530 nm. Methods B and C are based on redox reaction between LMFX and KMnO4 in acid and alkaline media. In indirect spectrophotometry method B the drug solution is treated with a known excess of KMnO4 in H2SO4 medium and subsequent determination of unreacted oxidant by reacting it with safronine O in the same medium at λmax = 520 nm. Direct spectrophotometry method C involves treating the alkaline solution of LMFX with KMnO4 and measuring the bluish green product at 604 nm. Method D is based on the chelation of LMFX with Zr(IV) to produce fluorescent chelate. At the optimum reaction conditions, the drug-metal chelate showed excitation maxima at 280 nm and emission maxima at 443 nm. The optimum experimental parameters for the reactions have been studied. The validity of the described procedures was assessed. Statistical analysis of the results has been carried out revealing high accuracy and good precision. The proposed methods were successfully applied for the determination of the selected drug in pharmaceutical preparations with good recoveries.

  11. Development and Validation of HPLC and HPTLC Methods for Determination of Cefoperazone and Its Related Impurities.

    PubMed

    Abdelaleem, Eglal A; Naguib, Ibrahim A; Zaazaa, Hala E; Hussein, Essraa A

    2016-02-01

    Validated sensitive and highly selective methods were developed for the quantitative determination of cefoperazone sodium (CEF) in the presence of its reported impurities; 7-aminocephalosporanic acid (7-ACA) and 5-mercapto-1-methyl-tetrazole (5-MER). Method A is high-performance liquid chromatography (HPLC), where the mixture of CEF and the reported impurities; 7-ACA and 5-MER were separated on a C8 column (5 µm ps, 250 mm × 4.6 i.d.) using methanol:0.05 M KH2PO4 buffer (22.5:77.5 v/v, pH 7.5) as a mobile phase. The three components were detected at 254 nm with a concentration range of 10-90 µg mL(-1) and the mean percentage recovery 99.67% (SD 1.465). Method B is high-performance thin layer chromatography (HPTLC), where the mixture of CEF and the reported impurities were separated on silica gel HPTLC F254 plates using (acetone:methanol:ethyl acetate:2% sodium lauryl sulfate:glacial acetic acid) (3:2:3:0.8:0.2, by volume) as a developing system and scanning at 254 nm over a concentration range of 1-10 µg per band with the mean percentage recovery 99.95% (SD 1.335). The proposed methods were statistically compared with a reported HPLC method with no significant difference regarding accuracy and precision; indicating the ability of the proposed methods to be reliable and suitable for routine analysis of drug product. The proposed HPTLC method proved to be more sensitive, while the HPLC gave more reproducible results besides saving time. PMID:26306573

  12. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials. PMID:11767156

  13. Validated kinetic spectrophotometric method for the determination of metoprolol tartrate in pharmaceutical formulations.

    PubMed

    Rahman, Nafisur; Rahman, Habibur; Azmi, Syed Najmul Hejaz

    2005-08-01

    A kinetic spectrophotometric method has been described for the determination of metoprolol tartrate in pharmaceutical formulations. The method is based on reaction of the drug with alkaline potassium permanganate at 25+/-1 degrees C. The reaction is followed spectrophotometrically by measuring the change in absorbance at 610 nm as a function of time. The initial rate and fixed time (at 15.0 min) methods are utilized for constructing the calibration graphs to determine the concentration of the drug. Both the calibration graphs are linear in the concentration range of 1.46 x 10(-6)-8.76 x 10(-6) M (10.0-60.0 microg per 10 ml). The calibration data resulted in the linear regression equations of log (rate)=3.634+0.999 log C and A=6.300 x 10(-4)+6.491 x 10(-2) C for initial-rate and fixed time methods, respectively. The limits of quantitation for initial rate and fixed time methods are 0.04 and 0.10 microg ml(-1), respectively. The activation parameters such as E(a), DeltaH(double dagger), DeltaS(double dagger) and DeltaG(double dagger) are also evaluated for the reaction and found to be 90.73 kJ mol(-1), 88.20 kJ mol(-1), 84.54 J K(-1) mol(-1) and 63.01 kJ mol(-1), respectively. The results are validated statistically and through recovery studies. The method has been successfully applied to the determination of metoprolol tartrate in pharmaceutical formulations. Statistical comparison of the results with the reference method shows excellent agreement and indicates no significant difference in accuracy and precision. PMID:16079525

  14. Development and validation of an UPLC-MS/MS method for the determination of ionophoric and synthetic coccidiostats in vegetables.

    PubMed

    Broekaert, N; Van Peteghem, C; Daeseleire, E; Sticker, D; Van Poucke, C

    2011-12-01

    In poultry farming, anticoccidial drugs are widely used as feed additives for the prevention and treatment of coccidiosis. Because coccidiostats and veterinary medicines, in general, are often poorly absorbed, manure from treated animals may contain high concentrations of these compounds. Experimental studies have shown that the uptake of veterinary medicines by plants from soil containing contaminated manure may occur. This leads to several questions regarding the impact on the environment, resistance problems, and public health and allergy issues. This work describes the development of a quantification method for coccidiostats in vegetables. Vegetables were spiked at 100 μg kg(-1) (dry weight) with coccidiostats (monensin, narasin, lasalocid A, salinomycin, diclazuril, and nicarbazin) in order to optimize the extraction and clean-up. Possible critical factors (e.g., extraction solvent) were statistically examined by linear regression with the use of Plackett-Burman and full factorial designs. Final extracts were analyzed with ultra-performance liquid chromatography tandem mass spectrometry operating in multiple-reaction monitoring mode. Both the synthetic and ionophoric coccidiostats could be determined in a single run with an analysis time of 5 min. The developed method was validated taking into account the requirements of the Commission Decision 2002/657/EC as a guideline. The method is regarded as applicable for its intended purposes with quantification limits between 0.30 and 2.98 μg kg(-1). This method could be used to establish possible maximum residue limits for coccidiostats in vegetables, as already exist for eggs, meat, and milk. PMID:21984012

  15. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    PubMed

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella. PMID:26268975

  16. Advances in the Development and Validation of Test Methods in the United States

    PubMed Central

    Casey, Warren M.

    2016-01-01

    The National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) provides validation support for US Federal agencies and the US Tox21 interagency consortium, an interagency collaboration that is using high throughput screening (HTS) and other advanced approaches to better understand and predict chemical hazards to humans and the environment. The use of HTS data from assays relevant to the estrogen receptor signaling data pathway is used as an example of how HTS data can be combined with computational modeling to meet the needs of US agencies. As brief summary of US efforts in the areas of biologics testing, acute toxicity, and skin sensitization will also be provided. PMID:26977254

  17. Flight test validation of a frequency-based system identification method on an F-15 aircraft

    NASA Technical Reports Server (NTRS)

    Schkolnik, Gerard S.; Orme, John S.; Hreha, Mark A.

    1995-01-01

    A frequency-based performance identification approach was evaluated using flight data from the NASA F-15 Highly Integrated Digital Electronic Control aircraft. The approach used frequency separation to identify the effectiveness of multiple controls simultaneously as an alternative to independent control identification methods. Fourier transformations converted measured control and response data into frequency domain representations. Performance gradients were formed using multiterm frequency matching of control and response frequency domain models. An objective function was generated using these performance gradients. This function was formally optimized to produce a coordinated control trim set. This algorithm was applied to longitudinal acceleration and evaluated using two control effectors: nozzle throat area and inlet first ramp. Three criteria were investigated to validate the approach: simultaneous gradient identification, gradient frequency dependency, and repeatability. This report describes the flight test results. These data demonstrate that the approach can accurately identify performance gradients during simultaneous control excitation independent of excitation frequency.

  18. Validation of qPCR Methods for the Detection of Mycobacterium in New World Animal Reservoirs

    PubMed Central

    Housman, Genevieve; Malukiewicz, Joanna; Boere, Vanner; Grativol, Adriana D.; Pereira, Luiz Cezar M.; Silva, Ita de Oliveira e; Ruiz-Miranda, Carlos R.; Truman, Richard; Stone, Anne C.

    2015-01-01

    Zoonotic pathogens that cause leprosy (Mycobacterium leprae) and tuberculosis (Mycobacterium tuberculosis complex, MTBC) continue to impact modern human populations. Therefore, methods able to survey mycobacterial infection in potential animal hosts are necessary for proper evaluation of human exposure threats. Here we tested for mycobacterial-specific single- and multi-copy loci using qPCR. In a trial study in which armadillos were artificially infected with M. leprae, these techniques were specific and sensitive to pathogen detection, while more traditional ELISAs were only specific. These assays were then employed in a case study to detect M. leprae as well as MTBC in wild marmosets. All marmosets were negative for M. leprae DNA, but 14 were positive for the mycobacterial rpoB gene assay. Targeted capture and sequencing of rpoB and other MTBC genes validated the presence of mycobacterial DNA in these samples and revealed that qPCR is useful for identifying mycobacterial-infected animal hosts. PMID:26571269

  19. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  20. Angle-of-attack validation of a new zonal CFD method for airfoil simulations

    NASA Technical Reports Server (NTRS)

    Yoo, Sungyul; Summa, J. Michael; Strash, Daniel J.

    1990-01-01

    The angle-of-attack validation of a new concept suggested by Summa (1990) for coupling potential and viscous flow methods has been investigated for two-dimensional airfoil simulations. The fully coupled potential/Navier-Stokes code, ZAP2D (Zonal Aerodynamics Program 2D), has been used to compute the flow field around an NACA 0012 airfoil for a range of angles of attack up to stall at a Mach number of 0.3 and a Reynolds number of 3 million. ZAP2D calculation for various domain sizes from 25 to 0.12 chord lengths are compared with the ARC2D large domain solution as well as with experimental data.

  1. Validation of qPCR Methods for the Detection of Mycobacterium in New World Animal Reservoirs.

    PubMed

    Housman, Genevieve; Malukiewicz, Joanna; Boere, Vanner; Grativol, Adriana D; Pereira, Luiz Cezar M; Silva, Ita de Oliveira; Ruiz-Miranda, Carlos R; Truman, Richard; Stone, Anne C

    2015-11-01

    Zoonotic pathogens that cause leprosy (Mycobacterium leprae) and tuberculosis (Mycobacterium tuberculosis complex, MTBC) continue to impact modern human populations. Therefore, methods able to survey mycobacterial infection in potential animal hosts are necessary for proper evaluation of human exposure threats. Here we tested for mycobacterial-specific single- and multi-copy loci using qPCR. In a trial study in which armadillos were artificially infected with M. leprae, these techniques were specific and sensitive to pathogen detection, while more traditional ELISAs were only specific. These assays were then employed in a case study to detect M. leprae as well as MTBC in wild marmosets. All marmosets were negative for M. leprae DNA, but 14 were positive for the mycobacterial rpoB gene assay. Targeted capture and sequencing of rpoB and other MTBC genes validated the presence of mycobacterial DNA in these samples and revealed that qPCR is useful for identifying mycobacterial-infected animal hosts. PMID:26571269

  2. Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.

    PubMed

    Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer

    2016-09-01

    Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types. PMID:27343591

  3. A validated ultra high pressure liquid chromatographic method for qualification and quantification of folic acid in pharmaceutical preparations.

    PubMed

    Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J

    2011-04-01

    A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption. PMID:21168299

  4. A validated HPTLC method for determination of terbutaline sulfate in biological samples: Application to pharmacokinetic study

    PubMed Central

    Faiyazuddin, Md.; Rauf, Abdul; Ahmad, Niyaz; Ahmad, Sayeed; Iqbal, Zeenat; Talegaonkar, Sushma; Bhatnagar, Aseem; Khar, Roop K.; Ahmad, Farhan J.

    2011-01-01

    Terbutaline sulfate (TBS) was assayed in biological samples by validated HPTLC method. Densitometric analysis of TBS was carried out at 366 nm on precoated TLC aluminum plates with silica gel 60F254 as a stationary phase and chloroform–methanol (9.0:1.0, v/v) as a mobile phase. TBS was well resolved at RF 0.34 ± 0.02. In all matrices, the calibration curve appeared linear (r2 ⩾ 0.9943) in the tested range of 100–1000 ng spot−1 with a limit of quantification of 18.35 ng spot−1. Drug recovery from biological fluids averaged ⩾95.92%. In both matrices, rapid degradation of drug favored and the T0.5 of drug ranged from 9.92 to 12.41 h at 4 °C and from 6.31 to 9.13 h at 20 °C. Frozen at −20 °C, this drug was stable for at least 2 months (without losses >10%). The maximum plasma concentration (Cpmax) was found to be 5875.03 ± 114 ng mL−1, which is significantly higher than the maximum saliva concentration (Csmax, 1501.69 ± 96 ng mL−1). Therefore, the validated method could be used to carry out pharmacokinetic studies of the TBS from novel drug delivery systems. PMID:23960758

  5. Validation of image-based method for extraction of coronary morphometry.

    PubMed

    Wischgoll, Thomas; Choy, Jenny Susana; Ritman, Erik L; Kassab, Ghassan S

    2008-03-01

    An accurate analysis of the spatial distribution of blood flow in any organ must be based on detailed morphometry (diameters, lengths, vessel numbers, and branching pattern) of the organ vasculature. Despite the significance of detailed morphometric data, there is relative scarcity of data on 3D vascular anatomy. One of the major reasons is that the process of morphometric data collection is labor intensive. The objective of this study is to validate a novel segmentation algorithm for semi-automation of morphometric data extraction. The utility of the method is demonstrated in porcine coronary arteries imaged by computerized tomography (CT). The coronary arteries of five porcine hearts were injected with a contrast-enhancing polymer. The coronary arterial tree proximal to 1 mm was extracted from the 3D CT images. By determining the centerlines of the extracted vessels, the vessel radii and lengths were identified for various vessel segments. The extraction algorithm described in this paper is based on a topological analysis of a vector field generated by normal vectors of the extracted vessel wall. With this approach, special focus is placed on achieving the highest accuracy of the measured values. To validate the algorithm, the results were compared to optical measurements of the main trunk of the coronary arteries with microscopy. The agreement was found to be excellent with a root mean square deviation between computed vessel diameters and optical measurements of 0.16 mm (<10% of the mean value) and an average deviation of 0.08 mm. The utility and future applications of the proposed method to speed up morphometric measurements of vascular trees are discussed. PMID:18228141

  6. Development, Quantification, Method Validation, and Stability Study of a Novel Fucoxanthin-Fortified Milk.

    PubMed

    Mok, Il-Kyoon; Yoon, Jung-Ro; Pan, Cheol-Ho; Kim, Sang Min

    2016-08-10

    To extend the scope of application of fucoxanthin, a marine carotenoid, whole milk (WM) and skimmed milk (SM) were fortified with fucoxanthin isolated from the microalga Phaeodactylum tricornutum to a final 8 μg/mL milk solution concentration. Using these liquid systems, a fucoxanthin analysis method implementing extraction and HPLC-DAD was developed and validated by accuracy, precision, system suitability, and robustness tests. The current method demonstrated good linearity over the range of 0.125-100 μg/mL fucoxanthin with R(2) = 1.0000, and all validation data supported its adequacy for use in fucoxanthin analysis from milk solution. To investigate fucoxanthin stability during milk production and distribution, fucoxanthin content was examined during storage, pasteurization, and drying processes under various conditions. Fucoxanthin in milk solutions showed better stabilizing effect in 1 month of storage period. Degradation rate constant (k) on fucoxanthin during this storage period suggested that fucoxanthin stability might be negatively correlated with decrease of temperature and increase of protein content such as casein and whey protein in milk matrix. In a comparison between SM and WM, fucoxantin in SM always showed better stability than that in WM during storage and three kinds of drying processes. This effect was also deduced to relate with protein content. In the pasteurization step, >91% of fucoxanthin was retained after three pasteurization processes even though the above trend was not found. This study demonstrated for the first time that milk products can be used as a basic food matrix for fucoxanthin application and that protein content in milk is an important factor for fucoxanthin stability. PMID:27455130

  7. Validation of doubly labeled water method for energy expenditure in postsurgical infants

    SciTech Connect

    Jones, P.J.H.; Winthrop, A.L.; Schoeller, D.A.; Swyer, P.R.; Filler, R.M.; Smith, J.M.; Heim, T.

    1986-03-05

    To validate the doubly labeled water method (/sup 2/H/sub 2/ /sup 18/O) in infants without concurrent water balance, carbon dioxide production rate (rCO/sub 2/) and energy expenditure (EE) were measured for 5 or 6 days by /sup 2/H/sub 2/ /sup 18/O and periodic open circuit respiratory gas exchange in 5 infants (mean age: 5.3 wk, range 1-14 wk). Following abdominal surgery (mean = 10.3 d, range 7-18 d), infants were maintained on constant oral or parenteral nutrition 4 d prior to and during the study. This avoided changes in baseline isotopic enrichment of body water diet induced changes in relative isotopic abundance of /sup 2/H and /sup 18/O could introduce significant errors in rCO/sub 2/. For /sup 2/H/sub 2/ /sub 1//sup 8/O, they assumed insensible water loss would be proportional to respiration volume and body surface area and hence rCO/sub 2/. This calculated insensible loss averaged 18% of water turnover. EE was calculated using measured respiratory quotients (m) and dietary intake (i) data. In 6 blinded studies with 5 infants, rCO/sub 2/ = 34.2 L/d (range 27.3-48.0), EE = 191 kcal/d (133-266) and EE/sub i/ = 197 kcal/d (138-281). Percent differences (+/- SD) from respiratory gas exchange were -1.1 +/- 5.8, -1.1 +/- 5.8, and 1.4 +/- 5.2, respectively. These findings demonstrate the validity of the doubly labeled water method for determining energy expenditure without concurrent water balance studies.

  8. Determining 'age at death' for forensic purposes using human bone by a laboratory-based biomechanical analytical method.

    PubMed

    Zioupos, P; Williams, A; Christodoulou, G; Giles, R

    2014-05-01

    be applied worldwide following stringent laboratory protocols. As such, this technique contributes significantly to improving age estimation and therefore identification methods for forensic and other purposes. PMID:24286969

  9. Validation of free flow electrophoresis as a novel plasma and serum processing and fractionation method in biobanking.

    PubMed

    Gaillard, Gwenaelle; Trezzi, Jean-Pierre; Betsou, Fotini

    2012-08-01

    Free flow electrophoresis (FFE) is a fractionation method, based on isoelectric focusing (IEF). We validate the reproducibility of the method and show that it can be applied by biobanks in order to fractionate fluid biospecimens efficiently and reproducibly and to facilitate downstream proteomic applications. We also propose a simple method allowing researchers to assess the reproducibility of each FFE run. PMID:24849883

  10. Scope extension validation protocol: inclusion of analytes and matrices in an LC-MS/MS sulfonamide residues method.

    PubMed

    Hoff, Rodrigo Barcellos; Barreto, Fabiano; Melo, Jéssica; Martins, Magda Targa; Pizzolato, Tânia Mara; Peralba, Maria do Carmo Ruaro

    2014-01-01

    Validation is a required process for analytical methods. However, scope extension, i.e. inclusion of more analytes, other matrices and/or minor changes in extraction procedures, can be achieved without a full validation protocol, which requires time and is laborious to the laboratory. This paper presents a simple and rugged protocol for validation in the case of extension of scope. Based on a previously reported method for analysis of sulfonamide residues using LC-MS/MS, inclusion of more analytes, metabolites, matrices and optimisation for the extraction procedure are presented in detail. Initially, the method was applied only to liver samples. In this work, milk, eggs and feed were also added to the scope. Several case-specific validation protocols are proposed for extension of scope. PMID:24195474

  11. Estimating chimpanzee population size with nest counts: validating methods in Taï National Park.

    PubMed

    Kouakou, Célestin Yao; Boesch, Christophe; Kuehl, Hjalmar

    2009-06-01

    Successful conservation and management of wild animals require reliable estimates of their population size. Ape surveys almost always rely on counts of sleeping nests, as the animals occur at low densities and visibility is low in tropical forests. The reliability of standing-crop nest counts and marked-nest counts, the most widely used methods, has not been tested on populations of known size. Therefore, the answer to the question of which method is more appropriate for surveying chimpanzee population remains problematic and comparisons among sites are difficult. This study aimed to test the validity of these two methods by comparing their estimates to the known population size of three habituated chimpanzee communities in Taï National Park [Boesch et al., Am J Phys Anthropol 130:103-115, 2006; Boesch et al., Am J Primatol 70:519-532, 2008]. In addition to transect surveys, we made observations on nest production rate and nest lifetime. Taï chimpanzees built 1.143 nests per day. The mean nest lifetime of 141 fresh nests was 91.22 days. Estimate precision for the two methods did not differ considerably (difference of coefficient of variation <5%). The estimate of mean nest decay time was more precise (CV=6.46%) when we used covariates (tree species, rainfall, nest height and age) to model nest decay rate, than when we took a simple mean of nest decay times (CV=9.17%). The two survey methods produced point estimates of chimpanzee abundance that were similar and reliable: i.e. for both methods the true chimpanzee abundance was included within the 95% estimate confidence interval. We recommend further research on covariate modeling of nest decay times as one way to improve the precision and to reduce the costs of conducting nest surveys. PMID:19235865

  12. Experimental validation of a method for removing the capacitive leakage artifact from electrical bioimpedance spectroscopy measurements

    NASA Astrophysics Data System (ADS)

    Buendia, R.; Seoane, F.; Gil-Pita, R.

    2010-11-01

    Often when performing electrical bioimpedance (EBI) spectroscopy measurements, the obtained EBI data present a hook-like deviation, which is most noticeable at high frequencies in the impedance plane. The deviation is due to a capacitive leakage effect caused by the presence of stray capacitances. In addition to the data deviation being remarkably noticeable at high frequencies in the phase and the reactance spectra, the measured EBI is also altered in the resistance and the modulus. If this EBI data deviation is not properly removed, it interferes with subsequent data analysis processes, especially with Cole model-based analyses. In other words, to perform any accurate analysis of the EBI spectroscopy data, the hook deviation must be properly removed. Td compensation is a method used to compensate the hook deviation present in EBI data; it consists of multiplying the obtained spectrum, Zmeas(ω), by a complex exponential in the form of exp(-jωTd). Although the method is well known and accepted, Td compensation cannot entirely correct the hook-like deviation; moreover, it lacks solid scientific grounds. In this work, the Td compensation method is revisited, and it is shown that it should not be used to correct the effect of a capacitive leakage; furthermore, a more developed approach for correcting the hook deviation caused by the capacitive leakage is proposed. The method includes a novel correcting expression and a process for selecting the proper values of expressions that are complex and frequency dependent. The correctness of the novel method is validated with the experimental data obtained from measurements from three different EBI applications. The obtained results confirm the sufficiency and feasibility of the correcting method.

  13. Validation and Clinical Evaluation of a Novel Method To Measure Miltefosine in Leishmaniasis Patients Using Dried Blood Spot Sample Collection

    PubMed Central

    Rosing, H.; Hillebrand, M. J. X.; Blesson, S.; Mengesha, B.; Diro, E.; Hailu, A.; Schellens, J. H. M.; Beijnen, J. H.

    2016-01-01

    To facilitate future pharmacokinetic studies of combination treatments against leishmaniasis in remote regions in which the disease is endemic, a simple cheap sampling method is required for miltefosine quantification. The aims of this study were to validate a liquid chromatography-tandem mass spectrometry method to quantify miltefosine in dried blood spot (DBS) samples and to validate its use with Ethiopian patients with visceral leishmaniasis (VL). Since hematocrit (Ht) levels are typically severely decreased in VL patients, returning to normal during treatment, the method was evaluated over a range of clinically relevant Ht values. Miltefosine was extracted from DBS samples using a simple method of pretreatment with methanol, resulting in >97% recovery. The method was validated over a calibration range of 10 to 2,000 ng/ml, and accuracy and precision were within ±11.2% and ≤7.0% (≤19.1% at the lower limit of quantification), respectively. The method was accurate and precise for blood spot volumes between 10 and 30 μl and for Ht levels of 20 to 35%, although a linear effect of Ht levels on miltefosine quantification was observed in the bioanalytical validation. DBS samples were stable for at least 162 days at 37°C. Clinical validation of the method using paired DBS and plasma samples from 16 VL patients showed a median observed DBS/plasma miltefosine concentration ratio of 0.99, with good correlation (Pearson's r = 0.946). Correcting for patient-specific Ht levels did not further improve the concordance between the sampling methods. This successfully validated method to quantify miltefosine in DBS samples was demonstrated to be a valid and practical alternative to venous blood sampling that can be applied in future miltefosine pharmacokinetic studies with leishmaniasis patients, without Ht correction. PMID:26787691

  14. Validation Methods Research for Fault-Tolerant Avionics and Control Systems: Working Group Meeting, 2

    NASA Technical Reports Server (NTRS)

    Gault, J. W. (Editor); Trivedi, K. S. (Editor); Clary, J. B. (Editor)

    1980-01-01

    The validation process comprises the activities required to insure the agreement of system realization with system specification. A preliminary validation methodology for fault tolerant systems documented. A general framework for a validation methodology is presented along with a set of specific tasks intended for the validation of two specimen system, SIFT and FTMP. Two major areas of research are identified. First, are those activities required to support the ongoing development of the validation process itself, and second, are those activities required to support the design, development, and understanding of fault tolerant systems.

  15. Reflectance Estimation from Urban Terrestrial Images: Validation of a Symbolic Ray-Tracing Method on Synthetic Data

    NASA Astrophysics Data System (ADS)

    Coubard, F.; Brédif, M.; Paparoditis, N.; Briottet, X.

    2011-04-01

    Terrestrial geolocalized images are nowadays widely used on the Internet, mainly in urban areas, through immersion services such as Google Street View. On the long run, we seek to enhance the visualization of these images; for that purpose, radiometric corrections must be performed to free them from illumination conditions at the time of acquisition. Given the simultaneously acquired 3D geometric model of the scene with LIDAR or vision techniques, we face an inverse problem where the illumination and the geometry of the scene are known and the reflectance of the scene is to be estimated. Our main contribution is the introduction of a symbolic ray-tracing rendering to generate parametric images, for quick evaluation and comparison with the acquired images. The proposed approach is then based on an iterative estimation of the reflectance parameters of the materials, using a single rendering pre-processing. We validate the method on synthetic data with linear BRDF models and discuss the limitations of the proposed approach with more general non-linear BRDF models.

  16. Validation of a gas chromatographic method to quantify sesquiterpenes in copaiba oils.

    PubMed

    Sousa, João Paulo B; Brancalion, Ana P S; Souza, Ariana B; Turatti, Izabel C C; Ambrósio, Sérgio R; Furtado, Niege A J C; Lopes, Norberto P; Bastos, Jairo K

    2011-03-25

    Copaifera species (Leguminoseae) are popularly known as "copaiba" or "copaíva". The oleoresins obtained from the trunk of these species have been extensively used in folk medicine and are commercialized in Brazil as crude oil and in several pharmaceutical and cosmetic products. This work reports a complete validated method for the quantification of β-caryophyllene, α-copaene, and α-humulene in distinct copaiba oleoresins available commercially. Thus, essential oil samples (100μL) were dissolved in 20mL of hexanes containing internal standard (1,2,4,5-tetramethylbenzene, 3.0mM) in a 25mL glass flask. A 1μL aliquot was injected into the GC-FID system. A fused-silica capillary column HP-5, coated with 5% phenyl-methylsiloxane was used for this study. The developed method gave a good detection response with linearity in the range of 0.10-18.74mM. Limits of detection and quantitation variety ranged between 0.003 and 0.091mM. β-Caryophyllene, α-copaene, and α-humulene were recovered in a range from 74.71% to 88.31%, displaying RSD lower than 10% and relative errors between -11.69% and -25.30%. Therefore, this method could be considered as an analytical tool for the quality control of different Copaifera oil samples and its products in both cosmetic and pharmaceutical companies. PMID:21095089

  17. Development and validation of RP-HPLC method for quantification of glipizide in biological macromolecules.

    PubMed

    Pani, Nihar Ranjan; Acharya, Sujata; Patra, Sradhanjali

    2014-04-01

    Glipizide (GPZ) has been widely used in the treatment of type-2 diabetics as insulin secretogague. Multiunit chitosan based GPZ floating microspheres was prepared by ionotropic gelation method for gastroretentive delivery using sodiumtripolyphosphate as cross-linking agent. Pharmacokinetic study of microspheres was done in rabbit and plasma samples were analyzed by a newly developed and validated high-performance liquid chromatographic method. Method was developed on Hypersil ODS-18 column using a mobile phase of 10mM phosphate buffer (pH, 3.5) and methanol (25:75, v/v). Elute was monitored at 230 nm with a flow rate of 1 mL/min. Calibration curve was linear over the concentration range of 25.38-2046.45 ng/mL. Retention times of GPZ and internal standard (gliclazide) were 7.32 and 9.02 min respectively. Maximum plasma drug concentration, area under the plasma drug concentration-time curve and elimination half life for GPZ floating microspheres were 2.88±0.29 μg mL(-1), 38.46±2.26 μg h mL(-1) and 13.55±1.36 h respectively. When the fraction of drug dissolved from microspheres in pH 7.4 was plotted against the fraction of drug absorbed, a linear correlation (R(2)=0.991) was obtained in in vitro and in vivo correlation study. PMID:24418334

  18. Validation of a mass spectrometry-based method for milk traces detection in baked food.

    PubMed

    Lamberti, Cristina; Cristina, Lamberti; Acquadro, Elena; Elena, Acquadro; Corpillo, Davide; Davide, Corpillo; Giribaldi, Marzia; Marzia, Giribaldi; Decastelli, Lucia; Lucia, Decastelli; Garino, Cristiano; Cristiano, Garino; Arlorio, Marco; Marco, Arlorio; Ricciardi, Carlo; Carlo, Ricciardi; Cavallarin, Laura; Laura, Cavallarin; Giuffrida, Maria Gabriella; Gabriella, Giuffrida Maria

    2016-05-15

    A simple validated LC-MS/MS-based method was set up to detect milk contamination in bakery products, taking the effects of food processing into account for the evaluation of allergen recovery and quantification. Incurred cookies were prepared at eight levels of milk contamination and were cooked to expose all milk components, including allergenic proteins, to food processing conditions. Remarkable results were obtained in term of sufficiently low LOD and LOQ (1.3 and 4 mg/kg cookies, respectively). Precision was calculated as intra-day repeatability (RSD in the 5-20% range) and inter-day repeatability (4 days; RSD never exceeded 12%). The extraction recovery values ranged from 20% to 26%. Method applicability was evaluated by analysing commercial cookies labelled either as "milk-free" or "may contain milk". Although the ELISA methodology is considered the gold standard for detecting allergens in foods, this robust LC-MS/MS approach should be a useful confirmatory method for assessing and certifying "milk-free" food products. PMID:26775952

  19. Development and validation of QuEChERS method for estimation of chlorantraniliprole residue in vegetables.

    PubMed

    Singh, Balwinder; Kar, Abhijit; Mandal, Kousik; Kumar, Rajinder; Sahoo, Sanjay Kumar

    2012-12-01

    An easy, simple and efficient analytical method was standardized and validated for the estimation of residues of chlorantraniliprole in different vegetables comprising brinjal, cabbage, capsicum, cauliflower, okra, and tomato. QuEChERS method was used for the extraction and cleanup of chlorantraniliprole residues on these vegetables. Final clear extracts of ethyl acetate were concentrated under vacuum and reconstituted into high performance liquid chromatograph (HPLC) grade acetonitrile, and residues were estimated using HPLC equipped with PDA detector system, C(18) column and confirmed by liquid chromatograph mass spectrometer (LC-MS/MS), and high performance thin layer chromatograph (HPTLC). HPLC grade acetonitrile:water (80:20, v/v) was used as mobile phase @ 0.4 mL/min. Chlorantraniliprole presented distinct peak at retention time of 9.82 min. Consistent recoveries ranging from 85% to 96% for chlorantraniliprole were observed when samples were spiked at 0.10, 0.25, 0.50, and 1.00 mg/kg levels. The limit of quantification of this method was worked out to be 0.10 mg/kg. PMID:22853564

  20. Validation of an HPLC method for direct measurement of steviol equivalents in foods.

    PubMed

    Bartholomees, Uria; Struyf, Tom; Lauwers, Olivier; Ceunen, Stijn; Geuns, Jan M C

    2016-01-01

    Steviol glycosides are intense natural sweeteners used in foods and beverages. Their acceptable daily intake, expressed as steviol equivalents, is set at 0-4 mg/kg body weight. We report the development and validation of a RP-HPLC method with fluorometric detection of derivatized isosteviol, formed by acid hydrolysis of steviol glycosides. Dihydroisosteviol was used as an internal standard. Using this method, the amount of steviol equivalents in commercial steviol glycoside mixtures and different foods can be directly quantified. The method was successfully tested on strawberry jam, low-fat milk, soft drink, yogurt and a commercial mixture of steviol glycosides. Calibration curves were linear between 0.01 and 1.61 mM steviol equivalents, with a quantification limit of 0.2 nmol. The % RSD of intra-day precision varied between 0.4% and 4%, whereas inter-day precision varied between 0.4% and 5%, for high and medium concentrations, and between 3% and 8% for low concentrations. Accuracy of the analysis varied between 99% and 115%. PMID:26212970