Science.gov

Sample records for purpose validated method

  1. Validity for What Purpose?

    ERIC Educational Resources Information Center

    Shepard, Lorrie A.

    2013-01-01

    Background/Context: The evolution of validity understandings from mid-century to now has emphasized that test validity depends on test purpose--adding consequence considerations to issues of interpretation and evidentiary warrants. Purpose: To consider the tensions created by multiple purposes for assessment and sketch briefly how we got to where…

  2. Critical analysis of several analytical method validation strategies in the framework of the fit for purpose concept.

    PubMed

    Bouabidi, A; Rozet, E; Fillet, M; Ziemons, E; Chapuzet, E; Mertens, B; Klinkenberg, R; Ceccato, A; Talbi, M; Streel, B; Bouklouze, A; Boulanger, B; Hubert, Ph

    2010-05-07

    Analytical method validation is a mandatory step at the end of the development in all analytical laboratories. It is a highly regulated step of the life cycle of a quantitative analytical method. However, even if some documents have been published there is a lack of clear guidance for the methodology to follow to adequately decide when a method can be considered as valid. This situation has led to the availability of several methodological approaches and it is therefore the responsibility of the analyst to choose the best one. The classical decision processes encountered during method validation evaluation are compared, namely the descriptive, difference and equivalence approaches. Furthermore a validation approach using accuracy profile computed by means of beta-expectation tolerance interval and total measurement error is also available. In the present paper all of these different validation approaches were applied to the validation of two analytical methods. The evaluation of the producer and consumer risks by Monte Carlo simulations were also made in order to compare the appropriateness of these various approaches. The classical methodologies give rise to inadequate and contradictory conclusions which do not allow them to answer adequately the objective of method validation, i.e. to give enough guarantees that each of the future results that will be generated by the method during routine use will be close enough to the true value. It is found that the validation methodology which gives the most guarantees with regards to the reliability or adequacy of the decision to consider a method as valid is the one based on the use of the accuracy profile.

  3. Validity of a simple videogrammetric method to measure the movement of all hand segments for clinical purposes.

    PubMed

    Sancho-Bru, Joaquín L; Jarque-Bou, Néstor J; Vergara, Margarita; Pérez-González, Antonio

    2014-02-01

    Hand movement measurement is important in clinical, ergonomics and biomechanical fields. Videogrammetric techniques allow the measurement of hand movement without interfering with the natural hand behaviour. However, an accurate measurement of the hand movement requires the use of a high number of markers, which limits its applicability for the clinical practice (60 markers would be needed for hand and wrist). In this work, a simple method that uses a reduced number of markers (29), based on a simplified kinematic model of the hand, is proposed and evaluated. A set of experiments have been performed to evaluate the errors associated with the kinematic simplification, together with the evaluation of its accuracy, repeatability and reproducibility. The global error attributed to the kinematic simplification was 6.68°. The method has small errors in repeatability and reproducibility (3.43° and 4.23°, respectively) and shows no statistically significant difference with the use of electronic goniometers. The relevance of the work lies in the ability of measuring all degrees of freedom of the hand with a reduced number of markers without interfering with the natural hand behaviour, which makes it suitable for its use in clinical applications, as well as for ergonomic and biomechanical purposes.

  4. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  5. Fit for purpose validated method for the determination of the strontium isotopic signature in mineral water samples by multi-collector inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Brach-Papa, Christophe; Van Bocxstaele, Marleen; Ponzevera, Emmanuel; Quétel, Christophe R.

    2009-03-01

    A robust method allowing the routine determination of n( 87Sr)/ n( 86Sr) with at least five significant decimal digits for large sets of mineral water samples is described. It is based on 2 consecutive chromatographic separations of Sr associated to multi-collector inductively coupled plasma mass spectrometry (MC-ICPMS) measurements. Separations are performed using commercial pre-packed columns filled with "Sr resin" to overcome isobaric interferences affecting the determination of strontium isotope ratios. The careful method validation scheme applied is described. It included investigations on all parameters influencing both chromatographic separations and MC-ICPMS measurements, and also the test on a synthetic sample made of an aliquot of the NIST SRM 987 certified reference material dispersed in a saline matrix to mimic complex samples. Correction for mass discrimination was done internally using the n( 88Sr)/ n( 86Sr) ratio. For comparing mineral waters originating from different geological backgrounds or identifying counterfeits, calculations involved the well known consensus value (1/0.1194) ± 0 as reference. The typical uncertainty budget estimated for these results was 40 'ppm' relative ( k = 2). It increased to 150 'ppm' ( k = 2) for the establishment of stand alone results, taking into account a relative difference of about 126 'ppm' systematically observed between measured and certified values of the NIST SRM 987. In case there was suspicion of a deviation of the n( 88Sr)/ n( 86Sr) ratio (worst case scenario) our proposal was to use the NIST SRM 987 value 8.37861 ± 0.00325 ( k = 2) as reference, and assign a typical relative uncertainty budget of 300 'ppm' ( k = 2). This method is thus fit for purpose and was applied to eleven French samples.

  6. External Validity in Policy Evaluations that Choose Sites Purposively.

    PubMed

    Olsen, Robert B; Orr, Larry L; Bell, Stephen H; Stuart, Elizabeth A

    2013-01-01

    Evaluations of the impact of social programs are often carried out in multiple "sites," such as school districts, housing authorities, local TANF offices, or One-Stop Career Centers. Most evaluations select sites purposively following a process that is nonrandom. Unfortunately, purposive site selection can produce a sample of sites that is not representative of the population of interest for the program. In this paper, we propose a conceptual model of purposive site selection. We begin with the proposition that a purposive sample of sites can usefully be conceptualized as a random sample of sites from some well-defined population, for which the sampling probabilities are unknown and vary across sites. This proposition allows us to derive a formal, yet intuitive, mathematical expression for the bias in the pooled impact estimate when sites are selected purposively. This formula helps us to better understand the consequences of selecting sites purposively, and the factors that contribute to the bias. Additional research is needed to obtain evidence on how large the bias tends to be in actual studies that select sites purposively, and to develop methods to increase the external validity of these studies.

  7. Construct Validity in Formative Assessment: Purpose and Practices

    ERIC Educational Resources Information Center

    Rix, Samantha

    2012-01-01

    This paper examines the utilization of construct validity in formative assessment for classroom-based purposes. Construct validity pertains to the notion that interpretations are made by educators who analyze test scores during formative assessment. The purpose of this paper is to note the challenges that educators face when interpreting these…

  8. Simple validated LC-MS/MS method for the determination of atropine and scopolamine in plasma for clinical and forensic toxicological purposes.

    PubMed

    Koželj, Gordana; Perharič, Lucija; Stanovnik, Lovro; Prosen, Helena

    2014-08-05

    A liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the determination of atropine and scopolamine in 100μL human plasma was developed and validated. Sample pretreatment consisted of protein precipitation with acetonitrile followed by a concentration step. Analytes and levobupivacaine (internal standard) were separated on a Zorbax XDB-CN column (75mm×4.6mm i.d., 3.5μm) with gradient elution (purified water, acetonitrile, formic acid). The triple quadrupole MS was operated in ESI positive mode. Matrix effect was estimated for deproteinised plasma samples. Selected reaction monitoring (SRM) was used for quantification in the range of 0.10-50.00ng/mL. Interday precision for both tropanes and intraday precision for atropine was <10%, intraday precision for scopolamine was <14% and <18% at lower limit of quantification (LLOQ). Mean interday and intraday accuracies for atropine were within ±7% and for scopolamine within ±11%. The method can be used for determination of therapeutic and toxic levels of both compounds and has been successfully applied to a study of pharmacodynamic and pharmacokinetic properties of tropanes, where plasma samples of volunteers were collected at fixed time intervals after ingestion of a buckwheat meal, spiked with five low doses of tropanes.

  9. Homework Purpose Scale for Middle School Students: A Validation Study

    ERIC Educational Resources Information Center

    Xu, Jianzhong

    2011-01-01

    The purpose of the present study is to test the validity of scores on the Homework Purpose Scale (HPS) for middle school students. The participants were 1,181 eighth graders in the southeastern United States, including (a) 699 students in urban school districts and (b) 482 students in rural school districts. First, confirmatory factor analysis was…

  10. VAN method lacks validity

    NASA Astrophysics Data System (ADS)

    Jackson, David D.; Kagan, Yan Y.

    Varotsos and colleagues (the VAN group) claim to have successfully predicted many earthquakes in Greece. Several authors have refuted these claims, as reported in the May 27,1996, special issue of Geophysical Research Letters and a recent book, A Critical Review of VAN [Lighthill 1996]. Nevertheless, the myth persists. Here we summarize why the VAN group's claims lack validity.The VAN group observes electrical potential differences that they call “seismic electric signals” (SES) weeks before and hundreds of kilometers away from some earthquakes, claiming that SES are somehow premonitory. This would require that increases in stress or decreases in strength cause the electrical variations, or that some regional process first causes the electrical signals and then helps trigger the earthquakes. Here we adopt their notation SES to refer to the electrical variations, without accepting any link to the quakes.

  11. Toward a Unified Validation Framework in Mixed Methods Research

    ERIC Educational Resources Information Center

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  12. Quality by design compliant analytical method validation.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-01-03

    The concept of quality by design (QbD) has recently been adopted for the development of pharmaceutical processes to ensure a predefined product quality. Focus on applying the QbD concept to analytical methods has increased as it is fully integrated within pharmaceutical processes and especially in the process control strategy. In addition, there is the need to switch from the traditional checklist implementation of method validation requirements to a method validation approach that should provide a high level of assurance of method reliability in order to adequately measure the critical quality attributes (CQAs) of the drug product. The intended purpose of analytical methods is directly related to the final decision that will be made with the results generated by these methods under study. The final aim for quantitative impurity assays is to correctly declare a substance or a product as compliant with respect to the corresponding product specifications. For content assays, the aim is similar: making the correct decision about product compliance with respect to their specification limits. It is for these reasons that the fitness of these methods should be defined, as they are key elements of the analytical target profile (ATP). Therefore, validation criteria, corresponding acceptance limits, and method validation decision approaches should be settled in accordance with the final use of these analytical procedures. This work proposes a general methodology to achieve this in order to align method validation within the QbD framework and philosophy. β-Expectation tolerance intervals are implemented to decide about the validity of analytical methods. The proposed methodology is also applied to the validation of analytical procedures dedicated to the quantification of impurities or active product ingredients (API) in drug substances or drug products, and its applicability is illustrated with two case studies.

  13. Random Qualitative Validation: A Mixed-Methods Approach to Survey Validation

    ERIC Educational Resources Information Center

    Van Duzer, Eric

    2012-01-01

    The purpose of this paper is to introduce the process and value of Random Qualitative Validation (RQV) in the development and interpretation of survey data. RQV is a method of gathering clarifying qualitative data that improves the validity of the quantitative analysis. This paper is concerned with validity in relation to the participants'…

  14. External Validity in Policy Evaluations That Choose Sites Purposively

    ERIC Educational Resources Information Center

    Olsen, Robert B.; Orr, Larry L.; Bell, Stephen H.; Stuart, Elizabeth A.

    2013-01-01

    Evaluations of the impact of social programs are often carried out in multiple sites, such as school districts, housing authorities, local TANF offices, or One-Stop Career Centers. Most evaluations select sites purposively following a process that is nonrandom. Unfortunately, purposive site selection can produce a sample of sites that is not…

  15. Are analysts doing method validation in liquid chromatography?

    PubMed

    Ruiz-Angel, M J; García-Alvarez-Coque, M C; Berthod, A; Carda-Broch, S

    2014-08-01

    Method validation is being applied in the reported analytical methods for decades. Even before this protocol was defined, authors already somehow validated their methods without full awareness. They wished to assure the quality of their work. Validation is an applied approach to verify that a method is suitable and rugged enough to function as a quality control tool in different locations and times. The performance parameters and statistical protocols followed throughout a validation study vary with the source of guidelines. Before single laboratory validation, an analytical method should be fully developed and optimized. The purpose of the validation is to confirm performance parameters that are determined during method development, and it should provide information on how the method will perform under routine use. An unstable method may require re-validation. Further method development and optimization will be needed if validation results do not meet the accepted performance standards. When possible, the validation protocol should also be conducted as a collaborative study by multiple laboratories, on different instruments, reagents, and standards. At this point, it would be interesting to know how people are validating their methods. Are they evaluating all defined validation parameters? Are they indicating the followed guidelines? Is re-validation really currently used? Is validation performed by a single laboratory, or is it a collaborative work by several laboratories? Is it an evolving discipline? In this survey, we will try to answer these questions focused to the field of liquid chromatography.

  16. Advances in validation, risk and uncertainty assessment of bioanalytical methods.

    PubMed

    Rozet, E; Marini, R D; Ziemons, E; Boulanger, B; Hubert, Ph

    2011-06-25

    Bioanalytical method validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application in order to trust the critical decisions that will be made with them. Even if several guidelines exist to help perform bioanalytical method validations, there is still the need to clarify the meaning and interpretation of bioanalytical method validation criteria and methodology. Yet, different interpretations can be made of the validation guidelines as well as for the definitions of the validation criteria. This will lead to diverse experimental designs implemented to try fulfilling these criteria. Finally, different decision methodologies can also be interpreted from these guidelines. Therefore, the risk that a validated bioanalytical method may be unfit for its future purpose will depend on analysts personal interpretation of these guidelines. The objective of this review is thus to discuss and highlight several essential aspects of methods validation, not only restricted to chromatographic ones but also to ligand binding assays owing to their increasing role in biopharmaceutical industries. The points that will be reviewed are the common validation criteria, which are selectivity, standard curve, trueness, precision, accuracy, limits of quantification and range, dilutional integrity and analyte stability. Definitions, methodology, experimental design and decision criteria are reviewed. Two other points closely connected to method validation are also examined: incurred sample reproducibility testing and measurement uncertainty as they are highly linked to bioanalytical results reliability. Their additional implementation is foreseen to strongly reduce the risk of having validated a bioanalytical method unfit for its purpose.

  17. Validation Methods for Direct Writing Assessment.

    ERIC Educational Resources Information Center

    Miller, M. David; Crocker, Linda

    1990-01-01

    This review of methods for validating writing assessments was conceptualized within a framework suggested by S. Messick (1989) that included five operational components of construct validation: (1) content representativeness; (2) structural fidelity; (3) nomological validity; (4) criterion-related validity; and (5) nomothetic span. (SLD)

  18. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    PubMed

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures.

  19. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  20. Design for validation, based on formal methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1990-01-01

    Validation of ultra-reliable systems decomposes into two subproblems: (1) quantification of probability of system failure due to physical failure; (2) establishing that Design Errors are not present. Methods of design, testing, and analysis of ultra-reliable software are discussed. It is concluded that a design-for-validation based on formal methods is needed for the digital flight control systems problem, and also that formal methods will play a major role in the development of future high reliability digital systems.

  1. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology.

  2. Experimental validation of structural optimization methods

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.

    1992-01-01

    The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.

  3. ASTM Validates Air Pollution Test Methods

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  4. A Practical Guide to Immunoassay Method Validation.

    PubMed

    Andreasson, Ulf; Perret-Liaudet, Armand; van Waalwijk van Doorn, Linda J C; Blennow, Kaj; Chiasserini, Davide; Engelborghs, Sebastiaan; Fladby, Tormod; Genc, Sermin; Kruse, Niels; Kuiperij, H Bea; Kulic, Luka; Lewczuk, Piotr; Mollenhauer, Brit; Mroczko, Barbara; Parnetti, Lucilla; Vanmechelen, Eugeen; Verbeek, Marcel M; Winblad, Bengt; Zetterberg, Henrik; Koel-Simmelink, Marleen; Teunissen, Charlotte E

    2015-01-01

    Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer's disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well.

  5. Determination of lead and cadmium in seawater by differential pulse anodic stripping voltammetry: fit-for-purpose partial validation and internal quality aspects.

    PubMed

    Bisetty, K; Gumede, N J; Escuder-Gilabert, L; Sagrado, S

    2008-09-01

    The main thrust of this work involves method validation, quality control and sample uncertainty estimations related to the determination of cadmium and lead in marine water by anodic stripping voltammetry. We have followed a step-by-step protocol to evaluate and harmonize the internal quality aspects of this method. Such protocol involves a statement of the method's scope (analytes, matrices, concentration level) and requisites (external and/or internal); selection of the method's (fit-for-purpose) features; prevalidation and validation of the intermediate accuracy (under intermediate precision conditions) and its assessment (by Monte Carlo simulation); validation of other required features of the method (if applicable); and a validity statement in terms of a "fit-for-purpose" decision, harmonized validation-control-uncertainty statistics (the "u-approach") and short-term routine work (with the aim of proposing virtually "ready-to-use" methods).

  6. Validation of qualitative microbiological test methods.

    PubMed

    IJzerman-Boon, Pieta C; van den Heuvel, Edwin R

    2015-01-01

    This paper considers a statistical model for the detection mechanism of qualitative microbiological test methods with a parameter for the detection proportion (the probability to detect a single organism) and a parameter for the false positive rate. It is demonstrated that the detection proportion and the bacterial density cannot be estimated separately, not even in a multiple dilution experiment. Only the product can be estimated, changing the interpretation of the most probable number estimator. The asymptotic power of the likelihood ratio statistic for comparing an alternative method with the compendial method, is optimal for a single dilution experiment. The bacterial density should either be close to two CFUs per test unit or equal to zero, depending on differences in the model parameters between the two test methods. The proposed strategy for method validation is to use these two dilutions and test for differences in the two model parameters, addressing the validation parameters specificity and accuracy. Robustness of these two parameters might still be required, but all other validation parameters can be omitted. A confidence interval-based approach for the ratio of the detection proportions for the two methods is recommended, since it is most informative and close to the power of the likelihood ratio test.

  7. Validation of an alternative microbiological method for tissue products.

    PubMed

    Suessner, Susanne; Hennerbichler, Simone; Schreiberhuber, Stefanie; Stuebl, Doris; Gabriel, Christian

    2014-06-01

    According to the European Pharmacopoeia sterility testing of products includes an incubation time of 14 days in thioglycollate medium and soya-bean casein medium. In this case a large period of time is needed for product testing. So we designed a study to evaluate an alternative method for sterility testing. The aim of this study was to reduce the incubation time for the routinely produced products in our tissue bank (cornea and amnion grafts) by obtaining the same detection limit, accurateness and recovery rates as the reference method described in the European Pharmacopoeia. The study included two steps of validation. Primary validation compared the reference method with the alternative method. Therefore eight bacterial and two fungi test strains were tested at their preferred milieu. A geometric dilution series from 10 to 0.625 colony forming unit per 10 ml culture media was used. Subsequent to the evaluation the second part of the study started including the validation of the fertility of the culture media and the parallel testing of the two methods by investigating products. For this purpose two product batches were tested in three independent runs. Concerning the validation we could not find any aberration between the alternative and the reference method. In addition, the recovery rate of each microorganism was between 83.33 and 100 %. The alternative method showed non-inferiority regarding accuracy to the reference method. Due to this study we reduced the sterility testing for cornea and amniotic grafts to 9 days.

  8. Validation and verification of measurement methods in clinical chemistry.

    PubMed

    Theodorsson, Elvar

    2012-02-01

    The present overview of validation and verification procedures in clinical chemistry focuses on the use of harmonized concepts and nomenclature, fitness-for-purpose evaluations and procedures for minimizing overall measurement and diagnostic uncertainty. The need for mutually accepted validation procedures in all fields of bioanalysis becomes obvious when they implement international accreditation and certification standards or their equivalents. The guide on bioanalytical method validation published by the US FDA in 2001 represents a sensible compromise between thoroughness and cost-effectiveness. Lacking comprehensive international agreements in the field, this document has also been successfully adapted in other fields of bioanalysis. European and international efforts aiming for consensus in the entire field of bioanalysis are currently being made. Manufacturers of highly automated in vitro diagnostic methods provide the majority of measurement methods used in unmodified in clinical chemistry. Validated by the manufacturers for their intended use and fitness-for-purpose, they need to be verified in the circumstances of the end-users. As yet, there is unfortunately no general agreement on the extent of the verification procedures needed.

  9. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  10. Validation methods for flight crucial systems

    NASA Technical Reports Server (NTRS)

    Holt, H. M.

    1983-01-01

    Research to develop techniques that can aid in determining the reliability and performance of digital electronic fault-tolerant systems, that have probability of catastrophic system failure on the order of 10 to the -9th at 10 hours, is reviewed. The computer-aided reliability estimation program (CARE III) provides general-purpose reliability analysis and a design tool for fault-tolerant systems; large reduction of state size; and a fault-handling model based on probabilistic description of detection, isolation, and recovery mechanisms. The application of design proof techniques as part of the design and development of the software implemented fault-tolerance computer is mentioned. Emulation techniques and experimental procedures are verified using specimens of fault-tolerant computers and the capabilities of the validation research laboratory, AIRLAB.

  11. Establishing the Content Validity of Tests Designed To Serve Multiple Purposes: Bridging Secondary-Postsecondary Mathematics.

    ERIC Educational Resources Information Center

    Burstein, Leigh; And Others

    A method is presented for determining the content validity of a series of secondary school mathematics tests. These tests are part of the Mathematics Diagnostic Testing Project (MDTP), a collaborative effort by California university systems to develop placement examinations and a means to document student preparation in mathematics. Content…

  12. Validation of analytical methods involved in dissolution assays: acceptance limits and decision methodologies.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-11-02

    Dissolution tests are key elements to ensure continuing product quality and performance. The ultimate goal of these tests is to assure consistent product quality within a defined set of specification criteria. Validation of an analytical method aimed at assessing the dissolution profile of products or at verifying pharmacopoeias compliance should demonstrate that this analytical method is able to correctly declare two dissolution profiles as similar or drug products as compliant with respect to their specifications. It is essential to ensure that these analytical methods are fit for their purpose. Method validation is aimed at providing this guarantee. However, even in the ICHQ2 guideline there is no information explaining how to decide whether the method under validation is valid for its final purpose or not. Are the entire validation criterion needed to ensure that a Quality Control (QC) analytical method for dissolution test is valid? What acceptance limits should be set on these criteria? How to decide about method's validity? These are the questions that this work aims at answering. Focus is made to comply with the current implementation of the Quality by Design (QbD) principles in the pharmaceutical industry in order to allow to correctly defining the Analytical Target Profile (ATP) of analytical methods involved in dissolution tests. Analytical method validation is then the natural demonstration that the developed methods are fit for their intended purpose and is not any more the inconsiderate checklist validation approach still generally performed to complete the filing required to obtain product marketing authorization.

  13. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  14. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  15. A special purpose knowledge-based face localization method

    NASA Astrophysics Data System (ADS)

    Hassanat, Ahmad; Jassim, Sabah

    2008-04-01

    This paper is concerned with face localization for visual speech recognition (VSR) system. Face detection and localization have got a great deal of attention in the last few years, because it is an essential pre-processing step in many techniques that handle or deal with faces, (e.g. age, face, gender, race and visual speech recognition). We shall present an efficient method for localization human's faces in video images captured on mobile constrained devices, under a wide variation in lighting conditions. We use a multiphase method that may include all or some of the following steps starting with image pre-processing, followed by a special purpose edge detection, then an image refinement step. The output image will be passed through a discrete wavelet decomposition procedure, and the computed LL sub-band at a certain level will be transformed into a binary image that will be scanned by using a special template to select a number of possible candidate locations. Finally, we fuse the scores from the wavelet step with scores determined by color information for the candidate location and employ a form of fuzzy logic to distinguish face from non-face locations. We shall present results of large number of experiments to demonstrate that the proposed face localization method is efficient and achieve high level of accuracy that outperforms existing general-purpose face detection methods.

  16. Validation of the total organic carbon (TOC) swab sampling and test method.

    PubMed

    Glover, Chris

    2006-01-01

    For cleaning validation purposes, the combination of swab sampling and the total organic carbon (TOC) test method provides a useful mechanism to monitor the cleanliness of equipment surfaces. The TOC test method is an ideal choice for monitoring carbon-containing residuals. Sample and test method validation "TOC Swabbing Method Validation" BV-000-BC-078-01, Bayer Healthcare, proved quantifiable recovery of albumin down to 25 microg. The validation characteristics included accuracy, repeatability and intermediate precision, specificity, linearity, assay range, detection and quantitation limit, and robustness.

  17. Key elements of bioanalytical method validation for small molecules.

    PubMed

    Bansal, Surendra; DeStefano, Anthony

    2007-03-30

    Method validation is a process that demonstrates that a method will successfully meet or exceed the minimum standards recommended in the Food and Drug Administration (FDA) guidance for accuracy, precision, selectivity, sensitivity, reproducibility, and stability. This article discusses the validation of bioanalytical methods for small molecules with emphasis on chromatographic techniques. We present current thinking on validation requirements as described in the current FDA Guidance and subsequent 2006 Bioanalytical Methods Validation Workshop white paper.

  18. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  19. Some useful statistical methods for model validation.

    PubMed Central

    Marcus, A H; Elias, R W

    1998-01-01

    Although formal hypothesis tests provide a convenient framework for displaying the statistical results of empirical comparisons, standard tests should not be used without consideration of underlying measurement error structure. As part of the validation process, predictions of individual blood lead concentrations from models with site-specific input parameters are often compared with blood lead concentrations measured in field studies that also report lead concentrations in environmental media (soil, dust, water, paint) as surrogates for exposure. Measurements of these environmental media are subject to several sources of variability, including temporal and spatial sampling, sample preparation and chemical analysis, and data entry or recording. Adjustments for measurement error must be made before statistical tests can be used to empirically compare environmental data with model predictions. This report illustrates the effect of measurement error correction using a real dataset of child blood lead concentrations for an undisclosed midwestern community. We illustrate both the apparent failure of some standard regression tests and the success of adjustment of such tests for measurement error using the SIMEX (simulation-extrapolation) procedure. This procedure adds simulated measurement error to model predictions and then subtracts the total measurement error, analogous to the method of standard additions used by analytical chemists. Images Figure 1 Figure 3 PMID:9860913

  20. FDIR Strategy Validation with the B Method

    NASA Astrophysics Data System (ADS)

    Sabatier, D.; Dellandrea, B.; Chemouil, D.

    2008-08-01

    In a formation flying satellite system, the FDIR strategy (Failure Detection, Isolation and Recovery) is paramount. When a failure occurs, satellites should be able to take appropriate reconfiguration actions to obtain the best possible results given the failure, ranging from avoiding satellite-to-satellite collision to continuing the mission without disturbance if possible. To achieve this goal, each satellite in the formation has an implemented FDIR strategy that governs how it detects failures (from tests or by deduction) and how it reacts (reconfiguration using redundant equipments, avoidance manoeuvres, etc.). The goal is to protect the satellites first and the mission as much as possible. In a project initiated by the CNES, ClearSy experiments the B Method to validate the FDIR strategies developed by Thales Alenia Space, of the inter satellite positioning and communication devices that will be used for the SIMBOL-X (2 satellite configuration) and the PEGASE (3 satellite configuration) missions and potentially for other missions afterward. These radio frequency metrology sensor devices provide satellite positioning and inter satellite communication in formation flying. This article presents the results of this experience.

  1. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes.

  2. Methods for developing and validating survivability distributions

    SciTech Connect

    Williams, R.L.

    1993-10-01

    A previous report explored and discussed statistical methods and procedures that may be applied to validate the survivability of a complex system of systems that cannot be tested as an entity. It described a methodology where Monte Carlo simulation was used to develop the system survivability distribution from the component distributions using a system model that registers the logical interactions of the components to perform system functions. This paper discusses methods that can be used to develop the required survivability distributions based upon three sources of knowledge. These are (1) available test results; (2) little or no available test data, but a good understanding of the physical laws and phenomena which can be applied by computer simulation; and (3) neither test data nor adequate knowledge of the physics are known, in which case, one must rely upon, and quantify, the judgement of experts. This paper describes the relationship between the confidence bounds that can be placed on survivability and the number of tests conducted. It discusses the procedure for developing system level survivability distributions from the distributions for lower levels of integration. It demonstrates application of these techniques by defining a communications network for a Hypothetical System Architecture. A logic model for the performance of this communications network is developed, as well as the survivability distributions for the nodes and links based on two alternate data sets, reflecting the effects of increased testing of all elements. It then shows how this additional testing could be optimized by concentrating only on those elements contained in the low-order fault sets which the methodology identifies.

  3. Validation of a Cost-Efficient Multi-Purpose SNP Panel for Disease Based Research

    PubMed Central

    Hou, Liping; Phillips, Christopher; Azaro, Marco; Brzustowicz, Linda M.; Bartlett, Christopher W.

    2011-01-01

    Background Here we present convergent methodologies using theoretical calculations, empirical assessment on in-house and publicly available datasets as well as in silico simulations, that validate a panel of SNPs for a variety of necessary tasks in human genetics disease research before resources are committed to larger-scale genotyping studies on those samples. While large-scale well-funded human genetic studies routinely have up to a million SNP genotypes, samples in a human genetics laboratory that are not yet part of such studies may be productively utilized in pilot projects or as part of targeted follow-up work though such smaller scale applications require at least some genome-wide genotype data for quality control purposes such as DNA “barcoding” to detect swaps or contamination issues, determining familial relationships between samples and correcting biases due to population effects such as population stratification in pilot studies. Principal Findings Empirical performance in classification of relative types for any two given DNA samples (e.g., full siblings, parental, etc) indicated that for outbred populations the panel performs sufficiently to classify relationship in extended families and therefore also for smaller structures such as trios and for twin zygosity testing. Additionally, familial relationships do not significantly diminish the (mean match) probability of sharing SNP genotypes in pedigrees, further indicating the uniqueness of the “barcode.” Simulation using these SNPs for an African American case-control disease association study demonstrated that population stratification, even in complex admixed samples, can be adequately corrected under a range of disease models using the SNP panel. Conclusion The panel has been validated for use in a variety of human disease genetics research tasks including sample barcoding, relationship verification, population substructure detection and statistical correction. Given the ease of genotyping

  4. Purpose and methods of a Pollution Prevention Awareness Program

    SciTech Connect

    Flowers, P.A.; Irwin, E.F.; Poligone, S.E.

    1994-08-15

    The purpose of the Pollution Prevention Awareness Program (PPAP), which is required by DOE Order 5400.1, is to foster the philosophy that prevention is superior to remediation. The goal of the program is to incorporate pollution prevention into the decision-making process at every level throughout the organization. The objectives are to instill awareness, disseminate information, provide training and rewards for identifying the true source or cause of wastes, and encourage employee participation in solving environmental issues and preventing pollution. PPAP at the Oak Ridge Y-12 Plant was created several years ago and continues to grow. We believe that we have implemented several unique methods of communicating environmental awareness to promote a more active work force in identifying ways of reducing pollution.

  5. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    PubMed

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  6. Is it really necessary to validate an analytical method or not? That is the question.

    PubMed

    Rambla-Alegre, Maria; Esteve-Romero, Josep; Carda-Broch, Samuel

    2012-04-06

    Method validation is an important requirement in the practice of chemical analysis. However, awareness of its importance, why it should be done and when, and exactly what needs to be done, seems to be poor amongst analytical chemists. Much advice related to method validation already exists in the literature, especially related to particular methods, but more often than not is underused. Some analysts see method validation as something that can only be done by collaborating with other laboratories and therefore do not go about it. In addition, analysts' understanding of method validation is inhibited by the fact that many of the technical terms used in the processes for evaluating methods vary in different sectors of analytical measurement, both in terms of their meaning and the way they are determined. Validation applies to a defined protocol, for the determination of a specified analyte and range of concentrations in a particular type of test material, used for a specified purpose. In general, validation should check that the method performs adequately for the purpose throughout the range of analyte concentrations and test materials to which it is applied. It follows that these features, together with a statement of any fitness-for-purpose criteria, should be completely specified before any validation takes place.

  7. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method.

    PubMed

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-02-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5-12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim,) [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  8. [Validation and regulatory acceptance of alternative methods for toxicity evaluation].

    PubMed

    Ohno, Yasuo

    2004-01-01

    For regulatory acceptance of alternative methods (AMs) to animal toxicity tests, their reproducibility and relevance should be determined by intra- and inter-laboratory validation. Appropriate procedures of the validation and regulatory acceptance of AMs were recommended by OECD in 1996. According to those principles, several in vitro methods like skin corrosivity tests and phototoxicity tests were evaluated and accepted by ECVAM (European Center for the Validation of Alternative Methods), ICCVAM (The Interagency Coordinating Committee on the Validation of Alternative Methods), and OECD. Because of the difficulties in conducting inter-laboratory validation and relatively short period remained until EU's ban of animal experiments for safety evaluation of cosmetics, ECVAM and ICCVAM have recently started cooperation in validation and evaluation of AMs. It is also necessary to establish JaCVAM (Japanese Center for the Validation of AM) to contribute the issue and for the evaluation of new toxicity tests originated in Japan.

  9. The Value of Qualitative Methods in Social Validity Research

    ERIC Educational Resources Information Center

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based reading…

  10. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  11. Purpose in Life in Emerging Adulthood: Development and Validation of a New Brief Measure.

    PubMed

    Hill, Patrick L; Edmonds, Grant W; Peterson, Missy; Luyckx, Koen; Andrews, Judy A

    2016-05-01

    Accruing evidence points to the value of studying purpose in life across adolescence and emerging adulthood. Research though is needed to understand the unique role of purpose in life in predicting well-being and developmentally relevant outcomes during emerging adulthood. The current studies (total n = 669) found support for the development of a new brief measure of purpose in life using data from American and Canadian samples, while demonstrating evidence for two important findings. First, purpose in life predicted well-being during emerging adulthood, even when controlling for the Big Five personality traits. Second, purpose in life was positively associated with self-image and negatively associated with delinquency, again controlling for personality traits. Findings are discussed with respect to how studying purpose in life can help understand which individuals are more likely to experience positive transitions into adulthood.

  12. Exploring valid and reliable assessment methods for care management education.

    PubMed

    Gennissen, Lokke; Stammen, Lorette; Bueno-de-Mesquita, Jolien; Wieringa, Sietse; Busari, Jamiu

    2016-07-04

    Purpose It is assumed that the use of valid and reliable assessment methods can facilitate the development of medical residents' management and leadership competencies. To justify this assertion, the perceptions of an expert panel of health care leaders were explored on assessment methods used for evaluating care management (CM) development in Dutch residency programs. This paper aims to investigate how assessors and trainees value these methods and examine for any inherent benefits or shortcomings when they are applied in practice. Design/methodology/approach A Delphi survey was conducted among members of the platform for medical leadership in The Netherlands. This panel of experts was made up of clinical educators, practitioners and residents interested in CM education. Findings Of the respondents, 40 (55.6 per cent) and 31 (43 per cent) participated in the first and second rounds of the Delphi survey, respectively. The respondents agreed that assessment methods currently being used to measure residents' CM competencies were weak, though feasible for use in many residency programs. Multi-source feedback (MSF, 92.1 per cent), portfolio/e-portfolio (86.8 per cent) and knowledge testing (76.3 per cent) were identified as the most commonly known assessment methods with familiarity rates exceeding 75 per cent. Practical implications The findings suggested that an "assessment framework" comprising MSF, portfolios, individual process improvement projects or self-reflections and observations in clinical practice should be used to measure CM competencies in residents. Originality/value This study reaffirms the need for objective methods to assess CM skills in post-graduate medical education, as there was not a single assessment method that stood out as the best instrument.

  13. Bioanalytical method validation: An updated review.

    PubMed

    Tiwari, Gaurav; Tiwari, Ruchi

    2010-10-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development, culminating in a marketing approval. The objective of this paper is to review the sample preparation of drug in biological matrix and to provide practical approaches for determining selectivity, specificity, limit of detection, lower limit of quantitation, linearity, range, accuracy, precision, recovery, stability, ruggedness, and robustness of liquid chromatographic methods to support pharmacokinetic (PK), toxicokinetic, bioavailability, and bioequivalence studies. Bioanalysis, employed for the quantitative determination of drugs and their metabolites in biological fluids, plays a significant role in the evaluation and interpretation of bioequivalence, PK, and toxicokinetic studies. Selective and sensitive analytical methods for quantitative evaluation of drugs and their metabolites are critical for the successful conduct of pre-clinical and/or biopharmaceutics and clinical pharmacology studies.

  14. Bioanalytical method validation: An updated review

    PubMed Central

    Tiwari, Gaurav; Tiwari, Ruchi

    2010-01-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development, culminating in a marketing approval. The objective of this paper is to review the sample preparation of drug in biological matrix and to provide practical approaches for determining selectivity, specificity, limit of detection, lower limit of quantitation, linearity, range, accuracy, precision, recovery, stability, ruggedness, and robustness of liquid chromatographic methods to support pharmacokinetic (PK), toxicokinetic, bioavailability, and bioequivalence studies. Bioanalysis, employed for the quantitative determination of drugs and their metabolites in biological fluids, plays a significant role in the evaluation and interpretation of bioequivalence, PK, and toxicokinetic studies. Selective and sensitive analytical methods for quantitative evaluation of drugs and their metabolites are critical for the successful conduct of pre-clinical and/or biopharmaceutics and clinical pharmacology studies. PMID:23781413

  15. Modified cross-validation as a method for estimating parameter

    NASA Astrophysics Data System (ADS)

    Shi, Chye Rou; Adnan, Robiah

    2014-12-01

    Best subsets regression is an effective approach to distinguish models that can attain objectives with as few predictors as would be prudent. Subset models might really estimate the regression coefficients and predict future responses with smaller variance than the full model using all predictors. The inquiry of how to pick subset size λ depends on the bias and variance. There are various method to pick subset size λ. Regularly pick the smallest model that minimizes an estimate of the expected prediction error. Since data are regularly small, so Repeated K-fold cross-validation method is the most broadly utilized method to estimate prediction error and select model. The data is reshuffled and re-stratified before each round. However, the "one-standard-error" rule of Repeated K-fold cross-validation method always picks the most stingy model. The objective of this research is to modify the existing cross-validation method to avoid overfitting and underfitting model, a modified cross-validation method is proposed. This paper compares existing cross-validation and modified cross-validation. Our results reasoned that the modified cross-validation method is better at submodel selection and evaluation than other methods.

  16. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    NASA Astrophysics Data System (ADS)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  17. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines.

  18. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  19. Methods for validating chronometry of computerized tests.

    PubMed

    Salmon, Joshua P; Jones, Stephanie A H; Wright, Chris P; Butler, Beverly C; Klein, Raymond M; Eskes, Gail A

    2017-03-01

    Determining the speed at which a task is performed (i.e., reaction time) can be a valuable tool in both research and clinical assessments. However, standard computer hardware employed for measuring reaction times (e.g., computer monitor, keyboard, or mouse) can add nonrepresentative noise to the data, potentially compromising the accuracy of measurements and the conclusions drawn from the data. Therefore, an assessment of the accuracy and precision of measurement should be included along with the development of computerized tests and assessment batteries that rely on reaction times as the dependent variable. This manuscript outlines three methods for assessing the temporal accuracy of reaction time data (one employing external chronometry). Using example data collected from the Dalhousie Computerized Attention Battery (DalCAB) we discuss the detection, measurement, and correction of nonrepresentative noise in reaction time measurement. The details presented in this manuscript should act as a cautionary tale to any researchers or clinicians gathering reaction time data, but who have not yet considered methods for verifying the internal chronometry of the software and or hardware being used.

  20. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    ERIC Educational Resources Information Center

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  1. Validation of population-based disease simulation models: a review of concepts and methods

    PubMed Central

    2010-01-01

    Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1) the process of model development, 2) the performance of a model, and 3) the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction). More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility. PMID:21087466

  2. Validation of analytic methods for biomarkers used in drug development.

    PubMed

    Chau, Cindy H; Rixe, Olivier; McLeod, Howard; Figg, William D

    2008-10-01

    The role of biomarkers in drug discovery and development has gained precedence over the years. As biomarkers become integrated into drug development and clinical trials, quality assurance and, in particular, assay validation become essential with the need to establish standardized guidelines for analytic methods used in biomarker measurements. New biomarkers can revolutionize both the development and use of therapeutics but are contingent on the establishment of a concrete validation process that addresses technology integration and method validation as well as regulatory pathways for efficient biomarker development. This perspective focuses on the general principles of the biomarker validation process with an emphasis on assay validation and the collaborative efforts undertaken by various sectors to promote the standardization of this procedure for efficient biomarker development.

  3. Recommendations on biomarker bioanalytical method validation by GCC.

    PubMed

    Hougton, Richard; Gouty, Dominique; Allinson, John; Green, Rachel; Losauro, Mike; Lowes, Steve; LeLacheur, Richard; Garofolo, Fabio; Couerbe, Philippe; Bronner, Stéphane; Struwe, Petra; Schiebl, Christine; Sangster, Timothy; Pattison, Colin; Islam, Rafiq; Garofolo, Wei; Pawula, Maria; Buonarati, Mike; Hayes, Roger; Cameron, Mark; Nicholson, Robert; Harman, Jake; Wieling, Jaap; De Boer, Theo; Reuschel, Scott; Cojocaru, Laura; Harter, Tammy; Malone, Michele; Nowatzke, William

    2012-10-01

    The 5th GCC in Barcelona (Spain) and 6th GCC in San Antonio (TX, USA) events provided a unique opportunity for CRO leaders to openly share opinions and perspectives, and to agree upon recommendations on biomarker bioanalytical method validation.

  4. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    ERIC Educational Resources Information Center

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  5. A Model-Based Method for Content Validation of Automatically Generated Test Items

    ERIC Educational Resources Information Center

    Zhang, Xinxin; Gierl, Mark

    2016-01-01

    The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…

  6. The external validity of results derived from ecstasy users recruited using purposive sampling strategies.

    PubMed

    Topp, Libby; Barker, Bridget; Degenhardt, Louisa

    2004-01-07

    This study sought to compare the patterns and correlates of 'recent' and 'regular' ecstasy use estimated on the basis of two datasets generated in 2001 in New South Wales, Australia, from a probability and a non-probability sample. The first was the National Drug Strategy Household Survey (NDSHS), a multistage probability sample of the general population; and the second was the Illicit Drug Reporting System (IDRS) Party Drugs Module, for which regular ecstasy users were recruited using purposive sampling strategies. NDSHS recent ecstasy users (any use in the preceding 12 months) were compared on a range of demographic and drug use variables to NDSHS regular ecstasy users (at least monthly use in the preceding 12 months) and purposively sampled regular ecstasy users (at least monthly use in the preceding 6 months). The demographic characteristics of the three samples were consistent. Among all three, the mean age was approximately 25 years, and a majority (60%) of subjects were male, relatively well-educated, and currently employed or studying. Patterns of ecstasy use were similar among the three samples, although compared to recent users, regular users were likely to report more frequent use of ecstasy. All samples were characterised by extensive polydrug use, although the two samples of regular ecstasy users reported higher rates of other illicit drug use than the sample of recent users. The similarities between the demographic and drug use characteristics of the samples are striking, and suggest that, at least in NSW, purposive sampling that seeks to draw from a wide cross-section of users and to sample a relatively large number of individuals, can give rise to samples of ecstasy users that may be considered sufficiently representative to reasonably warrant the drawing of inferences relating to the entire population. These findings may partially offset concerns that purposive samples of ecstasy users are likely to remain a primary source of ecstasy

  7. Development and validation of a new fallout transport method using variable spectral winds. Doctoral thesis

    SciTech Connect

    Hopkins, A.T.

    1984-09-01

    The purpose of this research was to develop and validate a fallout prediction method using variable transport calculations. The new method uses National Meteorological Center (NMC) spectral coefficients to compute wind vectors along the space- and time-varying trajectories of falling particles. The method was validated by comparing computed and actual cloud trajectories from a Mount St. Helens volcanic eruption and a high dust cloud. In summary, this research demonstrated the feasibility of using spectral coefficients for fallout transport calculations, developed a two-step smearing model to treat variable winds, and showed that uncertainties in spectral winds do not contribute significantly to the error in computed dose rate.

  8. International Harmonization and Cooperation in the Validation of Alternative Methods.

    PubMed

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  9. Reliability and validity of optoelectronic method for biophotonical measurements

    NASA Astrophysics Data System (ADS)

    Karpienko, Katarzyna; Wróbel, Maciej S.; UrniaŻ, Rafał

    2013-11-01

    Reliability and validity of measurements is of utmost importance when assessing measuring capability of instruments developed for research. In order to perform an experiment which is legitimate, used instruments must be both reliable and valid. Reliability estimates the degree of precision of measurement, the extent to which a measurement is internally consistent. Validity is the usefulness of an instrument to perform accurate measurements of quantities it was designed to measure. Statistical analysis for reliability and validity control of low-coherence interferometry method for refractive index measurements of biological fluids is presented. The low-coherence interferometer is sensitive to optical path difference between interfering beams. This difference depends on the refractive index of measured material. To assess the validity and reliability of proposed method for blood measurements, the statistical analysis of the method was performed on several substances with known refractive indices. Analysis of low-coherence interferograms considered the mean distances between fringes. Performed statistical analysis for validity and reliability consisted of Grubb's test for outliers, Shapiro-Wilk test for normal distribution, T-Student test, standard deviation, coefficient of determination and r-Pearson correlation. Overall the tests proved high statistical significance of measurement method with confidence level < 0.0001 of measurement method.

  10. [Data validation methods and discussion on Chinese materia medica resource survey].

    PubMed

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  11. Testing and Validation of Computational Methods for Mass Spectrometry.

    PubMed

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  12. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    ERIC Educational Resources Information Center

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  13. Adapting CEF-Descriptors for Rating Purposes: Validation by a Combined Rater Training and Scale Revision Approach

    ERIC Educational Resources Information Center

    Harsch, Claudia; Martin, Guido

    2012-01-01

    We explore how a local rating scale can be based on the Common European Framework CEF-proficiency scales. As part of the scale validation (Alderson, 1991; Lumley, 2002), we examine which adaptations are needed to turn CEF-proficiency descriptors into a rating scale for a local context, and to establish a practicable method to revise the initial…

  14. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-10-16

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks.

  15. Beyond Correctness: Development and Validation of Concept-Based Categorical Scoring Rubrics for Diagnostic Purposes

    ERIC Educational Resources Information Center

    Arieli-Attali, Meirav; Liu, Ying

    2016-01-01

    Diagnostic assessment approaches intend to provide fine-grained reports of what students know and can do, focusing on their areas of strengths and weaknesses. However, current application of such diagnostic approaches is limited by the scoring method for item responses; important diagnostic information, such as type of errors and strategy use is…

  16. Validation of a previous day recall for measuring the location and purpose of active and sedentary behaviors compared to direct observation

    PubMed Central

    2014-01-01

    Purpose Gathering contextual information (i.e., location and purpose) about active and sedentary behaviors is an advantage of self-report tools such as previous day recalls (PDR). However, the validity of PDR’s for measuring context has not been empirically tested. The purpose of this paper was to compare PDR estimates of location and purpose to direct observation (DO). Methods Fifteen adult (18–75 y) and 15 adolescent (12–17 y) participants were directly observed during at least one segment of the day (i.e., morning, afternoon or evening). Participants completed their normal daily routine while trained observers recorded the location (i.e., home, community, work/school), purpose (e.g., leisure, transportation) and whether the behavior was sedentary or active. The day following the observation, participants completed an unannounced PDR. Estimates of time in each context were compared between PDR and DO. Intra-class correlations (ICC), percent agreement and Kappa statistics were calculated. Results For adults, percent agreement was 85% or greater for each location and ICC values ranged from 0.71 to 0.96. The PDR-reported purpose of adults’ behaviors were highly correlated with DO for household activities and work (ICCs of 0.84 and 0.88, respectively). Transportation was not significantly correlated with DO (ICC = -0.08). For adolescents, reported classification of activity location was 80.8% or greater. The ICCs for purpose of adolescents’ behaviors ranged from 0.46 to 0.78. Participants were most accurate in classifying the location and purpose of the behaviors in which they spent the most time. Conclusions This study suggests that adults and adolescents can accurately report where and why they spend time in behaviors using a PDR. This information on behavioral context is essential for translating the evidence for specific behavior-disease associations to health interventions and public policy. PMID:24490619

  17. Recommendations for Use and Fit-for-Purpose Validation of Biomarker Multiplex Ligand Binding Assays in Drug Development.

    PubMed

    Jani, Darshana; Allinson, John; Berisha, Flora; Cowan, Kyra J; Devanarayan, Viswanath; Gleason, Carol; Jeromin, Andreas; Keller, Steve; Khan, Masood U; Nowatzke, Bill; Rhyne, Paul; Stephen, Laurie

    2016-01-01

    Multiplex ligand binding assays (LBAs) are increasingly being used to support many stages of drug development. The complexity of multiplex assays creates many unique challenges in comparison to single-plexed assays leading to various adjustments for validation and potentially during sample analysis to accommodate all of the analytes being measured. This often requires a compromise in decision making with respect to choosing final assay conditions and acceptance criteria of some key assay parameters, depending on the intended use of the assay. The critical parameters that are impacted due to the added challenges associated with multiplexing include the minimum required dilution (MRD), quality control samples that span the range of all analytes being measured, quantitative ranges which can be compromised for certain targets, achieving parallelism for all analytes of interest, cross-talk across assays, freeze-thaw stability across analytes, among many others. Thus, these challenges also increase the complexity of validating the performance of the assay for its intended use. This paper describes the challenges encountered with multiplex LBAs, discusses the underlying causes, and provides solutions to help overcome these challenges. Finally, we provide recommendations on how to perform a fit-for-purpose-based validation, emphasizing issues that are unique to multiplex kit assays.

  18. Validation of three-dimensional Euler methods for vibrating cascade aerodynamics

    SciTech Connect

    Gerolymos, G.A.; Vallet, I.

    1996-10-01

    The purpose of this work is to validate a time-nonlinear three-dimensional Euler solver for vibrating cascades aerodynamics by comparison with available theoretical semi-analytical results from flat-plate cascades. First the method is validated with respect to the purely two-dimensional theory of Verdon (for supersonic flow) by computing two-dimensional vibration (spanwise constant) in linear three-dimensional cascades. Then the method is validated by comparison with the theoretical results of Namba and the computational results of He and Denton, for subsonic flow in a linear three-dimensional cascade with three-dimensional vibratory mode. Finally the method is compared with results of Chi from two subsonic rotating annular cascades of helicoiedal flat plates. Quite satisfactory agreement is obtained for all the cases studied. A first code-to-code comparison is also presented.

  19. The Relationship between Method and Validity in Social Science Research.

    ERIC Educational Resources Information Center

    MacKinnon, David; And Others

    An endless debate in social science research focuses on whether or not there is a philosophical basis for justifying the application of scientific methods to social inquiry. A review of the philosophies of various scholars in the field indicates that there is no single procedure for arriving at a valid statement in a scientific inquiry. Natural…

  20. ePortfolios: The Method of Choice for Validation

    ERIC Educational Resources Information Center

    Scott, Ken; Kim, Jichul

    2015-01-01

    Community colleges have long been institutions of higher education in the arenas of technical education and training, as well as preparing students for transfer to universities. While students are engaged in their student learning outcomes, projects, research, and community service, how have these students validated their work? One method of…

  1. Youden test application in robustness assays during method validation.

    PubMed

    Karageorgou, Eftichia; Samanidou, Victoria

    2014-08-01

    Analytical method validation is a vital step following method development for ensuring reliable and accurate method performance. Among examined figures of merit, robustness/ruggedness study allows us to test performance characteristics of the analytical process when operating conditions are altered either deliberately or not. This study yields useful information, being a fundamental part of method validation. Since many experiments are required, this step is high demanding in time and consumables. In order to avoid the difficult task of performing too many experiments the Youden test which makes use of fractional factorial designs and has been proved to be a very effective approach. The main advantage of Youden test is the fact that it keeps the required time and effort to a minimum, since only a limited number of determinations have to be made, using combinations of the chosen investigated factors. Typical applications of this robustness test found in literature covering a wide variety of sample matrices are briefly discussed in this review.

  2. Cleaning validation of ofloxacin on pharmaceutical manufacturing equipment and validation of desired HPLC method.

    PubMed

    Arayne, M Saeed; Sultana, Najma; Sajid, S Shahnawaz; Ali, S Shahid

    2008-01-01

    Inadequate cleaning of a pharmaceutical manufacturing plant or inadequate purging of the individual pieces of equipment used in multi-product manufacturing or equipment not dedicated to individual products may lead to contamination of the next batch of pharmaceutics manufactured using the same equipment. Challenges for cleaning validation are encountered especially when developing sensitive analytical methods capable of detecting traces of active pharmaceutical ingredients that are likely to remain on the surface of the pharmaceutical equipment after cleaning. A method's inability to detect some residuals could mean that either the method is not sensitive enough to the residue in question or the sampling procedure is inadequate. A sensitive and reproducible reversed-phase, high-performance liquid chromatographic method was developed for the determination of ofloxacin in swab samples. The method for determining ofloxacin residues on manufacturing equipment surfaces was validated in regard to precision, linearity, accuracy, specificity, limit of quantification, and percent recovery from the equipment surface, as well as the stability of a potential contaminant in a cleaning validation process. The active compound was selectively quantified in a sample matrix and swab material in amounts as low as 0.55 ng/mL. The swabbing procedure using cotton swabs was validated. A mean recovery from stainless steel plate of close to 85% was obtained. Chromatography was carried out on a pre-packed Merck (Dermstadt, Germany) Lichrospher model 100 Rp-18 (5.0 microm, 250 mm X 4.0 mm) column using a mixture of sodium lauryl sulfate (0.024% aqueous solution), acetonitrile, and glacial acetic acid (500:480:20,v/v) as the mobile phase at a flow rate of 1.5 mL/min with a column temperature of 35 degrees C and 294 nm detection. The assay was linear over the concentration range of 2 ng/mL to 2000 ng/mL (R approximately 0.99998). The method was validated for accuracy and precision. The

  3. Validation of cleaning method for various parts fabricated at a Beryllium facility

    SciTech Connect

    Davis, Cynthia M.

    2015-12-15

    This study evaluated and documented a cleaning process that is used to clean parts that are fabricated at a beryllium facility at Los Alamos National Laboratory. The purpose of evaluating this cleaning process was to validate and approve it for future use to assure beryllium surface levels are below the Department of Energy’s release limits without the need to sample all parts leaving the facility. Inhaling or coming in contact with beryllium can cause an immune response that can result in an individual becoming sensitized to beryllium, which can then lead to a disease of the lungs called chronic beryllium disease, and possibly lung cancer. Thirty aluminum and thirty stainless steel parts were fabricated on a lathe in the beryllium facility, as well as thirty-two beryllium parts, for the purpose of testing a parts cleaning method that involved the use of ultrasonic cleaners. A cleaning method was created, documented, validated, and approved, to reduce beryllium contamination.

  4. How to develop and validate a total organic carbon method for cleaning applications.

    PubMed

    Clark, K

    2001-01-01

    Good Manufacturing Practices require that the cleaning of drug manufacturing equipment be validated. Common analytical techniques used in the validation process include HPLC, UV/Vis, and Total Organic Carbon (TOC). HPLC and UV/Vis are classified as specific methods that identify and measure appropriate active substances. TOC is classified as a non-specific method and can detect all carbon-containing compounds, including active substances, excipients, and cleaning agents. The disadvantage of specific methods is that a new procedure must be developed for every active drug substance that is manufactured. This development process can be very time consuming and tedious. In contrast, one TOC method can potentially be used for all products. A TOC method is sensitive to the ppb range and is less time consuming than HPLC or UV/Vis. USP TOC methods are standard for Water for Injection and Purified Water, and simple modifications of these methods can be used for cleaning validation. The purpose of this study is to demonstrate how to develop and validate a TOC method for cleaning applications. Performance parameters evaluated in this study include linearity, MDL, LOQ, accuracy, precision, and swab recovery.

  5. The Bland-Altman Method Should Not Be Used in Regression Cross-Validation Studies

    ERIC Educational Resources Information Center

    O'Connor, Daniel P.; Mahar, Matthew T.; Laughlin, Mitzi S.; Jackson, Andrew S.

    2011-01-01

    The purpose of this study was to demonstrate the bias in the Bland-Altman (BA) limits of agreement method when it is used to validate regression models. Data from 1,158 men were used to develop three regression equations to estimate maximum oxygen uptake (R[superscript 2] = 0.40, 0.61, and 0.82, respectively). The equations were evaluated in a…

  6. LC-MS quantification of protein drugs: validating protein LC-MS methods with predigestion immunocapture.

    PubMed

    Duggan, Jeffrey; Ren, Bailuo; Mao, Yan; Chen, Lin-Zhi; Philip, Elsy

    2016-09-01

    A refinement of protein LC-MS bioanalysis is to use predigestion immunoaffinity capture to extract the drug from matrix prior to digestion. Because of their increased sensitivity, such hybrid assays have been successfully validated and applied to a number of clinical studies; however, they can also be subject to potential interferences from antidrug antibodies, circulating ligands or other matrix components specific to patient populations and/or dosed subjects. The purpose of this paper is to describe validation experiments that measure immunocapture efficiency, digestion efficiency, matrix effect and selectivity/specificity that can be used during method optimization and validation to test the resistance of the method to these potential interferences. The designs and benefits of these experiments are discussed in this report using an actual assay case study.

  7. An individual and dynamic Body Segment Inertial Parameter validation method using ground reaction forces.

    PubMed

    Hansen, Clint; Venture, Gentiane; Rezzoug, Nasser; Gorce, Philippe; Isableu, Brice

    2014-05-07

    Over the last decades a variety of research has been conducted with the goal to improve the Body Segment Inertial Parameters (BSIP) estimations but to our knowledge a real validation has never been completely successful, because no ground truth is available. The aim of this paper is to propose a validation method for a BSIP identification method (IM) and to confirm the results by comparing them with recalculated contact forces using inverse dynamics to those obtained by a force plate. Furthermore, the results are compared with the recently proposed estimation method by Dumas et al. (2007). Additionally, the results are cross validated with a high velocity overarm throwing movement. Throughout conditions higher correlations, smaller metrics and smaller RMSE can be found for the proposed BSIP estimation (IM) which shows its advantage compared to recently proposed methods as of Dumas et al. (2007). The purpose of the paper is to validate an already proposed method and to show that this method can be of significant advantage compared to conventional methods.

  8. Methods for Geometric Data Validation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  9. Validity of body composition methods across ethnic population groups.

    PubMed

    Deurenberg, P; Deurenberg-Yap, M

    2003-10-01

    Most in vivo body composition methods rely on assumptions that may vary among different population groups as well as within the same population group. The assumptions are based on in vitro body composition (carcass) analyses. The majority of body composition studies were performed on Caucasians and much of the information on validity methods and assumptions were available only for this ethnic group. It is assumed that these assumptions are also valid for other ethnic groups. However, if apparent differences across ethnic groups in body composition 'constants' and body composition 'rules' are not taken into account, biased information on body composition will be the result. This in turn may lead to misclassification of obesity or underweight at an individual as well as a population level. There is a need for more cross-ethnic population studies on body composition. Those studies should be carried out carefully, with adequate methodology and standardization for the obtained information to be valuable.

  10. Validation of chemistry models employed in a particle simulation method

    NASA Technical Reports Server (NTRS)

    Haas, Brian L.; Mcdonald, Jeffrey D.

    1991-01-01

    The chemistry models employed in a statistical particle simulation method, as implemented in the Intel iPSC/860 multiprocessor computer, are validated and applied. Chemical relaxation of five-species air in these reservoirs involves 34 simultaneous dissociation, recombination, and atomic-exchange reactions. The reaction rates employed in the analytic solutions are obtained from Arrhenius experimental correlations as functions of temperature for adiabatic gas reservoirs in thermal equilibrium. Favorable agreement with the analytic solutions validates the simulation when applied to relaxation of O2 toward equilibrium in reservoirs dominated by dissociation and recombination, respectively, and when applied to relaxation of air in the temperature range 5000 to 30,000 K. A flow of O2 over a circular cylinder at high Mach number is simulated to demonstrate application of the method to multidimensional reactive flows.

  11. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset

    PubMed Central

    Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R.

    2016-01-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo “paired-recordings” such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  12. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    PubMed

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study.

  13. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  14. Calculation spreadsheet for uncertainty estimation of measurement results in gamma-ray spectrometry and its validation for quality assurance purpose.

    PubMed

    Ceccatelli, Alessia; Dybdal, Ashild; Fajgelj, Ales; Pitois, Aurelien

    2017-03-03

    An Excel calculation spreadsheet has been developed to estimate the uncertainty of measurement results in γ-ray spectrometry. It considers all relevant uncertainty components and calculates the combined standard uncertainty of the measurement result. The calculation spreadsheet has been validated using two independent open access software and is available for download free of charge at: https://nucleus.iaea.org/rpst/ReferenceProducts/Analytical_Methods/index.htm. It provides a simple and easy-to-use template for estimating the uncertainty of γ-ray spectrometry measurement results and supports the radioanalytical laboratories seeking accreditation for their measurements using γ-ray spectrometry.

  15. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research.

    PubMed

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly

    2015-09-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  16. Flexibility and applicability of β-expectation tolerance interval approach to assess the fitness of purpose of pharmaceutical analytical methods.

    PubMed

    Bouabidi, A; Talbi, M; Bourichi, H; Bouklouze, A; El Karbane, M; Boulanger, B; Brik, Y; Hubert, Ph; Rozet, E

    2012-12-01

    An innovative versatile strategy using Total Error has been proposed to decide about the method's validity that controls the risk of accepting an unsuitable assay together with the ability to predict the reliability of future results. This strategy is based on the simultaneous combination of systematic (bias) and random (imprecision) error of analytical methods. Using validation standards, both types of error are combined through the use of a prediction interval or β-expectation tolerance interval. Finally, an accuracy profile is built by connecting, on one hand all the upper tolerance limits, and on the other hand all the lower tolerance limits. This profile combined with pre-specified acceptance limits allows the evaluation of the validity of any quantitative analytical method and thus their fitness for their intended purpose. In this work, the approach of accuracy profile was evaluated on several types of analytical methods encountered in the pharmaceutical industrial field and also covering different pharmaceutical matrices. The four studied examples depicted the flexibility and applicability of this approach for different matrices ranging from tablets to syrups, different techniques such as liquid chromatography, or UV spectrophotometry, and for different categories of assays commonly encountered in the pharmaceutical industry i.e. content assays, dissolution assays, and quantitative impurity assays. The accuracy profile approach assesses the fitness of purpose of these methods for their future routine application. It also allows the selection of the most suitable calibration curve, the adequate evaluation of a potential matrix effect and propose efficient solution and the correct definition of the limits of quantification of the studied analytical procedures.

  17. Validation of an Impedance Education Method in Flow

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Parrott, Tony L.

    2004-01-01

    This paper reports results of a research effort to validate a method for educing the normal incidence impedance of a locally reacting liner, located in a grazing incidence, nonprogressive acoustic wave environment with flow. The results presented in this paper test the ability of the method to reproduce the measured normal incidence impedance of a solid steel plate and two soft test liners in a uniform flow. The test liners are known to be locally react- ing and exhibit no measurable amplitude-dependent impedance nonlinearities or flow effects. Baseline impedance spectra for these liners were therefore established from measurements in a conventional normal incidence impedance tube. A key feature of the method is the expansion of the unknown impedance function as a piecewise continuous polynomial with undetermined coefficients. Stewart's adaptation of the Davidon-Fletcher-Powell optimization algorithm is used to educe the normal incidence impedance at each Mach number by optimizing an objective function. The method is shown to reproduce the measured normal incidence impedance spectrum for each of the test liners, thus validating its usefulness for determining the normal incidence impedance of test liners for a broad range of source frequencies and flow Mach numbers. Nomenclature

  18. Total organic carbon method for aspirin cleaning validation.

    PubMed

    Holmes, A J; Vanderwielen, A J

    1997-01-01

    Cleaning validation is the process of assuring that cleaning procedures effectively remove the residue from manufacturing equipment/facilities below a predetermined level. This is necessary to assure the quality of future products using the equipment, to prevent cross-contamination, and as a World Health Organization Good Manufacturing Practices requirement. We have applied the Total Organic Carbon (TOC) analysis method to a number of pharmaceutical products. In this article we discuss the TOC method that we developed for measuring residual aspirin on aluminum, stainless steel, painted carbon steel, and plexiglass. These are all surfaces that are commonly found as part of pharmaceutical production equipment. The method offers low detection capability (parts per million levels) and rapid sample analysis time. The recovery values ranged from 25% for aluminum to about 75% for plexiglass with a precision of 13% or less. The results for the plexiglass tended to vary with the age of the surface making the determination of an accurate recovery value difficult for this type of surface. We found that the TOC method is applicable for determining residual aspirin on pharmaceutical surfaces and will be useful for cleaning validation.

  19. Bioanalytical method development and validation: Critical concepts and strategies.

    PubMed

    Moein, Mohammad Mahdi; El Beqqali, Aziza; Abdel-Rehim, Mohamed

    2017-02-01

    Bioanalysis is an essential part in drug discovery and development. Bioanalysis is related to the analysis of analytes (drugs, metabolites, biomarkers) in biological samples and it involves several steps from sample collection to sample analysis and data reporting. The first step is sample collection from clinical or preclinical studies; then sending the samples to laboratory for analysis. Second step is sample clean-up (sample preparation) and it is very important step in bioanalysis. In order to reach reliable results, a robust and stable sample preparation method should be applied. The role of sample preparation is to remove interferences from sample matrix and improve analytical system performance. Sample preparation is often labor intensive and time consuming. Last step is the sample analysis and detection. For separation and detection, liquid chromatography-tandem mass spectrometry (LC-MS/MS) is method of choice in bioanalytical laboratories. This is due to high selectivity and high sensitivity of the LC-MS/MS technique. In addition the information about the analyte chemical structure and chemical properties is important to be known before the start of bioanalytical work. This review provides an overview of bioanalytical method development and validation. The main principles of method validation will be discussed. In this review GLP and regulated bioanalysis are described. Commonly used sample preparation techniques will be presented. In addition the role of LC-MS/MS in modern bioanalysis will be discussed. In the present review we have our focus on bioanalysis of small molecules.

  20. Method validation strategies involved in non-targeted metabolomics.

    PubMed

    Naz, Shama; Vallejo, Maria; García, Antonia; Barbas, Coral

    2014-08-01

    Non-targeted metabolomics is the hypothesis generating, global unbiased analysis of all the small-molecule metabolites present within a biological system, under a given set of conditions. It includes several common steps such as selection of biological samples, sample pre-treatment, analytical conditions set-up, acquiring data, data analysis by chemometrics, database search and biological interpretation. Non-targeted metabolomics offers the potential for a holistic approach in the area of biomedical research in order to improve disease diagnosis and to understand its pathological mechanisms. Various analytical methods have been developed based on nuclear magnetic resonance spectroscopy (NMR) and mass spectrometry (MS) coupled with different separation techniques. The key points in any analytical method development are the validation of every step to get a reliable and reproducible result and non-targeted metabolomics is not beyond this criteria, although analytical challenges are completely new and different to target methods. This review paper will describe the available validation strategies that are being used and as well will recommend some steps to consider during a non-targeted metabolomics analytical method development.

  1. Methodology for the validation of analytical methods involved in uniformity of dosage units tests.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2013-01-14

    Validation of analytical methods is required prior to their routine use. In addition, the current implementation of the Quality by Design (QbD) framework in the pharmaceutical industries aims at improving the quality of the end products starting from its early design stage. However, no regulatory guideline or none of the published methodologies to assess method validation propose decision methodologies that effectively take into account the final purpose of developed analytical methods. In this work a solution is proposed for the specific case of validating analytical methods involved in the assessment of the content uniformity or uniformity of dosage units of a batch of pharmaceutical drug products as proposed in the European or US pharmacopoeias. This methodology uses statistical tolerance intervals as decision tools. Moreover it adequately defines the Analytical Target Profile of analytical methods in order to obtain analytical methods that allow to make correct decisions about Content uniformity or uniformity of dosage units with high probability. The applicability of the proposed methodology is further illustrated using an HPLC-UV assay as well as a near infra-red spectrophotometric method.

  2. Determination of Al in cake mix: Method validation and estimation of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Andrade, G.; Rocha, O.; Junqueira, R.

    2016-07-01

    An analytical method for the determination of Al in cake mix was developed. Acceptable values were obtained for the following parameters: linearity, detection limit - LOD (5.00 mg-kg-1) quantification limit - LOQ (12.5 mg-kg-1), the recovery assays values (between 91 and 102%), the relative standard deviation under repeatability and within-reproducibility conditions (<20.0%) and measurement uncertainty tests (<10.0%) The results of the validation process showed that the proposed method is fitness for purpose.

  3. Brazilian Center for the Validation of Alternative Methods (BraCVAM) and the process of validation in Brazil.

    PubMed

    Presgrave, Octavio; Moura, Wlamir; Caldeira, Cristiane; Pereira, Elisabete; Bôas, Maria H Villas; Eskes, Chantra

    2016-03-01

    The need for the creation of a Brazilian centre for the validation of alternative methods was recognised in 2008, and members of academia, industry and existing international validation centres immediately engaged with the idea. In 2012, co-operation between the Oswaldo Cruz Foundation (FIOCRUZ) and the Brazilian Health Surveillance Agency (ANVISA) instigated the establishment of the Brazilian Center for the Validation of Alternative Methods (BraCVAM), which was officially launched in 2013. The Brazilian validation process follows OECD Guidance Document No. 34, where BraCVAM functions as the focal point to identify and/or receive requests from parties interested in submitting tests for validation. BraCVAM then informs the Brazilian National Network on Alternative Methods (RENaMA) of promising assays, which helps with prioritisation and contributes to the validation studies of selected assays. A Validation Management Group supervises the validation study, and the results obtained are peer-reviewed by an ad hoc Scientific Review Committee, organised under the auspices of BraCVAM. Based on the peer-review outcome, BraCVAM will prepare recommendations on the validated test method, which will be sent to the National Council for the Control of Animal Experimentation (CONCEA). CONCEA is in charge of the regulatory adoption of all validated test methods in Brazil, following an open public consultation.

  4. Characterization and validation of a Portuguese natural reference soil to be used as substrate for ecotoxicological purposes.

    PubMed

    Caetano, A L; Gonçalves, F; Sousa, J P; Cachada, A; Pereira, E; Duarte, A C; Ferreira da Silva, E; Pereira, R

    2012-03-01

    This study describes the first attempt to validate a Portuguese natural soil (PTRS1) to be used as reference soil for ecotoxicological purposes, aimed to both: (i) obtain ecotoxicological data for the derivation of Soil Screening Values (SSVs) with regional relevance, acting as a substrate to be spiked with ranges of concentrations of the chemicals under evaluation and (ii) act as control and as substrate for the dilution of contaminated soils in ecotoxicological assays performed to evaluate the ecotoxicity of contaminated soils, in tier 2 of risk assessment frameworks, applied to contaminated lands. The PTRS1 is a cambisol from a granitic area integrated in the Central Iberian Zone. After chemical characterization of the soil in terms of pseudo-total metals, PAHs, PCBs and pesticide contents, it was possible to perceive that some metals (Ba, Be, Co, Cr and V) surpass the Dutch Target Values (Dtvs) corrected for the percentage of organic matter and clay of the PTRS1. Nevertheless, these metals displayed total concentrations below the background total concentrations described for Portuguese soils in general. The same was observed for aldrin, endosulfan I, endosulfan II, heptachlor epoxide, and heptachlor; however the Dtvs corrected become negligible. The performance of invertebrate and plant species, commonly used in standard ecotoxicological assays, was not compromised by both soil properties and soil metal contents. The results obtained suggest that the PTRS1 can be used as a natural reference soil in ecotoxicological assays carried out under the scope of ecological risk assessment.

  5. Validation of spectrophotometric method for lactulose assay in syrup preparation

    NASA Astrophysics Data System (ADS)

    Mahardhika, Andhika Bintang; Novelynda, Yoshella; Damayanti, Sophi

    2015-09-01

    Lactulose is a synthetic disaccharide widely used in food and pharmaceutical fields. In the pharmaceutical field, lactulose is used as osmotic laxative in a syrup dosage form. This research was aimed to validate the spectrophotometric method to determine the levels of lactulose in syrup preparation and the commercial sample. Lactulose is hydrolyzed by hydrochloric acid to form fructose and galactose. The fructose was reacted with resorcinol reagent, forming compounds that give absorption peak at 485 nm. Analytical methods was validated, hereafter lactulose content in syrup preparation were determined. The calibration curve was linear in the range of 30-100 μg/mL with a correlation coefficient (r) of 0.9996, coefficient of variance (Vxo) of 1.1 %, limit of detection of 2.32 μg/mL, and limit of quantitation of 7.04 μg/mL. The result of accuracy test for the lactulose assay in the syrup preparation showed recoveries of 96.6 to 100.8 %. Repeatability test of lactulose assay in standard solution of lactulose and sample preparation syrup showed the coefficient of variation (CV) of 0.75 % and 0.7 %. Intermediate precision (interday) test resulted in coefficient of variation 1.06 % on the first day, the second day by 0.99 %, and 0.95 % for the third day. This research gave a valid analysis method and levels of lactulose in syrup preparations of samples A, B, C were 101.6, 100.5, and 100.6 %, respectively.

  6. Validation of two methods for fatty acids analysis in eggs.

    PubMed

    Mazalli, Mônica R; Bragagnolo, Neura

    2007-05-01

    A comparative study between two methods (lipid extraction followed by saponification and methylation, and direct methylation) to determine the fatty acids in egg yolk was evaluated. Direct methylation of the samples resulted in lower fatty acid content and greater variation in the results than the lipid extraction followed by saponification and methylation. The low repeatability observed for the direct HCl methylation method was probably due to a less efficient extraction and conversion of the fatty acids into their methyl esters as compared to the same procedure starting with the lipid extract. As the lipid extraction followed by esterification method was shown to be more precise it was validated using powdered egg certified as reference material (RM 8415, NIST) and applied to samples of egg, egg enriched with polyunsaturated omega-3 fatty acids (n-3 PUFA), and commercial spray-dried whole egg powder.

  7. Validation of a hybrid life-cycle inventory analysis method.

    PubMed

    Crawford, Robert H

    2008-08-01

    The life-cycle inventory analysis step of a life-cycle assessment (LCA) may currently suffer from several limitations, mainly concerned with the use of incomplete and unreliable data sources and methods of assessment. Many past LCA studies have used traditional inventory analysis methods, namely process analysis and input-output analysis. More recently, hybrid inventory analysis methods have been developed, combining these two traditional methods in an attempt to minimise their limitations. In light of recent improvements, these hybrid methods need to be compared and validated, as these too have been considered to have several limitations. This paper evaluates a recently developed hybrid inventory analysis method which aims to improve the limitations of previous methods. It was found that the truncation associated with process analysis can be up to 87%, reflecting the considerable shortcomings in the quantity of process data currently available. Capital inputs were found to account for up to 22% of the total inputs to a particular product. These findings suggest that current best-practice methods are sufficiently accurate for most typical applications, but this is heavily dependent upon data quality and availability. The use of input-output data assists in improving the system boundary completeness of life-cycle inventories. However, the use of input-output analysis alone does not always provide an accurate model for replacing process data. Further improvements in the quantity of process data currently available are needed to increase the reliability of life-cycle inventories.

  8. Method validation for methanol quantification present in working places

    NASA Astrophysics Data System (ADS)

    Muna, E. D. M.; Bizarri, C. H. B.; Maciel, J. R. M.; da Rocha, G. P.; de Araújo, I. O.

    2015-01-01

    Given the widespread use of methanol by different industry sectors and high toxicity associated with this substance, it is necessary to use an analytical method able to determine in a sensitive, precise and accurate levels of methanol in the air of working environments. Based on the methodology established by the National Institute for Occupational Safety and Health (NIOSH), it was validated a methodology for determination of methanol in silica gel tubes which had demonstrated its effectiveness based on the participation of the international collaborative program sponsored by the American Industrial Hygiene Association (AIHA).

  9. Systematic method for the validation of long-term temperature measurements

    NASA Astrophysics Data System (ADS)

    Abdel-Jaber, H.; Glisic, B.

    2016-12-01

    Structural health monitoring (SHM) is the process of collecting and analyzing measurements of various structural and environmental parameters on a structure for the purpose of formulating conclusions on the performance and condition of the structure. Accurate long-term temperature data is critical for SHM applications as it is often used to compensate other measurements (e.g., strain), or to understand the thermal behavior of the structure. Despite the need for accurate long-term temperature data, there are currently no validation methods to ensure the accuracy of collected data. This paper researches and presents a novel method for the validation of long-term temperature measurements from any type of sensors. The method relies on modeling the dependence of temperature measurements inside a structure on the ambient temperature measurements collected from a reliable nearby weather tower. The model is then used to predict future measurements and assess whether or not future measurements conform to predictions. The paper presents both the model selection process, as well as the sensor malfunction detection process. To illustrate and validate the method, it is applied to data from a monitoring system installed on a real structure, Streicker Bridge on the Princeton University campus. Application of the method to data collected from about forty sensors over five years showed the potential of the method to categorize normal sensor function, as well as characterize sensor defect and minor drift.

  10. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    PubMed

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population.

  11. Knowledge Transmission versus Social Transformation: A Critical Analysis of Purpose in Elementary Social Studies Methods Textbooks

    ERIC Educational Resources Information Center

    Butler, Brandon M.; Suh, Yonghee; Scott, Wendy

    2015-01-01

    In this article, the authors investigate the extent to which 9 elementary social studies methods textbooks present the purpose of teaching and learning social studies. Using Stanley's three perspectives of teaching social studies for knowledge transmission, method of intelligence, and social transformation; we analyze how these texts prepare…

  12. Experimental validation of boundary element methods for noise prediction

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Oswald, Fred B.

    1992-01-01

    Experimental validation of methods to predict radiated noise is presented. A combined finite element and boundary element model was used to predict the vibration and noise of a rectangular box excited by a mechanical shaker. The predicted noise was compared to sound power measured by the acoustic intensity method. Inaccuracies in the finite element model shifted the resonance frequencies by about 5 percent. The predicted and measured sound power levels agree within about 2.5 dB. In a second experiment, measured vibration data was used with a boundary element model to predict noise radiation from the top of an operating gearbox. The predicted and measured sound power for the gearbox agree within about 3 dB.

  13. Examining the Content Validity of the WHOQOL-BRF from Respondents' Perspective by Quantitative Methods

    ERIC Educational Resources Information Center

    Yao, Grace; Wu, Chia-Huei; Yang, Cheng-Ta

    2008-01-01

    Content validity, the extent to which a measurement reflects the specific intended domain of content, is a basic type of validity for a valid measurement. It was usually examined qualitatively and relied on experts' subjective judgments, not on respondents' responses. Therefore, the purpose of this study was to introduce and demonstrate how to use…

  14. Validated spectrophotometric methods for determination of some oral hypoglycemic drugs.

    PubMed

    Farouk, M; Abdel-Satar, O; Abdel-Aziz, O; Shaaban, M

    2011-02-01

    Four accurate, precise, rapid, reproducible, and simple spectrophotometric methods were validated for determination of repaglinide (RPG), pioglitazone hydrochloride (PGL) and rosiglitazone maleate (RGL). The first two methods were based on the formation of a charge-transfer purple-colored complex of chloranilic acid with RPG and RGL with a molar absorptivity 1.23 × 103 and 8.67 × 102 l•mol-1•cm-1 and a Sandell's sensitivity of 0.367 and 0.412 μg•cm-2, respectively, and an ion-pair yellow-colored complex of bromophenol blue with RPG, PGL and RGL with molar absorptivity 8.86 × 103, 6.95 × 103, and 7.06 × 103 l•mol-1•cm-1, respectively, and a Sandell's sensitivity of 0.051 μg•cm-2 for all ion-pair complexes. The influence of different parameters on color formation was studied to determine optimum conditions for the visible spectrophotometric methods. The other spectrophotometric methods were adopted for demtermination of the studied drugs in the presence of their acid-, alkaline- and oxidative-degradates by computing derivative and pH-induced difference spectrophotometry, as stability-indicating techniques. All the proposed methods were validated according to the International Conference on Harmonization guidelines and successfully applied for determination of the studied drugs in pure form and in pharmaceutical preparations with good extraction recovery ranges between 98.7-101.4%, 98.2-101.3%, and 99.9-101.4% for RPG, PGL, and RGL, respectively. Results of relative standard deviations did not exceed 1.6%, indicating that the proposed methods having good repeatability and reproducibility. All the obtained results were statistically compared to the official method used for RPG analysis and the manufacturers methods used for PGL and RGL analysis, respectively, where no significant differences were found.

  15. Validation and applications of an expedited tablet friability method.

    PubMed

    Osei-Yeboah, Frederick; Sun, Changquan Calvin

    2015-04-30

    The harmonized monograph on tablet friability test in United States Pharmacopeia (USP), European Pharmacopeia (Pharm. Eur.), and Japanese Pharmacopeia (JP) is designed to assess adequacy of mechanical strength of a batch of tablets. Currently, its potential applications in formulation development have been limited due to the batch requirement that is both labor and material intensive. To this end, we have developed an expedited tablet friability test method, using the existing USP test apparatus. The validity of the expedited friability method is established by showing that the friability data from the expedited method is not statistically different from those from the standard pharmacopeia method using materials of very different mechanical properties, i.e., microcrystalline cellulose and dibasic calcium phosphate dihydrate. Using the expedited friability method, we have shown that the relationship between tablet friability and tablet mechanical strength follows a power law expression. Furthermore, potential applications of this expedited friability test in facilitating systematic and efficient tablet formulation and tooling design are demonstrated with examples.

  16. Validated spectrofluorometric methods for determination of amlodipine besylate in tablets

    NASA Astrophysics Data System (ADS)

    Abdel-Wadood, Hanaa M.; Mohamed, Niveen A.; Mahmoud, Ashraf M.

    2008-08-01

    Two simple and sensitive spectrofluorometric methods have been developed and validated for determination of amlodipine besylate (AML) in tablets. The first method was based on the condensation reaction of AML with ninhydrin and phenylacetaldehyde in buffered medium (pH 7.0) resulting in formation of a green fluorescent product, which exhibits excitation and emission maxima at 375 and 480 nm, respectively. The second method was based on the reaction of AML with 7-chloro-4-nitro-2,1,3-benzoxadiazole (NBD-Cl) in a buffered medium (pH 8.6) resulting in formation of a highly fluorescent product, which was measured fluorometrically at 535 nm ( λex, 480 nm). The factors affecting the reactions were studied and optimized. Under the optimum reaction conditions, linear relationships with good correlation coefficients (0.9949-0.9997) were found between the fluorescence intensity and the concentrations of AML in the concentration range of 0.35-1.8 and 0.55-3.0 μg ml -1 for ninhydrin and NBD-Cl methods, respectively. The limits of assays detection were 0.09 and 0.16 μg ml -1 for the first and second method, respectively. The precisions of the methods were satisfactory; the relative standard deviations were ranged from 1.69 to 1.98%. The proposed methods were successfully applied to the analysis of AML in pure and pharmaceutical dosage forms with good accuracy; the recovery percentages ranged from 100.4-100.8 ± 1.70-2.32%. The results were compared favorably with those of the reported method.

  17. Methods for detecting residues of cleaning agents during cleaning validation.

    PubMed

    Westman, L; Karlsson, G

    2000-01-01

    Cleaning validation procedures are carried out in order to assure that residues of cleaning agents are within acceptable limits after the cleaning process. Cleaning agents often consist of a mixture of various surfactants which are in a highly diluted state after the water rinsing procedure has been completed. This makes it difficult to find appropriate analytical methods that are sensitive enough to detect the cleaning agents. In addition, it is advantageous for the analytical methods to be simple to perform and to give results quickly. In this study, four different analytical methods are compared: visual detection of foam, pH, conductivity measurements, and analysis of total organic carbon (TOC). TOC was used as a reference method when evaluating the other three potential methods. The analyses were performed on different dilutions of the cleaning agents Vips Neutral, RBS-25, Debisan and Perform. The results demonstrated that the most sensitive method for analysis of Vips Neutral, Debisan and Perform is visual detection of foam, by which it is possible to detect concentrations of cleaning agents down to 10 micrograms/mL. RBS-25 was not detected below 200 micrograms/mL, probably because it is formulated with low-foaming surfactants. TOC analysis is less sensitive but has the advantage of being a quantitative analysis, while visual detection of foam is a semi-quantitative method. Visual detection of foam is easy to perform, gives a quick result, and requires no expensive instrumentation. The sensitivity of each method was found to be dependent upon the type of cleaning agent that was analyzed.

  18. Spectrum-transformed sequential testing method for signal validation applications

    SciTech Connect

    Gross, K.C.; Hoyer, K.K.

    1992-06-01

    The Sequential Probability Ratio Test (SPRT) has proven to be a valuable tool in a variety of reactor applications for signal validation and for sensor and equipment operability surveillance. One drawback of the conventional SPRT method is that its domain of application is limited to signals that are contaminated by gaussian white noise. Nongaussian process variables contaminated by serial correlation can produce higher-than-specified rates of false alarms and missed alarms for SPRT-based surveillance systems. To overcome this difficulty we present here the development and computer implementation of a new technique, the spectrum-transformed sequential testing method. This method retains the excellent surveillance advantage of the SPRT (extremely high sensitivity for very early annunciation of the onset of disturbances in monitored signals), and its false-alarm and missed-alarm probabilities are unaffected by the presence of serial correlation in the data. Example applications of the new method to serially-correlated reactor variables are demonstrated using data recorded from EBR-II.

  19. Spectrum-transformed sequential testing method for signal validation applications

    SciTech Connect

    Gross, K.C. ); Hoyer, K.K. . Dept. of Industrial Engineering and Management Sciences)

    1992-01-01

    The Sequential Probability Ratio Test (SPRT) has proven to be a valuable tool in a variety of reactor applications for signal validation and for sensor and equipment operability surveillance. One drawback of the conventional SPRT method is that its domain of application is limited to signals that are contaminated by gaussian white noise. Nongaussian process variables contaminated by serial correlation can produce higher-than-specified rates of false alarms and missed alarms for SPRT-based surveillance systems. To overcome this difficulty we present here the development and computer implementation of a new technique, the spectrum-transformed sequential testing method. This method retains the excellent surveillance advantage of the SPRT (extremely high sensitivity for very early annunciation of the onset of disturbances in monitored signals), and its false-alarm and missed-alarm probabilities are unaffected by the presence of serial correlation in the data. Example applications of the new method to serially-correlated reactor variables are demonstrated using data recorded from EBR-II.

  20. Validating for Use and Interpretation: A Mixed Methods Contribution Illustrated

    ERIC Educational Resources Information Center

    Morell, Linda; Tan, Rachael Jin Bee

    2009-01-01

    Researchers in the areas of psychology and education strive to understand the intersections among validity, educational measurement, and cognitive theory. Guided by a mixed model conceptual framework, this study investigates how respondents' opinions inform the validation argument. Validity evidence for a science assessment was collected through…

  1. Validation of a digital PCR method for quantification of DNA copy number concentrations by using a certified reference material.

    PubMed

    Deprez, Liesbet; Corbisier, Philippe; Kortekaas, Anne-Marie; Mazoua, Stéphane; Beaz Hidalgo, Roxana; Trapmann, Stefanie; Emons, Hendrik

    2016-09-01

    Digital PCR has become the emerging technique for the sequence-specific detection and quantification of nucleic acids for various applications. During the past years, numerous reports on the development of new digital PCR methods have been published. Maturation of these developments into reliable analytical methods suitable for diagnostic or other routine testing purposes requires their validation for the intended use. Here, the results of an in-house validation of a droplet digital PCR method are presented. This method is intended for the quantification of the absolute copy number concentration of a purified linearized plasmid in solution with a nucleic acid background. It has been investigated which factors within the measurement process have a significant effect on the measurement results, and the contribution to the overall measurement uncertainty has been estimated. A comprehensive overview is provided on all the aspects that should be investigated when performing an in-house method validation of a digital PCR method.

  2. Forward Modeling of Electromagnetic Methods Using General Purpose Finite Element Software

    NASA Astrophysics Data System (ADS)

    Butler, S. L.

    2015-12-01

    Electromagnetic methods are widely used in mineral exploration and environmental applications and are increasingly being used in hydrocarbon exploration. Forward modeling of electromagnetic methods remains challenging and is mostly carried out using purpose-built research software. General purpose commercial modeling software has become increasingly flexible and powerful in recent years and is now capable of modeling field geophysical electromagnetic techniques. In this contribution, I will show examples of the use of commercial finite element modeling software Comsol Multiphysics for modeling frequency and time-domain electromagnetic techniques as well as for modeling the Very Low Frequency technique and magnetometric resistivity. Comparisons are made with analytical solutions, benchmark numerical solutions, analog experiments and field data. Although some calculations take too long to be practical as part of an inversion scheme, I suggest that modeling of this type will be useful for modeling novel techniques and for educational purposes.

  3. Testing and Validation of the Dynamic Inertia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.

  4. A validated stability indicating LC method for oxcarbazepine.

    PubMed

    Pathare, D B; Jadhav, A S; Shingare, M S

    2007-04-11

    The present paper describes the development of a stability indicating reversed phase liquid chromatographic (RPLC) method for oxcarbazepine in the presence of its impurities and degradation products generated from forced decomposition studies. The drug substance was subjected to stress conditions of hydrolysis, oxidation, photolysis and thermal degradation. The degradation of oxcarbazepine was observed under base hydrolysis. The drug was found to be stable to other stress conditions attempted. Successful separation of the drug from the synthetic impurities and degradation product formed under stress conditions was achieved on a C18 column using mixture of aqueous 0.02 M potassium dihydrogen phosphate-acetonitrile-methanol (45:35:20, v/v/v) as mobile phase. The developed HPLC method was validated with respect to linearity, accuracy, precision, specificity and robustness. The developed HPLC method to determine the related substances and assay determination of oxcarbazepine can be used to evaluate the quality of regular production samples. It can be also used to test the stability samples of oxcarbazepine.

  5. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  6. New validated method for piracetam HPLC determination in human plasma.

    PubMed

    Curticapean, Augustin; Imre, Silvia

    2007-01-10

    The new method for HPLC determination of piracetam in human plasma was developed and validated by a new approach. The simple determination by UV detection was performed on supernatant, obtained from plasma, after proteins precipitation with perchloric acid. The chromatographic separation of piracetam under a gradient elution was achieved at room temperature with a RP-18 LiChroSpher 100 column and aqueous mobile phase containing acetonitrile and methanol. The quantitative determination of piracetam was performed at 200 nm with a lower limit of quantification LLQ=2 microg/ml. For this limit, the calculated values of the coefficient of variation and difference between mean and the nominal concentration are CV%=9.7 and bias%=0.9 for the intra-day assay, and CV%=19.1 and bias%=-7.45 for the between-days assay. For precision, the range was CV%=1.8/11.6 in the intra-day and between-days assay, and for accuracy, the range was bias%=2.3/14.9 in the intra-day and between-days assay. In addition, the stability of piracetam in different conditions was verified. Piracetam proved to be stable in plasma during 4 weeks at -20 degrees C and for 36 h at 20 degrees C in the supernatant after protein precipitation. The new proposed method was used for a bioequivalence study of two medicines containing 800 mg piracetam.

  7. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... isotopic spiking? You must analyze the bias, precision, relative standard deviation, and data acceptance... established from data obtained during your validation test. Methods that have bias correction factors outside... data or that they be liberalized (e.g., increase the minimum hold time from 24 hours to 48 to 72...

  8. Design, development and method validation of a novel multi-resonance microwave sensor for moisture measurement.

    PubMed

    Peters, Johanna; Taute, Wolfgang; Bartscher, Kathrin; Döscher, Claas; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg

    2017-04-08

    Microwave sensor systems using resonance technology at a single resonance in the range of 2-3 GHz have been shown to be a rapid and reliable tool for moisture determination in solid materials including pharmaceutical granules. So far, their application is limited to lower moisture ranges or limitations above certain moisture contents had to be accepted. Aim of the present study was to develop a novel multi-resonance sensor system in order to expand the measurement range. Therefore, a novel sensor using additional resonances over a wide frequency band was designed and used to investigate inherent limitations of first generation sensor systems and material-related limits. Using granule samples with different moisture contents, an experimental protocol for calibration and validation of the method was established. Pursuant to this protocol, a multiple linear regression (MLR) prediction model built by correlating microwave moisture values to the moisture determined by Karl Fischer titration was chosen and rated using conventional criteria such as coefficient of determination (R(2)) and root mean square error of calibration (RMSEC). Using different operators, different analysis dates and different ambient conditions the method was fully validated following the guidance of ICH Q2(R1). The study clearly showed explanations for measurement uncertainties of first generation sensor systems which confirmed the approach to overcome these by using additional resonances. The established prediction model could be validated in the range of 7.6-19.6%, demonstrating its fit for its future purpose, the moisture content determination during wet granulations.

  9. An Investigation of Pre-Service Middle School Mathematics Teachers' Ability to Conduct Valid Proofs, Methods Used, and Reasons for Invalid Arguments

    ERIC Educational Resources Information Center

    Demiray, Esra; Isiksal Bostan, Mine

    2017-01-01

    The purposes of this study are to investigate Turkish pre-service middle school mathematics teachers' ability in conducting valid proofs for statements regarding numbers and algebra in terms of their year of enrollment in a teacher education program, to determine the proof methods used in their valid proofs, and to examine the reasons for their…

  10. Methods to validate tooth-supporting regenerative therapies.

    PubMed

    Padial-Molina, Miguel; Marchesan, Julie T; Taut, Andrei D; Jin, Qiming; Giannobile, William V; Rios, Hector F

    2012-01-01

    In humans, microbially induced inflammatory periodontal diseases are the primary initiators that disrupt the functional and structural integrity of the periodontium (i.e., the alveolar bone, the periodontal ligament, and the cementum). The reestablishment of its original structure, properties, and function constitutes a significant challenge in the development of new therapies to regenerate tooth-supporting defects. Preclinical models represent an important in vivo tool to critically evaluate and analyze the key aspects of novel regenerative therapies, including (1) safety, (2) effectiveness, (3) practicality, and (4) functional and structural stability over time. Therefore, these models provide foundational data that supports the clinical validation and the development of novel innovative regenerative periodontal technologies. Steps are provided on the use of the root fenestration animal model for the proper evaluation of periodontal outcome measures using the following parameters: descriptive histology, histomorphometry, immunostaining techniques, three-dimensional imaging, electron microscopy, gene expression analyses, and safety assessments. These methods will prepare investigators and assist them in identifying the key end points that can then be adapted to later stage human clinical trials.

  11. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method

    PubMed Central

    2017-01-01

    Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant

  12. How to qualify and validate wear simulation devices and methods.

    PubMed

    Heintze, S D

    2006-08-01

    The clinical significance of increased wear can mainly be attributed to impaired aesthetic appearance and/or functional restrictions. Little is known about the systemic effects of swallowed or inhaled worn particles that derive from restorations. As wear measurements in vivo are complicated and time-consuming, wear simulation devices and methods had been developed without, however, systematically looking at the factors that influence important wear parameters. Wear simulation devices shall simulate processes that occur in the oral cavity during mastication, namely force, force profile, contact time, sliding movement, clearance of worn material, etc. Different devices that use different force actuator principles are available. Those with the highest citation frequency in the literature are - in descending order - the Alabama, ACTA, OHSU, Zurich and MTS wear simulators. When following the FDA guidelines on good laboratory practice (GLP) only the expensive MTS wear simulator is a qualified machine to test wear in vitro; the force exerted by the hydraulic actuator is controlled and regulated during all movements of the stylus. All the other simulators lack control and regulation of force development during dynamic loading of the flat specimens. This may be an explanation for the high coefficient of variation of the results in some wear simulators (28-40%) and the poor reproducibility of wear results if dental databases are searched for wear results of specific dental materials (difference of 22-72% for the same material). As most of the machines are not qualifiable, wear methods applying the machine may have a sound concept but cannot be validated. Only with the MTS method have wear parameters and influencing factors been documented and verified. A good compromise with regard to costs, practicability and robustness is the Willytec chewing simulator, which uses weights as force actuator and step motors for vertical and lateral movements. The Ivoclar wear method run on

  13. An Anatomically Validated Brachial Plexus Contouring Method for Intensity Modulated Radiation Therapy Planning

    SciTech Connect

    Van de Velde, Joris; Audenaert, Emmanuel; Speleers, Bruno; Vercauteren, Tom; Mulliez, Thomas; Vandemaele, Pieter; Achten, Eric; Kerckaert, Ingrid; D'Herde, Katharina; De Neve, Wilfried; Van Hoof, Tom

    2013-11-15

    Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validated the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection.

  14. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  15. STANDARDIZATION AND VALIDATION OF MICROBIOLOGICAL METHODS FOR EXAMINATION OF BIOSOLIDS

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within a complex matrix. Implications of ...

  16. Should the AOAC use-dilution method be continued for regulatory purposes?

    PubMed

    Omidbakhsh, Navid

    2012-01-01

    Despite its very poor reproducibility, AOAC INTERNATIONAL's use-dilution method (UDM) for bactericidal activity (AOAC Methods 964.02, 955.14, and 955.15) has been required by the U.S. Environmental Protection Agency (EPA) since 1953 for regulatory purposes, while methods with better reproducibility have been adopted in Canada and Australia. This study reviews UDM from a statistical perspective. Additionally, the test's expected results were compared to those obtained from actual evaluation of several formulations. Significant gaps have been identified in the reproducibility of the test data as predicted by statistical analysis and those presented to the EPA for product registration. UDM's poor reproducibility, along with its qualitative nature, requires the concentration of the active ingredient to be high enough to ensure all or most carriers to be free of any viable organisms. This is not in accord with the current trends towards sustainability, human safety, and environmental protection. It is recommended that the use of the method for regulatory purposes be phased out as soon as possible, and methods with better design and reproducibility be adopted instead.

  17. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    SciTech Connect

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  18. Cleaning validation 2: development and validation of an ion chromatographic method for the detection of traces of CIP-100 detergent.

    PubMed

    Resto, Wilfredo; Hernández, Darimar; Rey, Rosamil; Colón, Héctor; Zayas, José

    2007-05-09

    A cleaning validation method, ion chromatographic method with conductivity detection was developed and validated for the determination of traces of a clean-in-place (CIP) detergent. It was shown to be linear with a squared correlation coefficient (r(2)) of 0.9999 and average recoveries of 71.4% (area response factor) from stainless steel surfaces and 101% from cotton. The repeatability was found to be 2.17% and an intermediate precision of 1.88% across the range. The method was also shown to be sensitive with a detection limit (DL) of 0.13 ppm and a quantitation limit (QL) of 0.39 ppm for EDTA, which translates to less than 1 microL of CIP diluted in 100mL of diluent in both cases. The EDTA signal was well resolved from typical ions encountered in water samples or any other interference presented from swabs and surfaces. The method could be applied to cleaning validation samples. The validated method could be included as a suitable one for rapid and reliable cleaning validation program.

  19. A Toxocara cati eggs concentration method from cats' faeces, for experimental and diagnostic purposes.

    PubMed

    Cardillo, N; Sommerfelt, I; Fariña, F; Pasqualetti, M; Pérez, M; Ercole, M; Rosa, A; Ribicich, M

    2014-09-01

    Toxocariosis is a zoonotic parasite infection worldwide distributed, now considered a neglected disease associated to poverty. For experimental infection in animals and to develop the diagnosis in humans it is necessary to obtain large number of Toxocara spp. larval eggs. Toxocara cati eggs recovered percentage from faeces of infected cats was determined employing a novel egg concentration method. The McMaster egg counting technique and the concentration method were applied on 20 positive cats' sample faeces obtained from naturally infected cats. The mean percentage of eggs recovered by the concentration method was 24.37% higher than the count obtained by McMaster egg counting technique. The main advantage of this method is that it can be obtained a small final volume with a high number of recovered eggs and a good quality inoculum for experimental and diagnostic purposes.

  20. Use of Validation by Enterprises for Human Resource and Career Development Purposes. Cedefop Reference Series No 96

    ERIC Educational Resources Information Center

    Cedefop - European Centre for the Development of Vocational Training, 2014

    2014-01-01

    European enterprises give high priority to assessing skills and competences, seeing this as crucial for recruitment and human resource management. Based on a survey of 400 enterprises, 20 in-depth case studies and interviews with human resource experts in 10 countries, this report analyses the main purposes of competence assessment, the standards…

  1. Determination of validation threshold for coordinate measuring methods using a metrological compatibility model

    NASA Astrophysics Data System (ADS)

    Gromczak, Kamila; Gąska, Adam; Kowalski, Marek; Ostrowska, Ksenia; Sładek, Jerzy; Gruza, Maciej; Gąska, Piotr

    2017-01-01

    The following paper presents a practical approach to the validation process of coordinate measuring methods at an accredited laboratory, using a statistical model of metrological compatibility. The statistical analysis of measurement results obtained using a highly accurate system was intended to determine the permissible validation threshold values. The threshold value constitutes the primary criterion for the acceptance or rejection of the validated method, and depends on both the differences between measurement results with corresponding uncertainties and the individual correlation coefficient. The article specifies and explains the types of measuring methods that were subject to validation and defines the criterion value governing their acceptance or rejection in the validation process.

  2. Validated UPLC method for the fast and sensitive determination of steroid residues in support of cleaning validation in formulation area.

    PubMed

    Fekete, Szabolcs; Fekete, Jeno; Ganzler, Katalin

    2009-04-05

    An ultra performance liquid chromatographic (UPLC) method was developed for simultaneous determination of seven steroid (dienogest, finasteride, gestodene, levonorgestrel, estradiol, ethinylestradiol, and norethisterone acetate) active pharmaceutical ingredient (API) residues. A new, generic method is presented, with which it is possible to verify the cleaning process of a steroid producing equipment line used for the production of various pharmaceuticals. The UPLC method was validated using an UPLC BEH C18 column with a particle size of 1.7 microm (50 mm x 2.1 mm) and acetonitrile-water (48:52, v/v) as mobile phase at a flow rate of 0.55 ml/min. Method development and method validation for cleaning control analysis are described. The rapid UPLC method is suitable for cleaning control assays within good manufacturing practices (GMP) of the pharmaceutical industry.

  3. Comparison of manual and automated nucleic acid extraction methods from clinical specimens for microbial diagnosis purposes.

    PubMed

    Wozniak, Aniela; Geoffroy, Enrique; Miranda, Carolina; Castillo, Claudia; Sanhueza, Francia; García, Patricia

    2016-11-01

    The choice of nucleic acids (NAs) extraction method for molecular diagnosis in microbiology is of major importance because of the low microbial load, different nature of microorganisms, and clinical specimens. The NA yield of different extraction methods has been mostly studied using spiked samples. However, information from real human clinical specimens is scarce. The purpose of this study was to compare the performance of a manual low-cost extraction method (Qiagen kit or salting-out extraction method) with the automated high-cost MagNAPure Compact method. According to cycle threshold values for different pathogens, MagNAPure is as efficient as Qiagen for NA extraction from noncomplex clinical specimens (nasopharyngeal swab, skin swab, plasma, respiratory specimens). In contrast, according to cycle threshold values for RNAseP, MagNAPure method may not be an appropriate method for NA extraction from blood. We believe that MagNAPure versatility reduced risk of cross-contamination and reduced hands-on time compensates its high cost.

  4. Independent data validation of an in vitro method for ...

    EPA Pesticide Factsheets

    In vitro bioaccessibility assays (IVBA) estimate arsenic (As) relative bioavailability (RBA) in contaminated soils to improve the accuracy of site-specific human exposure assessments and risk calculations. For an IVBA assay to gain acceptance for use in risk assessment, it must be shown to reliably predict in vivo RBA that is determined in an established animal model. Previous studies correlating soil As IVBA with RBA have been limited by the use of few soil types as the source of As. Furthermore, the predictive value of As IVBA assays has not been validated using an independent set of As-contaminated soils. Therefore, the current study was undertaken to develop a robust linear model to predict As RBA in mice using an IVBA assay and to independently validate the predictive capability of this assay using a unique set of As-contaminated soils. Thirty-six As-contaminated soils varying in soil type, As contaminant source, and As concentration were included in this study, with 27 soils used for initial model development and nine soils used for independent model validation. The initial model reliably predicted As RBA values in the independent data set, with a mean As RBA prediction error of 5.3% (range 2.4 to 8.4%). Following validation, all 36 soils were used for final model development, resulting in a linear model with the equation: RBA = 0.59 * IVBA + 9.8 and R2 of 0.78. The in vivo-in vitro correlation and independent data validation presented here provide

  5. The establishment of tocopherol reference intervals for Hungarian adult population using a validated HPLC method.

    PubMed

    Veres, Gábor; Szpisjak, László; Bajtai, Attila; Siska, Andrea; Klivényi, Péter; Ilisz, István; Földesi, Imre; Vécsei, László; Zádori, Dénes

    2017-02-09

    Evidence suggests that decreased α-tocopherol (the most biologically active substance in the vitamin E group) level can cause neurological symptoms, most likely ataxia. The aim of the current study was to first provide reference intervals for serum tocopherols in the adult Hungarian population with appropriate sample size, recruiting healthy control subjects and neurological patients suffering from conditions without symptoms of ataxia, myopathy or cognitive deficiency. A validated HPLC method applying a diode array detector and rac-tocol as internal standard was utilized for that purpose. Furthermore, serum cholesterol levels were determined as well for data normalization. The calculated 2.5-97.5% reference intervals for α-, β/γ- and δ-tocopherols were 24.62-54.67, 0.81-3.69 and 0.29-1.07 μm, respectively, whereas the tocopherol/cholesterol ratios were 5.11-11.27, 0.14-0.72 and 0.06-0.22 μmol/mmol, respectively. The establishment of these reference intervals may improve the diagnostic accuracy of tocopherol measurements in certain neurological conditions with decreased tocopherol levels. Moreover, the current study draws special attention to the possible pitfalls in the complex process of the determination of reference intervals as well, including the selection of study population, the application of internal standard and method validation and the calculation of tocopherol/cholesterol ratios.

  6. Validation of a high performance liquid chromatography method for the stabilization of epigallocatechin gallate.

    PubMed

    Fangueiro, Joana F; Parra, Alexander; Silva, Amélia M; Egea, Maria A; Souto, Eliana B; Garcia, Maria L; Calpena, Ana C

    2014-11-20

    Epigallocatechin gallate (EGCG) is a green tea catechin with potential health benefits, such as anti-oxidant, anti-carcinogenic and anti-inflammatory effects. In general, EGCG is highly susceptible to degradation, therefore presenting stability problems. The present paper was focused on the study of EGCG stability in HEPES (N-2-hydroxyethylpiperazine-N'-2-ethanesulfonic acid) medium regarding the pH dependency, storage temperature and in the presence of ascorbic acid a reducing agent. The evaluation of EGCG in HEPES buffer has demonstrated that this molecule is not able of maintaining its physicochemical properties and potential beneficial effects, since it is partially or completely degraded, depending on the EGCG concentration. The storage temperature of EGCG most suitable to maintain its structure was shown to be the lower values (4 or -20 °C). The pH 3.5 was able to provide greater stability than pH 7.4. However, the presence of a reducing agent (i.e., ascorbic acid) was shown to provide greater protection against degradation of EGCG. A validation method based on RP-HPLC with UV-vis detection was carried out for two media: water and a biocompatible physiological medium composed of Transcutol®P, ethanol and ascorbic acid. The quantification of EGCG for purposes, using pure EGCG, requires a validated HPLC method which could be possible to apply in pharmacokinetic and pharmacodynamics studies.

  7. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  8. An Automatic Method for Geometric Segmentation of Masonry Arch Bridges for Structural Engineering Purposes

    NASA Astrophysics Data System (ADS)

    Riveiro, B.; DeJong, M.; Conde, B.

    2016-06-01

    Despite the tremendous advantages of the laser scanning technology for the geometric characterization of built constructions, there are important limitations preventing more widespread implementation in the structural engineering domain. Even though the technology provides extensive and accurate information to perform structural assessment and health monitoring, many people are resistant to the technology due to the processing times involved. Thus, new methods that can automatically process LiDAR data and subsequently provide an automatic and organized interpretation are required. This paper presents a new method for fully automated point cloud segmentation of masonry arch bridges. The method efficiently creates segmented, spatially related and organized point clouds, which each contain the relevant geometric data for a particular component (pier, arch, spandrel wall, etc.) of the structure. The segmentation procedure comprises a heuristic approach for the separation of different vertical walls, and later image processing tools adapted to voxel structures allows the efficient segmentation of the main structural elements of the bridge. The proposed methodology provides the essential processed data required for structural assessment of masonry arch bridges based on geometric anomalies. The method is validated using a representative sample of masonry arch bridges in Spain.

  9. Validation Methods for Fault-Tolerant avionics and control systems, working group meeting 1

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The proceedings of the first working group meeting on validation methods for fault tolerant computer design are presented. The state of the art in fault tolerant computer validation was examined in order to provide a framework for future discussions concerning research issues for the validation of fault tolerant avionics and flight control systems. The development of positions concerning critical aspects of the validation process are given.

  10. A simple method to generate adipose stem cell-derived neurons for screening purposes.

    PubMed

    Bossio, Caterina; Mastrangelo, Rosa; Morini, Raffaella; Tonna, Noemi; Coco, Silvia; Verderio, Claudia; Matteoli, Michela; Bianco, Fabio

    2013-10-01

    Strategies involved in mesenchymal stem cell (MSC) differentiation toward neuronal cells for screening purposes are characterized by quality and quantity issues. Differentiated cells are often scarce with respect to starting undifferentiated population, and the differentiation process is usually quite long, with high risk of contamination and low yield efficiency. Here, we describe a novel simple method to induce direct differentiation of MSCs into neuronal cells, without neurosphere formation. Differentiated cells are characterized by clear morphological changes, expression of neuronal specific markers, showing functional response to depolarizing stimuli and electrophysiological properties similar to those of developing neurons. The method described here represents a valuable tool for future strategies aimed at personalized screening of therapeutic agents in vitro.

  11. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    ERIC Educational Resources Information Center

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  12. Optimal combining of ground-based sensors for the purpose of validating satellite-based rainfall estimates

    NASA Technical Reports Server (NTRS)

    Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie

    1991-01-01

    Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.

  13. A photographic method to measure food item intake. Validation in geriatric institutions.

    PubMed

    Pouyet, Virginie; Cuvelier, Gérard; Benattar, Linda; Giboreau, Agnès

    2015-01-01

    From both a clinical and research perspective, measuring food intake is an important issue in geriatric institutions. However, weighing food in this context can be complex, particularly when the items remaining on a plate (side dish, meat or fish and sauce) need to be weighed separately following consumption. A method based on photography that involves taking photographs after a meal to determine food intake consequently seems to be a good alternative. This method enables the storage of raw data so that unhurried analyses can be performed to distinguish the food items present in the images. Therefore, the aim of this paper was to validate a photographic method to measure food intake in terms of differentiating food item intake in the context of a geriatric institution. Sixty-six elderly residents took part in this study, which was performed in four French nursing homes. Four dishes of standardized portions were offered to the residents during 16 different lunchtimes. Three non-trained assessors then independently estimated both the total and specific food item intakes of the participants using images of their plates taken after the meal (photographic method) and a reference image of one plate taken before the meal. Total food intakes were also recorded by weighing the food. To test the reliability of the photographic method, agreements between different assessors and agreements among various estimates made by the same assessor were evaluated. To test the accuracy and specificity of this method, food intake estimates for the four dishes were compared with the food intakes determined using the weighed food method. To illustrate the added value of the photographic method, food consumption differences between the dishes were explained by investigating the intakes of specific food items. Although they were not specifically trained for this purpose, the results demonstrated that the assessor estimates agreed between assessors and among various estimates made by the same

  14. Convergent validity of a novel method for quantifying rowing training loads.

    PubMed

    Tran, Jacqueline; Rice, Anthony J; Main, Luana C; Gastin, Paul B

    2015-01-01

    Elite rowers complete rowing-specific and non-specific training, incorporating continuous and interval-like efforts spanning the intensity spectrum. However, established training load measures are unsuitable for use in some modes and intensities. Consequently, a new measure known as the T2minute method was created. The method quantifies load as the time spent in a range of training zones (time-in-zone), multiplied by intensity- and mode-specific weighting factors that scale the relative stress of different intensities and modes to the demands of on-water rowing. The purpose of this study was to examine the convergent validity of the T2minute method with Banister's training impulse (TRIMP), Lucia's TRIMP and Session-RPE when quantifying elite rowing training. Fourteen elite rowers (12 males, 2 females) were monitored during four weeks of routine training. Unadjusted T2minute loads (using coaches' estimates of time-in-zone) demonstrated moderate-to-strong correlations with Banister's TRIMP, Lucia's TRIMP and Session-RPE (rho: 0.58, 0.55 and 0.42, respectively). Adjusting T2minute loads by using actual time-in-zone data resulted in stronger correlations between the T2minute method and Banister's TRIMP and Lucia's TRIMP (rho: 0.85 and 0.81, respectively). The T2minute method is an appropriate in-field measure of elite rowing training loads, particularly when actual time-in-zone values are used to quantify load.

  15. Voltammetric determination of copper in selected pharmaceutical preparations--validation of the method.

    PubMed

    Lutka, Anna; Maruszewska, Małgorzata

    2011-01-01

    It were established and validated the conditions of voltammetric determination of copper in pharmaceutical preparations. The three selected preparations: Zincuprim (A), Wapń, cynk, miedź z wit. C (B), Vigor complete (V) contained different salts and different quantity of copper (II) and increasing number of accompanied ingredients. For the purpose to transfer copper into solution, the samples of powdered tablets of the first and second preparation were undergone extraction and of the third the mineralization procedures. The concentration of copper in solution was determined by differential pulse voltammetry (DP) using comparison with standard technique. In the validation process, the selectivity, accuracy, precision and linearity of DP determination of copper in three preparations were estimated. Copper was determined within the concentration range of 1-9 ppm (1-9 microg/mL): the mean recoveries approached 102% (A), 100% (B), 102% (V); the relative standard deviations of determinations (RSD) were 0.79-1.59% (A), 0.62-0.85% (B) and 1.68-2.28% (V), respectively. The mean recoveries and the RSDs of determination satisfied the requirements for the analyte concentration at the level 1-10 ppm. The statistical verification confirmed that the tested voltammetric method is suitable for determination of copper in pharmaceutical preparation.

  16. An evaluation of alternate production methods for Pu-238 general purpose heat source pellets

    SciTech Connect

    Mark Borland; Steve Frank

    2009-06-01

    For the past half century, the National Aeronautics and Space Administration (NASA) has used Radioisotope Thermoelectric Generators (RTG) to power deep space satellites. Fabricating heat sources for RTGs, specifically General Purpose Heat Sources (GPHSs), has remained essentially unchanged since their development in the 1970s. Meanwhile, 30 years of technological advancements have been made in the applicable fields of chemistry, manufacturing and control systems. This paper evaluates alternative processes that could be used to produce Pu 238 fueled heat sources. Specifically, this paper discusses the production of the plutonium-oxide granules, which are the input stream to the ceramic pressing and sintering processes. Alternate chemical processes are compared to current methods to determine if alternative fabrication processes could reduce the hazards, especially the production of respirable fines, while producing an equivalent GPHS product.

  17. FIELD VALIDATION OF SEDIMENT TOXCITY IDENTIFCATION AND EVALUATION METHODS

    EPA Science Inventory

    Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both porewaters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question of whethe...

  18. Comparison of Machine Learning Methods for the Purpose Of Human Fall Detection

    NASA Astrophysics Data System (ADS)

    Strémy, Maximilián; Peterková, Andrea

    2014-12-01

    According to several studies, the European population is rapidly aging far over last years. It is therefore important to ensure that aging population is able to live independently without the support of working-age population. In accordance with the studies, fall is the most dangerous and frequent accident in the everyday life of aging population. In our paper, we present a system to track the human fall by a visual detection, i.e. using no wearable equipment. For this purpose, we used a Kinect sensor, which provides the human body position in the Cartesian coordinates. It is possible to directly capture a human body because the Kinect sensor has a depth and also an infrared camera. The first step in our research was to detect postures and classify the fall accident. We experimented and compared the selected machine learning methods including Naive Bayes, decision trees and SVM method to compare the performance in recognizing the human postures (standing, sitting and lying). The highest classification accuracy of over 93.3% was achieved by the decision tree method.

  19. Development and Validation of an HPLC Method for the Analysis of Sirolimus in Drug Products

    PubMed Central

    Islambulchilar, Ziba; Ghanbarzadeh, Saeed; Emami, Shahram; Valizadeh, Hadi; Zakeri-Milani, Parvin

    2012-01-01

    Purpose: The aim of this study was to develop a simple, rapid and sensitive reverse phase high performance liquid chromatography (RP-HPLC) method for quantification of sirolimus (SRL) in pharmaceutical dosage forms. Methods: The chromatographic system employs isocratic elution using a Knauer- C18, 5 mm, 4.6 × 150 mm. Mobile phase consisting of acetonitril and ammonium acetate buffer set at flow rate 1.5 ml/min. The analyte was detected and quantified at 278nm using ultraviolet detector. The method was validated as per ICH guidelines. Results: The standard curve was found to have a linear relationship (r2 > 0.99) over the analytical range of 125–2000ng/ml. For all quality control (QC) standards in intraday and interday assay, accuracy and precision range were -0.96 to 6.30 and 0.86 to 13.74 respectively, demonstrating the precision and accuracy over the analytical range. Samples were stable during preparation and analysis procedure. Conclusion: Therefore the rapid and sensitive developed method can be used for the routine analysis of sirolimus such as dissolution and stability assays of pre- and post-marketed dosage forms. PMID:24312784

  20. Polymorphism in nimodipine raw materials: development and validation of a quantitative method through differential scanning calorimetry.

    PubMed

    Riekes, Manoela Klüppel; Pereira, Rafael Nicolay; Rauber, Gabriela Schneider; Cuffini, Silvia Lucia; de Campos, Carlos Eduardo Maduro; Silva, Marcos Antonio Segatto; Stulzer, Hellen Karine

    2012-11-01

    Due to the physical-chemical and therapeutic impacts of polymorphism, its monitoring in raw materials is necessary. The purpose of this study was to develop and validate a quantitative method to determine the polymorphic content of nimodipine (NMP) raw materials based on differential scanning calorimetry (DSC). The polymorphs required for the development of the method were characterized through DSC, X-ray powder diffraction (XRPD) and Raman spectroscopy and their polymorphic identity was confirmed. The developed method was found to be linear, robust, precise, accurate and specific. Three different samples obtained from distinct suppliers (NMP 1, NMP 2 and NMP 3) were firstly characterized through XRPD and DSC as polymorphic mixtures. The determination of their polymorphic identity revealed that all samples presented the Modification I (Mod I) or metastable form in greatest proportion. Since the commercial polymorph is Mod I, the polymorphic characteristic of the samples analyzed needs to be investigated. Thus, the proposed method provides a useful tool for the monitoring of the polymorphic content of NMP raw materials.

  1. Sample size considerations of prediction-validation methods in high-dimensional data for survival outcomes.

    PubMed

    Pang, Herbert; Jung, Sin-Ho

    2013-04-01

    A variety of prediction methods are used to relate high-dimensional genome data with a clinical outcome using a prediction model. Once a prediction model is developed from a data set, it should be validated using a resampling method or an independent data set. Although the existing prediction methods have been intensively evaluated by many investigators, there has not been a comprehensive study investigating the performance of the validation methods, especially with a survival clinical outcome. Understanding the properties of the various validation methods can allow researchers to perform more powerful validations while controlling for type I error. In addition, sample size calculation strategy based on these validation methods is lacking. We conduct extensive simulations to examine the statistical properties of these validation strategies. In both simulations and a real data example, we have found that 10-fold cross-validation with permutation gave the best power while controlling type I error close to the nominal level. Based on this, we have also developed a sample size calculation method that will be used to design a validation study with a user-chosen combination of prediction. Microarray and genome-wide association studies data are used as illustrations. The power calculation method in this presentation can be used for the design of any biomedical studies involving high-dimensional data and survival outcomes.

  2. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    ERIC Educational Resources Information Center

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  3. VALIDATION OF A METHOD FOR ESTIMATING LONG-TERM EXPOSURES BASED ON SHORT-TERM MEASUREMENTS

    EPA Science Inventory

    A method for estimating long-term exposures from short-term measurements is validated using data from a recent EPA study of exposure to fine particles. The method was developed a decade ago but long-term exposure data to validate it did not exist until recently. In this paper, ...

  4. A Systematic Review of Validated Methods for Identifying Cerebrovascular Accident or Transient Ischemic Attack Using Administrative Data

    PubMed Central

    Andrade, Susan E.; Harrold, Leslie R.; Tjia, Jennifer; Cutrona, Sarah L.; Saczynski, Jane S.; Dodd, Katherine S.; Goldberg, Robert J.; Gurwitz, Jerry H.

    2012-01-01

    Purpose To perform a systematic review of the validity of algorithms for identifying cerebrovascular accidents (CVAs) or transient ischemic attacks (TIAs) using administrative and claims data. Methods PubMed and Iowa Drug Information Service (IDIS) searches of the English language literature were performed to identify studies published between 1990 and 2010 that evaluated the validity of algorithms for identifying CVAs (ischemic and hemorrhagic strokes, intracranial hemorrhage and subarachnoid hemorrhage) and/or TIAs in administrative data. Two study investigators independently reviewed the abstracts and articles to determine relevant studies according to pre-specified criteria. Results A total of 35 articles met the criteria for evaluation. Of these, 26 articles provided data to evaluate the validity of stroke, 7 reported the validity of TIA, 5 reported the validity of intracranial bleeds (intracerebral hemorrhage and subarachnoid hemorrhage), and 10 studies reported the validity of algorithms to identify the composite endpoints of stroke/TIA or cerebrovascular disease. Positive predictive values (PPVs) varied depending on the specific outcomes and algorithms evaluated. Specific algorithms to evaluate the presence of stroke and intracranial bleeds were found to have high PPVs (80% or greater). Algorithms to evaluate TIAs in adult populations were generally found to have PPVs of 70% or greater. Conclusions The algorithms and definitions to identify CVAs and TIAs using administrative and claims data differ greatly in the published literature. The choice of the algorithm employed should be determined by the stroke subtype of interest. PMID:22262598

  5. Principles and Methods to Guide Education for Purpose: A Brazilian Experience

    ERIC Educational Resources Information Center

    Araujo, Ulisses F.; Arantes, Valeria Amorim; Danza, Hanna Cebel; Pinheiro, Viviane Potenza Guimarães; Garbin, Monica

    2016-01-01

    This article presents a Brazilian experience in training teachers to educate for purpose. Understanding that purpose is a value to be constructed through real-world and contextualised experiences, the authors discuss some psychological processes that underlie purpose development. Then the authors show how these processes are used in a purpose…

  6. Validation of two ribosomal RNA removal methods for microbial metatranscriptomics

    SciTech Connect

    He, Shaomei; Wurtzel, Omri; Singh, Kanwar; Froula, Jeff L; Yilmaz, Suzan; Tringe, Susannah G; Wang, Zhong; Chen, Feng; Lindquist, Erika A; Sorek, Rotem; Hugenholtz, Philip

    2010-10-01

    The predominance of rRNAs in the transcriptome is a major technical challenge in sequence-based analysis of cDNAs from microbial isolates and communities. Several approaches have been applied to deplete rRNAs from (meta)transcriptomes, but no systematic investigation of potential biases introduced by any of these approaches has been reported. Here we validated the effectiveness and fidelity of the two most commonly used approaches, subtractive hybridization and exonuclease digestion, as well as combinations of these treatments, on two synthetic five-microorganism metatranscriptomes using massively parallel sequencing. We found that the effectiveness of rRNA removal was a function of community composition and RNA integrity for these treatments. Subtractive hybridization alone introduced the least bias in relative transcript abundance, whereas exonuclease and in particular combined treatments greatly compromised mRNA abundance fidelity. Illumina sequencing itself also can compromise quantitative data analysis by introducing a G+C bias between runs.

  7. STATISTICAL VALIDATION OF SULFATE QUANTIFICATION METHODS USED FOR ANALYSIS OF ACID MINE DRAINAGE

    EPA Science Inventory

    Turbidimetric method (TM), ion chromatography (IC) and inductively coupled plasma atomic emission spectrometry (ICP-AES) with and without acid digestion have been compared and validated for the determination of sulfate in mining wastewater. Analytical methods were chosen to compa...

  8. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.

    PubMed

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2016-04-26

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions.

  9. Modified method to enhanced recovery of Toxocara cati larvae for the purposes of diagnostic and therapeutic.

    PubMed

    Zibaei, Mohammad; Uga, Shoji

    2016-10-01

    Human toxocariasis, extraintestinal-migration of Toxocara species, is a worldwide helminthic zoonosis in many places of the undeveloped countries. Toxocara cati is one of the common helminths in cats and it is a potentially preventable disease. Its diagnosis and treatment depend on the demonstration of specific excretory-secretory Toxocara antibodies from Toxocara larvae by immunological assays. This study provides a simple manual technique which can be performed in any laboratory for recovering a large number of Toxocara cati larvae from the thick-shelled eggs. The devices that are required contain a manual homogenizer and a filter membrane of 40 μm mesh; the rest of materials and solutions is standard laboratory ware. In the modified method the larval yields were 2.7 times higher (3000 larval/ml) and the time spent in performing the modified method was shorter (75 min). Further benefits over already techniques are the easy and repeatable, inexpensive and convenient materials, simplicity to perform and require less time for recovery of Toxocara cati larvae for subsequent cultivation and harvest of the larval excretory-secretory antigens for diagnostic or treatment purposes.

  10. Production of general purpose heat source (GPHS) using advanced manufacturing methods

    NASA Astrophysics Data System (ADS)

    Miller, Roger G.

    1996-03-01

    Mankind will continue to explore the stars through the use of unmanned space craft until the technology and costs are compatible with sending travelers to the outer planets of our solar system and beyond. Unmanned probes of the present and future will be necessary to develop the necessary technologies and obtain information that will make this travel possible. Because of the significant costs incurred, the use of modern manufacturing technologies must be used to lower the investment needed even when shared by international partnerships. For over the last 30 years, radioisotopes have provided the heat from which electrical power is extracted. Electric power for future spacecraft will be provided by either Radioisotope Thermoelectric Generators (RTG), Radioisotopic Thermophotovoltaic systems (RTPV), radioisotope Stirling systems, or a combination of these. All of these systems will be thermally driven by General Purpose Heat Source (GPHS) fueled clad in some configuration. The GPHS clad contains a 238PuO2 pellet encapsulated in an iridium alloy container. Historically, the fabrication of the iridium alloy shells has been performed at EG&G Mound and Oak Ridge National Laboratory (ORNL), and girth welding at Westinghouse Savannah River Corporation (WSRC) and Los Alamos National Laboratory (LANL). This paper will describe the use of laser processing for welding, drilling, cutting, and machining with other manufacturing methods to reduce the costs of producing GPHS fueled clad components and compléted assemblies. Incorporation of new quality technologies will compliment these manufacturing methods to reduce cost.

  11. Testing and Validation of the Dynamic Interia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander; Herrera, Claudia; Spivey, Natalie; Fladung, William; Cloutier, David

    2015-01-01

    This presentation describes the DIM method and how it measures the inertia properties of an object by analyzing the frequency response functions measured during a ground vibration test (GVT). The DIM method has been in development at the University of Cincinnati and has shown success on a variety of small scale test articles. The NASA AFRC version was modified for larger applications.

  12. [Genetic tests: definition, methods, validity and clinical utility].

    PubMed

    Lagos L, Marcela; Poggi M, Helena

    2010-01-01

    The knowledge of the human genome has led to an explosion of available genetic tests for clinical use. The methodologies used in these tests vary widely, allowing the study from chromosomes to the analysis of a single nucleotide. Prior to its use in the clinical setting, these tests should have an evaluation that includes analytical and clinical validation and determination of the clinical utility, as any other tests, including requirements for quality assurance. Recently, the CDC (Centers for Disease Control and Prevention, USA) published a guideline for Good Laboratory Practices for Molecular Genetic Testing for Heritable Diseases and Conditions, covering the pre-analytical, analytical and post-analytical phases of the tests. The document covers the importance of proper selection of tests, the availability of information on the performance of the techniques used, the quality control practices, the training of personnel involved and the report of results, to allow the adequate interpretation, including sensitivity and specificity. Considering that recent advances in genetics have changed and will continue to affect clinical practice, genetic tests must meet quality and safety requirements to enable optimal use of them.

  13. Flight critical system design guidelines and validation methods

    NASA Technical Reports Server (NTRS)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  14. Validation of a Numerical Method for Determining Liner Impedance

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Tanner, Sharon E.; Parrott, Tony L.

    1996-01-01

    This paper reports the initial results of a test series to evaluate a method for determining the normal incidence impedance of a locally reacting acoustically absorbing liner, located on the lower wall of a duct in a grazing incidence, multi-modal, non-progressive acoustic wave environment without flow. This initial evaluation is accomplished by testing the methods' ability to converge to the known normal incidence impedance of a solid steel plate, and to the normal incidence impedance of an absorbing test specimen whose impedance was measured in a conventional normal incidence tube. The method is shown to converge to the normal incident impedance values and thus to be an adequate tool for determining the impedance of specimens in a grazing incidence, multi-modal, nonprogressive acoustic wave environment for a broad range of source frequencies.

  15. Validated spectrofluorimetric method for determination of selected aminoglycosides

    NASA Astrophysics Data System (ADS)

    Omar, Mahmoud A.; Ahmed, Hytham M.; Hammad, Mohamed A.; Derayea, Sayed M.

    2015-01-01

    New, sensitive, and selective spectrofluorimetric method was developed for determination of three aminoglycoside drugs in different dosage forms, namely; neomycin sulfate (NEO), tobramycin (TOB) and kanamycin sulfate (KAN). The method is based on Hantzsch condensation reaction between the primary amino group of aminoglycosides with acetylacetone and formaldehyde in pH 2.7 yielding highly yellow fluorescent derivatives measured emission (471 nm) and excitation (410 nm) wavelengths. The fluorescence intensity was directly proportional to the concentration over the range 10-60, 40-100 and 5-50 ng/mL for NEO, TOB and KAN respectively. The proposed method was applied successfully for determination of these drugs in their pharmaceutical dosage forms.

  16. Validation of ESR analyzer using Westergren ESR method.

    PubMed

    Sikka, Meera; Tandon, Rajesh; Rusia, Usha; Madan, Nishi

    2007-07-01

    Erythrocyte sedimentation rate (ESR) is one of the most frequently ordered laboratory test. ESR analyzers were developed to provide a quick and efficient measure of ESR. We compared the results of ESR obtained by an ESR analyzer with those by the Westergren method in a group of 75 patients Linear regression analysis showed a good correlation between the two results (r = 0.818, p < 0.01). The intra class correlation was 0.82. The analyzer method had the advantages of safety, decreased technician time and improved patient care by providing quick results.

  17. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  18. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  19. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  20. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  1. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  2. Maladjustment of Bully-Victims: Validation with Three Identification Methods

    ERIC Educational Resources Information Center

    Yang, An; Li, Xiang; Salmivalli, Christina

    2016-01-01

    Although knowledge on the psychosocial (mal)adjustment of bully-victims, children who bully others and are victimised by others, has been increasing, the findings have been principally gained utilising a single method to identify bully-victims. The present study examined the psychosocial adjustment of bully-victims (as compared with pure bullies…

  3. Obtaining Valid Response Rates: Considerations beyond the Tailored Design Method.

    ERIC Educational Resources Information Center

    Huang, Judy Y.; Hubbard, Susan M.; Mulvey, Kevin P.

    2003-01-01

    Reports on the use of the tailored design method (TDM) to achieve high survey response in two separate studies of the dissemination of Treatment Improvement Protocols (TIPs). Findings from these two studies identify six factors may have influenced nonresponse, and show that use of TDM does not, in itself, guarantee a high response rate. (SLD)

  4. The Language Teaching Methods Scale: Reliability and Validity Studies

    ERIC Educational Resources Information Center

    Okmen, Burcu; Kilic, Abdurrahman

    2016-01-01

    The aim of this research is to develop a scale to determine the language teaching methods used by English teachers. The research sample consisted of 300 English teachers who taught at Duzce University and in primary schools, secondary schools and high schools in the Provincial Management of National Education in the city of Duzce in 2013-2014…

  5. Use of multiple cluster analysis methods to explore the validity of a community outcomes concept map.

    PubMed

    Orsi, Rebecca

    2017-02-01

    Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research.

  6. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  7. Double Cross-Validation in Multiple Regression: A Method of Estimating the Stability of Results.

    ERIC Educational Resources Information Center

    Rowell, R. Kevin

    In multiple regression analysis, where resulting predictive equation effectiveness is subject to shrinkage, it is especially important to evaluate result replicability. Double cross-validation is an empirical method by which an estimate of invariance or stability can be obtained from research data. A procedure for double cross-validation is…

  8. A Validation of Elements, Methods, and Barriers to Inclusive High School Service-Learning Programs

    ERIC Educational Resources Information Center

    Dymond, Stacy K.; Chun, Eul Jung; Kim, Rah Kyung; Renzaglia, Adelle

    2013-01-01

    A statewide survey of coordinators of inclusive high school service-learning programs was conducted to validate elements, methods, and barriers to including students with and without disabilities in service-learning. Surveys were mailed to 655 service-learning coordinators; 190 (29%) returned a completed survey. Findings support the validity of…

  9. Validation of a new ELISA method for in vitro potency testing of hepatitis A vaccines.

    PubMed

    Morgeaux, S; Variot, P; Daas, A; Costanzo, A

    2013-01-01

    The goal of the project was to standardise a new in vitro method in replacement of the existing standard method for the determination of hepatitis A virus antigen content in hepatitis A vaccines (HAV) marketed in Europe. This became necessary due to issues with the method used previously, requiring the use of commercial test kits. The selected candidate method, not based on commercial kits, had already been used for many years by an Official Medicines Control Laboratory (OMCL) for routine testing and batch release of HAV. After a pre-qualification phase (Phase 1) that showed the suitability of the commercially available critical ELISA reagents for the determination of antigen content in marketed HAV present on the European market, an international collaborative study (Phase 2) was carried out in order to fully validate the method. Eleven laboratories took part in the collaborative study. They performed assays with the candidate standard method and, in parallel, for comparison purposes, with their own in-house validated methods where these were available. The study demonstrated that the new assay provides a more reliable and reproducible method when compared to the existing standard method. A good correlation of the candidate standard method with the in vivo immunogenicity assay in mice was shown previously for both potent and sub-potent (stressed) vaccines. Thus, the new standard method validated during the collaborative study may be implemented readily by manufacturers and OMCLs for routine batch release but also for in-process control or consistency testing. The new method was approved in October 2012 by Group of Experts 15 of the European Pharmacopoeia (Ph. Eur.) as the standard method for in vitro potency testing of HAV. The relevant texts will be revised accordingly. Critical reagents such as coating reagent and detection antibodies have been adopted by the Ph. Eur. Commission and are available from the EDQM as Ph. Eur. Biological Reference Reagents (BRRs).

  10. Assessment and validation of a simple automated method for the detection of gait events and intervals.

    PubMed

    Ghoussayni, Salim; Stevens, Christopher; Durham, Sally; Ewins, David

    2004-12-01

    A simple and rapid automatic method for detection of gait events at the foot could speed up and possibly increase the repeatability of gait analysis and evaluations of treatments for pathological gaits. The aim of this study was to compare and validate a kinematic-based algorithm used in the detection of four gait events, heel contact, heel rise, toe contact and toe off. Force platform data is often used to obtain start and end of contact phases, but not usually heel rise and toe contact events. For this purpose synchronised kinematic, kinetic and video data were captured from 12 healthy adult subjects walking both barefoot and shod at slow and normal self-selected speeds. The data were used to determine the gait events using three methods: force, visual inspection and algorithm methods. Ninety percent of all timings given by the algorithm were within one frame (16.7 ms) when compared to visual inspection. There were no statistically significant differences between the visual and algorithm timings. For both heel and toe contact the differences between the three methods were within 1.5 frames, whereas for heel rise and toe off the differences between the force on one side and the visual and algorithm on the other were higher and more varied (up to 175 ms). In addition, the algorithm method provided the duration of three intervals, heel contact to toe contact, toe contact to heel rise and heel rise to toe off, which are not readily available from force platform data. The ability to automatically and reliably detect the timings of these four gait events and three intervals using kinematic data alone is an asset to clinical gait analysis.

  11. Stress degradation studies on betahistine and development of a validated stability-indicating assay method.

    PubMed

    Khedr, Alaa; Sheha, Mahmoud

    2008-06-15

    The purpose of this work was to study the stability of betahistine (BET) at different stress conditions and to develop a sensitive stability-indicating high-performance liquid chromatographic (HPLC) assay method. The stress conditions applied were including the effect of heat, moisture, acid-base, and ultra-violet (UV) light. Betahistine and its decomposition products were derivatized by reaction with dansyl chloride (Dan-Cl) and analyzed by HPLC equipped with fluorescence detector (FL) set at 336 and 531 nm as excitation and emission wavelengths, respectively. The drug was particularly labile at UV light and oxygen rich media. Two potential degradation products could be separated and identified by spectral methods. The chromatographic method involved Zorbax Eclipse XDB-C(18) column kept at 30+/-2 degrees C and a gradient elution with mobile phase composed of acetonitrile and 0.02 mol L(-1) sodium acetate. The response factor of dansylated BET monitored by fluorescence detection was 32 times more than its UV response. The calibration curve of BET in bulk form was linear from 0.005 to 4.2 ng microL(-1). Intraday and interday precision were less than 0.04% (CV), and accuracy was between 99.2% and 100.9% over 2.0 ng microL(-1). The limit of detection was 0.002 ng microL(-1). The method was also validated for sample stability during reaction, robustness and selectivity. The method was applied for purity testing of betahistine in tablet form.

  12. System identification methods for aircraft flight control development and validation

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1995-01-01

    System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.

  13. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose.

  14. Validation of a spectrophotometric method for quantification of carboxyhemoglobin.

    PubMed

    Luchini, Paulo D; Leyton, Jaime F; Strombech, Maria de Lourdes C; Ponce, Julio C; Jesus, Maria das Graças S; Leyton, Vilma

    2009-10-01

    The measurement of carboxyhemoglobin (COHb) levels in blood is a valuable procedure to confirm exposure to carbon monoxide (CO) either for forensic or occupational matters. A previously described method using spectrophotometric readings at 420 and 432 nm after reduction of oxyhemoglobin (O(2)Hb) and methemoglobin with sodium hydrosulfite solution leads to an exponential curve. This curve, used with pre-established factors, serves well for lower concentrations (1-7%) or for high concentrations (> 20%) but very rarely for both. The authors have observed that small variations on the previously described factors F1, F2, and F3, obtained from readings for 100% COHb and 100% O(2)Hb, turn into significant changes in COHb% results and propose that these factors should be determined every time COHb is measured by reading CO and O(2) saturated samples. This practice leads to an increase in accuracy and precision.

  15. Experimental Validation for Hot Stamping Process by Using Taguchi Method

    NASA Astrophysics Data System (ADS)

    Fawzi Zamri, Mohd; Lim, Syh Kai; Razlan Yusoff, Ahmad

    2016-02-01

    Due to the demand for reduction in gas emissions, energy saving and producing safer vehicles has driven the development of Ultra High Strength Steel (UHSS) material. To strengthen UHSS material such as boron steel, it needed to undergo a process of hot stamping for heating at certain temperature and time. In this paper, Taguchi method is applied to determine the appropriate parameter of thickness, heating temperature and heating time to achieve optimum strength of boron steel. The experiment is conducted by using flat square shape of hot stamping tool with tensile dog bone as a blank product. Then, the value of tensile strength and hardness is measured as response. The results showed that the lower thickness, higher heating temperature and heating time give the higher strength and hardness for the final product. In conclusion, boron steel blank are able to achieve up to 1200 MPa tensile strength and 650 HV of hardness.

  16. A Model Incorporating the Rationale and Purpose for Conducting Mixed-Methods Research in Special Education and beyond

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Sutton, Ida L.

    2006-01-01

    This article provides a typology of reasons for conducting mixed-methods research in special education. The mixed-methods research process is described along with the role of the rationale and purpose of study. The reasons given in the literature for utilizing mixed-methods research are explicated, and the limitations of these reason frameworks…

  17. General purpose nonlinear system solver based on Newton-Krylov method.

    SciTech Connect

    2013-12-01

    KINSOL is part of a software family called SUNDIALS: SUite of Nonlinear and Differential/Algebraic equation Solvers [1]. KINSOL is a general-purpose nonlinear system solver based on Newton-Krylov and fixed-point solver technologies [2].

  18. Validation of an evacuated canister method for measuring part-per-billion levels of chemical warfare agent simulants.

    PubMed

    Coffey, Christopher C; LeBouf, Ryan F; Calvert, Catherine A; Slaven, James E

    2011-08-01

    The National Institute for Occupational Safety and Health (NIOSH) research on direct-reading instruments (DRIs) needed an instantaneous sampling method to provide independent confirmation of the concentrations of chemical warfare agent (CWA) simulants. It was determined that evacuated canisters would be the method of choice. There is no method specifically validated for volatile organic compounds (VOCs) in the NIOSH Manual of Analytical Methods. The purpose of this study was to validate an evacuated canister method for sampling seven specific VOCs that can be used as a simulant for CWA agents (cyclohexane) or influence the DRI measurement of CWA agents (acetone, chloroform, methylene chloride, methyl ethyl ketone, hexane, and carbon tetrachloride [CCl4]). The method used 6-L evacuated stainless-steel fused silica-lined canisters to sample the atmosphere containing VOCs. The contents of the canisters were then introduced into an autosampler/preconcentrator using a microscale purge and trap (MPT) method. The MPT method trapped and concentrated the VOCs in the air sample and removed most of the carbon dioxide and water vapor. After preconcentration, the samples were analyzed using a gas chromatograph with a mass selective detector. The method was tested, evaluated, and validated using the NIOSH recommended guidelines. The evaluation consisted of determining the optimum concentration range for the method; the sample stability over 30 days; and the accuracy, precision, and bias of the method. This method meets the NIOSH guidelines for six of the seven compounds (excluding acetone) tested in the range of 2.3-50 parts per billion (ppb), making it suitable for sampling of these VOCs at the ppb level.

  19. Development and validity of a method for the evaluation of printed education material

    PubMed Central

    Castro, Mauro Silveira; Pilger, Diogo; Fuchs, Flávio Danni; Ferreira, Maria Beatriz Cardoso

    Objectives To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM); to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men) and 5 nurses (all women). Results Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group. Conclusions The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material. PMID:25214924

  20. Reliability-targeted HPLC-UV method validation--a protocol enrichment perspective.

    PubMed

    Dharuman, Joghee Gowder; Vasudevan, Mahalingam

    2014-02-01

    Method validation is important in analytical chemistry to obtain the reliability of an analytical method. Guidelines provided by the regulatory bodies can be used as a general framework to assess the validity of a method. Since these guidelines do not focus on the reliability of analytical results exclusively, this study was aimed to combine a few recently evolved strategies that may render analytical method validation more reliable and trustworthy. In this research, the analytical error function was determined by appropriate polynomial regression statistics that determine the range of analyte concentration that may lead to more accurate measurements by producing the least possible total error in the assay and can be regarded as a reliable weighting method. The reliability of the analytical results over a particular concentration range has been proposed by a Bayesian probability study. In order to ensure the applicability of this approach, it was applied for the validation of an HPLC-UV assay method dedicated to the quantification of cefepime and tazobactam in human plasma. A comparison between the newer approach and the usual method validation revealed that the application of analytical error function and Bayesian analysis at the end of the validation process can produce significant improvements in the analytical results.

  1. Convergent Validity of Three Methods for Measuring Postoperative Complications

    PubMed Central

    Fritz, Bradley A.; Escallier, Krisztina E.; Abdallah, Arbi Ben; Oberhaus, Jordan; Becker, Jennifer; Geczi, Kristin; McKinnon, Sherry; Helsten, Dan L.; Sharma, Anshuman; Wildes, Troy S.; Avidan, Michael S.

    2016-01-01

    Background Anesthesiologists need tools to accurately track postoperative outcomes. The accuracy of patient report in identifying a wide variety of postoperative complications after diverse surgical procedures has not previously been investigated. Methods In this cohort study, 1,578 adult surgical patients completed a survey at least 30 days after their procedure asking if they had experienced any of 18 complications while in the hospital after surgery. Patient responses were compared to the results of an automated electronic chart review and (for a random subset of 750 patients) to a manual chart review. Results from automated chart review were also compared to those from manual chart review. Forty-two randomly selected patients were contacted by telephone to explore reasons for discrepancies between patient report and manual chart review. Results Comparisons between patient report, automated chart review, and manual chart review demonstrated poor-to-moderate positive agreement (range, 0 to 58%) and excellent negative agreement (range, 82 to 100%). Discordance between patient report and manual chart review was frequently explicable by patients reporting events that happened outside the time period of interest. Conclusions Patient report can provide information about subjective experiences or events that happen after hospital discharge, but often yields different results from chart review for specific in-hospital complications. Effective in-hospital communication with patients and thoughtful survey design may increase the quality of patient-reported complication data. PMID:27028469

  2. 78 FR 56718 - Draft Guidance for Industry on Bioanalytical Method Validation; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-13

    ... HUMAN SERVICES Food and Drug Administration Draft Guidance for Industry on Bioanalytical Method Validation; Availability AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and... revising the guidance to reflect advancements in the science and technology of bioanalytical...

  3. Single Lab Validation of a LC/UV/FLD/MS Method for Simultaneous Determination of Water-soluble Vitamins in Multi-Vitamin Dietary Supplements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this study was to develop a Single-Lab Validated Method using high-performance liquid chromatography (HPLC) with different detectors (diode array detector - DAD, fluorescence detector - FLD, and mass spectrometer - MS) for determination of seven B-complex vitamins (B1 - thiamin, B2 – ...

  4. The Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM): a review of the ICCVAM test method evaluation process and current international collaborations with the European Centre for the Validation of Alternative Methods (ECVAM).

    PubMed

    Stokes, William S; Schechtman, Leonard M; Hill, Richard N

    2002-12-01

    Over the last decade, national authorities in the USA and Europe have launched initiatives to validate new and improved toxicological test methods. In the USA, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and its supporting National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) were established by the Federal Government to work with test developers and Federal agencies to facilitate the validation, review, and adoption of new scientifically sound test methods, including alternatives that can refine, reduce, and replace animal use. In Europe, the European Centre for the Validation of Alternative Methods (ECVAM) was established to conduct validation studies on alternative test methods. Despite differences in organisational structure and processes, both organisations seek to achieve the adoption and use of alternative test methods. Accordingly, both have adopted similar validation and regulatory acceptance criteria. Collaborations and processes have also evolved to facilitate the international adoption of new test methods recommended by ECVAM and ICCVAM. These collaborations involve the sharing of expertise and data for test-method workshops and independent scientific peer reviews, and the adoption of processes to expedite the consideration of test methods already reviewed by the other organisation. More recently, NICEATM and ECVAM initiated a joint international validation study on in vitro methods for assessing acute systemic toxicity. These collaborations are expected to contribute to accelerated international adoption of harmonised new test methods that will support improved public health and provide for reduced and more-humane use of laboratory animals.

  5. Bioanalytical method validation: concepts, expectations and challenges in small molecule and macromolecule--a report of PITTCON 2013 symposium.

    PubMed

    Bashaw, Edward D; DeSilva, Binodh; Rose, Mark J; Wang, Yow-Ming C; Shukla, Chinmay

    2014-05-01

    The concepts, importance, and implications of bioanalytical method validation has been discussed and debated for a long time. The recent high profile issues related to bioanalytical method validation at both Cetero Houston and former MDS Canada has brought this topic back in the limelight. Hence, a symposium on bioanalytical method validation with the aim of revisiting the building blocks as well as discussing the challenges and implications on the bioanalysis of both small molecules and macromolecules was featured at the PITTCON 2013 Conference and Expo. This symposium was cosponsored by the American Chemical Society (ACS)-Division of Analytical Chemistry and Analysis and Pharmaceutical Quality (APQ) Section of the American Association of Pharmaceutical Scientists (AAPS) and featured leading speakers from the Food & Drug Administration (FDA), academia, and industry. In this symposium, the speakers shared several unique examples, and this session also provided a platform to discuss the need for continuous vigilance of the bioanalytical methods during drug discovery and development. The purpose of this article is to provide a concise report on the materials that were presented.

  6. Development and Validation of a Cultural Method for the Detection and Isolation of Salmonella in Cloves.

    PubMed

    Zhang, Guodong; Ali, Laila; Gill, Vikas; Tatavarthy, Aparna; Deng, Xiaohong; Hu, Lijun; Brown, Eric W; Hammack, Thomas S

    2017-03-01

    Detection of Salmonella in some spices, such as cloves, remains a challenge due to their inherent antimicrobial properties. The purpose of this study was to develop an effective detection method for Salmonella from spices using cloves as a model. Two clove varieties, Ceylon and Madagascar, were used in the study. Cloves were inoculated with Salmonella enterica subsp. enterica serotypes Montevideo, Typhimurium, or Weltevreden at about 1, 3, or 6 log CFU/25 g. Two test portion sizes, 10 and 25 g, were compared. After adding Trypticase soy broth (TSB) to the weighed cloves for preenrichment, three preenrichment methods were compared: cloves were left in the TSB for 24 h during preenrichment (PreE1), or the cloves-TSB mixture was shaken vigorously for 30 s (PreE2) or 60 s (PreE3), and the decanted material was transferred to a new bag for 24 h of preenrichment. The rest of the procedures were carried out according to the U.S. Food and Drug Administration Bacteriological Analytical Manual (BAM). At the low inoculation level (<1 log CFU/25 g), the detection rate was low across the three preenrichment methods, with the highest for PreE3 and lowest for PreE1. At the medium and high inoculation levels (3 and 6 log CFU/25 g), all samples from PreE2 and PreE3 were positive for Salmonella , whereas PreE1 produced only 12 positive samples from the 48 samples at the medium inoculation level and 38 positive samples from the 48 samples at the high inoculation level. Therefore, PreE3 with 25 g of cloves per sample was more effective than the other two tested methods. This newly designed method was then validated by comparing with the BAM method in six trials, with each trial consisting of 40 test samples. The results showed that PreE3 detected Salmonella from 88 of 120 inoculated test samples compared with only 31 positive from 120 test samples with the BAM method. Thus, our newly designed method PreE3 was more sensitive and easier to operate than the current BAM method for

  7. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC).

  8. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    PubMed

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  9. Dynamic Time Warping compared to established methods for validation of musculoskeletal models.

    PubMed

    Gaspar, Martin; Welke, Bastian; Seehaus, Frank; Hurschler, Christof; Schwarze, Michael

    2017-04-11

    By means of Multi-Body musculoskeletal simulation, important variables such as internal joint forces and moments can be estimated which cannot be measured directly. Validation can ensued by qualitative or by quantitative methods. Especially when comparing time-dependent signals, many methods do not perform well and validation is often limited to qualitative approaches. The aim of the present study was to investigate the capabilities of the Dynamic Time Warping (DTW) algorithm for comparing time series, which can quantify phase as well as amplitude errors. We contrast the sensitivity of DTW with other established metrics: the Pearson correlation coefficient, cross-correlation, the metric according to Geers, RMSE and normalized RMSE. This study is based on two data sets, where one data set represents direct validation and the other represents indirect validation. Direct validation was performed in the context of clinical gait-analysis on trans-femoral amputees fitted with a 6 component force-moment sensor. Measured forces and moments from amputees' socket-prosthesis are compared to simulated forces and moments. Indirect validation was performed in the context of surface EMG measurements on a cohort of healthy subjects with measurements taken of seven muscles of the leg, which were compared to simulated muscle activations. Regarding direct validation, a positive linear relation between results of RMSE and nRMSE to DTW can be seen. For indirect validation, a negative linear relation exists between Pearson correlation and cross-correlation. We propose the DTW algorithm for use in both direct and indirect quantitative validation as it correlates well with methods that are most suitable for one of the tasks. However, in DV it should be used together with methods resulting in a dimensional error value, in order to be able to interpret results more comprehensible.

  10. Development and Validation of UV-Visible Spectrophotometric Method for Simultaneous Determination of Eperisone and Paracetamol in Solid Dosage Form

    PubMed Central

    Khanage, Shantaram Gajanan; Mohite, Popat Baban; Jadhav, Sandeep

    2013-01-01

    Purpose: Eperisone Hydrochloride (EPE) is a potent new generation antispasmodic drug which is used in the treatment of moderate to severe pain in combination with Paracetamol (PAR). Both drugs are available in tablet dosage form in combination with a dose of 50 mg for EPE and 325 mg PAR respectively. Methods: The method is based upon Q-absorption ratio method for the simultaneous determination of the EPE and PAR. Absorption ratio method is used for the ratio of the absorption at two selected wavelength one of which is the iso-absorptive point and other being the λmax of one of the two components. EPE and PAR shows their iso-absorptive point at 260 nm in methanol, the second wavelength used is 249 nm which is the λmax of PAR in methanol. Results: The linearity was obtained in the concentration range of 5-25 μg/mL for EPE and 2-10 μg/mL for PAR. The proposed method was effectively applied to tablet dosage form for estimation of both drugs. The accuracy and reproducibility results are close to 100% with 2% RSD. Results of the analysis were validated statistically and found to be satisfactory. The results of proposed method have been validated as per ICH guidelines. Conclusion: A simple, precise and economical spectrophotometric method has been developed for the estimation of EPE and PAR in pharmaceutical formulation. PMID:24312876

  11. Development and validation of an ionic chromatography method for the determination of nitrate, nitrite and chloride in meat.

    PubMed

    Lopez-Moreno, Cristina; Perez, Isabel Viera; Urbano, Ana M

    2016-03-01

    The purpose of this study is to develop the validation of a method for the analysis of certain preservatives in meat and to obtain a suitable Certified Reference Material (CRM) to achieve this task. The preservatives studied were NO3(-), NO2(-) and Cl(-) as they serve as important antimicrobial agents in meat to inhibit the growth of bacteria spoilage. The meat samples were prepared using a treatment that allowed the production of a known CRM concentration that is highly homogeneous and stable in time. The matrix effects were also studied to evaluate the influence on the analytical signal for the ions of interest, showing that the matrix influence does not affect the final result. An assessment of the signal variation in time was carried out for the ions. In this regard, although the chloride and nitrate signal remained stable for the duration of the study, the nitrite signal decreased appreciably with time. A mathematical treatment of the data gave a stable nitrite signal, obtaining a method suitable for the validation of these anions in meat. A statistical study was needed for the validation of the method, where the precision, accuracy, uncertainty and other mathematical parameters were evaluated obtaining satisfactory results.

  12. Validation of an in-vivo proton beam range check method in an anthropomorphic pelvic phantom using dose measurements

    SciTech Connect

    Bentefour, El H. Prieels, Damien; Tang, Shikui; Cascio, Ethan W.; Testa, Mauro; Lu, Hsiao-Ming; Samuel, Deepak; Gottschalk, Bernard

    2015-04-15

    Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification in the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose

  13. Validation of an in-vivo proton beam range check method in an anthropomorphic pelvic phantom using dose measurements

    PubMed Central

    Bentefour, El H.; Tang, Shikui; Cascio, Ethan W.; Testa, Mauro; Samuel, Deepak; Prieels, Damien; Gottschalk, Bernard; Lu, Hsiao-Ming

    2015-01-01

    Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification in the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose

  14. Validation of the greater trochanter method with radiographic measurements of frontal plane hip joint centers and knee mechanical axis angles and two other hip joint center methods.

    PubMed

    Bennett, Hunter J; Shen, Guangping; Weinhandl, Joshua T; Zhang, Songning

    2016-09-06

    Several motion capture methods exist for predicting hip joint centers (HJC). These methods include regression models, functional joints, and projections from greater trochanters. While regression and functional methods have been compared to imaging techniques, the TROCH method has not been previously validated. The purpose of this study was to compare frontal-plane HJCs and knee mechanical axis angles estimated using the greater trochanter method with a regression (Bell) and a functional method against those obtained using radiographs. Thirty-five participants underwent a long-standing anteroposterior radiograph, and performed static and functional motion capture trials. The Bell, functional, and trochanter HJCs were constructed to predict mechanical axes and compare HJC locations. One-way repeated measures ANOVAs were used to compare mechanical axes and HJC locations estimated by motion capture methods and measured using radiographs (p<0.05). All methods overestimated mechanical axes compared to radiographs (<2°), but were not different. Mediolateral HJC locations and inter-HJC widths were similar between methods; however, inter-HJC widths were underestimated (average 3.7%) compared to radiographs. The Bell HJC was more superior and anterior to both functional and trochanter methods. The trochanter HJC was more posterior to both methods. The Bell method outperformed the other methods in leg length predictions compared to radiographs. Although differences existed between methods, all frontal-plane HJC location differences were <1.7cm. This study validated the trochanter HJC prediction method mediolaterally and vertically (with small respective correction factors). Therefore, all HJC methods seem to be viable in predicting mechanical axes and frontal-plane HJC locations compared with radiographs.

  15. Validation of high-performance thin-layer chromatographic methods for the identification of botanicals in a cGMP environment.

    PubMed

    Reich, Eike; Schibli, Anne; DeBatt, Alison

    2008-01-01

    Current Good Manufacturing Practices (cGMP) for botanicals stipulates the use of appropriate methods for identification of raw materials. Due to natural variability, chemical analysis of plant material is a great challenge and requires special approaches. This paper presents a comprehensive proposal to the process of validating qualitative high-performance thin-layer chromatographic (HPTLC) methods, proving that such methods are suitable for the purpose. The steps of the validation process are discussed and illustrated with examples taken from a project aiming at validation of methods for identification of green tea leaf, ginseng root, eleuthero root, echinacea root, black cohosh rhizome, licorice root, kava root, milk thistle aerial parts, feverfew aerial parts, and ginger root. The appendix of the paper, which includes complete documentation and method write-up for those plants, is available on the J. AOAC Int. Website (http://www.atypon-link.com/AOAC/loi/jaoi).

  16. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    PubMed

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  17. Verification, Validation, and Solution Quality in Computational Physics: CFD Methods Applied to Ice Sheet Physics

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    2005-01-01

    Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.

  18. An experimental validation method for questioning techniques that assess sensitive issues.

    PubMed

    Moshagen, Morten; Hilbig, Benjamin E; Erdfelder, Edgar; Moritz, Annie

    2014-01-01

    Studies addressing sensitive issues often yield distorted prevalence estimates due to socially desirable responding. Several techniques have been proposed to reduce this bias, including indirect questioning, psychophysiological lie detection, and bogus pipeline procedures. However, the increase in resources required by these techniques is warranted only if there is a substantial increase in validity as compared to direct questions. Convincing demonstration of superior validity necessitates the availability of a criterion reflecting the "true" prevalence of a sensitive attribute. Unfortunately, such criteria are notoriously difficult to obtain, which is why validation studies often proceed indirectly by simply comparing estimates obtained with different methods. Comparative validation studies, however, provide weak evidence only since the exact increase in validity (if any) remains unknown. To remedy this problem, we propose a simple method that allows for measuring the "true" prevalence of a sensitive behavior experimentally. The basic idea is to elicit normatively problematic behavior in a way that ensures conclusive knowledge of the prevalence rate of this behavior. This prevalence measure can then serve as an external validation criterion in a second step. An empirical demonstration of this method is provided.

  19. Validation of an extraction method for Cry1Ab protein from soil.

    PubMed

    Mueting, Sara A; Strain, Katherine E; Lydy, Michael J

    2014-01-01

    Corn expressing insecticidal proteins derived from Bacillus thuringiensis (Bt corn) has increased in usage in the United States from 8% of total corn acreage in 1996 to 67% in 2012. Because of this increase, it is important to be able to monitor the fate and transport of the insecticidal Bt proteins to evaluate environmental exposure and effects. Accurate and validated methods are needed to quantify these proteins in environmental matrices. A method to extract Bt Cry1Ab proteins from 3 soil types using a 10× phosphate-buffered saline with Tween buffer and a commercially available enzyme-linked immunosorbent assay (ELISA) was validated through a series of 6 tests. The validation process for Cry1Ab extractions in soil has not yet been reported in the scientific literature. The extraction buffer and each soil matrix were tested and validated for the ELISA. Extraction efficiencies were 41%, 74%, and 89% for the 3 soil types and were significantly correlated with the organic matter content of the soil. Despite low recoveries, consistent results with low coefficients of variation allowed for accurate measurements. Through validating this method with 3 different soils, a sensitive, specific, precise, and accurate quantification of Bt Cry1Ab was developed. The validation process can be expanded and implemented in other environmental matrices, adding consistency to data across a wide range of samples.

  20. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    PubMed

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials.

  1. Bridging the gap between comprehensive extraction protocols in plant metabolomics studies and method validation.

    PubMed

    Bijttebier, Sebastiaan; Van der Auwera, Anastasia; Foubert, Kenn; Voorspoels, Stefan; Pieters, Luc; Apers, Sandra

    2016-09-07

    It is vital to pay much attention to the design of extraction methods developed for plant metabolomics, as any non-extracted or converted metabolites will greatly affect the overall quality of the metabolomics study. Method validation is however often omitted in plant metabolome studies, as the well-established methodologies for classical targeted analyses such as recovery optimization cannot be strictly applied. The aim of the present study is to thoroughly evaluate state-of-the-art comprehensive extraction protocols for plant metabolomics with liquid chromatography-photodiode array-accurate mass mass spectrometry (LC-PDA-amMS) by bridging the gap with method validation. Validation of an extraction protocol in untargeted plant metabolomics should ideally be accomplished by validating the protocol for all possible outcomes, i.e. for all secondary metabolites potentially present in the plant. In an effort to approach this ideal validation scenario, two plant matrices were selected based on their wide versatility of phytochemicals: meadowsweet (Filipendula ulmaria) for its polyphenols content, and spicy paprika powder (from the genus Capsicum) for its apolar phytochemicals content (carotenoids, phytosterols, capsaicinoids). These matrices were extracted with comprehensive extraction protocols adapted from literature and analysed with a generic LC-PDA-amMS characterization platform that was previously validated for broad range phytochemical analysis. The performance of the comprehensive sample preparation protocols was assessed based on extraction efficiency, repeatability and intermediate precision and on ionization suppression/enhancement evaluation. The manuscript elaborates on the finding that none of the extraction methods allowed to exhaustively extract the metabolites. Furthermore, it is shown that depending on the extraction conditions enzymatic degradation mechanisms can occur. Investigation of the fractions obtained with the different extraction methods

  2. Inter-laboratory validation of standardized method to determine permeability of plastic films

    Technology Transfer Automated Retrieval System (TEKTRAN)

    To support regulations controlling soil fumigation, we are standardizing the laboratory method we developed to measure the permeability of plastic films to fumigant vapors. The method was validated using an inter-laboratory comparison with 7 participants. Each participant evaluated the mass transfer...

  3. Method of validating measurement data of a process parameter from a plurality of individual sensor inputs

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1998-01-01

    A method for generating a validated measurement of a process parameter at a point in time by using a plurality of individual sensor inputs from a scan of said sensors at said point in time. The sensor inputs from said scan are stored and a first validation pass is initiated by computing an initial average of all stored sensor inputs. Each sensor input is deviation checked by comparing each input including a preset tolerance against the initial average input. If the first deviation check is unsatisfactory, the sensor which produced the unsatisfactory input is flagged as suspect. It is then determined whether at least two of the inputs have not been flagged as suspect and are therefore considered good inputs. If two or more inputs are good, a second validation pass is initiated by computing a second average of all the good sensor inputs, and deviation checking the good inputs by comparing each good input including a present tolerance against the second average. If the second deviation check is satisfactory, the second average is displayed as the validated measurement and the suspect sensor as flagged as bad. A validation fault occurs if at least two inputs are not considered good, or if the second deviation check is not satisfactory. In the latter situation the inputs from each of all the sensors are compared against the last validated measurement and the value from the sensor input that deviates the least from the last valid measurement is displayed.

  4. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement

    PubMed Central

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-01-01

    Objective The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Methods Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Results Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. Conclusions AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis. PMID:22654681

  5. Validation of USP apparatus 4 method for microsphere in vitro release testing using Risperdal Consta.

    PubMed

    Rawat, Archana; Stippler, Erika; Shah, Vinod P; Burgess, Diane J

    2011-11-28

    The current manuscript addresses the need for a validated in vitro release testing method for controlled release parenteral microspheres. A USP apparatus 4 method was validated with the objective of possible compendial adaptation for microsphere in vitro release testing. Commercial microspheres (Risperdal Consta) were used for method validation. Accelerated and real-time release tests were conducted. The accelerated method had significantly reduced test duration and showed a good correlation with the real-time release profile (with limited number of sample analysis). Accelerated conditions were used for method validation (robustness and reproducibility). The robustness testing results revealed that release from the microspheres was not flow rate dependent and was not affected by minor variations in the method (such as cell preparation technique, amount of microspheres, flow-through cell size and size of glass beads). The significant difference in the release profile with small variations (± 0.5°C) in temperature was shown to be due to a change in risperidone catalyzed PLGA degradation in response to temperature. The accelerated method was reproducible as changing the system/equipment or the analyst did not affect the release profile. This work establishes the suitability of the modified USP apparatus 4 for possible compendial adaptation for drug release testing of microspheres.

  6. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    PubMed

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing.

  7. Display format, highlight validity, and highlight method: Their effects on search performance

    NASA Technical Reports Server (NTRS)

    Donner, Kimberly A.; Mckay, Tim D.; Obrien, Kevin M.; Rudisill, Marianne

    1991-01-01

    Display format and highlight validity were shown to affect visual display search performance; however, these studies were conducted on small, artificial displays of alphanumeric stimuli. A study manipulating these variables was conducted using realistic, complex Space Shuttle information displays. A 2x2x3 within-subjects analysis of variance found that search times were faster for items in reformatted displays than for current displays. Responses to valid applications of highlight were significantly faster than responses to non or invalidly highlighted applications. The significant format by highlight validity interaction showed that there was little difference in response time to both current and reformatted displays when the highlight validity was applied; however, under the non or invalid highlight conditions, search times were faster with reformatted displays. A separate within-subject analysis of variance of display format, highlight validity, and several highlight methods did not reveal a main effect of highlight method. In addition, observed display search times were compared to search time predicted by Tullis' Display Analysis Program. Benefits of highlighting and reformatting displays to enhance search and the necessity to consider highlight validity and format characteristics in tandem for predicting search performance are discussed.

  8. Development and assessment of disinfectant efficacy test methods for regulatory purposes.

    PubMed

    Tomasino, Stephen F

    2013-05-01

    The United States Environmental Protection Agency regulates pesticidal products, including products with antimicrobial activity. Test guidelines have been established to inform manufacturers of which methodology is appropriate to support a specific efficacy claim. This paper highlights efforts designed to improve current methods and the development and assessment of new test methods.

  9. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... chemical have no reporting responsibilities under this Part. Note, however, that any method of extraction... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of... otherwise required, respondents must report only about quantities of a chemical that is defined as...

  10. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... chemical have no reporting responsibilities under this Part. Note, however, that any method of extraction... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of... otherwise required, respondents must report only about quantities of a chemical that is defined as...

  11. 40 CFR 712.5 - Method of identification of substances for reporting purposes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... chemical have no reporting responsibilities under this Part. Note, however, that any method of extraction... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT CHEMICAL INFORMATION RULES General Provisions § 712.5 Method of... otherwise required, respondents must report only about quantities of a chemical that is defined as...

  12. From the Bronx to Bengifunda (and Other Lines of Flight): Deterritorializing Purposes and Methods in Science Education Research

    ERIC Educational Resources Information Center

    Gough, Noel

    2011-01-01

    In this essay I explore a number of questions about purposes and methods in science education research prompted by my reading of Wesley Pitts' ethnographic study of interactions among four students and their teacher in a chemistry classroom in the Bronx, New York City. I commence three "lines of flight" (small acts of Deleuzo-Guattarian…

  13. Defining the knee joint flexion-extension axis for purposes of quantitative gait analysis: an evaluation of methods.

    PubMed

    Schache, Anthony G; Baker, Richard; Lamoreux, Larry W

    2006-08-01

    Minimising measurement variability associated with hip axial rotation and avoiding knee joint angle cross-talk are two fundamental objectives of any method used to define the knee joint flexion-extension axis for purposes of quantitative gait analysis. The aim of this experiment was to compare three different methods of defining this axis: the knee alignment device (KAD) method, a method based on the transepicondylar axis (TEA) and an alternative numerical method (Dynamic). The former two methods are common approaches that have been applied clinically in many quantitative gait analysis laboratories; the latter is an optimisation procedure. A cohort of 20 subjects performed three different functional tasks (normal gait; squat; non-weight bearing knee flexion) on repeated occasions. Three-dimensional hip and knee angles were computed using the three alternative methods of defining the knee joint flexion-extension axis. The repeatability of hip axial rotation measurements during normal gait was found to be significantly better for the Dynamic method (p<0.01). Furthermore, both the variance in the knee varus-valgus kinematic profile and the degree of knee joint angle cross-talk were smallest for the Dynamic method across all functional tasks. The Dynamic method therefore provided superior results in comparison to the KAD and TEA-based methods and thus represents an attractive solution for orientating the knee joint flexion-extension axis for purposes of quantitative gait analysis.

  14. Validation of a method for the analysis of biogenic amines: histamine instability during wine sample storage.

    PubMed

    Bach, Benoit; Le Quere, Stephanie; Vuchot, Patrick; Grinbaum, Magali; Barnavon, Laurent

    2012-06-30

    This paper reports on the development of an optimized method for the simultaneous analysis of eight biogenic amines (histamine, methylamine, ethylamine, tyramine, putrescine, cadaverine, phenethylamine, and isoamylamine). The analytical method thus proposed has the following advantages: the easy derivatization of wine, the quantification of biogenic amines and a complete degradation of excess derivatization reagent during sample preparation in order to preserve the column. It consists in reversed phase separation by HPLC and UV-vis detection of the aminoenones formed by the reaction of amino compounds with the derivatization reagent diethyl ethoxymethylenemalonate (DEEMM). The usefulness of this technique was confirmed by an alternative oenological analytical method for the validation, quality control and uncertainty assessment (OIV Oeno 10/2005). The method was validated and proposed as a reference method to the International Organization of Vine and Wine (OIV). As a specific application of the proposed method, the biogenic amine content of Rhône valley wines was investigated.

  15. Novel validated spectrofluorimetric methods for the determination of taurine in energy drinks and human urine.

    PubMed

    Sharaf El Din, M K; Wahba, M E K

    2015-03-01

    Two sensitive, selective, economic and validated spectrofluorimetric methods were developed for the determination of taurine in energy drinks and spiked human urine. Method Ι is based on fluorimetric determination of the amino acid through its reaction with Hantzsch reagent to form a highly fluorescent product measured at 490 nm after excitation at 419 nm. Method ΙΙ is based on the reaction of taurine with tetracyanoethylene yielding a fluorescent charge transfer complex, which was measured at λex /em of (360 nm/450 nm). The proposed methods were subjected to detailed validation procedures, and were statistically compared with the reference method, where the results obtained were in good agreement. Method Ι was further applied to determine taurine in energy drinks and spiked human urine giving promising results. Moreover, the stoichiometry of the reactions was studied, and reaction mechanisms were postulated.

  16. Validity and Feasibility of a Digital Diet Estimation Method for Use with Preschool Children: A Pilot Study

    ERIC Educational Resources Information Center

    Nicklas, Theresa A.; O'Neil, Carol E.; Stuff, Janice; Goodell, Lora Suzanne; Liu, Yan; Martin, Corby K.

    2012-01-01

    Objective: The goal of the study was to assess the validity and feasibility of a digital diet estimation method for use with preschool children in "Head Start." Methods: Preschool children and their caregivers participated in validation (n = 22) and feasibility (n = 24) pilot studies. Validity was determined in the metabolic research unit using…

  17. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  18. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  19. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  20. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  1. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  2. Developing and validating the Youth Conduct Problems Scale-Rwanda: a mixed methods approach.

    PubMed

    Ng, Lauren C; Kanyanganzi, Frederick; Munyanah, Morris; Mushashi, Christine; Betancourt, Theresa S

    2014-01-01

    This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R). Qualitative free listing (n = 74) and key informant interviews (n = 47) identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research.

  3. Best Practices in Stability Indicating Method Development and Validation for Non-clinical Dose Formulations.

    PubMed

    Henry, Teresa R; Penn, Lara D; Conerty, Jason R; Wright, Francesca E; Gorman, Gregory; Pack, Brian W

    2016-11-01

    Non-clinical dose formulations (also known as pre-clinical or GLP formulations) play a key role in early drug development. These formulations are used to introduce active pharmaceutical ingredients (APIs) into test organisms for both pharmacokinetic and toxicological studies. Since these studies are ultimately used to support dose and safety ranges in human studies, it is important to understand not only the concentration and PK/PD of the active ingredient but also to generate safety data for likely process impurities and degradation products of the active ingredient. As such, many in the industry have chosen to develop and validate methods which can accurately detect and quantify the active ingredient along with impurities and degradation products. Such methods often provide trendable results which are predictive of stability, thus leading to the name; stability indicating methods. This document provides an overview of best practices for those choosing to include development and validation of such methods as part of their non-clinical drug development program. This document is intended to support teams who are either new to stability indicating method development and validation or who are less familiar with the requirements of validation due to their position within the product development life cycle.

  4. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  5. Using non-linear methods to investigate the criterion validity of traffic-psychological test batteries.

    PubMed

    Risser, R; Chaloupka, Ch; Grundler, W; Sommer, M; Häusler, J; Kaufmann, C

    2008-01-01

    In several countries in Europe (among others Germany and Austria) persons who have lost their drivers licence have to undergo a psychological test in order to regain their licence. The article discusses the validity of two test batteries of the Expert System Traffic using standardized driving tests [Schuhfried, G., 2005. Manual Expert System Traffic (XPSV). Schuhfried GmbH, Mödling]. A global evaluation of the respondents' performance in a standardized driving test was used as criterion measure in order to divide the subjects into drivers, who were classified as relatively safe or unsafe according to their performance in a standardized driving test. Artificial neural networks were used to calculate the criterion validity. This yielded superior classification rates and validity coefficients compared to classical multivariate methods such as a logistic regression. The stability and generalizability of the results was empirically demonstrated using a jack-knife validation, an internal bootstrap validation and an independent validation sample which completed the test batteries and the standardized driving test as part of a so-called traffic-psychological assessment which is compulsory in Austria in all cases, where the driver's licence has been withdrawn, e.g., when caught driving under the influence of alcohol. Moreover, the procedure allows calculating incremental validities which enable the assessment of the relative importance of the individual predictor variables. This contributes to the transparency of the results obtained with artificial neural networks. In summary it can be said that the results provide empirical evidence of the validity of the traffic-psychological tests batteries used in this study. The practical implications of the results for traffic-psychological assessment are described.

  6. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    EPA Science Inventory

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  7. Cleaning validation 1: development and validation of a chromatographic method for the detection of traces of LpHse detergent.

    PubMed

    Zayas, José; Colón, Héctor; Garced, Omar; Ramos, Leyda M

    2006-05-03

    A high performance liquid chromatography (HPLC) method for the detection of traces of LpHse (4-tert-amylphenol and 2-phenylphenol) has been developed and validated. The method was shown to be linear in the range from 0.5 to 10.00 ppm in solution. The method was also shown to be accurate with a recovery of up to 95% by area response for amylphenol and up to 94% by area response for phenylphenol from metal surfaces (4''x4'' un-polished 304 stainless steel plates) by means of swab material. The reproducibility of the method was determined to be 1.61% by area response and 1.52% by height response for amylphenol and 5.40% by area response and 13.77% by height response for phenylphenol from solutions reported as the pooled relative standard deviation. The developed method was also shown to be rugged by comparisons of different preparations by different analysts. The limit of detection was established to be 0.076 ppm by peak area, 0.079 ppm by peak height for amylphenol and 0.34 ppm by peak area, 0.82 ppm by peak height for phenylphenol from solution, and 1.77 ppb by peak area, 1.23 ppm by peak height for amylphenol and 1.23 ppm by peak area, 1.44 ppm by peak height for phenylphenol from recovery from metal studies. The limit of quantitation was established to be 0.25 ppm by peak area, 0.26 ppm by peak height for amylphenol and 1.14 ppm by peak area, 2.73 ppm by peak height for phenylphenol from solution, and 3.89 ppm by peak area, 4.11 ppm by peak height for amylphenol and 4.11 ppm by peak area, 4.79 ppm by peak height for phenylphenol from recovery from metal plates studies. This method can be employed to determine the presence of LpHse residues in cleaned equipments where the detergent was used.

  8. Validated spectrofluorimetric method for the determination of clonazepam in pharmaceutical preparations.

    PubMed

    Ibrahim, Fawzia; El-Enany, Nahed; Shalan, Shereen; Elsharawy, Rasha

    2016-05-01

    A simple, highly sensitive and validated spectrofluorimetric method was applied in the determination of clonazepam (CLZ). The method is based on reduction of the nitro group of clonazepam with zinc/CaCl2, and the product is then reacted with 2-cyanoacetamide (2-CNA) in the presence of ammonia (25%) yielding a highly fluorescent product. The produced fluorophore exhibits strong fluorescence intensity at ʎ(em) = 383 nm after excitation at ʎ(ex) = 333 nm. The method was rectilinear over a concentration range of 0.1-0.5 ng/mL with a limit of detection (LOD) of 0.0057 ng/mL and a limit of quantification (LOQ) of 0.017 ng/mL. The method was fully validated and successfully applied to the determination of CLZ in its tablets with a mean percentage recovery of 100.10 ± 0.75%. Method validation according to ICH Guidelines was evaluated. Statistical analysis of the results obtained using the proposed method was successfully compared with those obtained using a reference method, and there was no significance difference between the two methods in terms of accuracy and precision.

  9. Citizen Decision Making, Reflective Thinking and Simulation Gaming: A Marriage of Purpose, Method and Strategy.

    ERIC Educational Resources Information Center

    White, Charles S.

    1985-01-01

    A conception of citizen decision making based on participatory democratic theory is most likely to foster effective citizenship. An examination of social studies traditions suggests that reflective thinking as a teaching method is congenial to this conception. Simulation gaming is a potentially powerful instructional strategy for supporting…

  10. Comparative analysis of rodent tissue preservation methods and nucleic acid extraction techniques for virus screening purposes.

    PubMed

    Yama, Ines N; Garba, Madougou; Britton-Davidian, Janice; Thiberville, Simon-Djamel; Dobigny, Gauthier; Gould, Ernest A; de Lamballerie, Xavier; Charrel, Remi N

    2013-05-01

    The polymerase chain reaction (PCR) has become an essential method for the detection of viruses in tissue specimens. However, it is well known that the presence of PCR inhibitors in tissue samples may cause false-negative results. Hence the identification of PCR inhibitors and evaluation and optimization of nucleic acid extraction and preservation methods is of prime concern in virus discovery programs dealing with animal tissues. Accordingly, to monitor and remove inhibitors we have performed comparative analyses of two commonly used tissue storage methods and five RNA purification techniques using a variety of animal tissues, containing quantified levels of added MS2 bacteriophages as the indicator of inhibition. The results showed (i) no significant difference between the two methods of sample preservation, viz. direct storage at -80°C or 4°C in RNAlater, (ii) lung rodent tissues contained lower levels of inhibitor than liver, kidney and spleen, (iii) RNA extraction using the EZ1+PK RNA kit was the most effective procedure for removal of RT-PCR inhibitors.

  11. Comparison of different mass transport calculation methods for wind erosion quantification purposes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative estimation of the material transported by the wind is essential in the study and control of wind erosion, although methods for its calculation are still controversial. Sampling the dust cloud at discrete heights, fitting an equation to the data, and integrating this equation from the so...

  12. A Classroom-Based Assessment Method to Test Speaking Skills in English for Specific Purposes

    ERIC Educational Resources Information Center

    Alberola Colomar, María Pilar

    2014-01-01

    This article presents and analyses a classroom-based assessment method to test students' speaking skills in a variety of professional settings in tourism. The assessment system has been implemented in the Communication in English for Tourism course, as part of the Tourism Management degree programme, at Florida Universitaria (affiliated to the…

  13. Validation of a three-dimensional hand scanning and dimension extraction method with dimension data.

    PubMed

    Li, Zhizhong; Chang, Chien-Chi; Dempsey, Patrick G; Ouyang, Lusha; Duan, Jiyang

    2008-11-01

    A three-level experiment was developed to validate a 3-D hand scanning and dimension extraction method with dimension data. At the first level, a resin hand model of a participant was fabricated to test the repeatability of the dimension data obtained by the 3-D method. At the second level, the actual hand of that participant was measured repeatedly using both the 3-D method and the traditional manual measurement method. The repeatability for both methods was investigated and compared. The influence of posture keeping, surface deformation and other human issues were also examined on the second level. At the third level, a group of participants were recruited and their hands were measured using both methods to examine any differences between the two methods on statistical descriptives. Significant differences, which varied among dimension types (length, depth/breadth, and circumference), were found between the 3-D method and the traditional method. 3-D anthropometric measurement and dimension extraction has become a prospective technology. The proposed three-level experiment provides a systematic method for validation of the repeatability of a 3-D method and compatibility between dimension data from a 3-D method and a traditional method.

  14. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... Procedure for EPA Waste and Wastewater Methods 1. Applicability This procedure is to be applied exclusively.... For the purposes of this appendix, “waste” means waste and wastewater. 2. Procedure This...

  15. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... Procedure for EPA Waste and Wastewater Methods 1. Applicability This procedure is to be applied exclusively.... For the purposes of this appendix, “waste” means waste and wastewater. 2. Procedure This...

  16. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... Procedure for EPA Waste and Wastewater Methods 1. Applicability This procedure is to be applied exclusively.... For the purposes of this appendix, “waste” means waste and wastewater. 2. Procedure This...

  17. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... Procedure for EPA Waste and Wastewater Methods 1. Applicability This procedure is to be applied exclusively.... For the purposes of this appendix, “waste” means waste and wastewater. 2. Procedure This...

  18. Reference method for detection of Pgp mediated multidrug resistance in human hematological malignancies: a method validated by the laboratories of the French Drug Resistance Network.

    PubMed

    Huet, S; Marie, J P; Gualde, N; Robert, J

    1998-12-15

    Multidrug resistance (MDR) associated with overexpression of the MDR1 gene and of its product, P-glycoprotein (Pgp), plays an important role in limiting cancer treatment efficacy. Many studies have investigated Pgp expression in clinical samples of hematological malignancies but failed to give definitive conclusion on its usefulness. One convenient method for fluorescent detection of Pgp in malignant cells is flow cytometry which however gives variable results from a laboratory to another one, partly due to the lack of a reference method rigorously tested. The purpose of this technical note is to describe each step of a reference flow cytometric method. The guidelines for sample handling, staining and analysis have been established both for Pgp detection with monoclonal antibodies directed against extracellular epitopes (MRK16, UIC2 and 4E3), and for Pgp functional activity measurement with Rhodamine 123 as a fluorescent probe. Both methods have been validated on cultured cell lines and clinical samples by 12 laboratories of the French Drug Resistance Network. This cross-validated multicentric study points out crucial steps for the accuracy and reproducibility of the results, like cell viability, data analysis and expression.

  19. A validation framework for microbial forensic methods based on statistical pattern recognition

    SciTech Connect

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  20. Validation of Western North America Models based on finite-frequency and ray theory imaging methods

    SciTech Connect

    Larmat, Carene; Maceira, Monica; Porritt, Robert W.; Higdon, David Mitchell; Rowe, Charlotte Anne; Allen, Richard M.

    2015-02-02

    We validate seismic models developed for western North America with a focus on effect of imaging methods on data fit. We use the DNA09 models for which our collaborators provide models built with both the body-­wave FF approach and the RT approach, when the data selection, processing and reference models are the same.

  1. Validity of a Simulation Game as a Method for History Teaching

    ERIC Educational Resources Information Center

    Corbeil, Pierre; Laveault, Dany

    2011-01-01

    The aim of this research is, first, to determine the validity of a simulation game as a method of teaching and an instrument for the development of reasoning and, second, to study the relationship between learning and students' behavior toward games. The participants were college students in a History of International Relations course, with two…

  2. Validation Specimen for Contour Method Extension to Multiple Residual Stress Components

    SciTech Connect

    Pagliaro, Pierluigi; Prime, Michael B; Zuccarello, B; Clausen, Bjorn; Watkins, Thomas R

    2007-01-01

    A new theoretical development of the contour method, that allow the user to measure the three normal residual stress components on cross sections of a generic mechanical part, is presented. To validate such a theoretical development, a residual stress test specimen was properly designed, fabricated and then tested with different experimental techniques.

  3. Validation of a new method for quantification of ammonia volatilization from agricultural field plots

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A low cost method of atmospheric ammonia (NH3) concentration was developed and validated for use in static chambers. This technique utilizes glass tubes coated with oxalic acid to adsorb NH3 from the air. The advantage of this procedure is that it can be used to quantify NH3 emissions from field p...

  4. Validity and reliability of an alternative method for measuring power output during six-second all-out cycling.

    PubMed

    Watson, Martin; Bibbo, Daniele; Duffy, Charles R; Riches, Philip E; Conforto, Silvia; Macaluso, Andrea

    2014-08-01

    In a laboratory setting where both a mechanically-braked cycling ergometer and a motion analysis (MA) system are available, flywheel angular displacement can be estimated by using MA. The purpose of this investigation was to assess the validity and reliability of a MA method for measuring maximal power output (Pmax) in comparison with a force transducer (FT) method. Eight males and eight females undertook three identical sessions, separated by 4 to 6 days; the first being a familiarization session. Individuals performed three 6-second sprints against 50% of the maximal resistance to complete two pedal revolutions with a 3-minute rest between trials. Power was determined independently using both MA and FT analyses. Validity: MA recorded significantly higher Pmax than FT (P < .05). Bland-Altman plots showed that there was a systematic bias in the difference between the measures of the two systems. This difference increased as power increased. Repeatability: Intraclass correlation coefficients were on average 0.90 ± 0.05 in males and 0.85 ± 0.08 in females. Measuring Pmax by MA, therefore, is as appropriate for use in exercise physiology research as Pmax measured by FT, provided that a bias between these measurements methods is allowed for.

  5. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    NASA Astrophysics Data System (ADS)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  6. Validation of three new methods for determination of metal emissions using a modified Environmental Protection Agency Method 301.

    PubMed

    Yanca, Catherine A; Barth, Douglas C; Petterson, Krag A; Nakanishi, Michael P; Cooper, John A; Johnsen, Bruce E; Lambert, Richard H; Bivins, Daniel G

    2006-12-01

    Three new methods applicable to the determination of hazardous metal concentrations in stationary source emissions were developed and evaluated for use in U.S. Environmental Protection Agency (EPA) compliance applications. Two of the three independent methods, a continuous emissions monitor-based method (Xact) and an X-ray-based filter method (XFM), are used to measure metal emissions. The third method involves a quantitative aerosol generator (QAG), which produces a reference aerosol used to evaluate the measurement methods. A modification of EPA Method 301 was used to validate the three methods for As, Cd, Cr, Pb, and Hg, representing three hazardous waste combustor Maximum Achievable Control Technology (MACT) metal categories (low volatile, semivolatile, and volatile). The modified procedure tested the methods using more stringent criteria than EPA Method 301; these criteria included accuracy, precision, and linearity. The aerosol generation method was evaluated in the laboratory by comparing actual with theoretical aerosol concentrations. The measurement methods were evaluated at a hazardous waste combustor (HWC) by comparing measured with reference aerosol concentrations. The QAG, Xact, and XFM met the modified Method 301 validation criteria. All three of the methods demonstrated precisions and accuracies on the order of 5%. In addition, correlation coefficients for each method were on the order of 0.99, confirming the methods' linear response and high precision over a wide range of concentrations. The measurement methods should be applicable to emissions from a wide range of sources, and the reference aerosol generator should be applicable to additional analytes. EPA recently approved an alternative monitoring petition for an HWC at Eli Lilly's Tippecanoe site in Lafayette, IN, in which the Xact is used for demonstrating compliance with the HWC MACT metal emissions (low volatile, semivolatile, and volatile). The QAG reference aerosol generator was approved as

  7. Refraction-based X-ray Computed Tomography for Biomedical Purpose Using Dark Field Imaging Method

    NASA Astrophysics Data System (ADS)

    Sunaguchi, Naoki; Yuasa, Tetsuya; Huo, Qingkai; Ichihara, Shu; Ando, Masami

    We have proposed a tomographic x-ray imaging system using DFI (dark field imaging) optics along with a data-processing method to extract information on refraction from the measured intensities, and a reconstruction algorithm to reconstruct a refractive-index field from the projections generated from the extracted refraction information. The DFI imaging system consists of a tandem optical system of Bragg- and Laue-case crystals, a positioning device system for a sample, and two CCD (charge coupled device) cameras. Then, we developed a software code to simulate the data-acquisition, data-processing, and reconstruction methods to investigate the feasibility of the proposed methods. Finally, in order to demonstrate its efficacy, we imaged a sample with DCIS (ductal carcinoma in situ) excised from a breast cancer patient using a system constructed at the vertical wiggler beamline BL-14C in KEK-PF. Its CT images depicted a variety of fine histological structures, such as milk ducts, duct walls, secretions, adipose and fibrous tissue. They correlate well with histological sections.

  8. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  9. Development and validation of a novel RP-HPLC method for the analysis of reduced glutathione.

    PubMed

    Sutariya, Vijaykumar; Wehrung, Daniel; Geldenhuys, Werner J

    2012-03-01

    The objective of this study was the development, optimization, and validation of a novel reverse-phase high-pressure liquid chromatography (RP-HPLC) method for the quantification of reduced glutathione in pharmaceutical formulations utilizing simple UV detection. The separation utilized a C18 column at room temperature and UV absorption was measured at 215 nm. The mobile phase was an isocratic flow of a 50/50 (v/v) mixture of water (pH 7.0) and acetonitrile flowing at 1.0 mL/min. Validation of the method assessed the methods ability in seven categories: linearity, range, limit of detection, limit of quantification, accuracy, precision, and selectivity. Analysis of the system suitability showed acceptable levels of suitability in all categories. Likewise, the method displayed an acceptable degree of linearity (r(2) = 0.9994) over a concentration range of 2.5-60 µg/mL. The detection limit and quantification limit were 0.6 and 1.8 µg/mL respectively. The percent recovery of the method was 98.80-100.79%. Following validation the method was employed in the determination of glutathione in pharmaceutical formulations in the form of a conjugate and a nanoparticle. The proposed method offers a simple, accurate, and inexpensive way to quantify reduced glutathione.

  10. Evaluation Protocol for Review of Method Validation Data by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals Expert Review Panel.

    PubMed

    Gill, Brendon D; Indyk, Harvey E; Blake, Christopher J; Konings, Erik J M; Jacobs, Wesley A; Sullivan, Darryl M

    2015-01-01

    Methods under consideration as part of the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals process are to be evaluated against a set of Standard Method Performance RequirementsSM (SMPRs) via peer review by an expert review panel (ERP). A validation protocol and a checklist have been developed to assist the ERP to evaluate experimental data and to compare multiple candidate methods for each nutrient. Method performance against validation parameters mandated in the SMPRs as well as additional criteria are to be scored, with the method selected by the ERP proceeding to multilaboratory study prior to Final Action approval. These methods are intended to be used by the infant formula industry for the purposes of dispute resolution.

  11. Validated spectrophotometric methods for the simultaneous determination of telmisartan and atorvastatin in bulk and tablets

    PubMed Central

    Ilango, Kaliappan; Kumar, Pushpangadhan S. Shiji

    2012-01-01

    Aim: Three simple, accurate, and reproducible spectrophotometric methods have been developed and validated for simultaneous estimation of telmisartan (TELM) atorvastatin (ATV) in combined tablet dosage form. Materials and Methods: The first method is based on first-order derivative spectroscopy. The sampling wavelengths were 223 nm (zero crossing of TELM) where ATV showed considerable absorbance and 272 nm (zero crossing of ATV) where TELM showed considerable absorbance. The second method Q-analysis (absorbance ratio), involves formation of Q-absorbance equation using respective absorptivity values at 280.9 nm (isobestic point) and 296.0 nm (λmax of TELM). The third method involves determination using multicomponent mode method; sampling wavelengths selected were 296.0 and 246.9 nm. Results: TELM and ATV followed linearity in the concentration range of 5–40 and 4–32 μg/ml for method I, 5–30 μg/ml and 2–24 μg/ml for method II and III, respectively. Mean recoveries for all three methods were found satisfactory. All methods were validated according to International Conference on Harmonization Q2B guidelines. Conclusion: The developed methods are simple, precise, rugged, and economical. The utility of methods has been demonstrated by analysis of commercially available tablet dosage form. PMID:23781490

  12. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form.

    PubMed

    Abualhasan, Murad N; Zaid, Abdel Naser; Jaradat, Nidal; Mousa, Ayman

    2017-01-01

    Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC) is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624) and a flame ionization detector (FID). The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material.

  13. [Validation of the HPLC method in the determination of dioxopromethazine and phenylephrine in eye drops].

    PubMed

    Hudecová, T; Hatrík, S; Zimová, N; Havránek, E

    2002-03-01

    The present paper introduces a rapid HPLC method for the determination of dioxopromethazine and phenylephrine in eye drops. The method uses a modified C18 stationary phase optimized for the separation of basic compounds and a methanol/1.5 mM phosphoric acid (60/40 v/v, pH 3.02) mobile phase. The flow rate is set to 2 ml/min, sample volume 20 microliters, and compounds are detected at 275 nm. Prior to analysis, the eye drops are diluted with water in a ratio of 1:50. The elaborated HPLC method and the chromatographic system were validated according to the procedure for the validation of chromatographic systems and methods.

  14. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form

    PubMed Central

    Zaid, Abdel Naser; Jaradat, Nidal; Mousa, Ayman

    2017-01-01

    Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC) is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624) and a flame ionization detector (FID). The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material. PMID:28367216

  15. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  16. Aspects of bioanalytical method validation for the quantitative determination of trace elements.

    PubMed

    Levine, Keith E; Tudan, Christopher; Grohse, Peter M; Weber, Frank X; Levine, Michael A; Kim, Yu-Seon J

    2011-08-01

    Bioanalytical methods are used to quantitatively determine the concentration of drugs, biotransformation products or other specified substances in biological matrices and are often used to provide critical data to pharmacokinetic or bioequivalence studies in support of regulatory submissions. In order to ensure that bioanalytical methods are capable of generating reliable, reproducible data that meet or exceed current regulatory guidance, they are subjected to a rigorous method validation process. At present, regulatory guidance does not necessarily account for nuances specific to trace element determinations. This paper is intended to provide the reader with guidance related to trace element bioanalytical method validation from the authors' perspective for two prevalent and powerful instrumental techniques: inductively coupled plasma-optical emission spectrometry and inductively coupled plasma-MS.

  17. Development and validation of stability-indicating HPLC method for determination of cefpirome sulfate.

    PubMed

    Zalewski, Przemysław; Skibiński, Robert; Cielecka-Piontek, Judyta; Bednarek-Rajewska, Katarzyna

    2014-01-01

    The stability-indicating LC assay method was developed and validated for quantitative determination of cefpirome sulfate (CPS) in the presence of degradation products formed during the forced degradation studies. An isocratic HPLC method was developed with Lichrospher RP-18 column, 5 μm particle size, 125 mm x 4 mm column and 12 mM ammonium acetate-acetonitrile (90 : 10 v/v) as a mobile phase. The flow rate of the mobile phase was 1.0 mL/min. Detection wavelength was 270 nm and temperature was 30 degrees C. Cefpirome sulfate as other cephalosporins was subjected to stress conditions of degradation in aqueous solutions including hydrolysis, oxidation, photolysis and thermal degradation. The developed method was validated with regard to linearity, accuracy, precision, selectivity and robustness. The method was applied successfully for identification and determination of cefpirome sulfate in pharmaceuticals and during kinetic studies.

  18. Titanium oxide thin films obtained with physical and chemical vapour deposition methods for optical biosensing purposes.

    PubMed

    Dominik, M; Leśniewski, A; Janczuk, M; Niedziółka-Jönsson, J; Hołdyński, M; Wachnicki, Ł; Godlewski, M; Bock, W J; Śmietana, M

    2017-07-15

    This work discusses an application of titanium oxide (TiOx) thin films deposited using physical (reactive magnetron sputtering, RMS) and chemical (atomic layer deposition, ALD) vapour deposition methods as a functional coating for label-free optical biosensors. The films were applied as a coating for two types of sensors based on the localised surface plasmon resonance (LSPR) of gold nanoparticles deposited on a glass plate and on a long-period grating (LPG) induced in an optical fibre. Optical and structural properties of the TiOx thin films were investigated and discussed. It has been found that deposition method has a significant influence on optical properties and composition of the films, but negligible impact on TiOx surface silanization effectiveness. A higher content of oxygen with lower Ti content in the ALD films leads to the formation of layers with higher refractive index and slightly higher extinction coefficient than for the RMS TiOx. Moreover, application of the TiOx film independently on deposition method enables not only for tuning of the spectral response of the investigated biosensors, but also in case of LSPR for enhancing the ability for biofunctionalization, i.e., TiOx film mechanically protects the nanoparticles and induces change in the biofunctionalization procedure to the one typical for oxides. TiOx coated LSPR and LPG sensors with refractive index sensitivity of close to 30 and 3400nm/RIU, respectively, were investigated. The ability for molecular recognition was evaluated with the well-known complex formation between avidin and biotin as a model system. The shift in resonance wavelength reached 3 and 13.2nm in case of LSPR and LPG sensors, respectively. Any modification in TiOx properties resulting from the biofunctionalization process can be also clearly detected.

  19. Assembly for collecting samples for purposes of identification or analysis and method of use

    DOEpatents

    Thompson, Cyril V [Knoxville, TN; Smith, Rob R [Knoxville, TN

    2010-02-02

    An assembly and an associated method for collecting a sample of material desired to be characterized with diagnostic equipment includes or utilizes an elongated member having a proximal end with which the assembly is manipulated by a user and a distal end. In addition, a collection tip which is capable of being placed into contact with the material to be characterized is supported upon the distal end. The collection tip includes a body of chemically-inert porous material for binding a sample of material when the tip is placed into contact with the material and thereby holds the sample of material for subsequent introduction to the diagnostic equipment.

  20. Reliability and concurrent validity of a novel method allowing for in-shoe measurement of navicular drop

    PubMed Central

    2014-01-01

    Background Increased navicular drop is associated with increased risk of lower extremity overuse injuries and foot orthoses are often prescribed to reduce navicular drop. For laboratory studies, transparent shoes may be used to monitor the effect of orthoses but no clinically feasible methods exist. We have developed a stretch-sensor that allows for in-shoe measurement of navicular drop but the reliability and validity is unknown. The purpose of this study was to investigate: 1) the reliability of the stretch-sensor for measuring navicular drop, and 2) the concurrent validity of the stretch-sensor compared to the static navicular drop test. Methods Intra- and inter-rater reliability was tested on 27 participants walking on a treadmill on two separate days. The stretch-sensor was positioned 20 mm posterior to the tip of the medial malleolus and 20 mm posterior to the navicular tuberosity. The participants walked six minutes on the treadmill before navicular drop was measured. Reliability was quantified by the Intraclass Correlation Coefficient (ICC 2.1) and agreement was quantified by Limits of Agreement (LOA). To assess concurrent validity, static navicular drop was measured with the stretch-sensor and compared with static navicular drop measured with a ruler on 27 new participants. Linear regression was used to measure concurrent validity. Results The reliability of the stretch-sensor was acceptable for barefoot measurement (intra- and inter-rater ICC: 0.76-0.84) but lower for in-shoe measurement (ICC: 0.65). There was a significant association between static navicular drop measured with the stretch-sensor compared with a ruler (r = 0.745, p < 0.001). Conclusion This study suggests that the stretch-sensor has acceptable reliability for dynamic barefoot measurement of navicular drop. Furthermore, the stretch-sensor shows concurrent validity compared with the static navicular drop test as performed by Brody. This new simple method may hold promise for both

  1. E-Flux2 and SPOT: Validated Methods for Inferring Intracellular Metabolic Flux Distributions from Transcriptomic Data

    PubMed Central

    Kim, Min Kyung; Lane, Anatoliy; Kelley, James J.; Lun, Desmond S.

    2016-01-01

    Background Several methods have been developed to predict system-wide and condition-specific intracellular metabolic fluxes by integrating transcriptomic data with genome-scale metabolic models. While powerful in many settings, existing methods have several shortcomings, and it is unclear which method has the best accuracy in general because of limited validation against experimentally measured intracellular fluxes. Results We present a general optimization strategy for inferring intracellular metabolic flux distributions from transcriptomic data coupled with genome-scale metabolic reconstructions. It consists of two different template models called DC (determined carbon source model) and AC (all possible carbon sources model) and two different new methods called E-Flux2 (E-Flux method combined with minimization of l2 norm) and SPOT (Simplified Pearson cOrrelation with Transcriptomic data), which can be chosen and combined depending on the availability of knowledge on carbon source or objective function. This enables us to simulate a broad range of experimental conditions. We examined E. coli and S. cerevisiae as representative prokaryotic and eukaryotic microorganisms respectively. The predictive accuracy of our algorithm was validated by calculating the uncentered Pearson correlation between predicted fluxes and measured fluxes. To this end, we compiled 20 experimental conditions (11 in E. coli and 9 in S. cerevisiae), of transcriptome measurements coupled with corresponding central carbon metabolism intracellular flux measurements determined by 13C metabolic flux analysis (13C-MFA), which is the largest dataset assembled to date for the purpose of validating inference methods for predicting intracellular fluxes. In both organisms, our method achieves an average correlation coefficient ranging from 0.59 to 0.87, outperforming a representative sample of competing methods. Easy-to-use implementations of E-Flux2 and SPOT are available as part of the open

  2. Owner-collected swabs of pets: a method fit for the purpose of zoonoses research.

    PubMed

    Möbius, N; Hille, K; Verspohl, J; Wefstaedt, P; Kreienbrock, L

    2013-09-01

    As part of the preparation of a large cohort study in the entire German population, this study examined the feasibility of cat and dog owners collecting nasal and oral swabs of their animals at home as a method of assessing exposure to zoonoses. In veterinary clinics in Hannover, Germany, 100 pet owners were recruited. Nasal and oral swabs of pets were taken by a veterinarian at the clinic and owners took swabs at home. Swabs were analysed regarding bacterial growth and compared (owner vs. vet) using Cohen's kappa and McNemar's test. The return rate of kits was 92%, and 77% of owners thought it unnecessary to have veterinarian assistance to swab the mouth. McNemar's test results: oral swabs 78% agreement with Gram-positive bacterial growth, 87% agreement with Gram-negative bacterial growth; with similar results for nasal swabs. Although sample quality differed, this method allowed the receipt of swabs from pets in order to obtain information about colonization with zoonotic pathogens.

  3. PAH detection in Quercus robur leaves and Pinus pinaster needles: A fast method for biomonitoring purpose.

    PubMed

    De Nicola, F; Concha Graña, E; Aboal, J R; Carballeira, A; Fernández, J Á; López Mahía, P; Prada Rodríguez, D; Muniategui Lorenzo, S

    2016-06-01

    Due to the complexity and heterogeneity of plant matrices, new procedure should be standardized for each single biomonitor. Thus, here is described a matrix solid-phase dispersion extraction method, previously used for moss samples, improved and modified for the analyses of PAHs in Quercus robur leaves and Pinus pinaster needles, species widely used in biomonitoring studies across Europe. The improvements compared to the previous procedure are the use of Florisil added with further clean-up sorbents, 10% deactivated silica for pine needles and PSA for oak leaves, being these matrices rich in interfering compounds, as shown by the gas chromatography-mass spectrometry analyses acquired in full scan mode. Good trueness, with values in the range 90-120% for the most of compounds, high precision (intermediate precision between 2% and 12%) and good sensitivity using only 250mg of samples (limits of quantification lower than 3 and 1.5ngg(-1), respectively for pine and oak) were achieved by the selected procedures. These methods proved to be reliable for PAH analyses and, having advantage of fastness, can be used in biomonitoring studies of PAH air contamination.

  4. Blood Density Is Nearly Equal to Water Density: A Validation Study of the Gravimetric Method of Measuring Intraoperative Blood Loss.

    PubMed

    Vitello, Dominic J; Ripper, Richard M; Fettiplace, Michael R; Weinberg, Guy L; Vitello, Joseph M

    2015-01-01

    Purpose. The gravimetric method of weighing surgical sponges is used to quantify intraoperative blood loss. The dry mass minus the wet mass of the gauze equals the volume of blood lost. This method assumes that the density of blood is equivalent to water (1 gm/mL). This study's purpose was to validate the assumption that the density of blood is equivalent to water and to correlate density with hematocrit. Methods. 50 µL of whole blood was weighed from eighteen rats. A distilled water control was weighed for each blood sample. The averages of the blood and water were compared utilizing a Student's unpaired, one-tailed t-test. The masses of the blood samples and the hematocrits were compared using a linear regression. Results. The average mass of the eighteen blood samples was 0.0489 g and that of the distilled water controls was 0.0492 g. The t-test showed P = 0.2269 and R (2) = 0.03154. The hematocrit values ranged from 24% to 48%. The linear regression R (2) value was 0.1767. Conclusions. The R (2) value comparing the blood and distilled water masses suggests high correlation between the two populations. Linear regression showed the hematocrit was not proportional to the mass of the blood. The study confirmed that the measured density of blood is similar to water.

  5. Blood Density Is Nearly Equal to Water Density: A Validation Study of the Gravimetric Method of Measuring Intraoperative Blood Loss

    PubMed Central

    Vitello, Dominic J.; Ripper, Richard M.; Fettiplace, Michael R.; Weinberg, Guy L.; Vitello, Joseph M.

    2015-01-01

    Purpose. The gravimetric method of weighing surgical sponges is used to quantify intraoperative blood loss. The dry mass minus the wet mass of the gauze equals the volume of blood lost. This method assumes that the density of blood is equivalent to water (1 gm/mL). This study's purpose was to validate the assumption that the density of blood is equivalent to water and to correlate density with hematocrit. Methods. 50 µL of whole blood was weighed from eighteen rats. A distilled water control was weighed for each blood sample. The averages of the blood and water were compared utilizing a Student's unpaired, one-tailed t-test. The masses of the blood samples and the hematocrits were compared using a linear regression. Results. The average mass of the eighteen blood samples was 0.0489 g and that of the distilled water controls was 0.0492 g. The t-test showed P = 0.2269 and R2 = 0.03154. The hematocrit values ranged from 24% to 48%. The linear regression R2 value was 0.1767. Conclusions. The R2 value comparing the blood and distilled water masses suggests high correlation between the two populations. Linear regression showed the hematocrit was not proportional to the mass of the blood. The study confirmed that the measured density of blood is similar to water. PMID:26464949

  6. A Thematic Review of Interactive Whiteboard Use in Science Education: Rationales, Purposes, Methods and General Knowledge

    NASA Astrophysics Data System (ADS)

    Ormanci, Ummuhan; Cepni, Salih; Deveci, Isa; Aydin, Ozhan

    2015-10-01

    In Turkey and many other countries, the importance of the interactive whiteboard (IWB) is increasing, and as a result, projects and studies are being conducted regarding the use of the IWB in classrooms. Accordingly, in these countries, many issues are being researched, such as the IWB's contribution to the education process, its use in classroom settings and problems that occur when using the IWB. In this context, the research and analysis of studies regarding the use of the IWB have important implications for educators, researchers and teachers. This study aims to review and analyze studies conducted regarding the use of the IWB in the field of science. Accordingly, as a thematic review of the research was deemed appropriate, extant articles available in the literature were analyzed using a matrix that consisted of general features (type of journal, year and demographic properties) and content features (rationales, aims, research methods, samples, data collections, results and suggestions). According to the findings, it was concluded that the studies regarding the use of IWBs were conducted due to deficiencies in the current literature. However, there are rare studies in which the reasons for the research were associated with the nature of science education. There were also studies that focused on the effects of the IWB on student academic success and learning outcomes. Within this context, it is evident that there is a need for further research concerning the use of IWBs in science education and for studies regarding the effect of IWBs on students' skills.

  7. Validity and feasibility of a digital diet estimation method for use with preschool children: a pilot study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The goal of the study was to assess the validity and feasibility of a digital diet estimation method for use with preschool children in Head Start. Preschool children and their caregivers participated in validation (n=22) and feasibility (n=24) pilot studies. Validity was determined in the metabolic...

  8. Validated UV-spectrophotometric method for the evaluation of the efficacy of makeup remover.

    PubMed

    Charoennit, P; Lourith, N

    2012-04-01

    A UV-spectrophotometric method for the analysis of makeup remover was developed and validated according to ICH guidelines. Three makeup removers for which the main ingredients consisted of vegetable oil (A), mineral oil and silicone (B) and mineral oil and water (C) were sampled in this study. Ethanol was the optimal solvent because it did not interfere with the maximum absorbance of the liquid foundation at 250 nm. The linearity was determined over a range of makeup concentrations from 0.540 to 1.412 mg mL⁻¹ (R² = 0.9977). The accuracy of this method was determined by analysing low, intermediate and high concentrations of the liquid foundation and gave 78.59-91.57% recoveries with a relative standard deviation of <2% (0.56-1.45%). This result demonstrates the validity and reliability of this method. The reproducibilities were 97.32 ± 1.79, 88.34 ± 2.69 and 95.63 ± 2.94 for preparations A, B and C respectively, which are within the acceptable limits set forth by the ASEAN analytical validation guidelines, which ensure the precision of the method under the same operating conditions over a short time interval and the inter-assay precision within the laboratory. The proposed method is therefore a simple, rapid, accurate, precise and inexpensive technique for the routine analysis of makeup remover efficacy.

  9. A Conventional Method for Valid "Actual Soil pH" Measurement.

    PubMed

    Oman, Srečko F

    2012-12-01

    After recognition of the Suspension Effect problem in potentiometric measurements in aqueous suspensions, no scientific consensus about its cause and nature was obtained. Numerous conventional methods of soil pH measurement were therefore introduced for practical soil pH determination. Most of the results of these methods are not valid with regard to the international pH scale. The method proposed in the present work rejects improper procedures and introduces correct soil sampling and a suitable pH measuring technique, as follows: the indicator glass electrode, substituting for roots in the soil, is inserted in a partly diluted sample suspension of the original soil and the modified reference electrode contacts the sample in a manner that eliminates the abnormal liquid junction potential. "Actual soil pH values" measured in this way are valid but the method used is a conventional one. Namely, the irreversible potential of the glass electrode includes the suspension effect of the first kind (SE1) and is a mixed steady-state potential. It is considered by convention as a substitute for and equivalent to the equilibrium potential which as a rule does not exist in a suspension. The soil pH values measured by the proposed conventional method are reproducible and valid with regard to the international hpH scale. They could be considered as the pH values, with uncertainty of +/- 0.1 pH unit, to which the roots are exposed.

  10. Determination of vitamin C in foods: current state of method validation.

    PubMed

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review.

  11. Critical review of near-infrared spectroscopic methods validations in pharmaceutical applications.

    PubMed

    De Bleye, C; Chavez, P-F; Mantanus, J; Marini, R; Hubert, Ph; Rozet, E; Ziemons, E

    2012-10-01

    Based on the large number of publications reported over the past five years, near-infrared spectroscopy (NIRS) is more and more considered an attractive and promising analytical tool regarding Process Analytical Technology and Green Chemistry. From the reviewed literature, few of these publications present a thoroughly validated NIRS method even if some guidelines have been published by different groups and regulatory authorities. However, as any analytical method, the validation of NIRS method is a mandatory step at the end of the development in order to give enough guarantees that each of the future results during routine use will be close enough to the true value. Besides the introduction of PAT concepts in the revised document of the European Pharmacopoeia (2.2.40) dealing with near-infrared spectroscopy recently published in Pharmeuropa, it agrees very well with this mandatory step. Indeed, the latter suggests to use similar analytical performance characteristics than those required for any analytical procedure based on acceptance criteria consistent with the intended use of the method. In this context, this review gives a comprehensive and critical overview of the methodologies applied to assess the validity of quantitative NIRS methods used in pharmaceutical applications.

  12. Validation of the Endopep-MS method for qualitative detection of active botulinum neurotoxins in human and chicken serum

    PubMed Central

    Björnstad, Kristian; Åberg, Annica Tevell; Kalb, Suzanne R.; Wang, Dongxia; Barr, John R.; Bondesson, Ulf; Hedeland, Mikael

    2015-01-01

    Botulinum neurotoxins (BoNTs) are highly toxic proteases produced by anaerobic bacteria. Traditionally, a mouse bioassay (MBA) has been used for detection of BoNTs, but for a long time, laboratories have worked with alternative methods for their detection. One of the most promising in vitro methods is a combination of an enzymatic and mass spectrometric assay called Endopep-MS. However, no comprehensive validation of the method has been presented. The main purpose of this work was to perform an in-house validation for the qualitative analysis of BoNT-A, B, C, C/D, D, D/C, E, and F in serum. The limit of detection (LOD), selectivity, precision, stability in matrix and solution, and correlation with the MBA were evaluated. The LOD was equal to or even better than that of the MBA for BoNT-A, B, D/C, E, and F. Furthermore, Endopep-MS was for the first time successfully used to differentiate between BoNT-C, D and their mosaics C/D and D/C by different combinations of antibodies and target peptides. In addition, sequential antibody capture was presented as a new way to multiplex the method when only a small sample volume is available. In the comparison with the MBA, all the samples analyzed were positive for BoNT-C/D with both methods. These results indicate that the Endopep-MS method is a good alternative to the MBA as the gold standard for BoNT detection based on its sensitivity, selectivity, speed, and that it does not require experimental animals. PMID:25228079

  13. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1314 and Method 1315

    EPA Science Inventory

    This report summarizes the results of an interlaboratory study conducted to generate precision estimates for two leaching methods under review by the U.S. EPA’s OSWER for inclusion into the EPA’s SW-846: Method 1314: Liquid-Solid Partitioning as a Function of Liquid...

  14. Development and validation of a molecular size distribution method for polysaccharide vaccines.

    PubMed

    Clément, G; Dierick, J-F; Lenfant, C; Giffroy, D

    2014-01-01

    Determination of the molecular size distribution of vaccine products by high performance size exclusion chromatography coupled to refractive index detection is important during the manufacturing process. Partial elution of high molecular weight compounds in the void volume of the chromatographic column is responsible for variation in the results obtained with a reference method using a TSK G5000PWXL chromatographic column. GlaxoSmithKline Vaccines has developed an alternative method relying on the selection of a different chromatographic column with a wider separation range and the generation of a dextran calibration curve to determine the optimal molecular weight cut-off values for all tested products. Validation of this method was performed according to The International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). The new method detected product degradation with the same sensitivity as that observed for the reference method. All validation parameters were within the pre-specified range. Precision (relative standard deviation (RSD) of mean values) was < 5 per cent (intra-assay) and < 10 per cent (inter-assay). Sample recovery was > 70 per cent for all polysaccharide conjugates and for the Haemophilus influenzae type B final container vaccine. All results obtained for robustness met the acceptance criteria defined in the validation protocol (≤ 2 times (RSD) or ≤ 2 per cent difference between the modified and the reference parameter value if RSD = 0 per cent). The new method was shown to be a suitable quality control method for the release and stability follow-up of polysaccharide-containing vaccines. The new method gave comparable results to the reference method, but with less intra- and inter-assay variability.

  15. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    PubMed

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy.

  16. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    SciTech Connect

    Melius, J.; Margolis, R.; Ong, S.

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  17. HPLC-UV method validation for the identification and quantification of bioactive amines in commercial eggs.

    PubMed

    de Figueiredo, Tadeu Chaves; de Assis, Débora Cristina Sampaio; Menezes, Liliane Denize Miranda; da Silva, Guilherme Resende; Lanza, Isabela Pereira; Heneine, Luiz Guilherme Dias; Cançado, Silvana de Vasconcelos

    2015-09-01

    A quantitative and confirmatory high-performance liquid chromatography with ultraviolet detection (HPLC-UV) method for the determination of bioactive amines in the albumen and yolk of commercial eggs was developed, optimized and validated by analyte extraction with trichloroacetic acid and pre-column derivatization with dansyl chloride. Phenylethylamine, putrescine, cadaverine, histamine, tyramine, spermidine and spermine standards were used to evaluate the following performance parameters: limit of detection (LoD), limit of quantification (LoQ), selectivity, linearity, precision, recovery and ruggedness. The LoD of the method was defined from 0.2 to 0.3 mg kg(-1) for the yolk matrix and from 0.2 to 0.4 mg kg(-1) for the albumen matrix; the LoQ was from 0.7 to 1.0 mg kg(-1) for the yolk matrix and from 0.7 to 1.1 mg kg(-1) for the albumen matrix. The validated method exhibited excellent selectivity and separation of all amines with coefficients of determination higher than 0.99. The obtained recovery values were from 90.5% to 108.3%, and the relative standard deviation (RSD) was lower than 10% under repeatability conditions for the studied analytes. The performance parameters show the validated method to be adequate for the determination of bioactive amines in egg albumen and yolk.

  18. Development and validation of UFLC-MS/MS method for determination of bosentan in rat plasma.

    PubMed

    Atila, Alptug; Ozturk, Murat; Kadioglu, Yucel; Halici, Zekai; Turkan, Didar; Yayla, Muhammed; Un, Harun

    2014-08-01

    A rapid, simple and sensitive UFLC-MS/MS method was developed and validated for the determination of bosentan in rat plasma using etodolac as an internal standard (IS) after liquid-liquid extraction with diethyl ether-chloroform (4:1, v/v). Bosentan and IS were detected using electrospray ionization in positive ion multiple reaction monitoring mode by monitoring the transitions m/z 551.90→201.90 and 288.20→172.00, respectively. Chromatographic separation was performed on the inertsil ODS-4 column with a gradient mobile phase, which consisted of 0.1% acetic acid with 5mM ammonium acetate in water for solvent A and 5mM ammonium acetate in acetonitrile-methanol (50:50, v/v) for solvent B at a flow rate of 0.3mL/min. The method was sensitive with 0.5ng/mL as the lower limit of quantitation (LLOQ) and the standard calibration curve for bosentan was linear (r>0.997) over the studied concentration range (0.5-2000ng/mL). The proposed method was fully validated by determining specificity, linearity, LLOQ, precision and accuracy, recovery, matrix effect and stability. The validated method was successfully applied to plasma samples obtained from rats.

  19. Improvement of the validity of the simplified modal method for designing a subwavelength dielectric transmission grating.

    PubMed

    Jing, Xufeng; Zhang, Junchao; Tian, Ying; Jin, Shangzhong

    2014-01-10

    To accurately and easily design the diffraction characteristics of a rectangular transmission grating under the illumination of Littrow mounting, the validity and limitation of the simplified modal method is evaluated by a comparison of diffraction efficiencies predicted by the modal approach to exact results calculated with rigorous coupled-wave analysis. The influence of the grating normalized period, the normalized groove depth, and the fill factor on the accuracy of the modal method is quantitatively determined. More importantly, the reflection effect of two propagating grating modes with the optical thin-film model and the nonsymmetrical Fabry-Perot model is proposed and applied in the modal method to improve the accuracy of the calculated diffraction efficiencies. Generally, it is found that the thin-film model of reflection loss is valid at the smaller normalized period, but the Fabry-Perot model can exactly calculate the reflection loss of grating modes at the larger normalized period. Based on the fact that the validity of the modal approach is determined independently of the incident wavelength, the exact design and analysis of grating diffraction elements can be implemented at different wavelengths by simply scaling the grating parameters. Moreover, the polarization effect of diffraction properties on the limitation of the modal method without and with the reflection loss of grating modes is clearly demonstrated.

  20. Comparison of the quantitative performances and measurement uncertainty estimates obtained during method validation versus routine applications of a novel hydrophilic interaction chromatography method for the determination of cidofovir in human plasma.

    PubMed

    Lecomte, F; Hubert, C; Demarche, S; De Bleye, C; Dispas, A; Jost, M; Frankenne, F; Ceccato, A; Rozet, E; Hubert, Ph

    2012-01-05

    Method validation is essential to ensure that an analytical method is fit for its intended purpose. Additionally, it is advisable to estimate measurement uncertainty in order to allow a correct interpretation of the results generated by analytical methods. Measurement uncertainty can be efficiently estimated during method validation as a top-down approach. However, method validation predictions of the quantitative performances of the assay and estimations of measurement uncertainty may be far away from the real performances obtained during the routine application of this assay. In this work, the predictions of the quantitative performances and measurement uncertainty estimations obtained from a method validation are compared to those obtained during routine applications of a bioanalytical method. For that purpose, a new hydrophilic interaction chromatography (HILIC) method was used. This method was developed for the determination of cidofovir, an antiviral drug, in human plasma. Cidofovir (CDV) is a highly polar molecule presenting three ionizable functions. Therefore, it is an interesting candidate for determination by HILIC mode. CDV is an acyclic cytidine monophosphate analog that has a broad antiviral spectrum and is currently undergoing evaluation in clinical trials as a topical agent for treatment of papillomavirus infections. The analytical conditions were optimized by means of design of experiments approach in order to obtain robust analytical conditions. These ones were absolutely necessary to enable the comparisons mentioned above. After a sample clean-up by means of solid phase extraction, the chromatographic analysis was performed on bare silica stationary phase using a mixture of acetonitrile-ammonium hydrogen carbonate (pH 7.0; 20mM) (72:28, v/v) as mobile phase. This newly developed bioanalytical method was then fully validated according to FDA (Food and Drug Administration) requirements using a total error approach that guaranteed that each future

  1. A statistical method (cross-validation) for bone loss region detection after spaceflight

    PubMed Central

    Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.

    2010-01-01

    Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144

  2. Validation of source biasing method for its use in CSNS beamline shielding calculation.

    PubMed

    Liang, Tai-ran; Shen, Fei; Liang, Tian-jiao; Yin, Wen; Yu, Quan-zhi; Yu, Chun-xu

    2014-12-01

    The Chinese spallation neutron source (CSNS) is a high-performance pulsed neutron source, having 20 neutron beamlines for neutron scattering instruments. The shielding design of these beamlines is usually needed for Monte Carlo (MC) calculation, and the use of variance reduction methods is critical to carrying out an efficient, reliable MC shielding calculation. This paper discusses the source biasing method based on actual source term and geometry model of a CSNS neutron beamline. Dose distribution throughout the geometry model was calculated with the FLUKA MC code. Full analogue calculation and biased calculation were compared, and it was validated that the source biasing method can effectively promote the calculation efficiency.

  3. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  4. Experimental validation of a modal flexibility-based damage detection method for a cyber-physical system

    NASA Astrophysics Data System (ADS)

    Martinez-Castro, Rosana E.; Eskew, Edward L.; Jang, Shinae

    2014-03-01

    The detection and localization of damage in a timely manner is critical in order to avoid the failure of structures. When a structure is subjected to an unscheduled impulsive force, the resulting damage can lead to failure in a very short period of time. As such, a monitoring strategy that can adapt to variability in the environment and that anticipates changes in physical processes has the potential of detecting, locating and mitigating damage. These requirements can be met by a cyber-physical system (CPS) equipped with Wireless Smart Sensor Network (WSSN) systems that is capable of measuring and analyzing dynamic responses in real time using on-board in network processing. The Eigenparameter Decomposition of Structural Flexibility Change (ED) Method is validated with real data and considered to be used in the computational core of this CPS. The condition screening is implemented on a damaged structure and compared to an original baseline calculation, hence providing a supervised learning environment. An experimental laboratory study on a 5-story shear building with three damage conditions subjected to an impulsive force has been chosen to validate the effectiveness of the method proposed to locate and quantify the extent of damage. A numerical simulation of the same building subject to band-limited white noise has also been developed with this purpose. The effectiveness of the ED Method to locate damage is compared to that of the Damage Index Method. With some modifications, the ED Method is capable of locating and quantifying damage satisfactorily in a shear building subject to a lower frequency content predominant excitation.

  5. Optimization and validation of a high-performance liquid chromatography method for the analysis of cardiac glycosides in Digitalis lanata.

    PubMed

    Pellati, Federica; Bruni, Renato; Bellardi, Maria Grazia; Bertaccini, Assunta; Benvenuti, Stefania

    2009-04-10

    In this study, a simple and reliable HPLC method for the qualitative and quantitative analysis of cardiac glycosides in Digitalis lanata Ehrh. raw material was developed and applied to healthy and phytoplasma-infected plants. The target analytes cover a broad range of secondary metabolites, including primary, secondary and tertiary glycosides and the corresponding aglycones. The sample preparation was carried out by sonication of the plant material with 70% (v/v) aqueous methanol at room temperature, followed by reversed-phase solid-phase extraction purification from interfering pigments. The HPLC analyses were performed on a Symmetry C(18) column (75 mm x 4.6mm I.D., 3.5 microm), with a gradient elution composed of water and acetonitrile, at a flow rate of 1.0 mL/min. The column temperature was set at 20 degrees C and the photodiode array detector monitored the eluent at 220 nm. The method was validated with respect to ICH guidelines and the validation parameters were found to be highly satisfactory. The application of the method to the analysis of D. lanata leaves indicated that air-drying was the optimum method for raw material processing when compared with freeze-drying. The analysis of healthy and phytoplasma-infected plants demonstrated that the secondary metabolite mainly affected by the pathogen presence was lanatoside C (153.2 microg/100mg versus 76.1 microg/100mg). Considering the importance of D. lanata plant material as source of cardiac glycosides, the developed method can be considered suitable for the phytochemical analysis and for the quality assurance of D. lanata used for pharmaceutical purpose.

  6. Determining an appropriate method for the purpose of land allocation for ecotourism development (case study: Taleghan County, Iran).

    PubMed

    Aliani, H; Kafaky, S Babaie; Saffari, A; Monavari, S M

    2016-11-01

    Appropriate management and planning of suitable areas for the development of ecotourism activities can play an important role in ensuring proper use of the environment. Due to the complexity of nature, applying different tools and models-particularly multi-criteria methods-can be useful in order to achieve these goals. In this study, to indicate suitable areas (land allocation) for ecotourism activities in Taleghan county, weighted linear combination (WLC) using geographical information system (GIS), fuzzy logic, and analytical network process (ANP) were used. To compare the applicability of each of these methods in achieving the goal, the results were compared with the previous model presented by Makhdoum. The results showed that the WLC and ANP methods are more efficient than the Makhdoum model in allocating lands for recreational areas and ecotourism purposes since concomitant use of fuzzy logic and ANP for ranking and weighing the criteria provides us with more flexible and logical conditions. Furthermore, the mentioned method makes it possible to involve ecological, economic, and social criteria simultaneously in the evaluation process in order to allocate land for ecotourism purposes.

  7. Comparison of Assertive Community Treatment Fidelity Assessment Methods: Reliability and Validity.

    PubMed

    Rollins, Angela L; McGrew, John H; Kukla, Marina; McGuire, Alan B; Flanagan, Mindy E; Hunt, Marcia G; Leslie, Doug L; Collins, Linda A; Wright-Berryman, Jennifer L; Hicks, Lia J; Salyers, Michelle P

    2016-03-01

    Assertive community treatment is known for improving consumer outcomes, but is difficult to implement. On-site fidelity measurement can help ensure model adherence, but is costly in large systems. This study compared reliability and validity of three methods of fidelity assessment (on-site, phone-administered, and expert-scored self-report) using a stratified random sample of 32 mental health intensive case management teams from the Department of Veterans Affairs. Overall, phone, and to a lesser extent, expert-scored self-report fidelity assessments compared favorably to on-site methods in inter-rater reliability and concurrent validity. If used appropriately, these alternative protocols hold promise in monitoring large-scale program fidelity with limited resources.

  8. [Validation Study for Analytical Method of Diarrhetic Shellfish Poisons in 9 Kinds of Shellfish].

    PubMed

    Yamaguchi, Mizuka; Yamaguchi, Takahiro; Kakimoto, Kensaku; Nagayoshi, Haruna; Okihashi, Masahiro; Kajimura, Keiji

    2016-01-01

    A method was developed for the simultaneous determination of okadaic acid, dinophysistoxin-1 and dinophysistoxin-2 in shellfish using ultra performance liquid chromatography with tandem mass spectrometry. Shellfish poisons in spiked samples were extracted with methanol and 90% methanol, and were hydrolyzed with 2.5 mol/L sodium hydroxide solution. Purification was done on an HLB solid-phase extraction column. This method was validated in accordance with the notification of Ministry of Health, Labour and Welfare of Japan. As a result of the validation study in nine kinds of shellfish, the trueness, repeatability and within-laboratory reproducibility were 79-101%, less than 12 and 16%, respectively. The trueness and precision met the target values of notification.

  9. Validation of a liquid chromatographic method for determination of tacrolimus in pharmaceutical dosage forms.

    PubMed

    Moyano, María A; Simionato, Laura D; Pizzorno, María T; Segall, Adriana I

    2006-01-01

    An accurate, simple, and reproducible liquid chromatographic method was developed and validated for the determination of tacrolimus in capsules. The analysis is performed at room temperature on a reversed-phase C18 column with UV detection at 210 nm. The mobile phase is methanol-water (90 + 10) at a constant flow rate of 0.8 mL/min. The method was validated in terms of linearity, precision, accuracy, and specificity by forced decomposition of tacrolimus, using acid, base, water, hydrogen peroxide, heat, and light. The response was linear in the range of 0.09-0.24 mg/mL (r2 = 0.9997). The relative standard deviation values for intra- and interday precision studies were 1.28 and 2.91%, respectively. Recoveries ranged from 98.06 to 102.52%.

  10. RETROSPECTIVE METHOD VALIDATION AND UNCERTAINTY ESTIMATION FOR ACTINIDES DETERMINATION IN EXCRETA BY ALPHA SPECTROMETRY.

    PubMed

    Hernández, C; Sierra, I

    2016-09-01

    Two essential technical requirements of ISO 17025 guide for accreditation of testing and calibration laboratories are the validation of methods and the estimation of all sources of uncertainty that may affect the analytical result. Bioelimination Laboratory from Radiation Dosimetry Service of CIEMAT (Spain) uses alpha spectrometry to quantify alpha emitters (Pu, Am, Th, U and Cm isotopes) in urine and faecal samples from workers exposed to internal radiation. Therefore and as a step previous to achieving the ISO 17025 accreditation, the laboratory has performed retrospective studies based on the obtained results in the past few years to validate the analytical method. Uncertainty estimation was done identifying and quantifying all the contributions, and finally the overall combined standard uncertainty was calculated.

  11. Validation of a partial coherence interferometry method for estimating retinal shape

    PubMed Central

    Verkicharla, Pavan K.; Suheimat, Marwan; Pope, James M.; Sepehrband, Farshid; Mathur, Ankit; Schmid, Katrina L.; Atchison, David A.

    2015-01-01

    To validate a simple partial coherence interferometry (PCI) based retinal shape method, estimates of retinal shape were determined in 60 young adults using off-axis PCI, with three stages of modeling using variants of the Le Grand model eye, and magnetic resonance imaging (MRI). Stage 1 and 2 involved a basic model eye without and with surface ray deviation, respectively and Stage 3 used model with individual ocular biometry and ray deviation at surfaces. Considering the theoretical uncertainty of MRI (12-14%), the results of the study indicate good agreement between MRI and all three stages of PCI modeling with <4% and <7% differences in retinal shapes along horizontal and vertical meridians, respectively. Stage 2 and Stage 3 gave slightly different retinal co-ordinates than Stage 1 and we recommend the intermediate Stage 2 as providing a simple and valid method of determining retinal shape from PCI data. PMID:26417496

  12. Quantification of santonin in eight species of Artemisia from Kazakhstan by means of HPLC-UV: Method development and validation

    PubMed Central

    Bekezhanova, Tolkyn; Sadykova; Shukirbekova, Alma

    2017-01-01

    Santonin, a powerful anthelmintic drug that was formely used to treat worms, is Artemisia cina's main constituent. However, due to its toxicity to humans, it is no longer in use. Kazakhstan is looking to introduce this plant as an anthelmintic drug for veterinary purposes, despite the known toxic properties of the santonin. The objective of this study was to develop a fast and specific method for the identification of santonin and its precise quantitation using HPLC-UV in order to avoid unnecessary intoxication, which is paramount for the development of veterinary medicines. The results obtained showed that santonin appears at around 5.7 minutes in this very reliable HPLC method. The validation of the method was performed by the investigation of parameters such as precision, accuracy, reproducibility and recovery. The method was used to identify and quantify santonin in leaves of A. scoparia, A. foetida, A. gmelinni, A. schrenkiana, A. frigida, A. sublesingiana, A terra-albae, and A. absinthium from Kazakhstan as well as in three different extracts of leaves of A. cina. This study has provided a faster and simpler method for the identification and quantification of this compound in other species of Artemisia of economic importance. PMID:28301522

  13. Quantification of santonin in eight species of Artemisia from Kazakhstan by means of HPLC-UV: Method development and validation.

    PubMed

    Sakipova, Zuriyadda; Wong, Nikki Siu Hai; Bekezhanova, Tolkyn; Sadykova; Shukirbekova, Alma; Boylan, Fabio

    2017-01-01

    Santonin, a powerful anthelmintic drug that was formely used to treat worms, is Artemisia cina's main constituent. However, due to its toxicity to humans, it is no longer in use. Kazakhstan is looking to introduce this plant as an anthelmintic drug for veterinary purposes, despite the known toxic properties of the santonin. The objective of this study was to develop a fast and specific method for the identification of santonin and its precise quantitation using HPLC-UV in order to avoid unnecessary intoxication, which is paramount for the development of veterinary medicines. The results obtained showed that santonin appears at around 5.7 minutes in this very reliable HPLC method. The validation of the method was performed by the investigation of parameters such as precision, accuracy, reproducibility and recovery. The method was used to identify and quantify santonin in leaves of A. scoparia, A. foetida, A. gmelinni, A. schrenkiana, A. frigida, A. sublesingiana, A terra-albae, and A. absinthium from Kazakhstan as well as in three different extracts of leaves of A. cina. This study has provided a faster and simpler method for the identification and quantification of this compound in other species of Artemisia of economic importance.

  14. Validated spectrophotometric methods for simultaneous determination of troxerutin and carbazochrome in dosage form

    NASA Astrophysics Data System (ADS)

    Khattab, Fatma I.; Ramadan, Nesrin K.; Hegazy, Maha A.; Al-Ghobashy, Medhat A.; Ghoniem, Nermine S.

    2015-03-01

    Four simple, accurate, sensitive and precise spectrophotometric methods were developed and validated for simultaneous determination of Troxerutin (TXN) and Carbazochrome (CZM) in their bulk powders, laboratory prepared mixtures and pharmaceutical dosage forms. Method A is first derivative spectrophotometry (D1) where TXN and CZM were determined at 294 and 483.5 nm, respectively. Method B is first derivative of ratio spectra (DD1) where the peak amplitude at 248 for TXN and 439 nm for CZM were used for their determination. Method C is ratio subtraction (RS); in which TXN was determined at its λmax (352 nm) in the presence of CZM which was determined by D1 at 483.5 nm. While, method D is mean centering of the ratio spectra (MCR) in which the mean centered values at 300 nm and 340.0 nm were used for the two drugs in a respective order. The two compounds were simultaneously determined in the concentration ranges of 5.00-50.00 μg mL-1 and 0.5-10.0 μg mL-1 for TXN and CZM, respectively. The methods were validated according to the ICH guidelines and the results were statistically compared to the manufacturer's method.

  15. Experimental validation of normalized uniform load surface curvature method for damage localization.

    PubMed

    Jung, Ho-Yeon; Sung, Seung-Hoon; Jung, Hyung-Jo

    2015-10-16

    In this study, we experimentally validated the normalized uniform load surface (NULS) curvature method, which has been developed recently to assess damage localization in beam-type structures. The normalization technique allows for the accurate assessment of damage localization with greater sensitivity irrespective of the damage location. In this study, damage to a simply supported beam was numerically and experimentally investigated on the basis of the changes in the NULS curvatures, which were estimated from the modal flexibility matrices obtained from the acceleration responses under an ambient excitation. Two damage scenarios were considered for the single damage case as well as the multiple damages case by reducing the bending stiffness (EI) of the affected element(s). Numerical simulations were performed using MATLAB as a preliminary step. During the validation experiments, a series of tests were performed. It was found that the damage locations could be identified successfully without any false-positive or false-negative detections using the proposed method. For comparison, the damage detection performances were compared with those of two other well-known methods based on the modal flexibility matrix, namely, the uniform load surface (ULS) method and the ULS curvature method. It was confirmed that the proposed method is more effective for investigating the damage locations of simply supported beams than the two conventional methods in terms of sensitivity to damage under measurement noise.

  16. Challenges in the analytical method development and validation for an unstable active pharmaceutical ingredient.

    PubMed

    Sajonz, Peter; Wu, Yan; Natishan, Theresa K; McGachy, Neil T; Detora, David

    2006-03-01

    A sensitive high-performance liquid chromatography (HPLC) impurity profile method for the antibiotic ertapenem is developed and subsequently validated. The method utilizes an Inertsil phenyl column at ambient temperature, gradient elution with aqueous sodium phosphate buffer at pH 8, and acetonitrile as the mobile phase. The linearity, method precision, method ruggedness, limit of quantitation, and limit of detection of the impurity profile HPLC method are found to be satisfactory. The method is determined to be specific, as judged by resolving ertapenem from in-process impurities in crude samples and degradation products that arise from solid state thermal and light stress, acid, base, and oxidative stressed solutions. In addition, evidence is obtained by photodiode array detection studies that no degradate or impurity having a different UV spectrum coeluted with the major component in stressed or unstressed samples. The challenges during the development and validation of the method are discussed. The difficulties of analyzing an unstable active pharmaceutical ingredient (API) are addressed. Several major impurities/degradates of the API have very different UV response factors from the API. These impurities/degradates are synthesized or prepared by controlled degradation and the relative response factors are determined.

  17. A Simple and Valid Method to Determine Thermoregulatory Sweating Threshold and Sensitivity

    DTIC Science & Technology

    2009-01-01

    S, Inoue Y, Crandall CG. Function of human eccrine sweat glands during dynamic exercise and passive heat stress. J Appl Physiol 90: 1877–1881, 2001...code) 2009 Journal Article - Journal of Applied Physiology A Simple and Valid Method to Determine Thermoregulatory Sweating Threshold and Sensitivity...Research Institute of Environmental Medicine Natick, MA 01760-5007 M09-27 Same as #7 above. Approved for public release; distribution unlimited Sweating

  18. Experimental validation of finite element and boundary element methods for predicting structural vibration and radiated noise

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Wu, T. W.; Wu, X. F.

    1994-01-01

    This research report is presented in three parts. In the first part, acoustical analyses were performed on modes of vibration of the housing of a transmission of a gear test rig developed by NASA. The modes of vibration of the transmission housing were measured using experimental modal analysis. The boundary element method (BEM) was used to calculate the sound pressure and sound intensity on the surface of the housing and the radiation efficiency of each mode. The radiation efficiency of each of the transmission housing modes was then compared to theoretical results for a finite baffled plate. In the second part, analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise level radiated from the box. The FEM was used to predict the vibration, while the BEM was used to predict the sound intensity and total radiated sound power using surface vibration as the input data. Vibration predicted by the FEM model was validated by experimental modal analysis; noise predicted by the BEM was validated by measurements of sound intensity. Three types of results are presented for the total radiated sound power: sound power predicted by the BEM model using vibration data measured on the surface of the box; sound power predicted by the FEM/BEM model; and sound power measured by an acoustic intensity scan. In the third part, the structure used in part two was modified. A rib was attached to the top plate of the structure. The FEM and BEM were then used to predict structural vibration and radiated noise respectively. The predicted vibration and radiated noise were then validated through experimentation.

  19. Capillary isoelectric focusing method development and validation for investigation of recombinant therapeutic monoclonal antibody.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2015-10-10

    Capillary isoelectric focusing (cIEF) is a basic and highly accurate routine analytical tool to prove identity of protein drugs in quality control (QC) and release tests in biopharmaceutical industries. However there are some "out-of-the-box" applications commercially available which provide easy and rapid isoelectric focusing solutions for investigating monoclonal antibody drug proteins. However use of these kits in routine testings requires high costs. A capillary isoelectric focusing method was developed and validated for identification testing of monoclonal antibody drug products with isoelectric point between 7.0 and 9.0. A method was developed providing good pH gradient for internal calibration (R(2)>0.99) and good resolution between all of the isoform peaks (R=2), minimizing the time and complexity of sample preparation (no urea or salt used). The method is highly reproducible and it is suitable for validation and method transfer to any QC laboratories. Another advantage of the method is that it operates with commercially available chemicals which can be purchased from any suppliers. The interaction with capillary walls (avoid precipitation and adsorption as far as possible) was minimized and synthetic isoelectric small molecular markers were used instead of peptide or protein based markers. The developed method was validated according to the recent ICH guideline (Q2(R1)). Relative standard deviation results were below 0.2% for isoelectric points and below 4% according to the normalized migration times. The method is robust to buffer components with different lot numbers and neutral capillaries with different type of inner coatings. The fluoro-carbon coated column was chosen because of costs-effectivity aspects.

  20. Determination of paraquat and diquat: LC-MS method optimization and validation.

    PubMed

    Pizzutti, Ionara R; Vela, Giovana M E; de Kok, André; Scholten, Jos M; Dias, Jonatan V; Cardoso, Carmem D; Concenço, Germani; Vivian, Rafael

    2016-10-15

    This study describes the optimization and single-laboratory validation of a single residue method for determination of two bipyridylium herbicides, paraquat and diquat, in cowpeas by UPLC-MS/MS in a total run time of 9.3min. The method is based on extraction with an acidified methanol-water mixture. Different extraction parameters (extraction solvent composition, temperature, sample extract filtration, and pre-treatment of the laboratory sample) were evaluated in order to optimize the extraction method efficiency. Isotopically labeled internal standards, Paraquat-D6 and Diquat-D4, were used and added to the test portions prior to extraction. The method validation was performed by analyzing spiked samples at three concentrations (10, 20 and 50μgkg(-1)), with seven replicates (n=7) for each concentration. Linearity (r(2)) of analytical curves, accuracy (trueness as recovery % and precision as RSD%), instrument and method limits of detection and quantification (LOD and LOQ) and matrix effects were determined. Average recoveries obtained for diquat were between 77 and 85% with RSD values ⩽20%, for all spike levels studied. On the other hand, paraquat showed average recoveries between 68 and 103% with RSDs in the range 14.4-25.4%. The method LOQ was 10 and 20μgkg(-1) for diquat and paraquat, respectively. The matrix effect was significant for both pesticides. Consequently, matrix-matched calibration standards and using isotopically labeled (IL) analogues as internal standards for the target analytes are required for application in routine analysis. The validated method was successfully applied for cowpea samples obtained from various field studies.

  1. Validation of analysis methods for assessing flawed piping subjected to dynamic loading

    SciTech Connect

    Olson, R.J.; Wolterman, R.L.; Wilkowski, G.M.; Kot, C.A.

    1994-08-01

    Argonne National Laboratory and Battelle have jointly conducted a research program for the USNRC to evaluate the ability of current engineering analysis methods and one state-of-the-art analysis method to predict the behavior of circumferentially surface-cracked pipe system water-hammer experiment. The experimental data used in the evaluation were from the HDR Test Group E31 series conducted by the Kernforschungszentrum Karlsruhe (KfK) in Germany. The incentive for this evaluation was that simplified engineering methods, as well as newer ``state-of-the-art`` fracture analysis methods, have been typically validated only with static experimental data. Hence, these dynamic experiments were of high interest. High-rate dynamic loading can be classified as either repeating, e.g., seismic, or nonrepeating, e.g., water hammer. Development of experimental data and validation of cracked pipe analyses under seismic loading (repeating dynamic loads) are being pursued separately within the NRC`s International Piping Integrity Research Group (IPIRG) program. This report describes developmental and validation efforts to predict crack stability under water hammer loading, as well as comparisons using currently used analysis procedures. Current fracture analysis methods use the elastic stress analysis loads decoupled from the fracture mechanics analysis, while state-of-the-art methods employ nonlinear cracked-pipe time-history finite element analyses. The results showed that the current decoupled methods were conservative in their predictions, whereas the cracked pipe finite element analyses were more accurate, yet slightly conservative. The nonlinear time-history cracked-pipe finite element analyses conducted in this program were also attractive in that they were done on a small Apollo DN5500 workstation, whereas other cracked-pipe dynamic analyses conducted in Europe on the same experiments required the use of a CRAY2 supercomputer, and were less accurate.

  2. Validation of finite element and boundary element methods for predicting structural vibration and radiated noise

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Wu, X. F.; Oswald, Fred B.

    1992-01-01

    Analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise radiated from the box. The FEM was used to predict the vibration, and the surface vibration was used as input to the BEM to predict the sound intensity and sound power. Vibration predicted by the FEM model was validated by experimental modal analysis. Noise predicted by the BEM was validated by sound intensity measurements. Three types of results are presented for the total radiated sound power: (1) sound power predicted by the BEM modeling using vibration data measured on the surface of the box; (2) sound power predicted by the FEM/BEM model; and (3) sound power measured by a sound intensity scan. The sound power predicted from the BEM model using measured vibration data yields an excellent prediction of radiated noise. The sound power predicted by the combined FEM/BEM model also gives a good prediction of radiated noise except for a shift of the natural frequencies that are due to limitations in the FEM model.

  3. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    PubMed

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  4. Validity and reliability of a method for retrospective evaluation of chlorophenate exposure in the lumber industry

    SciTech Connect

    Hertzman, C.; Teschke, K.; Dimich-Ward, H.; Ostry, A.

    1988-01-01

    This paper describes the validity and reliability of a method to retrospectively assess exposure to antisapstain agents used in sawmills (chlorophenates). The method is based on experienced workers' estimates of exposure for each job title at the sawmill where they work. At a pilot mill, 10 randomly selected workers estimated the frequency and duration of exposures to chlorophenates for all 59 job titles. The reliability of their mean exposure estimates was very high, with an intraclass correlation coefficient for all raters of 0.91 (based on a calculated index of exposure). To assess validity, urinary chlorophenate levels were measured for 86% of the workers at the mill during the summer and/or fall, and compared to experienced workers' estimates of exposure. The correlation between workers' exposure estimates and the urinary chlorophenate levels for each job title were consistently above 0.65 for all analyses and greater than 0.72 when summer and fall urine sample results were averaged. The evidence indicates that the validity and reliability of worker exposure estimates are high enough to justify investigation of the method's generalizability to other sawmills.

  5. Pressure ulcer prevention algorithm content validation: a mixed-methods, quantitative study.

    PubMed

    van Rijswijk, Lia; Beitz, Janice M

    2015-04-01

    Translating pressure ulcer prevention (PUP) evidence-based recommendations into practice remains challenging for a variety of reasons, including the perceived quality, validity, and usability of the research or the guideline itself. Following the development and face validation testing of an evidence-based PUP algorithm, additional stakeholder input and testing were needed. Using convenience sampling methods, wound care experts attending a national wound care conference and a regional wound ostomy continence nursing (WOCN) conference and/or graduates of a WOCN program were invited to participate in an Internal Review Board-approved, mixed-methods quantitative survey with qualitative components to examine algorithm content validity. After participants provided written informed consent, demographic variables were collected and participants were asked to comment on and rate the relevance and appropriateness of each of the 26 algorithm decision points/steps using standard content validation study procedures. All responses were anonymous. Descriptive summary statistics, mean relevance/appropriateness scores, and the content validity index (CVI) were calculated. Qualitative comments were transcribed and thematically analyzed. Of the 553 wound care experts invited, 79 (average age 52.9 years, SD 10.1; range 23-73) consented to participate and completed the study (a response rate of 14%). Most (67, 85%) were female, registered (49, 62%) or advanced practice (12, 15%) nurses, and had > 10 years of health care experience (88, 92%). Other health disciplines included medical doctors, physical therapists, nurse practitioners, and certified nurse specialists. Almost all had received formal wound care education (75, 95%). On a Likert-type scale of 1 (not relevant/appropriate) to 4 (very relevant and appropriate), the average score for the entire algorithm/all decision points (N = 1,912) was 3.72 with an overall CVI of 0.94 (out of 1). The only decision point/step recommendation

  6. Development and Validation of a Method to Measure Lumbosacral Motion Using Ultrasound Imaging.

    PubMed

    van den Hoorn, Wolbert; Coppieters, Michel W; van Dieën, Jaap H; Hodges, Paul W

    2016-05-01

    The study aim was to validate an ultrasound imaging technique to measure sagittal plane lumbosacral motion. Direct and indirect measures of lumbosacral angle change were developed and validated. Lumbosacral angle was estimated by the angle between lines through two landmarks on the sacrum and lowest lumbar vertebrae. Distance measure was made between the sacrum and lumbar vertebrae, and angle was estimated after distance was calibrated to angle. This method was tested in an in vitro spine and an in vivo porcine spine and validated to video and fluoroscopy measures, respectively. R(2), regression coefficients and mean absolute differences between ultrasound measures and validation measures were, respectively: 0.77, 0.982, 0.67° (in vitro, angle); 0.97, 0.992, 0.82° (in vitro, distance); 0.94, 0.995, 2.1° (in vivo, angle); and 0.95, 0.997, 1.7° (in vivo, distance). Lumbosacral motion can be accurately measured with ultrasound. This provides a basis to develop measurements for use in humans.

  7. Validation of 3D Seismic Velocity Models Using the Spectral Element Method

    NASA Astrophysics Data System (ADS)

    Maceira, M.; Larmat, C. S.; Porritt, R. W.; Higdon, D.; Allen, R. M.

    2012-12-01

    For over a decade now, many research institutions have been focusing on addressing the Earth's 3D heterogeneities and complexities by improving tomographic methods. Utilizing dense array datasets, these efforts have led to unprecedented 3D seismic images, but little is done in terms of model validation or to provide any absolute assessment of model uncertainty. Furthermore, the question of "How good is a 3D geophysical model at representing the Earth's true physics? " remains largely not addressed in a time when 3D Earth models are used for societal and energy security. In the last few years, new horizons have opened up in earth structure imaging, with the advent of new numerical and mathematical methods in computational seismology and statistical sciences. We use these methods to tackle the question of model validation taking advantage of unique and extensive High Performance Computing resources available at Los Alamos National Laboratory. We present results from a study focused on validating 3D models for the Western USA generated using both ray-theoretical and finite-frequency approximations. In this manner we do not validate just the model but also the imaging technique. For this test case, we utilize the Dynamic North America (DNA) model family of UC Berkeley, as they are readily available in both formulations. We evaluate model performances by comparing observed and synthetic seismograms generated using the Spectral Element Method. Results show that both, finite-frequency and ray-theoretical DNA09 models, predict the observations well. Waveform cross-correlation coefficients show a difference in performance between models obtained with the finite-frequency or ray-theory limited to smallest periods (<15s), with no perceptible difference at longer periods (50-200s). At those shortest periods, and based on statistical analyses on S-wave phase delay measurements, finite-frequency shows an improvement over ray theory. We are also investigating the breakdown of ray

  8. Development of a benchtop baking method for chemically leavened crackers. II. Validation of the method

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...

  9. Validation of a two-dimensional liquid chromatography method for quality control testing of pharmaceutical materials.

    PubMed

    Yang, Samuel H; Wang, Jenny; Zhang, Kelly

    2017-04-07

    Despite the advantages of 2D-LC, there is currently little to no work in demonstrating the suitability of these 2D-LC methods for use in a quality control (QC) environment for good manufacturing practice (GMP) tests. This lack of information becomes more critical as the availability of commercial 2D-LC instrumentation has significantly increased, and more testing facilities begin to acquire these 2D-LC capabilities. It is increasingly important that the transferability of developed 2D-LC methods be assessed in terms of reproducibility, robustness and performance across different laboratories worldwide. The work presented here focuses on the evaluation of a heart-cutting 2D-LC method used for the analysis of a pharmaceutical material, where a key, co-eluting impurity in the first dimension ((1)D) is resolved from the main peak and analyzed in the second dimension ((2)D). A design-of-experiments (DOE) approach was taken in the collection of the data, and the results were then modeled in order to evaluate method robustness using statistical modeling software. This quality by design (QBD) approach gives a deeper understanding of the impact of these 2D-LC critical method attributes (CMAs) and how they affect overall method performance. Although there are multiple parameters that may be critical from method development point of view, a special focus of this work is devoted towards evaluation of unique 2D-LC critical method attributes from method validation perspective that transcend conventional method development and validation. The 2D-LC method attributes are evaluated for their recovery, peak shape, and resolution of the two co-eluting compounds in question on the (2)D. In the method, linearity, accuracy, precision, repeatability, and sensitivity are assessed along with day-to-day, analyst-to-analyst, and lab-to-lab (instrument-to-instrument) assessments. The results of this validation study demonstrate that the 2D-LC method is accurate, sensitive, and robust and is

  10. The development and validation of a single SNaPshot multiplex for tiger species and subspecies identification--implications for forensic purposes.

    PubMed

    Kitpipit, Thitika; Tobe, Shanan S; Kitchener, Andrew C; Gill, Peter; Linacre, Adrian

    2012-03-01

    The tiger (Panthera tigris) is currently listed on Appendix I of the Convention on the International Trade in Endangered Species of Wild Fauna and Flora; this affords it the highest level of international protection. To aid in the investigation of alleged illegal trade in tiger body parts and derivatives, molecular approaches have been developed to identify biological material as being of tiger in origin. Some countries also require knowledge of the exact tiger subspecies present in order to prosecute anyone alleged to be trading in tiger products. In this study we aimed to develop and validate a reliable single assay to identify tiger species and subspecies simultaneously; this test is based on identification of single nucleotide polymorphisms (SNPs) within the tiger mitochondrial genome. The mitochondrial DNA sequence from four of the five extant putative tiger subspecies that currently exist in the wild were obtained and combined with DNA sequence data from 492 tiger and 349 other mammalian species available on GenBank. From the sequence data a total of 11 SNP loci were identified as suitable for further analyses. Five SNPs were species-specific for tiger and six amplify one of the tiger subspecies-specific SNPs, three of which were specific to P. t. sumatrae and the other three were specific to P. t. tigris. The multiplex assay was able to reliably identify 15 voucher tiger samples. The sensitivity of the test was 15,000 mitochondrial DNA copies (approximately 0.26 pg), indicating that it will work on trace amounts of tissue, bone or hair samples. This simple test will add to the DNA-based methods currently being used to identify the presence of tiger within mixed samples.

  11. Chiral purity assay for Flindokalner using tandem mass spectrometry: method development, validation, and benchmarking.

    PubMed

    Young, Brandy L; Cooks, R G; Madden, Michelle C; Bair, Michael; Jia, Jingpin; Aubry, Anne-Françoise; Miller, Scott A

    2007-04-11

    The present work demonstrates the application and validation of a mass spectrometry method for quantitative chiral purity determination. The particular compound analyzed is Flindokalner, a Bristol-Myers Squibb drug candidate for post-stroke neuroprotection. Chiral quantification of Flindokalner was achieved using tandem mass spectrometry (MS/MS) and the kinetic method, a gas phase method used for thermochemical and chiral determinations. The MS/MS method was validated and benchmarked against two separate chromatographic techniques, chiral high performance liquid chromatography with ultra-violet detection (LC/UV) and achiral high performance liquid chromatography with circular dichroism detection (LC/CD). The chiral purity determination of Flindokalner using MS/MS proved to be rapid (3 min run time for each sample) and to have accuracy and precision comparable to the chiral LC/UV and achiral LC/CD methods. This method represents an alternative to commonly used chromatographic techniques as a means of chiral purity determination and is particularly useful in rapid screening experiments.

  12. Development and validation of scale nuclear analysis methods for high temperature gas-cooled reactors

    SciTech Connect

    Gehin, Jess C; Jessee, Matthew Anderson; Williams, Mark L; Lee, Deokjung; Goluoglu, Sedat; Ilas, Germina; Ilas, Dan; Bowman, Steve A

    2010-01-01

    In support of the U.S. Nuclear Regulatory Commission, ORNL is updating the nuclear analysis methods and data in the SCALE code system to support modeling of HTGRs. Development activities include methods used for reactor physics, criticality safety, and radiation shielding. This paper focuses on the nuclear methods in support of reactor physics, which primarily include lattice physics for cross-section processing of both prismatic and pebble-bed designs, Monte Carlo depletion methods and efficiency improvements for double heterogeneous fuels, and validation against relevant experiments. These methods enhancements are being validated using available experimental data from the HTTR and HTR-10 startup and initial criticality experiments. Results obtained with three-dimensional Monte Carlo models of the HTTR initial core critical configurations with SCALE6/KENO show excellent agreement between the continuous energy and multigroup methods and the results are consistent with results obtained by others. A three-dimensional multigroup Monte Carlo model for the initial critical core of the HTR-10 has been developed with SCALE6/KENO based on the benchmark specifications included in the IRPhE Handbook. The core eigenvalue obtained with this model is in very good agreement with the corresponding value obtained with a consistent continuous energy MCNP5 core model.

  13. Validation of a Smartphone Image-Based Dietary Assessment Method for Pregnant Women

    PubMed Central

    Ashman, Amy M.; Collins, Clare E.; Brown, Leanne J.; Rae, Kym M.; Rollo, Megan E.

    2017-01-01

    Image-based dietary records could lower participant burden associated with traditional prospective methods of dietary assessment. They have been used in children, adolescents and adults, but have not been evaluated in pregnant women. The current study evaluated relative validity of the DietBytes image-based dietary assessment method for assessing energy and nutrient intakes. Pregnant women collected image-based dietary records (via a smartphone application) of all food, drinks and supplements consumed over three non-consecutive days. Intakes from the image-based method were compared to intakes collected from three 24-h recalls, taken on random days; once per week, in the weeks following the image-based record. Data were analyzed using nutrient analysis software. Agreement between methods was ascertained using Pearson correlations and Bland-Altman plots. Twenty-five women (27 recruited, one withdrew, one incomplete), median age 29 years, 15 primiparas, eight Aboriginal Australians, completed image-based records for analysis. Significant correlations between the two methods were observed for energy, macronutrients and fiber (r = 0.58–0.84, all p < 0.05), and for micronutrients both including (r = 0.47–0.94, all p < 0.05) and excluding (r = 0.40–0.85, all p < 0.05) supplements in the analysis. Bland-Altman plots confirmed acceptable agreement with no systematic bias. The DietBytes method demonstrated acceptable relative validity for assessment of nutrient intakes of pregnant women. PMID:28106758

  14. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    PubMed

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.

  15. Validation of a sensitive ion chromatography method for determination of monoethylsulfate in Indinavir sulfate drug substance.

    PubMed

    Prasanna, S John; Sharma, Hemant Kumar; Mukkanti, K; Sivakumaran, M; Pavan Kumar, K S R; Kumar, V Jagadeesh

    2009-12-05

    The present study relates to the optimization of an ion chromatography method to determine the content of monoethylsulfate at very low levels in Indinavir sulfate drug substance, and subsequent validation of the method to prove its suitability, reliability and sensitivity. Monoethylsulfate is a potential impurity of Indinavir sulfate, and may forms during the preparation as well as during storage. The ion chromatography method was developed in such a way that to enhance the detection level by introducing suppressor, and minimizing acquisition time by using suitable buffer of 3.2 mmole of sodium carbonate and 1 mmole of sodium hydrogen carbonate in water as eluent. The retention time of monoethylsulfate was about 9.5 min and the total acquisition time was 25 min. The optimized method was validated to prove its performance characteristics by demonstrating selectivity, sensitivity (limit of detection and quantification), linearity, precision and accuracy. The established limit of detection and quantification of monoethylsulfate in Indinavir sulfate by this method was found to be 24 ng/ml and 74 ng/ml respectively, and the overall percent accuracy (recovery) of samples evaluated at different concentration levels was found to be 97.1, indicating the sensitivity and accuracy of this optimized ion chromatography method.

  16. Development and validation of RP-HPLC method for estimation of Cefotaxime sodium in marketed formulations.

    PubMed

    Lalitha, N; Pai, Pn Sanjay

    2009-12-01

    A RP-HPLC assay method has been developed and validated for cefotaxime. An isocratic RP-HPLC was developed on a SS Wakosil II- C8 column (250 mm ˜4.6 mm i.d., 5 μm) utilizing a mobile phase of ammonium acetate buffer (pH 6.8) and acetonitrile (85:15 v/v) with UV detection at wavelength 252 nm at the flow rate 0 .8 ml/min. The proposed method was validated for sensitivity, selectivity, linearity, accuracy, precision, ruggedness, robustness and solution stability. The response of the drug was linear in the concentration range of 10-70 μg/ml. Limit of detection and Limit of quantification was found to be 0.3 μg/ml and 0.6 μg/ml respectively. The % recovery ranged within 97-102 %. Method, system, interday and intraday precision was found to be within the limits of acceptance criteria. Method was found to be rugged when analysis was carried out by different analyst. The method was found to be sensitive and efficient with 2216 theoretical plates, 0.1128 mm HETP and tailing factor 1. The method was suitable for the quality control of cefotaxime in injection formulations.

  17. Development and validation of event-specific quantitative PCR method for genetically modified maize MIR604.

    PubMed

    Mano, Junichi; Furui, Satoshi; Takashima, Kaori; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2012-01-01

    A GM maize event, MIR604, has been widely distributed and an analytical method to quantify its content is required to monitor the validity of food labeling. Here we report a novel real-time PCR-based quantitation method for MIR604 maize. We developed real-time PCR assays specific for MIR604 using event-specific primers designed by the trait developer, and for maize endogenous starch synthase IIb gene (SSIIb). Then, we determined the conversion factor, which is required to calculate the weight-based GM maize content from the copy number ratio of MIR604-specific DNA to the endogenous reference DNA. Finally, to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind samples containing MIR604 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The reproducibility (RSDr) of the developed method was evaluated to be less than 25%. The limit of quantitation of the method was estimated to be 0.5% based on the ISO 24276 guideline. These results suggested that the developed method would be suitable for practical quantitative analyses of MIR604 maize.

  18. Validity of a Novel Method for Estimation of Low-Density Lipoprotein Cholesterol Levels in Diabetic Patients

    PubMed Central

    Chaen, Hideto; Kinchiku, Shigesumi; Kajiya, Shoko; Uenomachi, Hitoshi; Yuasa, Toshinori; Takasaki, Kunitsugu; Ohishi, Mitsuru

    2016-01-01

    Aim: Low-density lipoprotein cholesterol (LDL-C) is routinely estimated using the Friedewald equation [LDL-C(F)]. A novel method for LDL-C [LDL-C(M)] estimation recently proposed by Martin et al. was reported to be more accurate than the Friedewald formula in subjects in the United States. The validity of LDL-C(M) in different races and patients with diabetes mellitus (DM) has not been elucidated. The purpose of this study was to validate the LDL-C(M) estimates in Japanese population with type 2 DM by comparing with LDL-C(F) and directly measured LDL-C [LDL-C(D)]. Methods: Both LDL-C(M) and LDL-C(F) levels were compared against LDL-C(D) measured by selective solubilization method in 1,828 Japanese patients with type 2 DM. Results: On linear regression analysis, LDL-C(M) showed a stronger correlation than that shown by LDL-C(F) (R = 0.979 vs. R = 0.953, respectively) with LDL-C(D). We further analyzed the effect of serum triglyceride (TG) concentrations on the accuracy of LDL-C(F) and LDL-C(M). Although LDL-C levels showed a positive correlation with TG levels, the LDL-C(F) levels tended to show a greater divergence from LDL-C(D) levels than that shown by LDL-C(M) with changes in TG levels. Conclusion: We for the first time demonstrated a more useful measurement of LDL-C levels estimated by Martin's method than that estimated by the Friedewald equation in Japanese patients with DM. PMID:27592628

  19. From the Bronx to Bengifunda (and other lines of flight): deterritorializing purposes and methods in science education research

    NASA Astrophysics Data System (ADS)

    Gough, Noel

    2011-03-01

    In this essay I explore a number of questions about purposes and methods in science education research prompted by my reading of Wesley Pitts' ethnographic study of interactions among four students and their teacher in a chemistry classroom in the Bronx, New York City. I commence three `lines of flight' (small acts of Deleuzo-Guattarian deterritorialization) that depart from the conceptual territory regulated by science education's dominant systems of signification and make new connections within and beyond that territory. I offer neither a comprehensive review nor a thorough critique of Wesley's paper but, rather, suggest some alternative directions for science education research in the genre he exemplifies.

  20. Diffuse reflectance near infrared-chemometric methods development and validation of amoxicillin capsule formulations

    PubMed Central

    Khan, Ahmed Nawaz; Khar, Roop Krishen; Ajayakumar, P. V.

    2016-01-01

    Objective: The aim of present study was to establish near infrared-chemometric methods that could be effectively used for quality profiling through identification and quantification of amoxicillin (AMOX) in formulated capsule which were similar to commercial products. In order to evaluate a large number of market products easily and quickly, these methods were modeled. Materials and Methods: Thermo Scientific Antaris II near infrared analyzer with TQ Analyst Chemometric Software were used for the development and validation of the identification and quantification models. Several AMOX formulations were composed with four excipients microcrystalline cellulose, magnesium stearate, croscarmellose sodium and colloidal silicon dioxide. Development includes quadratic mixture formulation design, near infrared spectrum acquisition, spectral pretreatment and outlier detection. According to prescribed guidelines by International Conference on Harmonization (ICH) and European Medicine Agency (EMA) developed methods were validated in terms of specificity, accuracy, precision, linearity, and robustness. Results: On diffuse reflectance mode, an identification model based on discriminant analysis was successfully processed with 76 formulations; and same samples were also used for quantitative analysis using partial least square algorithm with four latent variables and 0.9937 correlation of coefficient followed by 2.17% root mean square error of calibration (RMSEC), 2.38% root mean square error of prediction (RMSEP), 2.43% root mean square error of cross-validation (RMSECV). Conclusion: Proposed model established a good relationship between the spectral information and AMOX identity as well as content. Resulted values show the performance of the proposed models which offers alternate choice for AMOX capsule evaluation, relative to that of well-established high-performance liquid chromatography method. Ultimately three commercial products were successfully evaluated using developed

  1. Quantitative Imaging Methods for the Development and Validation of Brain Biomechanics Models

    PubMed Central

    Bayly, Philip V.; Clayton, Erik H.; Genin, Guy M.

    2013-01-01

    Rapid deformation of brain tissue in response to head impact or acceleration can lead to numerous pathological changes, both immediate and delayed. Modeling and simulation hold promise for illuminating the mechanisms of traumatic brain injury (TBI) and for developing preventive devices and strategies. However, mathematical models have predictive value only if they satisfy two conditions. First, they must capture the biomechanics of the brain as both a material and a structure, including the mechanics of brain tissue and its interactions with the skull. Second, they must be validated by direct comparison with experimental data. Emerging imaging technologies and recent imaging studies provide important data for these purposes. This review describes these techniques and data, with an emphasis on magnetic resonance imaging approaches. In combination, these imaging tools promise to extend our understanding of brain biomechanics and improve our ability to study TBI in silico. PMID:22655600

  2. Validation of a New Placebo Interferential Current Method: A New Placebo Method of Electrostimulation.

    PubMed

    Mendonça Araújo, Fernanda; Alves Menezes, Mayara; Martins de Araújo, Ariane; Abner Dos Santos Sousa, Thiago; Vasconcelos Lima, Lucas; Ádan Nunes Carvalho, Elyson; Melo DeSantana, Josimari

    2016-04-05

    OBJECTIVE : The present study aimed to investigate if a new placebo device for interferential current (IFC) that delivers current during only the first 40 seconds of stimulation is effective at promoting adequate subject blinding. METHODS : Seventy-five subjects were recruited and enrolled into three groups: active IFC, inactive placebo, and new placebo. Pressure pain threshold (PPT), cutaneous sensory threshold (CST), and pain intensity were measured before and after the intervention. After the final assessment, the subjects and the investigator who applied the current were asked about the type of stimulation administered. RESULTS : None of the placebo forms studied resulted in significant changes to PPT, CST, or pain intensity. The subjects stimulated with active IFC at high intensities (> 17 mA) of stimulation showed higher PPT and CST and lower pain intensity than subjects stimulated at low intensities (p< 0.03). The new placebo method blinded the investigator in 100% of cases of IFC and 60% of subjects stimulated, whereas for inactive placebo, the investigator was blinded at a rate of 0% and 34% of subjects. CONCLUSION : The new method of placebo IFC was effective for blinding of research investigators and most of the active IFC-treated subjects, promoting an appropriate placebo method.

  3. Determination of the content of desmopressin in pharmaceutical preparations by HPLC and validation of the method.

    PubMed

    Dudkiewicz-Wilczyńska, Jadwiga; Snycerski, Andrzej; Tautt, Jadwiga

    2002-01-01

    The aim of this study was to apply high performance liquid chromatography to the determination of content of desmopressin in pharmaceutical preparations and validation of the method. The satisfactory results have been obtained using a column Luna C 8.5 microm, 100 x 4.6 mm and a mobile phase containing 0.067 M phosphate buffer of pH = 7 and acetonitrile in the proportion 83:17. It has been shown that the elaborated method shows good precision and accuracy and can be applied to the qualitative and quantitative analysis of pharmaceutical preparations containing desmopressin.

  4. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  5. Control rod heterogeneity effects in liquid-metal fast breeder reactors: Method developments and experimental validation

    SciTech Connect

    Carta, M.; Granget, G.; Palmiotti, G.; Salvatores, M.; Soule, R.

    1988-11-01

    The control rod worth assessment in a large liquid-metal fast breeder reactor is strongly dependent on the actual arrangement of the absorber pins inside the control rod subassemblies. The so-called heterogeneity effects (i.e., the effects on the rod reactivity of the actual rod internal geometry versus homogenization of the absorber atoms over all the subassembly volume) have been evaluated, using explicit and variational methods to derive appropriate cross sections. An experimental program performed at the MASURCA facility has been used to validate these methods.

  6. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1989-01-01

    The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.

  7. Development and validation of a GC/MS method for the simultaneous determination of levetiracetam and lamotrigine in whole blood.

    PubMed

    Nikolaou, Panagiota; Papoutsis, Ioannis; Dona, Artemisia; Spiliopoulou, Chara; Athanaselis, Sotiris

    2015-01-01

    A sensitive and accurate gas chromatography-mass spectrometric method was developed and validated for the simultaneous determination of levetiracetam and lamotrigine in whole blood. A solid-phase extraction (SPE) procedure using HF Bond Elut C18 columns followed by derivatization using N-methyl-N-tert-butyldimethylsilyl-trifluoroacetamide (MTBSTFA) with 1% tert-butyldimethylsilyl chloride (TBDMSCl) was used. In this assay, levetiracetam-d6 was used as internal standard. Limits of detection and quantification were 0.15 and 0.50 μg/mL, respectively, for both analytes. The method was proved to be linear within the concentration range of 0.50-50.0 μg/mL (R(2) ≥ 0.992) for both analytes. Absolute recovery was found to be at least 90.0 and 97.2% for levetiracetam and lamotrigine, respectively. Intra-day and inter-day accuracy values for both analytes were ranged from -6.5 to 4.2 and -6.6 to 3.0%, respectively, whereas their respective precision values were less than 11.4 and 8.3%. The developed method was successfully used in our laboratory for quantification of levetiracetam and lamotrigine blood concentrations during the investigation of forensic cases where these antiepileptic drugs were involved. This method could also be used for therapeutic drug monitoring purposes.

  8. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  9. Voltammetric determination of zinc in compound pharmaceutical preparations--validation of method.

    PubMed

    Lutka, Anna; Bukowska, Honorata

    2009-01-01

    The conditions of voltammetric determination of zinc in compound pharmaceutical preparations were established and validated. The three investigated preparations (Organic zinc (A), Calcium, zinc, copper with vitamin C (B), Vigor complete (V) contained different salts of zinc (II) and increasing number of other components. The samples of powdered tablets of each preparation were undergone mineralization or extraction procedures for the purpose to transfer the zinc ions into solution. The concentration of zinc in solution was determined by differential pulse voltammetry (DP). The selectivity, accuracy, precision and linearity of DP determination of zinc in three preparations were estimated in validation process. Zinc was determined within the concentration range of 1-12 ppm (1-12 microg/mL): the mean recoveries approached 96-99% (A), 96-102% (B), 106% (V); the relative standard deviations of determinations (RSD) were 2,97-2,98% (A), 2,83-3,84% (B) and 2,65% (V). The mean recoveries and the errors of determination satisfied the requirements for the analyte concentration at the level 1-20 ppm (12).

  10. The voodoo doll task: Introducing and validating a novel method for studying aggressive inclinations.

    PubMed

    Dewall, C Nathan; Finkel, Eli J; Lambert, Nathaniel M; Slotter, Erica B; Bodenhausen, Galen V; Pond, Richard S; Renzetti, Claire M; Fincham, Frank D

    2013-01-01

    Aggression pervades modern life. To understand the root causes of aggression, researchers have developed several methods to assess aggressive inclinations. The current article introduces a new behavioral method-the voodoo doll task (VDT)-that offers a reliable and valid trait and state measure of aggressive inclinations across settings and relationship contexts. Drawing on theory and research on the law of similarity and magical beliefs (Rozin, Millman, & Nemeroff [1986], Journal of Personality and Social Psychology, 50, 703-712), we propose that people transfer characteristics of a person onto a voodoo doll representing that person. As a result, causing harm to a voodoo doll by stabbing it with pins may have important psychological similarities to causing actual harm to the person the voodoo doll represents. Nine methodologically diverse studies (total N = 1,376) showed that the VDT had strong reliability, construct validity, and convergent validity. Discussion centers on the importance of magical beliefs in understanding the causes of aggressive inclinations.

  11. Validation of structural analysis methods using burner liner cyclic rig test data

    NASA Technical Reports Server (NTRS)

    Thompson, R.

    1983-01-01

    The objectives of the hot section technology (HOST) burner liner cyclic rig test program are basically threefold: (1) to assist in developing predictive tools needed to improve design analyses and procedures for the efficient and accurate prediction of burner liner structural response; (2) to calibrate, evaluate and validate these predictive tools by comparing the predicted results with the experimental data generated in the tests; and (3) to evaluate existing as well as advanced temperature and strain measurement instrumentation, both contact and noncontact, in a simulated engine cycle environment. The data generated will include measurements of the thermal environment (metal surface temperatures) as well as structural (strain) and life (fatigue) responses of simulated burner liners and specimens under controlled boundary and operating conditions. These data will be used to calibrate, compare and validate analytical theories, methodologies and design procedures, as well as improvements in them, for predicting liner temperatures, stress-strain responses and cycles to failure. Comparison of predicted results with experimental data will be used to show where the predictive theories, etc. need improvements. In addition, as the predictive tools, as well as the tests, test methods, and data acquisition and reduction techniques, are developed and validated, a proven, integrated analysis/experiment method will be developed to determine the cyclic life of a simulated burner liner.

  12. Methods for validation of the mass distribution of a full body finite element model - biomed 2011.

    PubMed

    Thompson, A Bradley; Rhyne, Ashley C; Moreno, Daniel P; Gayzik, F Scott; Stitzel, Joel D

    2011-01-01

    Accurate mass distribution in computational human body models is essential for kinematic and kinetic validation. The purpose of this study was to validate the mass distribution of the 50th percentile male model (M50) developed as part of the Global Human Body Models Consortium (GHBMC) project. The body segment centers of gravity (CG) of M50 were compared against published data in two ways: using a homogeneous body surface CAD model, and a Finite Element Model (FEM). Both the CAD and FEM models were generated from image data collected from the same 50th percentile male subject. Each model was partitioned into 11 segments, using segment planes constructed from bony landmarks acquired from the subject. CG’s of the CAD and FEA models were computed using commercially available software packages. Deviation between the literature data CG’s and CG’s of the FEM and CAD were 5.8% and 5.6% respectively when normalized by a regional characteristic length. Deviation between the FEM and CAD CG’s averaged 2.4% when normalized in the same fashion. Unlike the CAD and literature which both assume homogenous mass distribution, the FEM CG data account for varying densities of anatomical structures by virtue of the assigned material properties. This analysis validates the CG’s determined from each model by comparing them directly to well-known literature studies that rely only on anthropometric landmarks to determine the CG’s measurements. The results of this study will help enhance the biofidelity of the GHBMC M50 model.

  13. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  14. Hydrophilic interaction liquid chromatography method development and validation for the assay of HEPES zwitterionic buffer.

    PubMed

    Xu, Xiaolong; Gevaert, Bert; Bracke, Nathalie; Yao, Han; Wynendaele, Evelien; De Spiegeleer, Bart

    2017-02-20

    HEPES is a zwitterionic buffer component used as a raw material in the GMP-manufacturing of advanced therapy medicinal products (ATMPs), hence requiring an adequate assay method with sufficient selectivity toward related impurities. Therefore, a hydrophilic interaction chromatography (HILIC) method was developed. Different factors were investigated towards the retention behavior of HEPES, its analogue EPPS and its starting material isethionate: pH, ion concentration and organic solvent ratio of the mobile phase, as well as column temperature. Moreover, stress testing resulted in the N-oxide degradant, identified by high resolution MS. The final method consisted of an isocratic system with an aqueous (pH 2.0 with H3PO4) acetonitrile (35:65, v/v) mobile phase on a zwitterionic HILIC (Obelisc N) column with a flow rate of 0.5mL/min and UV detection at 195nm. The assay method of HEPES was validated, obtaining adequate linearity (R(2)=0.999), precision (RSD of 0.5%) and accuracy (recovery of 100.08%). Finally, the applicability of the validated method was demonstrated by analysis of samples from different suppliers.

  15. Development and Validation of Stability Indicating Spectroscopic Method for Content Analysis of Ceftriaxone Sodium in Pharmaceuticals

    PubMed Central

    Ethiraj, Revathi; Thiruvengadam, Ethiraj; Sampath, Venkattapuram Saravanan; Vahid, Abdul; Raj, Jithin

    2014-01-01

    A simple, selective, and stability indicating spectroscopic method has been selected and validated for the assay of ceftriaxone sodium in the powder for injection dosage forms. Proposed method is based on the measurement of absorbance of ceftriaxone sodium in aqueous medium at 241 nm. The method obeys Beer's law in the range of 5–50 μg/mL with correlation coefficient of 0.9983. Apparent molar absorptivity and Sandell's sensitivity were found to be 2.046 × 103 L mol−1 cm−1 and 0.02732 μg/cm2/0.001 absorbance units. This study indicated that ceftriaxone sodium was degraded in acid medium and also underwent oxidative degradation. Percent relative standard deviation associated with all the validation parameters was less than 2, showing compliance with acceptance criteria of Q2 (R1), International Conference on Harmonization (2005) guidelines. Then the proposed method was successfully applied to the determination of ceftriaxone sodium in sterile preparation and results were comparable with reported methods. PMID:27355020

  16. Update from the Japanese Center for the Validation of Alternative Methods (JaCVAM).

    PubMed

    Kojima, Hajime

    2013-12-01

    The Japanese Center for the Validation of Alternative Methods (JaCVAM) was established in 2005 to promote the use of alternatives to animal testing in regulatory studies, thereby replacing, reducing, or refining the use of animals, according to the Three Rs principles. JaCVAM assesses the utility, limitations and suitability for use in regulatory studies, of test methods needed to determine the safety of chemicals and other materials. JaCVAM also organises and performs validation studies of new test methods, when necessary. In addition, JaCVAM co-operates and collaborates with similar organisations in related fields, both in Japan and internationally, which also enables JaCVAM to provide input during the establishment of guidelines for new alternative experimental methods. These activities help facilitate application and approval processes for the manufacture and sale of pharmaceuticals, chemicals, pesticides, and other products, as well as for revisions to standards for cosmetic products. In this manner, JaCVAM plays a leadership role in the introduction of new alternative experimental methods for regulatory acceptance in Japan.

  17. Development and Validation of HPTLC Method for Estimation of Tenoxicam and its Formulations

    PubMed Central

    Chandel, S.; Barhate, C. R.; Srivastava, A. R.; Kulkarni, S. R.; Kapadia, C. J.

    2012-01-01

    A simple, precise, accurate and rapid high performance thin layer chromatographic method has been developed and validated for the estimation of tenoxicam in the microemulsion gels. Tenoxicam was chromatographed on silica gel 60 F254 TLC plate, as a stationary phase. The mobile phase was toluene: ethyl acetate: formic acid (6:4:0.3 v/v/v), which gave a dense and compact spot of tenoxicam with a Rf value of 0.38±0.03. The quantification was carried out at 379 nm. The method was validated in terms of linearity, accuracy, precision and specificity. To justify the suitability, accuracy and precision of the proposed method, recovery studies were performed at three concentration levels. Statistical analysis proved that the proposed method is accurate and reproducible with linearity in the range of 100 to 400 ng. The limit of detection and limit of quantification for tenoxicam were 25 and 50 μg/spot, respectively. The proposed method can be employed for the routine analysis of tenoxicam as well as in pharmaceutical formulations. PMID:23204620

  18. Development and Validation of HPTLC Method for Estimation of Tenoxicam and its Formulations.

    PubMed

    Chandel, S; Barhate, C R; Srivastava, A R; Kulkarni, S R; Kulkarni, S K; Kapadia, C J

    2012-01-01

    A simple, precise, accurate and rapid high performance thin layer chromatographic method has been developed and validated for the estimation of tenoxicam in the microemulsion gels. Tenoxicam was chromatographed on silica gel 60 F(254) TLC plate, as a stationary phase. The mobile phase was toluene: ethyl acetate: formic acid (6:4:0.3 v/v/v), which gave a dense and compact spot of tenoxicam with a R(f) value of 0.38±0.03. The quantification was carried out at 379 nm. The method was validated in terms of linearity, accuracy, precision and specificity. To justify the suitability, accuracy and precision of the proposed method, recovery studies were performed at three concentration levels. Statistical analysis proved that the proposed method is accurate and reproducible with linearity in the range of 100 to 400 ng. The limit of detection and limit of quantification for tenoxicam were 25 and 50 μg/spot, respectively. The proposed method can be employed for the routine analysis of tenoxicam as well as in pharmaceutical formulations.

  19. Development and validation of an HPLC–MS/MS method to determine clopidogrel in human plasma

    PubMed Central

    Liu, Gangyi; Dong, Chunxia; Shen, Weiwei; Lu, Xiaopei; Zhang, Mengqi; Gui, Yuzhou; Zhou, Qinyi; Yu, Chen

    2015-01-01

    A quantitative method for clopidogrel using online-SPE tandem LC–MS/MS was developed and fully validated according to the well-established FDA guidelines. The method achieves adequate sensitivity for pharmacokinetic studies, with lower limit of quantifications (LLOQs) as low as 10 pg/mL. Chromatographic separations were performed on reversed phase columns Kromasil Eternity-2.5-C18-UHPLC for both methods. Positive electrospray ionization in multiple reaction monitoring (MRM) mode was employed for signal detection and a deuterated analogue (clopidogrel-d4) was used as internal standard (IS). Adjustments in sample preparation, including introduction of an online-SPE system proved to be the most effective method to solve the analyte back-conversion in clinical samples. Pooled clinical samples (two levels) were prepared and successfully used as real-sample quality control (QC) in the validation of back-conversion testing under different conditions. The result showed that the real samples were stable in room temperature for 24 h. Linearity, precision, extraction recovery, matrix effect on spiked QC samples and stability tests on both spiked QCs and real sample QCs stored in different conditions met the acceptance criteria. This online-SPE method was successfully applied to a bioequivalence study of 75 mg single dose clopidogrel tablets in 48 healthy male subjects. PMID:26904399

  20. Validation of a method to directly and specifically measure nitrite in biological matrices.

    PubMed

    Almeida, Luis E F; Kamimura, Sayuri; Kenyon, Nicholas; Khaibullina, Alfia; Wang, Li; de Souza Batista, Celia M; Quezado, Zenaide M N

    2015-02-15

    The bioactivity of nitric oxide (NO) is influenced by chemical species generated through reactions with proteins, lipids, metals, and its conversion to nitrite and nitrate. A better understanding of the functions played by each of these species could be achieved by developing selective assays able of distinguishing nitrite from other NO species. Nagababu and Rifkind developed a method using acetic and ascorbic acids to measure nitrite-derived NO in plasma. Here, we adapted, optimized, and validated this method to assay nitrite in tissues. The method yielded linear measurements over 1-300 pmol of nitrite and was validated for tissue preserved in a nitrite stabilization solution composed of potassium ferricyanide, N-ethylmaleimide and NP-40. When samples were processed with chloroform, but not with methanol, ethanol, acetic acid or acetonitrile, reliable and reproducible nitrite measurements in up to 20 sample replicates were obtained. The method's accuracy in tissue was ≈ 90% and in plasma 99.9%. In mice, during basal conditions, brain, heart, lung, liver, spleen and kidney cortex had similar nitrite levels. In addition, nitrite tissue levels were similar regardless of when organs were processed: immediately upon collection, kept in stabilization solution for later analysis or frozen and later processed. After ip nitrite injections, rapidly changing nitrite concentrations in tissue and plasma could be measured and were shown to change in significantly distinct patterns. This validated method could be valuable for investigations of nitrite biology in conditions such as sickle cell disease, cardiovascular disease, and diabetes, where nitrite is thought to play a role.

  1. A comparison of a 30-cluster survey method used in India and a purposive method in the estimation of immunization coverages in Tamil Nadu.

    PubMed

    Murthy, B N; Ezhil, R; Venkatasubramanian, S; Ramalingam, N; Periannan, V; Ganesan, R; Ramani, N; Selvaraj, V

    1995-01-01

    A 30-cluster survey method that is employed for estimating immunization coverages by the Government of India (GOI) was compared with a Purposive method, to investigate whether the likely omission of SC/ST and backward classes in the former would lead to the reporting of higher coverages. The essential difference between the two methods is in the manner in which the first household is selected in the chosen first stage sampling units (villages). With the GOI method, it is often close to the village centre, whereas with the Purposive method it is always in the periphery or in a pocket consisting of SC/ST or backward classes. A concurrent comparison of the two methods in three districts in Tamil Nadu showed no real differences in the coverage with DPT and BCG vaccines. However, the coverage was consistently higher by the GOI method in the case of the Polio vaccine (by 1.5%, 3.1% and 5.3% in the 3 districts), and the Measles vaccine (by 4.8%, 13.3% and 13.9%); the average difference was 3.3% for Polio vaccine (p = 0.08) and 7.3% for Measles vaccine (p = 0.01).

  2. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  3. Determination of some phenolic compounds in red wine by RP-HPLC: method development and validation.

    PubMed

    Burin, Vívian Maria; Arcari, Stefany Grützmann; Costa, Léa Luzia Freitas; Bordignon-Luiz, Marilde T

    2011-09-01

    A methodology employing reversed-phase high-performance liquid chromatography (RP-HPLC) was developed and validated for simultaneous determination of five phenolic compounds in red wine. The chromatographic separation was carried out in a C(18) column with water acidify with acetic acid (pH 2.6) (solvent A) and 20% solvent A and 80% acetonitrile (solvent B) as the mobile phase. The validation parameters included: selectivity, linearity, range, limits of detection and quantitation, precision and accuracy, using an internal standard. All calibration curves were linear (R(2) > 0.999) within the range, and good precision (RSD < 2.6%) and recovery (80-120%) was obtained for all compounds. This method was applied to quantify phenolics in red wine samples from Santa Catarina State, Brazil, and good separation peaks for phenolic compounds in these wines were observed.

  4. SWeRF--A method for estimating the relevant fine particle fraction in bulk materials for classification and labelling purposes.

    PubMed

    Pensis, Ingeborg; Luetzenkirchen, Frank; Friede, Bernd

    2014-05-01

    In accordance with the European regulation for classification, labelling and packaging of substances and mixtures (CLP) as well as the criteria as set out in the Globally Harmonized System (GHS), fine fraction of crystalline silica (CS) has been classified as a specific target organ toxicity, the specific organ in this case being the lung. Generic cut-off values for products containing a fine fraction of CS trigger the need for a method for the quantification of the fine fraction of CS in bulk materials. This article describes the so-called SWeRF method, the size-weighted relevant fine fraction. The SWeRF method combines the particle size distribution of a powder with probability factors from the EN 481 standard and allows the relevant fine fraction of a material to be calculated. The SWeRF method has been validated with a number of industrial minerals. This will enable manufacturers and blenders to apply the CLP and GHS criteria for the classification of mineral products containing RCS a fine fraction of CS.

  5. A fast and reliable method for GHB quantitation in whole blood by GC-MS/MS (TQD) for forensic purposes.

    PubMed

    Castro, André L; Tarelho, Sónia; Dias, Mário; Reis, Flávio; Teixeira, Helena M

    2016-02-05

    Gamma-hydroxybutyric acid (GHB) is an endogenous compound with a story of clinical use since the 1960s. However, due to its secondary effects, it has become a controlled substance, entering the illicit market. A fully validated, sensitive and reproducible method for the quantification of GHB by methanolic precipitation and GC-MS/MS (TQD) in whole blood is presented. Using 100μL of whole blood, obtained results included a LOD and LLOQ of 0.1mg/L and a recovery of 86% in a working range between 0.1 and 100mg/L. This method is sensitive and specific to detect the presence of GHB in small amounts of whole blood (both ante-mortem or post-mortem), and is, to the authors' knowledge, the first GC-MS-MS TQD method that uses different precursor ions and product ions for the identification of GHB and GHB-D6 (internal standard). Hence, this method may be especially useful for the study of endogenous values in this biological sample.

  6. SWeRF—A Method for Estimating the Relevant Fine Particle Fraction in Bulk Materials for Classification and Labelling Purposes

    PubMed Central

    2014-01-01

    In accordance with the European regulation for classification, labelling and packaging of substances and mixtures (CLP) as well as the criteria as set out in the Globally Harmonized System (GHS), fine fraction of crystalline silica (CS) has been classified as a specific target organ toxicity, the specific organ in this case being the lung. Generic cut-off values for products containing a fine fraction of CS trigger the need for a method for the quantification of the fine fraction of CS in bulk materials. This article describes the so-called SWeRF method, the size-weighted relevant fine fraction. The SWeRF method combines the particle size distribution of a powder with probability factors from the EN 481 standard and allows the relevant fine fraction of a material to be calculated. The SWeRF method has been validated with a number of industrial minerals. This will enable manufacturers and blenders to apply the CLP and GHS criteria for the classification of mineral products containing RCS a fine fraction of CS. PMID:24389081

  7. Enhanced high-performance liquid chromatography method for the determination of retinoic acid in plasma. Development, optimization and validation.

    PubMed

    Teglia, Carla M; Gil García, María D; Galera, María Martínez; Goicoechea, Héctor C

    2014-08-01

    When determining endogenous compounds in biological samples, the lack of blank or analyte-free matrix samples involves the use of alternative strategies for calibration and quantitation. This article deals with the development, optimization and validation of a high performance liquid chromatography method for the determination of retinoic acid in plasma, obtaining at the same time information about its isomers, taking into account the basal concentration of these endobiotica. An experimental design was used for the optimization of three variables: mobile phase composition, flow rate and column temperature through a central composite design. Four responses were selected for optimization purposes (area under the peaks, quantity of peaks, analysis time and resolution between the first principal peak and the following one). The optimum conditions resulted in a mobile phase consisting of methanol 83.4% (v/v), acetonitrile 0.6% (v/v) and acid aqueous solution 16.0% (v/v); flow rate of 0.68 mL min(-1) and an column temperature of 37.10 °C. Detection was performed at 350 nm by a diode array detector. The method was validated following a holistic approach that included not only the classical parameters related to method performance but also the robustness and the expected proportion of acceptable results lying inside predefined acceptability intervals, i.e., the uncertainty of measurements. The method validation results indicated a high selectivity and good precision characteristics that were studied at four concentration levels, with RSD less than 5.0% for retinoic acid (less than 7.5% for the LOQ concentration level), in intra and inter-assay precision studies. Linearity was proved for a range from 0.00489 to 15.109 ng mL(-1) of retinoic acid and the recovery, which was studied at four different fortification levels in phuman plasma samples, varied from 99.5% to 106.5% for retinoic acid. The applicability of the method was demonstrated by determining retinoic acid and

  8. Validation of a new hand-held electronic data capture method for continuous monitoring of subjective appetite sensations

    PubMed Central

    2011-01-01

    Background When large scale trials are investigating the effects of interventions on appetite, it is paramount to efficiently monitor large amounts of human data. The original hand-held Electronic Appetite Ratings System (EARS) was designed to facilitate the administering and data management of visual analogue scales (VAS) of subjective appetite sensations. The purpose of this study was to validate a novel hand-held method (EARS II (HP® iPAQ)) against the standard Pen and Paper (P&P) method and the previously validated EARS. Methods Twelve participants (5 male, 7 female, aged 18-40) were involved in a fully repeated measures design. Participants were randomly assigned in a crossover design, to either high fat (>48% fat) or low fat (<28% fat) meal days, one week apart and completed ratings using the three data capture methods ordered according to Latin Square. The first set of appetite sensations was completed in a fasted state, immediately before a fixed breakfast. Thereafter, appetite sensations were completed every thirty minutes for 4h. An ad libitum lunch was provided immediately before completing a final set of appetite sensations. Results Repeated measures ANOVAs were conducted for ratings of hunger, fullness and desire to eat. There were no significant differences between P&P compared with either EARS or EARS II (p > 0.05). Correlation coefficients between P&P and EARS II, controlling for age and gender, were performed on Area Under the Curve ratings. R2 for Hunger (0.89), Fullness (0.96) and Desire to Eat (0.95) were statistically significant (p < 0.05). Conclusions EARS II was sensitive to the impact of a meal and recovery of appetite during the postprandial period and is therefore an effective device for monitoring appetite sensations. This study provides evidence and support for further validation of the novel EARS II method for monitoring appetite sensations during large scale studies. The added versatility means that future uses of the system provides

  9. Validated spectrophotometric methods for determination of sodium valproate based on charge transfer complexation reactions

    NASA Astrophysics Data System (ADS)

    Belal, Tarek S.; El-Kafrawy, Dina S.; Mahrous, Mohamed S.; Abdel-Khalek, Magdi M.; Abo-Gharam, Amira H.

    2016-02-01

    This work presents the development, validation and application of four simple and direct spectrophotometric methods for determination of sodium valproate (VP) through charge transfer complexation reactions. The first method is based on the reaction of the drug with p-chloranilic acid (p-CA) in acetone to give a purple colored product with maximum absorbance at 524 nm. The second method depends on the reaction of VP with dichlone (DC) in dimethylformamide forming a reddish orange product measured at 490 nm. The third method is based upon the interaction of VP and picric acid (PA) in chloroform resulting in the formation of a yellow complex measured at 415 nm. The fourth method involves the formation of a yellow complex peaking at 361 nm upon the reaction of the drug with iodine in chloroform. Experimental conditions affecting the color development were studied and optimized. Stoichiometry of the reactions was determined. The proposed spectrophotometric procedures were effectively validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. Calibration curves of the formed color products with p-CA, DC, PA and iodine showed good linear relationships over the concentration ranges 24-144, 40-200, 2-20 and 1-8 μg/mL respectively. The proposed methods were successfully applied to the assay of sodium valproate in tablets and oral solution dosage forms with good accuracy and precision. Assay results were statistically compared to a reference pharmacopoeial HPLC method where no significant differences were observed between the proposed methods and reference method.

  10. Validated spectrophotometric methods for determination of sodium valproate based on charge transfer complexation reactions.

    PubMed

    Belal, Tarek S; El-Kafrawy, Dina S; Mahrous, Mohamed S; Abdel-Khalek, Magdi M; Abo-Gharam, Amira H

    2016-02-15

    This work presents the development, validation and application of four simple and direct spectrophotometric methods for determination of sodium valproate (VP) through charge transfer complexation reactions. The first method is based on the reaction of the drug with p-chloranilic acid (p-CA) in acetone to give a purple colored product with maximum absorbance at 524nm. The second method depends on the reaction of VP with dichlone (DC) in dimethylformamide forming a reddish orange product measured at 490nm. The third method is based upon the interaction of VP and picric acid (PA) in chloroform resulting in the formation of a yellow complex measured at 415nm. The fourth method involves the formation of a yellow complex peaking at 361nm upon the reaction of the drug with iodine in chloroform. Experimental conditions affecting the color development were studied and optimized. Stoichiometry of the reactions was determined. The proposed spectrophotometric procedures were effectively validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. Calibration curves of the formed color products with p-CA, DC, PA and iodine showed good linear relationships over the concentration ranges 24-144, 40-200, 2-20 and 1-8μg/mL respectively. The proposed methods were successfully applied to the assay of sodium valproate in tablets and oral solution dosage forms with good accuracy and precision. Assay results were statistically compared to a reference pharmacopoeial HPLC method where no significant differences were observed between the proposed methods and reference method.

  11. The Telemark Breast Score: a Valid Method for Evaluation of Outcome after Breast Surgery

    PubMed Central

    Stark, Birgit

    2017-01-01

    Background: “Telemark Breast Score” (TBS) has been developed at Telemark Hospital in Norway for evaluation of results after breast surgery based on standardized patients’ photographs taken as a part of daily routine. Its reliability has recently been tested and approved. The external validity of the TBS was assessed by matching its data against the internationally recognized Breast-Q (BQ) questionnaire as a further step to study the validity of this new tool. Methods: The ideal distribution of breast volume is 45% of the total volume above and 55% below the nipple, and a 40° slope line at the upper pole. TBS makes the evaluation of these parameters of breast aesthetics more explicit. The method has been tested on photographs from 31 patients operated on for breast cancer with the Deep Inferior Perforator Flap. The evaluation was done by an independent experienced plastic surgeon earlier participating in the test–retests. The external validity of TBS was investigated against domains 1 and 3 of the BQ reconstruction module. The concordance between ratings was analyzed. Results: Concordance between TBS items and BQ domain 1 items regarding patient satisfaction, and between TBS items and BQ domain 3 items regarding how the patient experienced the outcome of breast reconstruction was relatively high except for 6 comparisons where we could not statistically ensure that more pairs were concordant than discordant. A total of 178 comparisons appeared to be concordant. This means that for all other comparisons, there was a preponderance of pairs of concordant observations, which indicates that measurements from the 2 instruments follow each other. Conclusion: The present data indicate that the TBS can be recommended as a valid tool to professionals for assessment of the outcome after breast reconstruction. PMID:28280676

  12. Validity of bag urine culture for predicting urinary tract infections in febrile infants: a paired comparison of urine collection methods

    PubMed Central

    Kim, Geun-A

    2015-01-01

    Purpose Catheter urine (CATH-U) and suprapubic aspiration (SPA) are reliable urine collection methods for confirming urinary tract infections (UTI) in infants. However, noninvasive and easily accessible collecting bag urine (CBU) is widely used, despite its high contamination rate. This study investigated the validity of CBU cultures for diagnosing UTIs, using CATH-U culture results as the gold standard. Methods We retrospectively analyzed 210 infants, 2- to 24-month-old, who presented to a tertiary care hospital's pediatrics department between September 2008 and August 2013. We reviewed the results of CBU and CATH-U cultures from the same infants. Results CBU results, relative to CATH-U culture results (≥104 colony-forming units [CFU]/mL) were widely variable, ranging from no growth to ≥105 CFU/mL. A CBU cutoff value of ≥105 CFU/mL resulted in false-positive and false-negative rates of 18% and 24%, respectively. The probability of a UTI increased when the CBU bacterial count was ≥105/mL for all infants, both uncircumcised male infants and female infants (likelihood ratios [LRs], 4.16, 4.11, and 4.11, respectively). UTIs could not be excluded for female infants with a CBU bacterial density of 104-105 (LR, 1.40). The LRs for predicting UTIs based on a positive dipstick test and a positive urinalysis were 4.19 and 3.11, respectively. Conclusion The validity of obtaining urine sample from a sterile bag remains questionable. Inconclusive culture results from CBU should be confirmed with a more reliable method. PMID:26124849

  13. Validating a nondestructive optical method for apportioning colored particulate matter into black carbon and additional components

    PubMed Central

    Yan, Beizhan; Kennedy, Daniel; Miller, Rachel L.; Cowin, James P.; Jung, Kyung-hwa; Perzanowski, Matt; Balletta, Marco; Perera, Federica P.; Kinney, Patrick L.; Chillrud, Steven N.

    2011-01-01

    Exposure of black carbon (BC) is associated with a variety of adverse health outcomes. A number of optical methods for estimating BC on Teflon filters have been adopted but most assume all light absorption is due to BC while other sources of colored particulate matter exist. Recently, a four-wavelength-optical reflectance measurement for distinguishing second hand cigarette smoke (SHS) from soot-BC was developed (Brook et al., 2010; Lawless et al., 2004). However, the method has not been validated for soot-BC nor SHS and little work has been done to look at the methodological issues of the optical reflectance measurements for samples that could have SHS, BC, and other colored particles. We refined this method using a lab-modified integrating sphere with absorption measured continuously from 350 nm to 1000 nm. Furthermore, we characterized the absorption spectrum of additional components of particulate matter (PM) on PM2.5 filters including ammonium sulfate, hematite, goethite, and magnetite. Finally, we validate this method for BC by comparison to other standard methods. Use of synthesized data indicates that it is important to optimize the choice of wavelengths to minimize computational errors as additional components (more than 2) are added to the apportionment model of colored components. We found that substantial errors are introduced when using 4 wavelengths suggested by Lawless et al. to quantify four substances, while an optimized choice of wavelengths can reduce model-derived error from over 10% to less than 2%. For environmental samples, the method was sensitive for estimating airborne levels of BC and SHS, but not mass loadings of iron oxides and sulfate. Duplicate samples collected in NYC show high reproducibility (points consistent with a 1:1 line, R2 = 0.95). BC data measured by this method were consistent with those measured by other optical methods, including Aethalometer and Smoke-stain Reflectometer (SSR); although the SSR looses sensitivity at

  14. Validating a nondestructive optical method for apportioning colored particulate matter into black carbon and additional components.

    PubMed

    Yan, Beizhan; Kennedy, Daniel; Miller, Rachel L; Cowin, James P; Jung, Kyung-Hwa; Perzanowski, Matt; Balletta, Marco; Perera, Federica P; Kinney, Patrick L; Chillrud, Steven N

    2011-12-01

    Exposure of black carbon (BC) is associated with a variety of adverse health outcomes. A number of optical methods for estimating BC on Teflon filters have been adopted but most assume all light absorption is due to BC while other sources of colored particulate matter exist. Recently, a four-wavelength-optical reflectance measurement for distinguishing second hand cigarette smoke (SHS) from soot-BC was developed (Brook et al., 2010; Lawless et al., 2004). However, the method has not been validated for soot-BC nor SHS and little work has been done to look at the methodological issues of the optical reflectance measurements for samples that could have SHS, BC, and other colored particles. We refined this method using a lab-modified integrating sphere with absorption measured continuously from 350 nm to 1000 nm. Furthermore, we characterized the absorption spectrum of additional components of particulate matter (PM) on PM(2.5) filters including ammonium sulfate, hematite, goethite, and magnetite. Finally, we validate this method for BC by comparison to other standard methods. Use of synthesized data indicates that it is important to optimize the choice of wavelengths to minimize computational errors as additional components (more than 2) are added to the apportionment model of colored components. We found that substantial errors are introduced when using 4 wavelengths suggested by Lawless et al. to quantify four substances, while an optimized choice of wavelengths can reduce model-derived error from over 10% to less than 2%. For environmental samples, the method was sensitive for estimating airborne levels of BC and SHS, but not mass loadings of iron oxides and sulfate. Duplicate samples collected in NYC show high reproducibility (points consistent with a 1:1 line, R(2) = 0.95). BC data measured by this method were consistent with those measured by other optical methods, including Aethalometer and Smoke-stain Reflectometer (SSR); although the SSR looses sensitivity at

  15. Validating a nondestructive optical method for apportioning colored particulate matter into black carbon and additional components

    NASA Astrophysics Data System (ADS)

    Yan, Beizhan; Kennedy, Daniel; Miller, Rachel L.; Cowin, James P.; Jung, Kyung-hwa; Perzanowski, Matt; Balletta, Marco; Perera, Federica P.; Kinney, Patrick L.; Chillrud, Steven N.

    2011-12-01

    Exposure of black carbon (BC) is associated with a variety of adverse health outcomes. A number of optical methods for estimating BC on Teflon filters have been adopted but most assume all light absorption is due to BC while other sources of colored particulate matter exist. Recently, a four-wavelength-optical reflectance measurement for distinguishing second hand cigarette smoke (SHS) from soot-BC was developed (Brook et al., 2010; Lawless et al., 2004). However, the method has not been validated for soot-BC nor SHS and little work has been done to look at the methodological issues of the optical reflectance measurements for samples that could have SHS, BC, and other colored particles. We refined this method using a lab-modified integrating sphere with absorption measured continuously from 350 nm to 1000 nm. Furthermore, we characterized the absorption spectrum of additional components of particulate matter (PM) on PM 2.5 filters including ammonium sulfate, hematite, goethite, and magnetite. Finally, we validate this method for BC by comparison to other standard methods. Use of synthesized data indicates that it is important to optimize the choice of wavelengths to minimize computational errors as additional components (more than 2) are added to the apportionment model of colored components. We found that substantial errors are introduced when using 4 wavelengths suggested by Lawless et al. to quantify four substances, while an optimized choice of wavelengths can reduce model-derived error from over 10% to less than 2%. For environmental samples, the method was sensitive for estimating airborne levels of BC and SHS, but not mass loadings of iron oxides and sulfate. Duplicate samples collected in NYC show high reproducibility (points consistent with a 1:1 line, R2 = 0.95). BC data measured by this method were consistent with those measured by other optical methods, including Aethalometer and Smoke-stain Reflectometer (SSR); although the SSR looses sensitivity at

  16. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  17. Validity boundary of orbital-free molecular dynamics method corresponding to thermal ionization of shell structure

    NASA Astrophysics Data System (ADS)

    Gao, Chang; Zhang, Shen; Kang, Wei; Wang, Cong; Zhang, Ping; He, X. T.

    2016-11-01

    With 6LiD as an example, we show that the applicable region of the orbital-free molecular dynamics (OFMD) method in a large temperature range is determined by the thermal ionization process of bound electrons in shell structures. The validity boundary of the OFMD method is defined roughly by the balance point of the average thermal energy of an electron and the ionization energy of the lowest localized electronic state. This theoretical proposition is based on the observation that the deviation of the OFMD method originates from its less accurate description to the charge density in partially ionized shells, as compared with the results of the extended first-principles molecular dynamics method, which well reproduces the charge density of shell structures.

  18. Validated spectrophotometric methods for simultaneous determination of Omeprazole, Tinidazole and Doxycycline in their ternary mixture

    NASA Astrophysics Data System (ADS)

    Lotfy, Hayam M.; Hegazy, Maha A.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-01-01

    A comparative study of smart spectrophotometric techniques for the simultaneous determination of Omeprazole (OMP), Tinidazole (TIN) and Doxycycline (DOX) without prior separation steps is developed. These techniques consist of several consecutive steps utilizing zero/or ratio/or derivative spectra. The proposed techniques adopt nine simple different methods, namely direct spectrophotometry, dual wavelength, first derivative-zero crossing, amplitude factor, spectrum subtraction, ratio subtraction, derivative ratio-zero crossing, constant center, and successive derivative ratio method. The calibration graphs are linear over the concentration range of 1-20 μg/mL, 5-40 μg/mL and 2-30 μg/mL for OMP, TIN and DOX, respectively. These methods are tested by analyzing synthetic mixtures of the above drugs and successfully applied to commercial pharmaceutical preparation. The methods that are validated according to the ICH guidelines, accuracy, precision, and repeatability, were found to be within the acceptable limits.

  19. [Validity of the modern fetal monitoring methods in the decision of emergency obstetric operations].

    PubMed

    Issel, E P; Bollmann, R; Prenzlau, P

    1975-01-01

    The validity of the modern methods of fetal monitoring to decide for the indication of urgent obstetric operations. The reliability of the modern supervision of the fetus is studied in cases of doubtful fetal heart action. Up to the present day we have no method for the exact estimation of the degree of a damage to the fetus. In such a precarious situation we should use all available methods for the diagnosis of the fetal condition, because the results of only one of the methods offer insufficient evidence. By means of the literature the alterations in the ECG of the dying fetus are interpreted in comparison to artefacts. In cases of doubtful fetal heart action we recommend in addition to the clinical findings to record the fetal ECG, to controll the actual fetal pH and attempt an investigation by ultrasonic.

  20. Chemometric approach to open validation protocols: Prediction of validation parameters in multi-residue ultra-high performance liquid chromatography-tandem mass spectrometry methods.

    PubMed

    Alladio, Eugenio; Pirro, Valentina; Salomone, Alberto; Vincenti, Marco; Leardi, Riccardo

    2015-06-09

    The recent technological advancements of liquid chromatography-tandem mass spectrometry allow the simultaneous determination of tens, or even hundreds, of target analytes. In such cases, the traditional approach to quantitative method validation presents three major drawbacks: (i) it is extremely laborious, repetitive and rigid; (ii) it does not allow to introduce new target analytes without starting the validation from its very beginning and (iii) it is performed on spiked blank matrices, whose very nature is significantly modified by the addition of a large number of spiking substances, especially at high concentration. In the present study, several predictive chemometric models were developed from closed sets of analytes in order to estimate validation parameters on molecules of the same class, but not included in the original training set. Retention time, matrix effect, recovery, detection and quantification limits were predicted with partial least squares regression method. In particular, iterative stepwise elimination, iterative predictors weighting and genetic algorithms approaches were utilized and compared to achieve effective variables selection. These procedures were applied to data reported in our previously validated ultra-high performance liquid chromatography-tandem mass spectrometry multi-residue method for the determination of pharmaceutical and illicit drugs in oral fluid samples in accordance with national and international guidelines. Then, the partial least squares model was successfully tested on naloxone and lormetazepam, in order to introduce these new compounds in the oral fluid validated method, which adopts reverse-phase chromatography. Retention time, matrix effect, recovery, limit of detection and limit of quantification parameters for naloxone and lormetazepam were predicted by the model and then positively compared with their corresponding experimental values. The whole study represents a proof-of-concept of chemometrics potential to

  1. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    PubMed Central

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-01-01

    Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625

  2. AAPS and US FDA Crystal City VI workshop on bioanalytical method validation for biomarkers.

    PubMed

    Lowes, Steve; Ackermann, Bradley L

    2016-02-01

    Crystal City VI Workshop on Bioanalytical Method Validation of Biomarkers, Renaissance Baltimore Harborplace Hotel, Baltimore, MD, USA, 28-29 September 2015 The Crystal City VI workshop was organized by the American Association of Pharmaceutical Scientists in association with the US FDA to continue discussion on the bioanalysis of biomarkers. An outcome of the Crystal City V workshop, convened following release of the draft FDA Guidance for Industry on Bioanalytical Methods Validation in 2013 was the need to have further discussion on biomarker methods. Biomarkers ultimately became the sole focal point for Crystal City VI, a meeting attended by approximately 200 people and composed of industry scientists and regulators from around the world. The meeting format included several panel discussions to maximize the opportunity for dialogue among participants. Following an initial session on the general topic of biomarker assays and intended use, more focused sessions were held on chromatographic (LC-MS) and ligand-binding assays. In addition to participation by the drug development community, significant representation was present from clinical testing laboratories. The experience of this latter group, collectively identified as practitioners of CLIA (Clinical Laboratory Improvement Amendments), helped shape the discussion and takeaways from the meeting. While the need to operate within the framework of the current BMV guidance was clearly acknowledged, a general understanding that biomarker methods validation cannot be adequately depicted by current PK-centric guidelines emerged as a consensus from the meeting. This report is not intended to constitute the official proceedings from Crystal City VI, which is expected to be published in early 2016.

  3. Experimental comparison and validation of hot-ball method with guarded hot plate method on polyurethane foams

    NASA Astrophysics Data System (ADS)

    Hudec, Ján; Glorieux, Christ; Dieška, Peter; Kubičár, Ľudovít

    2016-07-01

    The Hot-ball method is an innovative transient method for measuring thermophysical properties. The principle is based on heating of a small ball, incorporated in measured medium, by constant heating power and simultaneous measuring of the ball's temperature response since the heating was initiated. The shape of the temperature response depends on thermophysical properties of the medium, where the sensor is placed. This method is patented by Institute of Physics, SAS, where the method and sensors based on this method are being developed. At the beginning of the development of sensors for this method we were oriented on monitoring applications, where relative precision is much more important than accuracy. Meanwhile, the quality of sensors was improved good enough to be used for a new application - absolute measuring of thermophysical parameters of low thermally conductive materials. This paper describes experimental verification and validation of measurement by hot-ball method. Thanks to cooperation with Laboratory of Soft Matter and Biophysics of Catholic University of Leuven in Belgium, established Guarded Hot Plate method was used as a reference. Details about measuring setups, description of the experiments and results of the comparison are presented.

  4. Validation of a two-plate microbiological method for screening antibiotic residues in shrimp tissue.

    PubMed

    Dang, Pham Kim; Degand, Guy; Danyi, Sophie; Pierret, Gilles; Delahaut, Philippe; Ton, Vu Dinh; Maghuin-Rogister, Guy; Scippo, Marie-Louise

    2010-07-05

    Microbiological inhibition screening tests could play an important role to detect residues of antibiotics in the different animal food products, but very few are available for the aquaculture products in general, and for shrimps in particular. A two-plate microbiological method to screen shrimp for residues of the most commonly used antibiotics has been developed and validated according to criteria derived from the European Commission Decision 2002/657/CE. Bacillus subtilis was used as a sensitive strain to target antibiotics. Culture conditions on Petri plates (pH of medium) were selected to enhance the capacity of antibiotic detection. Antibiotic residues were extracted from shrimps using acetonitrile/acetone (70/30, v/v) before application on Petri plates seeded with B. subtilis. The method was validated using spiked blank tissues as well as antibiotic treated shrimps with enrofloxacin and tetracycline, two antibiotics often found to be used in shrimp production. For tetracyclines and (fluoro)quinolones, the detection capability was below the maximum residue limit (MRL), while it was around the MRL for sulfonamides. The specificity of the microbiological screening was 100% in all cases while the sensitivity and accuracy was 100% in almost all cases. The capacity of the method to detect contaminated samples was confirmed on antibiotic treated shrimps, analyzed in parallel with a confirmatory method (Liquid Chromatography coupled to mass spectrometry (LC-MS)).

  5. Method validation using weighted linear regression models for quantification of UV filters in water samples.

    PubMed

    da Silva, Claudia Pereira; Emídio, Elissandro Soares; de Marchi, Mary Rosa Rodrigues

    2015-01-01

    This paper describes the validation of a method consisting of solid-phase extraction followed by gas chromatography-tandem mass spectrometry for the analysis of the ultraviolet (UV) filters benzophenone-3, ethylhexyl salicylate, ethylhexyl methoxycinnamate and octocrylene. The method validation criteria included evaluation of selectivity, analytical curve, trueness, precision, limits of detection and limits of quantification. The non-weighted linear regression model has traditionally been used for calibration, but it is not necessarily the optimal model in all cases. Because the assumption of homoscedasticity was not met for the analytical data in this work, a weighted least squares linear regression was used for the calibration method. The evaluated analytical parameters were satisfactory for the analytes and showed recoveries at four fortification levels between 62% and 107%, with relative standard deviations less than 14%. The detection limits ranged from 7.6 to 24.1 ng L(-1). The proposed method was used to determine the amount of UV filters in water samples from water treatment plants in Araraquara and Jau in São Paulo, Brazil.

  6. Quantification of histone modifications by parallel-reaction monitoring: a method validation.

    PubMed

    Sowers, James L; Mirfattah, Barsam; Xu, Pei; Tang, Hui; Park, In Young; Walker, Cheryl; Wu, Ping; Laezza, Fernanda; Sowers, Lawrence C; Zhang, Kangling

    2015-10-06

    Abnormal epigenetic reprogramming is one of the major causes leading to irregular gene expression and regulatory pathway perturbations, in the cells, resulting in unhealthy cell development or diseases. Accurate measurements of these changes of epigenetic modifications, especially the complex histone modifications, are very important, and the methods for these measurements are not trivial. By following our previous introduction of PRM to targeting histone modifications (Tang, H.; Fang, H.; Yin, E.; Brasier, A. R.; Sowers, L. C.; Zhang, K. Multiplexed parallel reaction monitoring targeting histone modifications on the QExactive mass spectrometer. Anal. Chem. 2014, 86 (11), 5526-34), herein we validated this method by varying the protein/trypsin ratios via serial dilutions. Our data demonstrated that PRM with SILAC histones as the internal standards allowed reproducible measurements of histone H3/H4 acetylation and methylation in the samples whose histone contents differ at least one-order of magnitude. The method was further validated by histones isolated from histone H3 K36 trimethyltransferase SETD2 knockout mouse embryonic fibroblasts (MEF) cells. Furthermore, histone acetylation and methylation in human neural stem cells (hNSC) treated with ascorbic acid phosphate (AAP) were measured by this method, revealing that H3 K36 trimethylation was significantly down-regulated by 6 days of treatment with vitamin C.

  7. Development and method validation for the determination of nitroimidazole residues in salmon, tilapia and shrimp muscle.

    PubMed

    Watson, Lynn; Potter, Ross; MacNeil, James D; Murphy, Cory

    2014-01-01

    The use of nitroimidazoles in aquacultured fish has been banned in many countries due to the suspected mutagenic and carcinogenic effects of these compounds. In response to the need to conduct residue testing of these compounds in fish, a simple, rapid, and sensitive method was developed and validated that is suitable for regulatory monitoring of nitroimidazole residues and their hydroxy metabolites in fish muscle tissue. Following solvent extraction of homogenized tissue and clean-up using a C18 SPE cartridge, analyses were conducted by ultra-performance UPLC-MS/MS. A precursor ion and two product ions were monitored for each of the parent compounds and metabolites included in the method. The validated method has an analytical range from 1 to 50 ng/g in the representative species (tilapia, salmon, and shrimp), with an LOD and LOQ ranging from 0.07 to 1.0 nglg and 0.21 to 3.0 nglg, respectively, depending on the analyte. Recoveries ranged from 81 to 124% and repeatability was between 4 and 17%. HorRat values were within typical limits of acceptability for a single laboratory. Working standards were stable for 12 months, extracts were stable for 5 days, and tissues for 2 months under appropriate storage conditions. This method was determined to be suitable for routine use for screening, quantification, and confirmation of nitroimidazole residues in a residue monitoring program for fish with regulatory oversight.

  8. Development and validation of an HPLC-MS/MS method for the early diagnosis of aspergillosis.

    PubMed

    Cerqueira, Letícia B; de Francisco, Thais M G; Gasparetto, João C; Campos, Francinete R; Pontarolo, Roberto

    2014-01-01

    Invasive aspergillosis is an opportunistic infection that is mainly caused by Aspergillus fumigatus, which is known to produce several secondary metabolites, including gliotoxin, the most abundant metabolite produced during hyphal growth. The diagnosis of invasive aspergillosis is often made late in the infection because of the lack of reliable and feasible diagnostic techniques; therefore, early detection is critical to begin treatment and avoid more serious complications. The present work reports the development and validation of an HPLC-MS/MS method for the detection of gliotoxin in the serum of patients with suspected aspergillosis. Chromatographic separation was achieved using an XBridge C18 column (150 × 2.1 mm id; 5 mm particle size) maintained at 25 °C with the corresponding guard column (XBridge C18, 10 × 2.1 mm id, 5 mm particle size). The mobile phase was composed of a gradient of water and acetonitrile/water (95:5 v/v), both containing 1 mM ammonium formate with a flow rate of 0.45 mL min(-1). Data from the validation studies demonstrate that this new method is highly sensitive, selective, linear, precise, accurate and free from matrix interference. The developed method was successfully applied to samples from patients suspected of having aspergillosis. Therefore, the developed method has considerable potential as a diagnostic technique for aspergillosis.

  9. Development and validation of sensitive spectrophotometric method for determination of two antiepileptics in pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Gouda, Ayman A.; Malah, Zakia Al

    2013-03-01

    Rapid, sensitive and validated spectrophotometric methods for the determination of two antiepileptics (gabapentin (GAB) and pregabalin (PRG)) in pure forms and in pharmaceutical formulations was developed. The method is based on the formation of charge transfer complex between drug and the chromogenic reagents quinalizarin (Quinz) and alizarin red S (ARS) producing charge transfer complexes in methanolic medium which showed an absorption maximum at 571 and 528 nm for GAB and 572 and 538 nm for PRG using Quinz and ARS, respectively. The optimization of the reaction conditions such as the type of solvent, reagent concentration and reaction time were investigated. Beer's law is obeyed in the concentration ranges 0.4-8.0 and 0.5-10 μg mL-1 for GAB and PRG using Quinz and ARS, respectively. The molar absorptivity, Sandell sensitivity, detection and quantification limits are also calculated. The correlation coefficients were ⩾0.9992 with a relative standard deviation (RSD%) of ⩽1.76. The methods are successfully applied to the determination of GAB and PRG in pharmaceutical formulations and the validity assesses by applying the standard addition technique, which compared with those obtained using the reported methods.

  10. Validation of a UV Spectrometric Method for the Assay of Tolfenamic Acid in Organic Solvents

    PubMed Central

    Ahmed, Sofia; Mustaan, Nafeesa; Sheraz, Muhammad Ali; Nabi, Syeda Ayesha Ahmed un; Ahmad, Iqbal

    2015-01-01

    The present study has been carried out to validate a UV spectrometric method for the assay of tolfenamic acid (TA) in organic solvents. TA is insoluble in water; therefore, a total of thirteen commonly used organic solvents have been selected in which the drug is soluble. Fresh stock solutions of TA in each solvent in a concentration of 1 × 10−4 M (2.62 mg%) were prepared for the assay. The method has been validated according to the guideline of International Conference on Harmonization and parameters like linearity, range, accuracy, precision, sensitivity, and robustness have been studied. Although the method was found to be efficient for the determination of TA in all solvents on the basis of statistical data 1-octanol, followed by ethanol and methanol, was found to be comparatively better than the other studied solvents. No change in the stock solution stability of TA has been observed in each solvent for 24 hours stored either at room (25 ± 1°C) or at refrigerated temperature (2–8°C). A shift in the absorption maxima has been observed for TA in various solvents indicating drug-solvent interactions. The studied method is simple, rapid, economical, accurate, and precise for the assay of TA in different organic solvents. PMID:26783497

  11. Validation of experimental whole-body SAR assessment method in a complex indoor environment.

    PubMed

    Bamba, Aliou; Joseph, Wout; Vermeeren, Gunter; Tanghe, Emmeric; Gaillot, Davy Paul; Andersen, Jørgen B; Nielsen, Jesper Ødum; Lienard, Martine; Martens, Luc

    2013-02-01

    Experimentally assessing the whole-body specific absorption rate (SAR(wb) ) in a complex indoor environment is very challenging. An experimental method based on room electromagnetics theory (accounting only the line-of-sight as specular path) is validated using numerical simulations with the finite-difference time-domain method. Furthermore, the method accounts for diffuse multipath components (DMC) in the total absorption rate by considering the reverberation time of the investigated room, which describes all the losses in a complex indoor environment. The advantage of the proposed method is that it allows discarding the computational burden because it does not use any discretizations. Results show good agreement between measurement and computation at 2.8 GHz, as long as the plane wave assumption is valid, that is, at large distances from the transmitter. Relative deviations of 0.71% and 4% have been obtained for far-field scenarios, and 77.5% for the near field-scenario. The contribution of the DMC in the total absorption rate is also quantified here, which has never been investigated before. It is found that the DMC may represent an important part of the total absorption rate; its contribution may reach up to 90% for certain scenarios in an indoor environment.

  12. Validation of an Association Rule Mining-Based Method to Infer Associations Between Medications and Problems

    PubMed Central

    Wright, A.; McCoy, A.; Henkin, S.; Flaherty, M.; Sittig, D.

    2013-01-01

    Background In a prior study, we developed methods for automatically identifying associations between medications and problems using association rule mining on a large clinical data warehouse and validated these methods at a single site which used a self-developed electronic health record. Objective To demonstrate the generalizability of these methods by validating them at an external site. Methods We received data on medications and problems for 263,597 patients from the University of Texas Health Science Center at Houston Faculty Practice, an ambulatory practice that uses the Allscripts Enterprise commercial electronic health record product. We then conducted association rule mining to identify associated pairs of medications and problems and characterized these associations with five measures of interestingness: support, confidence, chi-square, interest and conviction and compared the top-ranked pairs to a gold standard. Results 25,088 medication-problem pairs were identified that exceeded our confidence and support thresholds. An analysis of the top 500 pairs according to each measure of interestingness showed a high degree of accuracy for highly-ranked pairs. Conclusion The same technique was successfully employed at the University of Texas and accuracy was comparable to our previous results. Top associations included many medications that are highly specific for a particular problem as well as a large number of common, accurate medication-problem pairs that reflect practice patterns. PMID:23650491

  13. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  14. Degradation Pathway for Eplerenone by Validated Stability Indicating UP-LC Method.

    PubMed

    Sudhakar Babu, Kondru; Madireddy, Venkataramanna; Indukuri, Venkata Somaraju

    2012-01-01

    Degradation pathway for eplerenone is established as per ICH recommendations by validated and stability-indicating reverse phase liquid chromatographic method. Eplerenone is subjected to stress conditions of acid, base, oxidation, and thermal and photolysis. Significant degradation is observed in acid and base stress conditions. Four impurities are studied and the major degradant (RRT about 0.31) was identified by LC-MS and spectral analysis. The stress samples are assayed against a qualified reference standard and the mass balance is found close to 99.5%. Efficient chromatographic separation is achieved on a Waters symmetry C18 stationary phase with simple mobile phase combination delivered in gradient mode and quantification is carried at 240 nm at a flow rate of 1.0 mL min(-1). In the developed LC method the resolution between eplerenone and four potential impurities (imp-1, imp-2, imp-3, and imp-4) is found to be greater than 4.0. Regression analysis shows an r value (correlation coefficient) of greater than 0.999 for eplerenone and four potential impurities. This method is capable to detect the impurities of eplerenone at a level of 0.020% with respect to test concentration of 1.0 mg mL(-1) for a 20 μL injection volume. The developed UPLC method is validated with respect to specificity, linearity and range, accuracy, precision, and robustness for impurities and assay determination.

  15. Development and Validation of Stability-indicating HPLC Method for Simultaneous Estimation of Cefixime and Linezolid

    PubMed Central

    Patel, Nidhi S.; Tandel, Falguni B.; Patel, Yogita D.; Thakkar, Kartavya B.

    2014-01-01

    A stability-indicating reverse phase high performance liquid chromatography method was developed and validated for cefixime and linezolid. The wavelength selected for quantitation was 276 nm. The method has been validated for linearity, accuracy, precision, robustness, limit of detection and limit of quantitation. Linearity was observed in the concentration range of 2-12 μg/ml for cefixime and 6-36 μg/ml for linezolid. For RP-HPLC, the separation was achieved by Phenomenex Luna C18 (250×4.6 mm) 5 μm column using phosphate buffer (pH 7):methanol (60:40 v/v) as mobile phase with flow rate 1 ml/min. The retention time of cefixime and linezolid were found to be 3.127 min and 11.986 min, respectively. During force degradation, drug product was exposed to hydrolysis (acid and base hydrolysis), H2O2, thermal degradation and photo degradation. The % degradation was found to be 10 to 20% for both cefixime and linezolid in the given condition. The method specifically estimates both the drugs in presence of all the degradants generated during forced degradation study. The developed methods were simple, specific and economic, which can be used for simultaneous estimation of cefixime and linezolid in tablet dosage form. PMID:25593387

  16. Fisk-based criteria to support validation of detection methods for drinking water and air.

    SciTech Connect

    MacDonell, M.; Bhattacharyya, M.; Finster, M.; Williams, M.; Picel, K.; Chang, Y.-S.; Peterson, J.; Adeshina, F.; Sonich-Mullin, C.; Environmental Science Division; EPA

    2009-02-18

    This report was prepared to support the validation of analytical methods for threat contaminants under the U.S. Environmental Protection Agency (EPA) National Homeland Security Research Center (NHSRC) program. It is designed to serve as a resource for certain applications of benchmark and fate information for homeland security threat contaminants. The report identifies risk-based criteria from existing health benchmarks for drinking water and air for potential use as validation targets. The focus is on benchmarks for chronic public exposures. The priority sources are standard EPA concentration limits for drinking water and air, along with oral and inhalation toxicity values. Many contaminants identified as homeland security threats to drinking water or air would convert to other chemicals within minutes to hours of being released. For this reason, a fate analysis has been performed to identify potential transformation products and removal half-lives in air and water so appropriate forms can be targeted for detection over time. The risk-based criteria presented in this report to frame method validation are expected to be lower than actual operational targets based on realistic exposures following a release. Note that many target criteria provided in this report are taken from available benchmarks without assessing the underlying toxicological details. That is, although the relevance of the chemical form and analogues are evaluated, the toxicological interpretations and extrapolations conducted by the authoring organizations are not. It is also important to emphasize that such targets in the current analysis are not health-based advisory levels to guide homeland security responses. This integrated evaluation of chronic public benchmarks and contaminant fate has identified more than 200 risk-based criteria as method validation targets across numerous contaminants and fate products in drinking water and air combined. The gap in directly applicable values is

  17. Validation of a Method to Accurately Correct Anterior Superior Iliac Spine Marker Occlusion

    PubMed Central

    Hoffman, Joshua T.; McNally, Michael P.; Wordeman, Samuel C.; Hewett, Timothy E.

    2015-01-01

    Anterior superior iliac spine (ASIS) marker occlusion commonly occurs during three-dimensional (3-D) motion capture of dynamic tasks with deep hip flexion. The purpose of this study was to validate a universal technique to correct ASIS occlusion. 420ms of bilateral ASIS marker occlusion was simulated in fourteen drop vertical jump (DVJ) trials (n=14). Kinematic and kinetic hip data calculated for pelvic segments based on iliac crest (IC) marker and virtual ASIS (produced by our algorithm and a commercial virtual join) trajectories was compared to true ASIS marker tracking data. Root mean squared errors (RMSEs; mean ± standard deviation) and intra-class correlations (ICCs) between pelvic tracking based on virtual ASIS trajectories filled by our algorithm and true ASIS position were 2.3±0.9° (ICC=0.982) flexion/extension, 0.8±0.2° (ICC=0.954) abduction/adduction for hip angles, and 0.40±0.17N-m (ICC=1.000) and 1.05±0.36N-m (ICC=0,998) for sagittal and frontal plane moments. RMSEs for IC pelvic tracking were 6.9±1.8° (ICC=0.888) flexion/extension, 0.8±0.3° (ICC=0.949) abduction/adduction for hip angles, and 0.31±0.13N-m (ICC=1.00) and 1.48±0.69N-m (ICC=0.996) for sagittal and frontal plane moments. Finally, the commercially-available virtual join demonstrated RMSEs of 4.4±1.5° (ICC=0.945) flexion/extension, 0.7±0.2° (ICC=0.972) abduction/adduction for hip angles, and 0.97±0.62N-m (ICC=1.000) and 1.49±0.67N-m (ICC=0.996) for sagittal and frontal plane moments. The presented algorithm exceeded the a priori ICC cutoff of 0.95 for excellent validity and is an acceptable tracking alternative. While ICCs for the commercially available virtual join did not exhibit excellent correlation, good validity was observed for all kinematics and kinetics. IC marker pelvic tracking is not a valid alternative. PMID:25704531

  18. Validated stability-indicating spectrofluorimetric methods for the determination of ebastine in pharmaceutical preparations.

    PubMed

    Ibrahim, Fawzia; El-Din, Mohie Khaled Sharaf; Eid, Manal Ibrahim; Wahba, Mary Elias Kamel

    2011-03-08

    Two sensitive, selective, economic, and validated spectrofluorimetric methods were developed for the determination of ebastine (EBS) in pharmaceutical preparations depending on reaction with its tertiary amino group. Method I involves condensation of the drug with mixed anhydrides (citric and acetic anhydrides) producing a product with intense fluorescence, which was measured at 496 nm after excitation at 388 nm.Method (IIA) describes quantitative fluorescence quenching of eosin upon addition of the studied drug where the decrease in the fluorescence intensity was directly proportional to the concentration of ebastine; the fluorescence quenching was measured at 553 nm after excitation at 457 nm. This method was extended to (Method IIB) to apply first and second derivative synchronous spectrofluorimetric method (FDSFS & SDSFS) for the simultaneous analysis of EBS in presence of its alkaline, acidic, and UV degradation products.The proposed methods were successfully applied for the determination of the studied compound in its dosage forms. The results obtained were in good agreement with those obtained by a comparison method. Both methods were utilized to investigate the kinetics of the degradation of the drug.

  19. Validated stability-indicating spectrofluorimetric methods for the determination of ebastine in pharmaceutical preparations

    PubMed Central

    2011-01-01

    Two sensitive, selective, economic, and validated spectrofluorimetric methods were developed for the determination of ebastine (EBS) in pharmaceutical preparations depending on reaction with its tertiary amino group. Method I involves condensation of the drug with mixed anhydrides (citric and acetic anhydrides) producing a product with intense fluorescence, which was measured at 496 nm after excitation at 388 nm. Method (IIA) describes quantitative fluorescence quenching of eosin upon addition of the studied drug where the decrease in the fluorescence intensity was directly proportional to the concentration of ebastine; the fluorescence quenching was measured at 553 nm after excitation at 457 nm. This method was extended to (Method IIB) to apply first and second derivative synchronous spectrofluorimetric method (FDSFS & SDSFS) for the simultaneous analysis of EBS in presence of its alkaline, acidic, and UV degradation products. The proposed methods were successfully applied for the determination of the studied compound in its dosage forms. The results obtained were in good agreement with those obtained by a comparison method. Both methods were utilized to investigate the kinetics of the degradation of the drug. PMID:21385439

  20. Validation of different spectrophotometric methods for determination of vildagliptin and metformin in binary mixture

    NASA Astrophysics Data System (ADS)

    Abdel-Ghany, Maha F.; Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    New, simple, specific, accurate, precise and reproducible spectrophotometric methods have been developed and subsequently validated for determination of vildagliptin (VLG) and metformin (MET) in binary mixture. Zero order spectrophotometric method was the first method used for determination of MET in the range of 2-12 μg mL-1 by measuring the absorbance at 237.6 nm. The second method was derivative spectrophotometric technique; utilized for determination of MET at 247.4 nm, in the range of 1-12 μg mL-1. Derivative ratio spectrophotometric method was the third technique; used for determination of VLG in the range of 4-24 μg mL-1 at 265.8 nm. Fourth and fifth methods adopted for determination of VLG in the range of 4-24 μg mL-1; were ratio subtraction and mean centering spectrophotometric methods, respectively. All the results were statistically compared with the reported methods, using one-way analysis of variance (ANOVA). The developed methods were satisfactorily applied to analysis of the investigated drugs and proved to be specific and accurate for quality control of them in pharmaceutical dosage forms.

  1. Continental-scale Validation of MODIS-based and LEDAPS Landsat ETM+ Atmospheric Correction Methods

    NASA Technical Reports Server (NTRS)

    Ju, Junchang; Roy, David P.; Vermote, Eric; Masek, Jeffrey; Kovalskyy, Valeriy

    2012-01-01

    The potential of Landsat data processing to provide systematic continental scale products has been demonstrated by several projects including the NASA Web-enabled Landsat Data (WELD) project. The recent free availability of Landsat data increases the need for robust and efficient atmospheric correction algorithms applicable to large volume Landsat data sets. This paper compares the accuracy of two Landsat atmospheric correction methods: a MODIS-based method and the Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) method. Both methods are based on the 6SV radiative transfer code but have different atmospheric characterization approaches. The MODIS-based method uses the MODIS Terra derived dynamic aerosol type, aerosol optical thickness, and water vapor to atmospherically correct ETM+ acquisitions in each coincident orbit. The LEDAPS method uses aerosol characterizations derived independently from each Landsat acquisition and assumes a fixed continental aerosol type and uses ancillary water vapor. Validation results are presented comparing ETM+ atmospherically corrected data generated using these two methods with AERONET corrected ETM+ data for 95 10 km×10 km 30 m subsets, a total of nearly 8 million 30 m pixels, located across the conterminous United States. The results indicate that the MODIS-based method has better accuracy than the LEDAPS method for the ETM+ red and longer wavelength bands.

  2. Validation of different spectrophotometric methods for determination of vildagliptin and metformin in binary mixture.

    PubMed

    Abdel-Ghany, Maha F; Abdel-Aziz, Omar; Ayad, Miriam F; Tadros, Mariam M

    2014-05-05

    New, simple, specific, accurate, precise and reproducible spectrophotometric methods have been developed and subsequently validated for determination of vildagliptin (VLG) and metformin (MET) in binary mixture. Zero order spectrophotometric method was the first method used for determination of MET in the range of 2-12 μg mL(-1) by measuring the absorbance at 237.6 nm. The second method was derivative spectrophotometric technique; utilized for determination of MET at 247.4 nm, in the range of 1-12 μg mL(-1). Derivative ratio spectrophotometric method was the third technique; used for determination of VLG in the range of 4-24 μg mL(-1) at 265.8 nm. Fourth and fifth methods adopted for determination of VLG in the range of 4-24 μg mL(-1); were ratio subtraction and mean centering spectrophotometric methods, respectively. All the results were statistically compared with the reported methods, using one-way analysis of variance (ANOVA). The developed methods were satisfactorily applied to analysis of the investigated drugs and proved to be specific and accurate for quality control of them in pharmaceutical dosage forms.

  3. Development and validation of spectrophotometric, atomic absorption and kinetic methods for determination of moxifloxacin hydrochloride.

    PubMed

    Abdellaziz, Lobna M; Hosny, Mervat M

    2011-01-01

    Three simple spectrophotometric and atomic absorption spectrometric methods are developed and validated for the determination of moxifloxacin HCl in pure form and in pharmaceutical formulations. Method (A) is a kinetic method based on the oxidation of moxifloxacin HCl by Fe(3+) ion in the presence of 1,10 o-phenanthroline (o-phen). Method (B) describes spectrophotometric procedures for determination of moxifloxacin HCl based on its ability to reduce Fe (III) to Fe (II), which was rapidly converted to the corresponding stable coloured complex after reacting with 2,2' bipyridyl (bipy). The formation of the tris-complex formed in both methods (A) and (B) were carefully studied and their absorbance were measured at 510 and 520 nm respectively. Method (C) is based on the formation of ion- pair associated between the drug and bismuth (III) tetraiodide in acidic medium to form orange-red ion-pair associates. This associate can be quantitatively determined by three different procedures. The formed precipitate is either filtered off, dissolved in acetone and quantified spectrophotometrically at 462 nm (Procedure 1), or decomposed by hydrochloric acid, and the bismuth content is determined by direct atomic absorption spectrometric (Procedure 2). Also the residual unreacted metal complex in the filtrate is determined through its metal content using indirect atomic absorption spectrometric technique (procedure 3). All the proposed methods were validated according to the International Conference on Harmonization (ICH) guidelines, the three proposed methods permit the determination of moxifloxacin HCl in the range of (0.8-6, 0.8-4) for methods A and B, (16-96, 16-96 and 16-72) for procedures 1-3 in method C. The limits of detection and quantitation were calculated, the precision of the methods were satisfactory; the values of relative standard deviations did not exceed 2%. The proposed methods were successfully applied to determine the drug in its pharmaceutical formulations

  4. Development and Validation of Spectrophotometric, Atomic Absorption and Kinetic Methods for Determination of Moxifloxacin Hydrochloride

    PubMed Central

    Abdellaziz, Lobna M.; Hosny, Mervat M.

    2011-01-01

    Three simple spectrophotometric and atomic absorption spectrometric methods are developed and validated for the determination of moxifloxacin HCl in pure form and in pharmaceutical formulations. Method (A) is a kinetic method based on the oxidation of moxifloxacin HCl by Fe3+ ion in the presence of 1,10 o-phenanthroline (o-phen). Method (B) describes spectrophotometric procedures for determination of moxifloxacin HCl based on its ability to reduce Fe (III) to Fe (II), which was rapidly converted to the corresponding stable coloured complex after reacting with 2,2′ bipyridyl (bipy). The formation of the tris-complex formed in both methods (A) and (B) were carefully studied and their absorbance were measured at 510 and 520 nm respectively. Method (C) is based on the formation of ion- pair associated between the drug and bismuth (III) tetraiodide in acidic medium to form orange—red ion-pair associates. This associate can be quantitatively determined by three different procedures. The formed precipitate is either filtered off, dissolved in acetone and quantified spectrophotometrically at 462 nm (Procedure 1), or decomposed by hydrochloric acid, and the bismuth content is determined by direct atomic absorption spectrometric (Procedure 2). Also the residual unreacted metal complex in the filtrate is determined through its metal content using indirect atomic absorption spectrometric technique (procedure 3). All the proposed methods were validated according to the International Conference on Harmonization (ICH) guidelines, the three proposed methods permit the determination of moxifloxacin HCl in the range of (0.8–6, 0.8–4) for methods A and B, (16–96, 16–96 and 16–72) for procedures 1–3 in method C. The limits of detection and quantitation were calculated, the precision of the methods were satisfactory; the values of relative standard deviations did not exceed 2%. The proposed methods were successfully applied to determine the drug in its pharmaceutical

  5. Performance of the Tariff Method: validation of a simple additive algorithm for analysis of verbal autopsies

    PubMed Central

    2011-01-01

    Background Verbal autopsies provide valuable information for studying mortality patterns in populations that lack reliable vital registration data. Methods for transforming verbal autopsy results into meaningful information for health workers and policymakers, however, are often costly or complicated to use. We present a simple additive algorithm, the Tariff Method (termed Tariff), which can be used for assigning individual cause of death and for determining cause-specific mortality fractions (CSMFs) from verbal autopsy data. Methods Tariff calculates a score, or "tariff," for each cause, for each sign/symptom, across a pool of validated verbal autopsy data. The tariffs are summed for a given response pattern in a verbal autopsy, and this sum (score) provides the basis for predicting the cause of death in a dataset. We implemented this algorithm and evaluated the method's predictive ability, both in terms of chance-corrected concordance at the individual cause assignment level and in terms of CSMF accuracy at the population level. The analysis was conducted separately for adult, child, and neonatal verbal autopsies across 500 pairs of train-test validation verbal autopsy data. Results Tariff is capable of outperforming physician-certified verbal autopsy in most cases. In terms of chance-corrected concordance, the method achieves 44.5% in adults, 39% in children, and 23.9% in neonates. CSMF accuracy was 0.745 in adults, 0.709 in children, and 0.679 in neonates. Conclusions Verbal autopsies can be an efficient means of obtaining cause of death data, and Tariff provides an intuitive, reliable method for generating individual cause assignment and CSMFs. The method is transparent and flexible and can be readily implemented by users without training in statistics or computer science. PMID:21816107

  6. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    PubMed Central

    Musmade, Kranti P.; Trilok, M.; Dengale, Swapnil J.; Bhat, Krishnamurthy; Reddy, M. S.; Musmade, Prashant B.; Udupa, N.

    2014-01-01

    A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC) method with UV detection has been developed and validated for quantification of naringin (NAR) in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm) column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v) as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1). The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV) less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity. PMID:26556205

  7. Validated spectrophotometric and spectrofluorimetric methods for determination of chloroaluminum phthalocyanine in nanocarriers.

    PubMed

    Siqueira-Moura, M P; Primo, F L; Peti, A P F; Tedesco, A C

    2010-01-01

    UV-VIS-Spectrophotometric and spectrofluorimetric methods have been developed and validated allowing the quantification of chloroaluminum phthalocyanine (CIAIPc) in nanocarriers. In order to validate the methods, the linearity, limit of detection (LOD), limit of quantification (LOQ), precision, accuracy, and selectivity were examined according to USP 30 and ICH guidelines. Linearities range were found between 0.50-3.00 microg x mL(-1) (Y = 0.3829 X [CIAIPc, microg x mL(-1)] + 0.0126; r = 0.9992) for spectrophotometry, and 0.05-1.00 microg x mL(-1) (Y = 2.24 x 10(6) X [CIAIPc, microg x mL(-1)] + 9.74 x 10(4); r = 0.9978) for spectrofluorimetry. In addition, ANOVA and Lack-of-fit tests demonstrated that the regression equations were statistically significant (p<0.05), and the resulting linear model is fully adequate for both analytical methods. The LOD values were 0.09 and 0.01 microg x mL(-1), while the LOQ were 0.27 and 0.04 microg x mL(-1) for spectrophotometric and spectrofluorimetric methods, respectively. Repeatability and intermediate precision for proposed methods showed relative standard deviation (RSD) between 0.58% to 4.80%. The percent recovery ranged from 98.9% to 102.7% for spectrophotometric analyses and from 94.2% to 101.2% for spectrofluorimetry. No interferences from common excipients were detected and both methods were considered specific. Therefore, the methods are accurate, precise, specific, and reproducible and hence can be applied for quantification of CIAIPc in nanoemulsions (NE) and nanocapsules (NC).

  8. Validation of a standardised method for determining beryllium in human urine at nanogram level.

    PubMed

    Devoy, Jérôme; Melczer, Mathieu; Antoine, Guillaume; Remy, Aurélie; Heilier, Jean-François

    2013-10-01

    The potential toxicity of beryllium at low levels of exposure means that a biological and/or air monitoring strategy may be required to monitor the exposure of subjects. The main objective of the work presented in this manuscript was to develop and validate a sensitive and reproducible method for determining levels of beryllium in human urine and to establish reference values in workers and in non-occupationally exposed people. A chelate of beryllium acetylacetonate formed from beryllium(II) in human urine was pre-concentrated on a SPE C18 cartridge and eluted with methanol. After drying the eluate, the residue was solubilised in nitric acid and analysed by atomic absorption spectrometry and/or inductively coupled plasma mass spectrometry. The proposed method is 4 to 100 times more sensitive than other methods currently in routine use. The new method was validated with the concordance correlation coefficient test for beryllium concentrations ranging from 10 to 100 ng/L. Creatinine concentration, urine pH, interfering compounds and freeze-thaw cycles were found to have only slight effects on the performance of the method (less than 6%). The effectiveness of the two analytical techniques was compared statistically with each other and to direct analysis techniques. Even with a detection limit of 0.6 ng/L (obtained with inductively coupled plasma mass spectrometry), the method is not sensitive enough to detect levels in non-occupationally exposed persons. The method performance does however appear to be suitable for monitoring worker exposure in some industrial settings and it could therefore be of use in biological monitoring strategies.

  9. A validated high performance thin layer chromatography method for determination of yohimbine hydrochloride in pharmaceutical preparations

    PubMed Central

    Badr, Jihan M.

    2013-01-01

    Background: Yohimbine is an indole alkaloid used as a promising therapy for erectile dysfunction. A number of methods were reported for the analysis of yohimbine in the bark or in pharmaceutical preparations. Materials and Method: In the present work, a simple and sensitive high performance thin layer chromatographic method is developed for determination of yohimbine (occurring as yohimbine hydrochloride) in pharmaceutical preparations and validated according to International Conference of Harmonization (ICH) guidelines. The method employed thin layer chromatography aluminum sheets precoated with silica gel as the stationary phase and the mobile phase consisted of chloroform:methanol:ammonia (97:3:0.2), which gave compact bands of yohimbine hydrochloride. Results: Linear regression data for the calibration curves of standard yohimbine hydrochloride showed a good linear relationship over a concentration range of 80–1000 ng/spot with respect to the area and correlation coefficient (R2) was 0.9965. The method was evaluated regarding accuracy, precision, selectivity, and robustness. Limits of detection and quantitation were recorded as 5 and 40 ng/spot, respectively. The proposed method efficiently separated yohimbine hydrochloride from other components even in complex mixture containing powdered plants. The amount of yohimbine hydrochloride ranged from 2.3 to 5.2 mg/tablet or capsule in preparations containing the pure alkaloid, while it varied from zero (0) to 1.5–1.8 mg/capsule in dietary supplements containing powdered yohimbe bark. Conclusion: We concluded that this method employing high performance thin layer chromatography (HPTLC) in quantitative determination of yohimbine hydrochloride in pharmaceutical preparations is efficient, simple, accurate, and validated. PMID:23661986

  10. A validated HPLC method for the determination of octocrylene in solid lipid nanoparticle systems.

    PubMed

    Berkman, M S; Yazan, Y

    2011-02-01

    UV filters are traditionally classified as chemical absorbers and physical blockers depending on their mechanism of action. In this study, one of the most important chemical UVB absorber, octocrylene, was incorporated into Solid Lipid Nanoparticle (SLN) systems which themselves have UV blocking potential similar to physical blockers. Determination of octocrylene in the formulations was performed by HPLC (High Performance Liquid Chromatography) using a new validated method based on ICH harmonised tripartite guideline "validation of analytical procedures Q2(R1)". Determination and validation studies were carried out on a 4.6 x 250 mm, 5 microm C18 ACE column using an optimized mobile phase of acetonitrile:water (75:25, v/v) at a flow rate of 1.5 mL x min(-1). UV detection was performed at 210 nm and the column temperature was adjusted to 50 degrees C. Cyclosporine A was used as an internal standard (IS). The specified working range was derived from linearity studies and kept in the concentration range of 2.5 x -5.5 x 10(-5) M. Good correlation and accuracy were obtained. Limit of detection (LOD) and limit of quantitation (LOQ) values were determined to be 1.64 x 10(-6) M and 4.97 x 10-6 M, respectively. Octocrylene recovery % results of the SLN formulations stored at 25 degrees C, 4 degrees C and 40 degrees C for 360 days were investigated and compared to the freshly prepared samples.

  11. On protocols and measures for the validation of supervised methods for the inference of biological networks.

    PubMed

    Schrynemackers, Marie; Küffner, Robert; Geurts, Pierre

    2013-12-03

    Networks provide a natural representation of molecular biology knowledge, in particular to model relationships between biological entities such as genes, proteins, drugs, or diseases. Because of the effort, the cost, or the lack of the experiments necessary for the elucidation of these networks, computational approaches for network inference have been frequently investigated in the literature. In this paper, we examine the assessment of supervised network inference. Supervised inference is based on machine learning techniques that infer the network from a training sample of known interacting and possibly non-interacting entities and additional measurement data. While these methods are very effective, their reliable validation in silico poses a challenge, since both prediction and validation need to be performed on the basis of the same partially known network. Cross-validation techniques need to be specifically adapted to classification problems on pairs of objects. We perform a critical review and assessment of protocols and measures proposed in the literature and derive specific guidelines how to best exploit and evaluate machine learning techniques for network inference. Through theoretical considerations and in silico experiments, we analyze in depth how important factors influence the outcome of performance estimation. These factors include the amount of information available for the interacting entities, the sparsity and topology of biological networks, and the lack of experimentally verified non-interacting pairs.

  12. Experimental methods to validate measures of emotional state and readiness for duty in critical operations.

    SciTech Connect

    Weston, Louise Marie

    2007-09-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended.

  13. Physical activity assessment in the general population; validated self-report methods.

    PubMed

    Ara, Ignacio; Aparicio-Ugarriza, Raquel; Morales-Barco, David; Nascimento de Souza, Wysllenny; Mata, Esmeralda; González-Gross, Marcela

    2015-02-26

    Self-reported questionnaires have been commonly used to assess physical activity levels in large cohort studies. As a result, strong and convincing evidences that physical activity can protect health are widely recognized. However, validation studies using objective measures of physical activity or energy expenditure (double labelled water, accelerometers, pedometers, etc.) indicate that the accuracy and precision of survey techniques are limited. Physical activity questionnaires could fail in estimating particularly non-vigorous physical activity. They have a disproportionate focus on volitional type exercise (i.e. biking, jogging, and walking), while not capturing the activities of daily living and low to moderate intensity movements. Energy expenditure estimates from these data are not recommended. On the other hand, despite objective tools should be the measurement of choice to assess PA level, self-reported questionnaires remain valid, and have many advantages. i.e. low costs. These kind of recalls are designed and validated for different age groups and provide value and important information, mainly about physical activity pattern. Future studies will require more precision and accuracy in physical activity measurement than those provided by traditional survey methods. We can conclude that probably a mixed approach that combines both the objective and subjective techniques involving novel devices and electronic capture of physical activity questionnaires will be more effective.

  14. Methods for validating the performance of wearable motion-sensing devices under controlled conditions

    NASA Astrophysics Data System (ADS)

    Bliley, Kara E; Kaufman, Kenton R; Gilbert, Barry K

    2009-04-01

    This paper presents validation methods for assessing the accuracy and precision of motion-sensing device (i.e. accelerometer) measurements. The main goals of this paper were to assess the accuracy and precision of these measurements against a gold standard, to determine if differences in manufacturing and assembly significantly affected device performance and to determine if measurement differences due to manufacturing and assembly could be corrected by applying certain post-processing techniques to the measurement data during analysis. In this paper, the validation of a posture and activity detector (PAD), a device containing a tri-axial accelerometer, is described. Validation of the PAD devices required the design of two test fixtures: a test fixture to position the device in a known orientation, and a test fixture to rotate the device at known velocities and accelerations. Device measurements were compared to these known orientations and accelerations. Several post-processing techniques were utilized in an attempt to reduce variability in the measurement error among the devices. In conclusion, some of the measurement errors due to the inevitable differences in manufacturing and assembly were significantly improved (p < 0.01) by these post-processing techniques.

  15. Validation of analytical methods in compliance with good manufacturing practice: a practical approach

    PubMed Central

    2013-01-01

    Background The quality and safety of cell therapy products must be maintained throughout their production and quality control cycle, ensuring their final use in the patient. We validated the Lymulus Amebocyte Lysate (LAL) test and immunophenotype according to International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, considering accuracy, precision, repeatability, linearity and range. Methods For the endotoxin test we used a kinetic chromogenic LAL test. As this is a limit test for the control of impurities, in compliance with International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, we evaluated the specificity and detection limit. For the immunophenotype test, an identity test, we evaluated specificity through the Fluorescence Minus One method and we repeated all experiments thrice to verify precision. The immunophenotype validation required a performance qualification of the flow cytometer using two types of standard beads which have to be used daily to check cytometer reproducibly set up. The results were compared together. Collected data were statistically analyzed calculating mean, standard deviation and coefficient of variation percentage (CV%). Results The LAL test is repeatable and specific. The spike recovery value of each sample was between 0.25 EU/ml and 1 EU/ml with a CV% < 10%. The correlation coefficient (≥ 0.980) and CV% (< 10%) of the standard curve tested in duplicate showed the test's linearity and a minimum detectable concentration value of 0.005 EU/ml. The immunophenotype method performed thrice on our cell therapy products is specific and repeatable as showed by CV% inter -experiment < 10%. Conclusions Our data demonstrated that validated analytical procedures are suitable as quality controls for the batch release of cell therapy products. Our paper could offer an important contribution for the scientific community in the field of CTPs, above all to small Cell Factories such as ours, where it is

  16. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas.

  17. Validated HPAEC-PAD Method for the Determination of Fully Deacetylated Chitooligosaccharides

    PubMed Central

    Cao, Lidong; Wu, Jinlong; Li, Xiuhuan; Zheng, Li; Wu, Miaomiao; Liu, Pingping; Huang, Qiliang

    2016-01-01

    An efficient and sensitive analytical method based on high-performance anion exchange chromatography with pulsed amperometric detection (HPAEC-PAD) was established for the simultaneous separation and determination of glucosamine (GlcN)1 and chitooligosaccharides (COS) ranging from (GlcN)2 to (GlcN)6 without prior derivatization. Detection limits were 0.003 to 0.016 mg/L (corresponding to 0.4–0.6 pmol), and the linear range was 0.2 to 10 mg/L. The optimized analysis was carried out on a CarboPac-PA100 analytical column (4 × 250 mm) using isocratic elution with 0.2 M aqueous sodium hydroxide-water mixture (10:90, v/v) as the mobile phase at a 0.4 mL/min flow rate. Regression equations revealed a good linear relationship (R2 = 0.9979–0.9995, n = 7) within the test ranges. Quality parameters, including precision and accuracy, were fully validated and found to be satisfactory. The fully validated HPAEC-PAD method was readily applied for the quantification of (GlcN)1–6 in a commercial COS technical concentrate. The established method was also used to monitor the acid hydrolysis of a COS technical concentrate to ensure optimization of reaction conditions and minimization of (GlcN)1 degradation. PMID:27735860

  18. Validation of Broadly Filtered Diagonalization Method for Extracting Frequencies and Modes from High-Performance Computations

    SciTech Connect

    Austin, T.M.; Cary, J.R.; Werner, G.R.; Bellantoni, L.; /Fermilab

    2009-06-01

    Recent developments have shown that one can get around the difficulties of finding the eigenvalues and eigenmodes of the large systems studied with high performance computation by using broadly filtered diagonalization [G. R. Werner and J. R. Cary, J. Compo Phys. 227, 5200 (2008)]. This method can be used in conjunction with any time-domain computation, in particular those that scale very well up to 10000s of processors and beyond. Here we present results that show that this method accurately obtains both modes and frequencies of electromagnetic cavities, even when frequencies are nearly degenerate. The application was to a well-characterized Kaon separator cavity, the A15. The computations are shown to have a precision to a few parts in 10{sup 5}. Because the computed frequency differed from the measured frequency by more than this amount, a careful validation study to determine all sources of difference was undertaken. Ultimately, more precise measurements of the cavity showed that the computations were correct, with remaining differences accounted for by uncertainties in cavity dimensions and atmospheric and thermal conditions. Thus, not only was the method validated, but it was shown to have the ability to predict differences in cavity dimensions from fabrication specifications.

  19. Simultaneous quantification of paracetamol, acetylsalicylic acid and papaverine with a validated HPLC method.

    PubMed

    Kalmár, Eva; Gyuricza, Anett; Kunos-Tóth, Erika; Szakonyi, Gerda; Dombi, György

    2014-01-01

    Combined drug products have the advantages of better patient compliance and possible synergic effects. The simultaneous application of several active ingredients at a time is therefore frequently chosen. However, the quantitative analysis of such medicines can be challenging. The aim of this study is to provide a validated method for the investigation of a multidose packed oral powder that contained acetylsalicylic acid, paracetamol and papaverine-HCl. Reversed-phase high-pressure liquid chromatography was used. The Agilent Zorbax SB-C18 column was found to be the most suitable of the three different stationary phases tested for the separation of the components of this sample. The key parameters in the method development (apart from the nature of the column) were the pH of the aqueous phase (set to 3.4) and the ratio of the organic (acetonitrile) and the aqueous (25 mM phosphate buffer) phases, which was varied from 7:93 (v/v) to 25:75 (v/v) in a linear gradient, preceded by an initial hold. The method was validated: linearity, precision (repeatability and intermediate precision), accuracy, specificity and robustness were all tested, and the results met the ICH guidelines.

  20. Development and Validation of Reversed-Phase High Performance Liquid Chromatographic Method for Hydroxychloroquine Sulphate.

    PubMed

    Singh, A; Roopkishora; Singh, C L; Gupta, R; Kumar, S; Kumar, M

    2015-01-01

    In the present work new, simple reversed-phase high performance liquid chromatographic method was developed and validated for the determination of hydroxychloroquine sulphate in blood plasma. Chloroquine sulphate was used as an internal standard. The chromatographic separation was achieved with octadecyl silane Hypersil C18 column (250×6 mm, 5 μm) using water and organic (acetonitrile:methanol: 50:50, v/v) mobile phase in 75:25 v/v ratio, with sodium 1-pentanesulfonate and phosphoric acid. This organic phase was maintained at pH 3.0 by orthophosphoric acid. The flow rate of 2.0 ml/min(.) with detection at 343 nm was used in the analysis. The calibration curve of standard hydroxychloroquine sulphate was linear in range 0.1-20.0 μg/ml. The method was validated with respected to linearity, range, precision, accuracy, specificity and robustness studies according to ICH guidelines. The method was found to be accurate and robust to analyze the hydroxychloroquine sulphate in plasma samples.

  1. Method Development and Validation for UHPLC-MS-MS Determination of Hop Prenylflavonoids in Human Serum

    PubMed Central

    Yuan, Yang; Qiu, Xi; Nikolic, Dejan; Dahl, Jeffrey H.; van Breemen, Richard B.

    2013-01-01

    Hops (Humulus lupulus L.) are used in the brewing of beer, and hop extracts containing prenylated compounds such as xanthohumol and 8-prenylnaringenin are under investigation as dietary supplements for cancer chemoprevention and for the management of hot flashes in menopausal women. To facilitate clinical studies of hop safety and efficacy, a selective, sensitive, and fast ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS-MS) method was developed and validated for the simultaneous determination of the hop prenylflavonoids xanthohumol, isoxanthohumol, 6-prenylnaringenin, and 8-prenylnaringenin in human serum. The analytical method requires 300 μL of human serum which is processed using liquid-liquid extraction. UHPLC separation was carried out in 2.5 min with gradient elution using a reversed phase column containing 1.6 μm packing material. Prenylflavonoids were measured using negative ion electrospray mass spectrometry with collision-induced dissociation and selected reaction monitoring. The method was validated and showed good accuracy and precision with a lower limit of quantitation (LLOQ) of 0.50 ng/mL for XN (1.4 nM) and 1.0 ng/mL for 6-PN (2.8 nM), XN and IX (2.9 nM) in serum for each analyte. PMID:23451393

  2. Validated method for the analysis of goji berry, a rich source of zeaxanthin dipalmitate.

    PubMed

    Karioti, Anastasia; Bergonzi, Maria Camilla; Vincieri, Franco F; Bilia, Anna Rita

    2014-12-31

    In the present study an HPLC-DAD method was developed for the determination of the main carotenoid, zeaxanthin dipalmitate, in the fruits of Lycium barbarum. The aim was to develop and optimize an extraction protocol to allow fast, exhaustive, and repeatable extraction, suitable for labile carotenoid content. Use of liquid N2 allowed the grinding of the fruit. A step of ultrasonication with water removed efficiently the polysaccharides and enabled the exhaustive extraction of carotenoids by hexane/acetone 50:50. The assay was fast and simple and permitted the quality control of a large number of commercial samples including fruits, juices, and a jam. The HPLC method was validated according to ICH guidelines and satisfied the requirements. Finally, the overall method was validated for precision (% RSD ranging between 3.81 and 4.13) and accuracy at three concentration levels. The recovery was between 94 and 107% with RSD values <2%, within the acceptable limits, especially if the difficulty of the matrix is taken into consideration.

  3. Validated method for the quantification of free and total carnitine, butyrobetaine, and acylcarnitines in biological samples.

    PubMed

    Minkler, Paul E; Stoll, Maria S K; Ingalls, Stephen T; Kerner, Janos; Hoppel, Charles L

    2015-09-01

    A validated quantitative method for the determination of free and total carnitine, butyrobetaine, and acylcarnitines is presented. The versatile method has four components: (1) isolation using strong cation-exchange solid-phase extraction, (2) derivatization with pentafluorophenacyl trifluoromethanesulfonate, (3) sequential ion-exchange/reversed-phase (ultra) high-performance liquid chromatography [(U)HPLC] using a strong cation-exchange trap in series with a fused-core HPLC column, and (4) detection with electrospray ionization multiple reaction monitoring (MRM) mass spectrometry (MS). Standardized carnitine along with 65 synthesized, standardized acylcarnitines (including short-chain, medium-chain, long-chain, dicarboxylic, hydroxylated, and unsaturated acyl moieties) were used to construct multiple-point calibration curves, resulting in accurate and precise quantification. Separation of the 65 acylcarnitines was accomplished in a single chromatogram in as little as 14 min. Validation studies were performed showing a high level of accuracy, precision, and reproducibility. The method provides capabilities unavailable by tandem MS procedures, making it an ideal approach for confirmation of newborn screening results and for clinical and basic research projects, including treatment protocol studies, acylcarnitine biomarker studies, and metabolite studies using plasma, urine, tissue, or other sample matrixes.

  4. Simultaneous Determination of Sitagliptin Phosphate Monohydrate and Metformin Hydrochloride in Tablets by a Validated UPLC Method.

    PubMed

    Malleswararao, Chellu S N; Suryanarayana, Mulukutla V; Mukkanti, Khagga

    2012-01-01

    A novel approach was used to develop and validate a rapid, specific, accurate and precise reverse phase ultra performance liquid chromatographic (UPLC) method for the simultaneous determination of Sitagliptin phosphate monohydrate and Metformin hydrochloride in pharmaceutical dosage forms. The chromatographic separation was achieved on Aquity UPLC BEH C8 100 × 2.1 mm, 1.7 μm, column using a buffer consisting of 10 mM potassium dihydrogen phosphate and 2 mM hexane-1-sulfonic acid sodium salt (pH adjusted to 5.50 with diluted phosphoric acid) and acetonitrile as organic solvent in a gradient program. The flow rate was 0.2 mL min(-1) and the detection wavelength was 210 nm. The limit of detection (LOD) for Sitagliptin phosphate monohydrate and Metformin hydrochloride was 0.2 and 0.06 μg mL(-1), respectively. The limit of quantification (LOQ) for Sitagliptin phosphate monohydrate and Metformin hydrochloride was 0.7 and 0.2 μg mL(-1), respectively. This method was validated with respect to linearity, accuracy, precision, specificity and robustness. The method was also found to be stability-indicating.

  5. Validated HPAEC-PAD method for prebiotics determination in synbiotic fermented milks during shelf life.

    PubMed

    Borromei, Chiara; Cavazza, Antonella; Corradini, Claudio; Vatteroni, Claudia; Bazzini, Adelina; Ferrari, Raffaella; Merusi, Paolo

    2010-05-01

    Interest concerning functional food has been growing in recent years, and much attention has been focused on the choice of prebiotic fibers and probiotic microorganisms added to food products with the aim of improving health, producing synbiotic products. In the work reported here, an innovative analytical method performed by high-performance anion-exchange chromatography (HPAEC) with pulsed electrochemical detection has been optimized and validated for application to the study of prebiotic effects in synbiotic fermented milk prepared by addition of probiotics. The proposed method permits the evaluation of fructooligosaccharides and inulooligosaccharides with degrees of polymerization of 6-7 and 4-7, respectively. Quantitative determination was performed on oligosaccharides, whose standards are not commercially available, by employing calibration curves built by adding a known amount of the fiber used as an ingredient to the matrix. The work provides results from a parallel study on simultaneous variations of prebiotics and probiotics during the shelf life of fermented milk samples. The main advantage over time-consuming, classic enzymatic methods, whose results are limited only to average fiber content, is the possibility of dosing each carbohydrate by performing a single HPAEC run. Validation in terms of detection and quantitation limits, linearity, precision, and recovery was carried out.

  6. Comparative Study in Laboratory Rats to Validate Sperm Quality Methods and Endpoints

    NASA Technical Reports Server (NTRS)

    Price, W. A.; Briggs, G. B.; Alexander, W. K.; Still, K. R.; Grasman, K. A.

    2000-01-01

    Abstract The Naval Health Research Center, Detachment (Toxicology) performs toxicity studies in laboratory animals to characterize the risk of exposure to chemicals of Navy interest. Research was conducted at the Toxicology Detachment at WPAFB, OH in collaboration with Wright State University, Department of Biological Sciences for the validation of new bioassay methods for evaluating reproductive toxicity. The Hamilton Thorne sperm analyzer was used to evaluate sperm damage produced by exposure to a known testicular toxic agent, methoxyacetic acid and by inhalation exposure to JP-8 and JP-5 in laboratory rats. Sperm quality parameters were evaluated (sperm concentration, motility, and morphology) to provide evidence of sperm damage. The Hamilton Thorne sperm analyzer utilizes a DNA specific fluorescent stain (similar to flow cytometry) and digitized optical computer analysis to detect sperm cell damage. The computer assisted sperm analysis (CASA) is a more rapid, robust, predictive and sensitive method for characterizing reproductive toxicity. The results presented in this poster report validation information showing exposure to methoxyacetic acid causes reproductive toxicity and inhalation exposure to JP-8 and JP-5 had no significant effects. The CASA method detects early changes that result in reproductive deficits and these data will be used in a continuing program to characterize the toxicity of chemicals, and combinations of chemicals, of military interest to formulate permissible exposure limits.

  7. Development and Validation of GC-ECD Method for the Determination of Metamitron in Soil

    PubMed Central

    Tandon, Shishir; Kumar, Satyendra; Sand, N. K.

    2015-01-01

    This paper aims at developing and validating a convenient, rapid, and sensitive method for estimation of metamitron from soil samples.Determination andquantification was carried out by Gas Chromatography on microcapillary column with an Electron Capture Detector source. The compound was extracted from soil using methanol and cleanup by C-18 SPE. After optimization, the method was validated by evaluating the analytical curves, linearity, limits of detection, and quantification, precision (repeatability and intermediate precision), and accuracy (recovery). Recovery values ranged from 89 to 93.5% within 0.05- 2.0 µg L−1 with average RSD 1.80%. The precision (repeatability) ranged from 1.7034 to 1.9144% and intermediate precision from 1.5685 to 2.1323%. Retention time was 6.3 minutes, and minimum detectable and quantifiable limits were 0.02 ng mL−1 and 0.05 ng g−1, respectively. Good linearity (R2 = 0.998) of the calibration curves was obtained over the range from 0.05 to 2.0 µg L−1. Results indicated that the developed method is rapid and easy to perform, making it applicable for analysis in large pesticide monitoring programmes. PMID:25733978

  8. Measure profile surrogates: A method to validate the performance of epileptic seizure prediction algorithms

    NASA Astrophysics Data System (ADS)

    Kreuz, Thomas; Andrzejak, Ralph G.; Mormann, Florian; Kraskov, Alexander; Stögbauer, Harald; Elger, Christian E.; Lehnertz, Klaus; Grassberger, Peter

    2004-06-01

    In a growing number of publications it is claimed that epileptic seizures can be predicted by analyzing the electroencephalogram (EEG) with different characterizing measures. However, many of these studies suffer from a severe lack of statistical validation. Only rarely are results passed to a statistical test and verified against some null hypothesis H0 in order to quantify their significance. In this paper we propose a method to statistically validate the performance of measures used to predict epileptic seizures. From measure profiles rendered by applying a moving-window technique to the electroencephalogram we first generate an ensemble of surrogates by a constrained randomization using simulated annealing. Subsequently the seizure prediction algorithm is applied to the original measure profile and to the surrogates. If detectable changes before seizure onset exist, highest performance values should be obtained for the original measure profiles and the null hypothesis. “The measure is not suited for seizure prediction” can be rejected. We demonstrate our method by applying two measures of synchronization to a quasicontinuous EEG recording and by evaluating their predictive performance using a straightforward seizure prediction statistics. We would like to stress that the proposed method is rather universal and can be applied to many other prediction and detection problems.

  9. Further validation of the hybrid particle-mesh method for vortex shedding flow simulations

    NASA Astrophysics Data System (ADS)

    Lee, Seung-Jae; Lee, Jun-Hyeok; Suh, Jung-Chun

    2015-11-01

    This is the continuation of a numerical study on vortex shedding from a blunt trailing-edge of a hydrofoil. In our previous work (Lee et al., 2015), numerical schemes for efficient computations were successfully implemented; i.e. multiple domains, the approximation of domain boundary conditions using cubic spline functions, and particle-based domain decomposition for better load balancing. In this study, numerical results through a hybrid particle-mesh method which adopts the Vortex-In-Cell (VIC) method and the Brinkman penalization model are further rigorously validated through comparison to experimental data at the Reynolds number of 2 × 106. The effects of changes in numerical parameters are also explored herein. We find that the present numerical method enables us to reasonably simulate vortex shedding phenomenon, as well as turbulent wakes of a hydrofoil.

  10. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    PubMed

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC.

  11. Antioxidant Activity and Validation of Quantification Method for Lycopene Extracted from Tomato.

    PubMed

    Cefali, Letícia Caramori; Cazedey, Edith Cristina Laignier; Souza-Moreira, Tatiana Maria; Correa, Marcos Antônio; Salgado, Hérida Regina Nunes; Isaac, Vera Lucia Borges

    2015-01-01

    Lycopene is a carotenoid found in tomatoes with potent antioxidant activity. The aim of the study was to obtain an extract containing lycopene from four types of tomatoes, validate a quantification method for the extracts by HPLC, and assess its antioxidant activity. Results revealed that the tomatoes analyzed contained lycopene and antioxidant activity. Salad tomato presented the highest concentration of this carotenoid and antioxidant activity. The quantification method exhibited linearity with a correlation coefficient of 0.9992. Tests for the assessment of precision, accuracy, and robustness achieved coefficients with variation of less than 5%. The LOD and LOQ were 0.0012 and 0.0039 μg/mL, respectively. Salad tomato can be used as a source of lycopene for the development of topical formulations, and based on performed tests, the chosen method for the identification and quantification of lycopene was considered to be linear, precise, exact, selective, and robust.

  12. Development and validation of an extraction method for the analysis of perfluoroalkyl substances in human hair.

    PubMed

    Kim, Da-Hye; Oh, Jeong-Eun

    2017-05-01

    Human hair has many advantages as a non-invasive sample; however, analytical methods for detecting perfluoroalkyl substances (PFASs) in human hair are still in the development stage. Therefore, the aim of this study was to develop and validate a method for monitoring 11 PFASs in human hair. Solid-phase extraction (SPE), ion-pairing extraction (IPE), a combined method (SPE+IPE) and solvent extraction with ENVI-carb clean-up were compared to develop an optimal extraction method using two types of hair sample (powder and piece forms). Analysis of PFASs was performed using liquid chromatography and tandem mass spectrometry. Among the four different extraction procedures, the SPE method using powdered hair showed the best extraction efficiency and recoveries ranged from 85.8 to 102%. The method detection limits for the SPE method were 0.114-0.796 ng/g and good precision (below 10%) and accuracy (66.4-110%) were obtained. In light of these results, SPE is considered the optimal method for PFAS extraction from hair. It was also successfully used to detect PFASs in human hair samples.

  13. Recommendations and best practices for reference standards and reagents used in bioanalytical method validation.

    PubMed

    Bower, Joseph F; McClung, Jennifer B; Watson, Carl; Osumi, Takahiko; Pastre, Kátia

    2014-03-01

    The continued globalization of pharmaceutics has increased the demand for companies to know and understand the regulations that exist across the globe. One hurdle facing pharmaceutical and biotechnology companies developing new drug candidates is interpreting the current regulatory guidance documents and industry publications associated with bioanalytical method validation (BMV) from each of the different agencies throughout the world. The objective of this commentary is to provide our opinions on the best practices for reference standards and key reagents, such as metabolites and internal standards used in the support of regulated bioanalysis based on a review of current regulatory guidance documents and industry white papers for BMV.

  14. Validated spectrofluorimetric methods for the determination of apixaban and tirofiban hydrochloride in pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    El-Bagary, Ramzia I.; Elkady, Ehab F.; Farid, Naira A.; Youssef, Nadia F.

    2017-03-01

    Apixaban and Tirofiban Hydrochloride are low molecular weight anticoagulants. The two drugs exhibit native fluorescence that allow the development of simple and valid spectrofluorimetric methods for the determination of Apixaban at λ ex/λ em = 284/450 nm and tirofiban HCl at λ ex/λ em = 227/300 nm in aqueous media. Different experimental parameters affecting fluorescence intensities were carefully studied and optimized. The fluorescence intensity-concentration plots were linear over the ranges of 0.2-6 μg ml- 1 for apixaban and 0.2-5 μg ml- 1 for tirofiban HCl. The limits of detection were 0.017 and 0.019 μg ml- 1 and quantification limits were 0.057 and 0.066 μg ml- 1 for apixaban and tirofiban HCl, respectively. The fluorescence quantum yield of apixaban and tirofiban were calculated with values of 0.43 and 0.49. Method validation was evaluated for linearity, specificity, accuracy, precision and robustness as per ICH guidelines. The proposed spectrofluorimetric methods were successfully applied for the determination of apixaban in Eliquis tablets and tirofiban HCl in Aggrastat intravenous infusion. Tolerance ratio was tested to study the effect of foreign interferences from dosage forms excipients. Using Student's t and F tests, revealed no statistically difference between the developed spectrofluorimetric methods and the comparison methods regarding the accuracy and precision, so can be contributed to the analysis of apixaban and tirofiban HCl in QC laboratories as an alternative method.

  15. New Method for Simultaneous Quantification of 12 Ginsenosides in Red Ginseng Powder and Extract: In-house Method Validation.

    PubMed

    In, Gyo; Ahn, Nam-Geun; Bae, Bong-Seok; Han, Sung-Tai; Noh, Kil-Bong; Kim, Cheon-Suk

    2012-04-01

    For quality control of components in Korean red ginseng powder and extract, a new method for simultaneous quantification of 12 ginsenosides (Rg1, Re, Rf, Rh1, Rg2[S], Rg2[R], Rb1, Rc, Rb2, Rd, Rg3[S], and Rg3[R]) was studied. Compared to the official method for quantification of marker substances (ginsenosides Rg1 and Rb1), the proposed methods were guaranteed by in-house method validation. Several criteria such as linearity, specificity, precision and accuracy were evaluated. For red ginseng powder, recovery (averaging 95% to 105%) was calculated, and analysis of variance was carried out to estimate the relative standard deviation (0.20% to 2.12%). For red ginseng extract, the average recovery rate was 90% to 99% and the relative standard deviation was 0.39% to 2.40%. These results indicate that the proposed method could be used in the laboratory for determination of 12 ginsenosides in red ginseng powder and extract. In addition, this method was found to be suitable for quality control of ginseng products and potentially offer time and cost benefits.

  16. Validated Method for the Determination of Piroxicam by Capillary Zone Electrophoresis and Its Application to Tablets

    PubMed Central

    Dal, Arın Gül; Oktayer, Zeynep; Doğrukol-Ak, Dilek

    2014-01-01

    Simple and rapid capillary zone electrophoretic method was developed and validated in this study for the determination of piroxicam in tablets. The separation of piroxicam was conducted in a fused-silica capillary by using 10 mM borate buffer (pH 9.0) containing 10% (v/v) methanol as background electrolyte. The optimum conditions determined were 25 kV for separation voltage and 1 s for injection time. Analysis was carried out with UV detection at 204 nm. Naproxen sodium was used as an internal standard. The method was linear over the range of 0.23–28.79 µg/mL. The accuracy and precision were found to be satisfied within the acceptable limits (<2%). The LOD and LOQ were found to be 0.07 and 0.19 µg/mL, respectively. The method described here was applied to tablet dosage forms and the content of a tablet was found in the limits of USP-24 suggestions. To compare the results of capillary electrophoretic method, UV spectrophotometric method was developed and the difference between two methods was found to be insignificant. The capillary zone electrophoretic method developed in this study is rapid, simple, and suitable for routine analysis of piroxicam in pharmaceutical tablets. PMID:25295220

  17. Comparison of sample preparation methods, validation of an UPLC-MS/MS procedure for the quantification of tetrodotoxin present in marine gastropods and analysis of pufferfish.

    PubMed

    Nzoughet, Judith Kouassi; Campbell, Katrina; Barnes, Paul; Cooper, Kevin M; Chevallier, Olivier P; Elliott, Christopher T

    2013-02-15

    Tetrodotoxin (TTX) is one of the most potent marine neurotoxins reported. The global distribution of this toxin is spreading with the European Atlantic coastline now being affected. Climate change and increasing pollution have been suggested as underlying causes for this. In the present study, two different sample preparation techniques were used to extract TTX from Trumpet shells and pufferfish samples. Both extraction procedures (accelerated solvent extraction (ASE) and a simple solvent extraction) were shown to provide good recoveries (80-92%). A UPLC-MS/MS method was developed for the analysis of TTX and validated following the guidelines contained in the Commission Decision 2002/657/EC for chemical contaminant analysis. The performance of this procedure was demonstrated to be fit for purpose. This study is the first report on the use of ASE as a mean for TTX extraction, the use of UPLC-MS/MS for TTX analysis, and the validation of this method for TTX in gastropods.

  18. Optimization and single-laboratory validation study of a high-performance liquid chromatography (HPLC) method for the determination of phenolic Echinacea constituents.

    PubMed

    Brown, Paula N; Chan, Michael; Betz, Joseph M

    2010-07-01

    Three species of Echinacea (Echinacea purpurea, Echinacea angustifolia, and Echinacea pallida) are commonly used for medicinal purposes. The phenolic compounds caftaric acid, cichoric acid, echinacoside, cynarin, and chlorogenic acid are among the phytochemical constituents that may be responsible for the purported beneficial effects of the herb. Although methods for the analysis for these compounds have been published, documentation of their validity was inadequate as the accuracy and precision for the detection and quantification of these phenolics was not systematically determined and/or reported. To address this issue, the high-performance liquid chromatography method, originally developed by the Institute for Nutraceutical Advancement (INA), was reviewed, optimized, and validated for the detection and quantification of these phenolic compounds in Echinacea roots and aerial parts.

  19. Using Self- and Peer-Assessments for Summative Purposes: Analysing the Relative Validity of the AASL (Authentic Assessment for Sustainable Learning) Model

    ERIC Educational Resources Information Center

    Kearney, Sean; Perkins, Timothy; Kennedy-Clark, Shannon

    2016-01-01

    The purpose of this paper is to provide a proof of concept of a collaborative peer-, self- and lecturer assessment processes. The research presented here is part of an ongoing study on self- and peer assessments in higher education. The authentic assessment for sustainable learning (AASL) model is evaluated in terms of the correlations between…

  20. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    SciTech Connect

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary; Geller, Jil; Fisher, Susan; Hall, Steven; Hazen, Terry C.; Brenner, Steven; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.

  1. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions*

    PubMed Central

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E.; Geller, Jil T.; Fisher, Susan J.; Hall, Steven C.; Hazen, Terry C.; Brenner, Steven E.; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  2. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE PAGES

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; ...

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  3. Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730

  4. A simple and valid method to determine thermoregulatory sweating threshold and sensitivity.

    PubMed

    Cheuvront, Samuel N; Bearden, Shawn E; Kenefick, Robert W; Ely, Brett R; Degroot, David W; Sawka, Michael N; Montain, Scott J

    2009-07-01

    Sweating threshold temperature and sweating sensitivity responses are measured to evaluate thermoregulatory control. However, analytic approaches vary, and no standardized methodology has been validated. This study validated a simple and standardized method, segmented linear regression (SReg), for determination of sweating threshold temperature and sensitivity. Archived data were extracted for analysis from studies in which local arm sweat rate (m(sw); ventilated dew-point temperature sensor) and esophageal temperature (T(es)) were measured under a variety of conditions. The relationship m(sw)/T(es) from 16 experiments was analyzed by seven experienced raters (Rater), using a variety of empirical methods, and compared against SReg for the determination of sweating threshold temperature and sweating sensitivity values. Individual interrater differences (n = 324 comparisons) and differences between Rater and SReg (n = 110 comparisons) were evaluated within the context of biologically important limits of magnitude (LOM) via a modified Bland-Altman approach. The average Rater and SReg outputs for threshold temperature and sensitivity were compared (n = 16) using inferential statistics. Rater employed a very diverse set of criteria to determine the sweating threshold temperature and sweating sensitivity for the 16 data sets, but interrater differences were within the LOM for 95% (threshold) and 73% (sensitivity) of observations, respectively. Differences between mean Rater and SReg were within the LOM 90% (threshold) and 83% (sensitivity) of the time, respectively. Rater and SReg were not different by conventional t-test (P > 0.05). SReg provides a simple, valid, and standardized way to determine sweating threshold temperature and sweating sensitivity values for thermoregulatory studies.

  5. Validated stability-indicating TLC method for the determination of noscapine.

    PubMed

    Ashour, Ahmed; Hegazy, Maha Abdel Monem; Moustafa, Azza Aziz; Kelani, Khadiga Omar; Fattah, Laila Elsayed Abdel

    2009-07-01

    A sensitive, selective, precise and stability-indicating thin-layer chromatographic (TLC) method was developed and validated for the analysis of noscapine, both as a bulk drug and in its formulation. The method employed TLC aluminium plates precoated with silica gel 60F-254 as the stationary phase. The solvent system consisted of chloroform-methanol (10:0.5 v/v). Densitometric analysis of noscapine and its degradation products was carried out in the absorbance mode at 254 nm. This system was found to give compact symmetrical spots for noscapine (R(f) value 0.85 +/- 0.04). Noscapine was subjected to acid and alkali hydrolysis, oxidation and photo degradation. The drug undergoes photo degradation and also degrades under acidic and basic conditions. The prepared degradation products were identified and verified through infrared (IR) and mass spectral analyses. The degraded products were also well resolved from the pure drug with significantly different R(f) values and they were quantitatively determined. The method was validated for linearity, precision, robustness, limit of detection (LOD), limit of quantitation (LOQ), specificity and accuracy. Linearity was found to be in the 1.0-10.0 microg, 0.4-3.2 microg, 1.0-9.0 microg and 0.5-5.0 microg/band ranges for noscapine, cotarnine, meconine and opionic acid, respectively. The polynomial regression analysis for the calibration plots showed a good polynomial relationship with r(2) of 0.9998, 9989, 9996 and 0.9997 for noscapine and its three degradation products, cotarnine, meconine and opionic acid, respectively. Statistical analysis proves that the method is repeatable and specific for the estimation of noscapine. As this approach could effectively separate the drug from its degradation products it can be employed as a stability-indicating method in Quality Control laboratories.

  6. Fit-for-purpose chromatographic method for the determination of amikacin in human plasma for the dosage control of patients.

    PubMed

    Ezquer-Garin, C; Escuder-Gilabert, L; Martín-Biosca, Y; Lisart, R Ferriols; Sagrado, S; Medina-Hernández, M J

    2016-04-01

    In this paper, a simple, rapid and sensitive method based on liquid chromatography with fluorimetric detection (HPLC-FLD) for the determination of amikacin (AMK) in human plasma is developed. Determination is performed by pre-column derivatization of AMK with ortho-phtalaldehyde (OPA) in presence of N-acetyl-L-cysteine (NAC) at pH 9.5 for 5 min at 80 °C. In our knowledge, this is the first time that NAC has been used in AMK derivatization. Derivatization conditions (pH, AMK/OPA/NAC molar ratios, temperature and reaction time) are optimized to obtain a single and stable, at room temperature, derivative. Separation of the derivative is achieved on a reversed phase LC column (Kromasil C18, 5 μm, 150 × 4.6 i.d. mm) with a mobile phase of 0.05 M phosphate buffer:acetonitrile (80:20, v/v) pumped at flow rate of 1.0 mL/min. Detection is performed using 337 and 439 nm for excitation and emission wavelengths, respectively. The method is fitted for the purpose of being a competitive alternative to the currently used method in many hospitals for AMK dosage control: fluorescence polarization immunoassay (FPIA). The method exhibits linearity in the 0.17-10 µg mL(-1) concentration range with a squared correlation coefficient higher than 0.995. Trueness and intermediate precision are estimated using spiked drug free plasma samples, which fulfill current UNE-EN ISO15189:2007 accreditation schemes. Finally, for the first time, statistical comparison against the FPIA method is demonstrated using plasma samples from 31 patients under treatment with AMK.

  7. Low-level (PPB) determination of cisplatin in cleaning validation (rinse water) samples. I. An atomic absorption spectrophotometric method.

    PubMed

    Raghavan, R; Mulligan, J A

    2000-04-01

    Suitable analytical methods are required for quantitative determination of trace levels of ingredients in samples obtained for purposes of cleaning validation. We describe below an atomic absorption method for the quantitation of cisplatin, an antineoplastic agent, in aqueous samples. Cisplatin was reacted with diethyldithiocarbamic acid (DDTC), sodium salt, to yield a platinum-DDTC (Pt-DDTC) complex. The Pt-DDTC chelate was extracted into methylene chloride, the extract was mixed with acetonitrile, and the platinum content was then determined using a Zeeman atomic absorption (AA) spectrophotometer. The extraction conditions and AA experimental conditions were set up such that the detection level could be extended to 0.5 ng/ml. Reproducible results were obtained at a quantitative working standard concentration of 5 PPB. The absorbance response was found to be a linear function of cisplatin concentration in the region between 0.5 PPB and 20 PPB, which is about 10% to 400% of the target analyte concentration of 5 PPB. The target analyte concentration was set at 5 PPB such that it was at least 10 times the detection limit of about 0.5 PPB.

  8. Ecological validity and the study of publics: The case for organic public engagement methods.

    PubMed

    Gehrke, Pat J

    2014-01-01

    This essay argues for a method of public engagement grounded in the criteria of ecological validity. Motivated by what Hammersly called the responsibility that comes with intellectual authority: "to seek, as far as possible, to ensure the validity of their conclusions and to participate in rational debate about those conclusions" (1993: 29), organic public engagement follows the empirical turn in citizenship theory and in rhetorical studies of actually existing publics. Rather than shaping citizens into either the compliant subjects of the cynical view or the deliberatively disciplined subjects of the idealist view, organic public engagement instead takes Asen's advice that "we should ask: how do people enact citizenship?" (2004: 191). In short, organic engagement methods engage publics in the places where they already exist and through those discourses and social practices by which they enact their status as publics. Such engagements can generate practical middle-range theories that facilitate future actions and decisions that are attentive to the local ecologies of diverse publics.

  9. A validated new method for nevirapine quantitation in human plasma via high-performance liquid chromatography.

    PubMed

    Silverthorn, Courtney F; Parsons, Teresa L

    2006-01-01

    A fully validated and clinically relevant assay was developed for the assessment of nevirapine concentrations in neonate blood plasma samples. Solid-phase extraction with an acid-base wash series was used to prepare subject samples for analysis. Samples were separated by high performance liquid chromatography and detected at 280 nm on a C8 reverse-phase column in an isocratic mobile phase. The retention times of nevirapine and its internal standard were 5.0 and 6.9 min, respectively. The method was validated by assessment of accuracy and precision (statistical values <15%), specificity, and stability. The assay was linear in the range 25-10,000 ng/mL (r2 > 0.996) and the average recovery was 93% (n = 18). The lower limit of quantification (relative standard deviation <20%) was determined to be 25 ng/mL for 50 microL of plasma, allowing detection of as little as 1.25 ng of nevirapine in a sample. This value represents an increase in sensitivity of up to 30-fold over previously published methods.

  10. Validation of a novel sequential cultivation method for the production of enzymatic cocktails from Trichoderma strains.

    PubMed

    Florencio, C; Cunha, F M; Badino, A C; Farinas, C S

    2015-02-01

    The development of new cost-effective bioprocesses for the production of cellulolytic enzymes is needed in order to ensure that the conversion of biomass becomes economically viable. The aim of this study was to determine whether a novel sequential solid-state and submerged fermentation method (SF) could be validated for different strains of the Trichoderma genus. Cultivation of the Trichoderma reesei Rut-C30 reference strain under SF using sugarcane bagasse as substrate was shown to be favorable for endoglucanase (EGase) production, resulting in up to 4.2-fold improvement compared with conventional submerged fermentation. Characterization of the enzymes in terms of the optimum pH and temperature for EGase activity and comparison of the hydrolysis profiles obtained using a synthetic substrate did not reveal any qualitative differences among the different cultivation conditions investigated. However, the thermostability of the EGase was influenced by the type of carbon source and cultivation system. All three strains of Trichoderma tested (T. reesei Rut-C30, Trichoderma harzianum, and Trichoderma sp INPA 666) achieved higher enzymatic productivity when cultivated under SF, hence validating the proposed SF method for use with different Trichoderma strains. The results suggest that this bioprocess configuration is a very promising development for the cellulosic biofuels industry.

  11. Design and validation of bending test method for characterization of miniature pediatric cortical bone specimens.

    PubMed

    Albert, Carolyne I; Jameson, John; Harris, Gerald

    2013-02-01

    Osteogenesis imperfecta is a genetic disorder of bone fragility; however, the effects of this disorder on bone material properties are not well understood. No study has yet measured bone material strength in humans with osteogenesis imperfecta. Small bone specimens are often extracted during routine fracture surgeries in children with osteogenesis imperfecta. These specimens could provide valuable insight into the effects of osteogenesis imperfecta on bone material strength; however, their small size poses a challenge to their mechanical characterization. In this study, a validated miniature three-point bending test is described that enables measurement of the flexural material properties of pediatric cortical osteotomy specimens as small as 5 mm in length. This method was validated extensively using bovine bone, and the effect of span/depth aspect ratio (5 vs 6) on the measured flexural properties was examined. The method provided reasonable results for both Young's modulus and flexural strength in bovine bone. With a span/depth ratio of 6, the median longitudinal modulus and flexural strength results were 16.1 (range: 14.4-19.3)GPa and 251 (range: 219-293)MPa, respectively. Finally, the pilot results from two osteotomy specimens from children with osteogenesis imperfecta are presented. These results provide the first measures of bone material strength in this patient population.

  12. Validated spectrofluorimetric method for the determination of tamsulosin in spiked human urine, pure and pharmaceutical preparations.

    PubMed

    Karasakal, A; Ulu, S T

    2014-05-01

    A novel, sensitive and selective spectrofluorimetric method was developed for the determination of tamsulosin in spiked human urine and pharmaceutical preparations. The proposed method is based on the reaction of tamsulosin with 1-dimethylaminonaphthalene-5-sulfonyl chloride in carbonate buffer pH 10.5 to yield a highly fluorescent derivative. The described method was validated and the analytical parameters of linearity, limit of detection (LOD), limit of quantification (LOQ), accuracy, precision, recovery and robustness were evaluated. The proposed method showed a linear dependence of the fluorescence intensity on drug concentration over the range 1.22 × 10(-7) to 7.35 × 10(-6)  M. LOD and LOQ were calculated as 1.07 × 10(-7) and 3.23 × 10(-7)  M, respectively. The proposed method was successfully applied for the determination of tamsulosin in pharmaceutical preparations and the obtained results were in good agreement with those obtained using the reference method.

  13. Spectrofluorimetric method for determination and validation of cefixime in pharmaceutical preparations through derivatization with 2-cyanoacetamide.

    PubMed

    Shah, Jasmin; Jan, M Rasul; Shah, Sultan; Inayatullah

    2011-03-01

    A simple, sensitive and accurate method has been developed for spectrofluorimetric determination of cefixime in pure form and pharmaceutical preparations. The method is based on the reaction of cefixime with 2-cyanoacetamide in the presence of 21% ammonia at 100 °C. The fluorescent reaction product showed maximum fluorescence intensity at λ 378 nm after excitation at λ 330 nm. The factors affecting the derivatization reaction were carefully studied and optimized. The fluorescence intensity versus concentration plot was rectilinear over the range of 0.02 to 4 μg mL(-1) with correlation coefficient of 0.99036. The limit of detection (LOD) and limit of quantification (LOQ) was found to be 2.95 ng mL(-1) and 9.84 ng mL(-1), respectively. The proposed method was validated statistically and through recovery studies. The method was successfully applied for the determination of cefixime in pure and dosage form with percent recoveries from 98.117% to 100.38%. The results obtained from the proposed method have been compared with the official HPLC method and good agreement was found between them.

  14. The role of validated analytical methods in JECFA drug assessments and evaluation for recommending MRLs.

    PubMed

    Boison, Joe O

    2016-05-01

    The Joint Food and Agriculture Organization and World Health Organization (FAO/WHO) Expert Committee on Food Additives (JECFA) is one of three Codex committees tasked with applying risk analysis and relying on independent scientific advice provided by expert bodies organized by FAO/WHO when developing standards. While not officially part of the Codex Alimentarius Commission structure, JECFA provides independent scientific advice to the Commission and its specialist committees such as the Codex Committee on Residues of Veterinary Drugs in Foods (CCRVDF) in setting maximum residue limits (MRLs) for veterinary drugs. Codex methods of analysis (Types I, II, III, and IV) are defined in the Codex Procedural Manual as are criteria to be used for selecting methods of analysis. However, if a method is to be used under a single laboratory condition to support regulatory work, it must be validated according to an internationally recognized protocol and the use of the method must be embedded in a quality assurance system in compliance with ISO/IEC 17025:2005. This paper examines the attributes of the methods used to generate residue depletion data for drug registration and/or licensing and for supporting regulatory enforcement initiatives that experts consider to be useful and appropriate in their assessment of methods of analysis. Copyright © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd.

  15. Further validation to the variational method to obtain flow relations for generalized Newtonian fluids

    NASA Astrophysics Data System (ADS)

    Sochi, Taha

    2015-05-01

    We continue our investigation to the use of the variational method to derive flow relations for generalized Newtonian fluids in confined geometries. While in the previous investigations we used the straight circular tube geometry with eight fluid rheological models to demonstrate and establish the variational method, the focus here is on the plane long thin slit geometry using those eight rheological models, namely: Newtonian, power law, Ree-Eyring, Carreau, Cross, Casson, Bingham and Herschel-Bulkley. We demonstrate how the variational principle based on minimizing the total stress in the flow conduit can be used to derive analytical expressions, which are previously derived by other methods, or used in conjunction with numerical procedures to obtain numerical solutions which are virtually identical to the solutions obtained previously from well established methods of fluid dynamics. In this regard, we use the method of Weissenberg-Rabinowitsch- Mooney-Schofield (WRMS), with our adaptation from the circular pipe geometry to the long thin slit geometry, to derive analytical formulae for the eight types of fluid where these derived formulae are used for comparison and validation of the variational formulae and numerical solutions. Although some examples may be of little value, the optimization principle which the variational method is based upon has a significant theoretical value as it reveals the tendency of the flow system to assume a configuration that minimizes the total stress. Our proposal also offers a new methodology to tackle common problems in fluid dynamics and rheology.

  16. Method validation and dissipation dynamics of chlorfenapyr in squash and okra.

    PubMed

    Abdel Ghani, Sherif B; Abdallah, Osama I

    2016-03-01

    QuEChERS method combined with GC-IT-MS was developed and validated for the determination of chlorfenapyr residues in squash and okra matrices. Method accuracy, repeatability, linearity and specificity were investigated. Matrix effect was discussed. Determination coefficients (R(2)) were 0.9992 and 0.9987 in both matrices. LODs were 2.4 and 2.2μg/kg, while LOQs were 8.2 and 7.3μg/kg. Method accuracy ranged from 92.76% to 106.49%. Method precision RSDs were ⩽12.59%. A field trial to assess chlorfenapyr dissipation behavior was carried out. The developed method was employed in analyzing field samples. Dissipation behavior followed first order kinetics in both crops. Half-life values (t1/2) ranged from 0.2 to 6.58days with determination coefficient (R(2)) ranged from 0.78 to 0.96. The developed method was utilized for surveying chlorfenapyr residues in squash and okra samples collected from the market. Monitoring results are discussed.

  17. Validity of the t-plot method to assess microporosity in hierarchical micro/mesoporous materials.

    PubMed

    Galarneau, Anne; Villemot, François; Rodriguez, Jeremy; Fajula, François; Coasne, Benoit

    2014-11-11

    The t-plot method is a well-known technique which allows determining the micro- and/or mesoporous volumes and the specific surface area of a sample by comparison with a reference adsorption isotherm of a nonporous material having the same surface chemistry. In this paper, the validity of the t-plot method is discussed in the case of hierarchical porous materials exhibiting both micro- and mesoporosities. Different hierarchical zeolites with MCM-41 type ordered mesoporosity are prepared using pseudomorphic transformation. For comparison, we also consider simple mechanical mixtures of microporous and mesoporous materials. We first show an intrinsic failure of the t-plot method; this method does not describe the fact that, for a given surface chemistry and pressure, the thickness of the film adsorbed in micropores or small mesopores (< 10σ, σ being the diameter of the adsorbate) increases with decreasing the pore size (curvature effect). We further show that such an effect, which arises from the fact that the surface area and, hence, the free energy of the curved gas/liquid interface decreases with increasing the film thickness, is captured using the simple thermodynamical model by Derjaguin. The effect of such a drawback on the ability of the t-plot method to estimate the micro- and mesoporous volumes of hierarchical samples is then discussed, and an abacus is given to correct the underestimated microporous volume by the t-plot method.

  18. Extension and validation of a method for locating damaged members in large space trusses

    NASA Technical Reports Server (NTRS)

    Smith, Suzanne Weaver

    1988-01-01

    The damage location approach employs the control system capabilities for the structure to test the structure and measure the dynamic response. The measurements are then used in a system identification algorithm to produce a model of the damaged structure. The model is compared to one for the undamaged structure to find regions of reduced stiffness which indicate the location of damage. Kabe's 3,4 stiffness matrix adjustment method was the central identification algorithm. The strength of his method is that, with minimal data, it preserves the representation of the physical connectivity of the structure in the resulting model of the damaged truss. However, extensive storage and computational effort were required as a result. Extension of the damage location method to overcome these problems is the first part of the current work. The central system identification algorithm is replaced with the MSMT method of stiffness matrix adjustment which was previously derived by generalizing an optimal-update secant method form quasi-Newton approaches for nonlinear optimization. Validation of the extended damage location method is the second goal.

  19. Chemometric approach for development, optimization, and validation of different chromatographic methods for separation of opium alkaloids.

    PubMed

    Acevska, J; Stefkov, G; Petkovska, R; Kulevanova, S; Dimitrovska, A

    2012-05-01

    The excessive and continuously growing interest in the simultaneous determination of poppy alkaloids imposes the development and optimization of convenient high-throughput methods for the assessment of the qualitative and quantitative profile of alkaloids in poppy straw. Systematic optimization of two chromatographic methods (gas chromatography (GC)/flame ionization detector (FID)/mass spectrometry (MS) and reversed-phase (RP)-high-performance liquid chromatography (HPLC)/diode array detector (DAD)) for the separation of alkaloids from Papaver somniferum L. (Papaveraceae) was carried out. The effects of various conditions on the predefined chromatographic descriptors were investigated using chemometrics. A full factorial linear design of experiments for determining the relationship between chromatographic conditions and the retention behavior of the analytes was used. Central composite circumscribed design was utilized for the final method optimization. By conducting the optimization of the methods in very rational manner, a great deal of excessive and unproductive laboratory research work was avoided. The developed chromatographic methods were validated and compared in line with the resolving power, sensitivity, accuracy, speed, cost, ecological aspects, and compatibility with the poppy straw extraction procedure. The separation of the opium alkaloids using the GC/FID/MS method was achieved within 10 min, avoiding any derivatization step. This method has a stronger resolving power, shorter analysis time, better cost/effectiveness factor than the RP-HPLC/DAD method and is in line with the "green trend" of the analysis. The RP-HPLC/DAD method on the other hand displayed better sensitivity for all tested alkaloids. The proposed methods provide both fast screening and an accurate content assessment of the six alkaloids in the poppy samples obtained from the selection program of Papaver strains.

  20. Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli.

    PubMed

    Puelles, Victor G; van der Wolde, James W; Schulze, Keith E; Short, Kieran M; Wong, Milagros N; Bensley, Jonathan G; Cullen-McEwen, Luise A; Caruana, Georgina; Hokke, Stacey N; Li, Jinhua; Firth, Stephen D; Harper, Ian S; Nikolic-Paterson, David J; Bertram, John F

    2016-10-01

    Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli.

  1. Validation of a mass spectrometry method to quantify oak ellagitannins in wine samples.

    PubMed

    García-Estévez, Ignacio; Escribano-Bailón, M Teresa; Rivas-Gonzalo, Julián C; Alcalde-Eon, Cristina

    2012-02-15

    Detection and individual quantification of oak wood ellagitannins in oak barrel aged red wine samples are difficult mainly due to their low levels and the similarity between their structures. In this work, a quantification method using mass spectrometry has been developed and validated to quantify wine ellagitannins after sample fractionation with a previously reported method. The use of an internal standard is a requirement to correct mass signal variability. (-)-Gallocatechin, among the different tested compounds, was the only one that proved to be a suitable internal standard making possible the accurate and individual quantification of the main oak wood ellagitannins. The developed methodology has been used to detect and quantify these ellagitannins in different Spanish commercial wines, proving its usefulness.

  2. Field validation of test methods for solidified waste evaluation -- a status report

    SciTech Connect

    Stegemann, J.A.; Caldwell, R.J.; Shi, C.

    1996-12-31

    Application of solidification/stabilization as a treatment technology for hazardous wastes has been hindered by the lack of a regulatory approval mechanism for solidified wastes. The Wastewater Technology Centre (WTC) has developed a protocol for evaluation of solidified wastes, which uses the performance of a solidified product in twelve laboratory test methods to recommend one of four categories of utilization and disposal. In order to facilitate acceptance of the protocol, a validation study of the test methods has been initiated by the WTC. A 63 m{sup 3} field test cell has been constructed using electric arc furnace dust solidified with an activated blast furnace slag binder system. The behavior of the solidified waste in the field will be investigated by monitoring of leachate and testing of core samples, and compared with properties measured in the laboratory.

  3. Validation of Inlet and Exhaust Boundary Conditions for a Cartesian Method

    NASA Technical Reports Server (NTRS)

    Pandya, Shishir A.; Murman, Scott M.; Aftosmis, Michael J.

    2004-01-01

    Inlets and exhaust nozzles are often omitted in aerodynamic simulations of aircraft due to the complexities involved in the modeling of engine details and flow physics. However, the omission is often improper since inlet or plume flows may have a substantial effect on vehicle aerodynamics. A method for modeling the effect of inlets and exhaust plumes using boundary conditions within an inviscid Cartesian flow solver is presented. This approach couples with both CAD systems and legacy geometry to provide an automated tool suitable for parameter studies. The method is validated using two and three-dimensional test problems which are compared with both theoretical and experimental results. The numerical results demonstrate excellent agreement with theory and available data, even for extremely strong jets and very sensitive inlets.

  4. Validated spectrophotometric methods for the estimation of moxifloxacin in bulk and pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Motwani, Sanjay K.; Chopra, Shruti; Ahmad, Farhan J.; Khar, Roop K.

    2007-10-01

    New, simple, cost effective, accurate and reproducible UV-spectrophotometric methods were developed and validated for the estimation of moxifloxacin in bulk and pharmaceutical formulations. Moxifloxacin was estimated at 296 nm in 0.1N hydrochloric acid (pH 1.2) and at 289 nm in phosphate buffer (pH 7.4). Beer's law was obeyed in the concentration range of 1-12 μg ml -1 ( r2 = 0.9999) in hydrochloric acid and 1-14 μg ml -1 ( r2 = 0.9998) in the phosphate buffer medium. The apparent molar absorptivity and Sandell's sensitivity coefficient were found to be 4.63 × 10 4 l mol -1 cm -1 and 9.5 ng cm -2/0.001 A in hydrochloric acid; and 4.08 × 10 4 l mol -1 cm -1 and 10.8 ng cm -2/0.001 A in phosphate buffer media, respectively indicating the high sensitivity of the proposed methods. These methods were tested and validated for various parameters according to ICH guidelines. The detection and quantitation limits were found to be 0.0402, 0.1217 μg ml -1 in hydrochloric acid and 0.0384, 0.1163 μg ml -1 in phosphate buffer medium, respectively. The proposed methods were successfully applied for the determination of moxifloxacin in pharmaceutical formulations (tablets, i.v. infusions, eye drops and polymeric nanoparticles). The results demonstrated that the procedure is accurate, precise and reproducible (relative standard deviation <2%), while being simple, cheap and less time consuming and hence can be suitably applied for the estimation of moxifloxacin in different dosage forms and dissolution studies.

  5. Validation of the hypercapnic calibrated fMRI method using DOT-fMRI fusion imaging

    PubMed Central

    Yücel, Meryem A.; Evans, Karleyton C.; Selb, Juliette; Huppert, Theodore J.; Boas, David A.; Gagnon, Louis

    2014-01-01

    Calibrated functional Magnetic Resonance Imaging (fMRI) is a widely used method to investigate brain function in terms of physiological quantities such as the cerebral metabolic rate of oxygen (CMRO2). The first and one of the most common methods of fMRI calibration is hypercapnic calibration. This is achieved via simultaneous measures of blood-oxygenation-level dependent (BOLD) and the arterial spin labeling (ASL) signals during a functional task that evokes regional changes in CMRO2. A subsequent acquisition is then required during which the subject inhales carbon dioxide for short periods of time. A calibration constant, typically labeled M, is then estimated from the hypercapnic data and is subsequently used together with the BOLD-ASL recordings to compute evoked changes in CMRO2 during the functional task. The computation of M assumes a constant CMRO2 during the CO2 inhalation, an assumption that has been questioned since the origin of calibrated fMRI. In this study we used Diffuse Optical Tomography (DOT) together with BOLD and ASL – an alternative calibration method that does not require any gas manipulation and therefore no constant CMRO2 assumption - to cross-validate the estimation of M obtained from a traditional hypercapnic calibration. We found a high correlation between the M values (R=0.87, p<0.01) estimated using these two approaches. The findings serve to validate the hypercapnic fMRI calibration technique and suggest that the inter-subject variability routinely obtained for M is reproducible with an alternative method and might therefore reflect inter-subject physiological variability. PMID:25196509

  6. Implementation and validation of a new prognostic large-scale cloud and precipitation scheme for climate and data-assimilation purposes

    NASA Astrophysics Data System (ADS)

    Lopez, Philippe

    2002-01-01

    A large-scale condensation scheme, able to treat separately both atmospheric cloud condensate and precipitation content in a prognostic way, has been implemented and validated in Météo-France's operational global model, ARPEGE. The proposed scheme can be used for climate simulations and short-range numerical weather prediction, although it was originally designed for the future variational assimilation of cloud and precipitation observations. The main originalities of the scheme, compared with other existing schemes having a similar moderate level of complexity, lie in the inclusion of a prognostic variable for rain and snow content, and in the use of a simple semi-Lagrangian treatment of the fall of precipitation. The calculations of large-scale condensation/evaporation and cloud fraction are based on probability-density functions, and the parametrized microphysical processes that involve precipitation are autoconversion, collection, and evaporation/sublimation. Various observations, which include satellite data from METEOSAT and from the Defense Meteorological Satellite Program's Special Sensor Microwave Imager, have been used for validating the cloud scheme within three-dimensional ARPEGE simulations at operational resolution for cases from the Fronts and Atlantic Storm-Track Experiment. These ARPEGE simulations have also been compared with 10 km runs obtained with the Met Office's Unified Model and with the French Méso-NH research model. In addition, cloud radar, ceilometer, and lidar observations from the Atmospheric Radiation Measurement project have been utilized for validating the simulation of a synoptic winter cloud system over the southern Great Plains in the USA. The behaviour of the scheme was then assessed at a coarser resolution, with a particular focus on the zonal-mean radiative budget of the earth and the zonal-mean cloud cover. Finally, the question of the sensitivity of the results from the new scheme to various parameters has been addressed

  7. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  8. Validated spectrophotometric and chromatographic methods for simultaneous determination of ketorolac tromethamine and phenylephrine hydrochloride.

    PubMed

    Belal, T S; El-Kafrawy, D S; Mahrous, M S; Abdel-Khalek, M M; Abo-Gharam, A H

    2016-07-01

    This work describes five simple and reliable spectrophotometric and chromatographic methods for analysis of the binary mixture of ketorolac tromethamine (KTR) and phenylephrine hydrochloride (PHE). Method I is based on the use of conventional Amax and derivative spectrophotometry with the zero-crossing technique where KTR was determined using its Amax and (1)D amplitudes at 323 and 341nm respectively, while PHE was determined by measuring the (1)D amplitudes at 248.5nm. Method II involves the application of the ratio spectra derivative spectrophotometry. For KTR, 12μg/mL PHE was used as a divisor and the (1)DD amplitudes at 265nm were plotted against KTR concentrations; while - by using 4μg/mL KTR as divisor - the (1)DD amplitudes at 243.5nm were found proportional to PHE concentrations. Method III depends on ratio-difference measurement where the peak to trough amplitudes between 260 and 284nm were measured and correlated to KTR concentration. Similarly, the peak to trough amplitudes between 235 and 260nm in the PHE ratio spectra were recorded. For method IV, the two compounds were separated using Merck HPTLC sheets of silica gel 60 F254 and a mobile phase composed of chloroform/methanol/ammonia (70:30:2, by volume) followed by densitometric measurement of KTR and PHE spots at 320 and 278nm respectively. Method V depends on HPLC-DAD. Effective chromatographic separation was achieved using Zorbax eclipse plus C8 column (4.6×250mm, 5μm) with a mobile phase consisting of 0.05M o-phosphoric acid and acetonitrile (50:50, by volume) at a flow rate 1mL/min and detection at 313 and 274nm for KTR and PHE respectively. Analytical performance of the developed methods was statistically validated according to the ICH guidelines with respect to linearity, ranges, precision, accuracy, detection and quantification limits. The validated spectrophotometric and chromatographic methods were successfully applied to the simultaneous analysis of KTR and PHE in synthetic mixtures

  9. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    SciTech Connect

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  10. Stress degradation studies on Telmisartan and development of a validated method by UV spectrophotometry in bulk and pharmaceutical dosage forms

    PubMed Central

    Patel, Komal; Dhudasia, Komal; Patel, Amit; Dave, Jayant; Patel, Chaganbhai

    2011-01-01

    Aim and Background: To develop and validate a simple, precise, accurate, and stability indicating a UV-method for estimation of Telmisartan (TELM). UV, HPLC, HPTLC, and many more experiments were carried out by taking single drug and also by combining with other drugs. However, Such type of studies was not reported. Materials and Methods: In both methods, TELM has the absorbance maxima at 296 nm. Method A involves method development and validation and Method B involves forced degradation study. In these methods, methanol was used as a solvent. Linearity was observed in the concentration range of 4–16 μg/ml. Validation experiments were performed to demonstrate system suitability, specificity, precision, linearity, accuracy, robustness, LOD, and LOQ as per International Conference on Harmonization guidelines. Furthermore stability studies of TELM were carried out under acidic, alkali, neutral, oxidation, photolytic, and thermal degradation as per stability indicating assay methods. Results: The results of analysis have been validated, and recovery studies were carried out using a standard addition method by adding specific drug amount (80%, 100%, and 120%) and show recovery studies in the range (99.26–101.26)%. Conclusion: The proposed method can be successfully applied for method development, validation, and stability study of TELM. PMID:23781466

  11. Automated Method for Small-Animal PET Image Registration with Intrinsic Validation

    PubMed Central

    Pascau, Javier; Gispert, Juan Domingo; Michaelides, Michael; Thanos, Panayotis K.; Volkow, Nora D.; Vaquero, Juan José; Soto-Montenegro, Maria Luisa; Desco, Manuel

    2009-01-01

    Purpose: We propose and compare different registration approaches to align small-animal PET studies and a procedure to validate the results by means of objective registration consistency measurements. Procedures: We have applied a registration algorithm based on information theory, using different approaches to mask the reference image. The registration consistency allows for the detection of incorrect registrations. This methodology has been evaluated on a test dataset (FDG-PET rat brain images). Results: The results show that a multiresolution two-step registration approach based on the use of the whole image at the low resolution step, while masking the brain at the high resolution step, provides the best robustness (87.5% registration success) and highest accuracy (0.67-mm average). Conclusions: The major advantages of our approach are minimal user interaction and automatic assessment of the registration error, avoiding visual inspection of the results, thus facilitating the accurate, objective, and rapid analysis of large groups of rodent PET images. PMID:18670824

  12. Development, validation and acceptance of alternative methods in the quality control of vaccines: A case report.

    PubMed

    Hendriksen, C F

    1995-12-01

    Information about levels of protection against tetanus is needed both for the assessment of immune status and for the estimation of the potency of batches of tetanus toxoid. Originally, levels of protective antibodies in human serum samples were titrated in an in vivo toxin neutralization (TN) test. Potency testing was based either on an indirect protection test or a direct challenge test. In the former test, rabbits or guinea pigs are immunized and bled, followed by titration of serum samples in a TN test. In the latter test, used in the quality control of tetanus toxoid for human use, protective immunity is induced by vaccination in guinea pigs or mice and subsequently challenging them directly with tetanus toxin. In the mid-1980s, an in vitro model was developed at the National Institute of Public Health and Environmental Protection (RIVM) for estimating levels of tetanus antitoxin in serum samples. This model, the toxin-binding inhibition (ToBI) test, was validated recently for both diagnostic testing and potency testing. Although accurate estimation of antitoxin levels is more important for diagnostic testing than for potency testing, the ToBI test has been adopted for determining the immune status but not for potency testing. One major reason is that there are no official guidelines for titration of human serum samples. By contrast, potency testing is performed in accordance with the monographs of national and international pharmacopoeias, which complicates acceptance for technical and bureaucratic reasons. This paper focuses on the validation studies performed at the RIVM. In particular, attention is paid to the various problems encountered. Suggestions for the role of the European Centre for Validation of Alternative Methods (ECVAM)( *) in the quality control of vaccines are also presented.

  13. European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.

    PubMed

    Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario

    2014-08-01

    The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories.

  14. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  15. Spectrofluorimetric Method for Estimation of Curcumin in Rat Blood Plasma: Development and Validation

    NASA Astrophysics Data System (ADS)

    Trivedi, J.; Variya, B.; Gandhi, H.; Rathod, S. P.

    2016-01-01

    Curcumin is a medicinally important phytoconstituent of curcuminoids. The present study describes development of a simple method for estimation of curcumin in rat plasma. This method involves the use of spectrofluorimetry for evaluation of curcumin at 257 (Ex) and 504 nm (Em). Sample preparation involves only two steps: extraction of curcumin and drying the extract. Following this procedure, the samples are reconstituted with ethyl acetate, and relative fluorescence intensity is measured using a spectrofluorimeter. The method was validated as per CDER guidelines. The linearity of the method was found to be in the range of 100-500 ng/mL with accuracy and precision lying within 2% RSD. The LOD and LOQ were found to be 15.3 and 46.1 ng/mL, respectively. The method was applied for pharmacokinetic evaluation in rats, and AUC, Cmax, and Tmax were found to be 5580 ± 1006 h × ng/mL, 1526 ± 209 ng/mL, and 2.97 ± 0.28 h, respectively, with a plasma half-life of 1.14 ± 0.27 h.

  16. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments

    NASA Astrophysics Data System (ADS)

    Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    2015-04-01

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml-1. The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml-1. The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml-1. All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms.

  17. Novel method and experimental validation of statistical calibration via Gaussianization in hot-wire anemometry

    NASA Astrophysics Data System (ADS)

    Gluzman, Igal; Cohen, Jacob; Oshman, Yaakov

    2016-11-01

    We introduce a statistical method based on Gaussianization to estimate the nonlinear calibration curve of a hot-wire probe, that relates the input flow velocity to the output (measured) voltage. The method uses as input a measured sequence of voltage samples, corresponding to different unknown flow velocities in the desired operational range, and only two measured voltages along with their known (calibrated) flow velocities. The novel method is validated against standard calibration methods using data acquired by hot-wire probes using wind-tunnel experiments. We demonstrate our new calibration technique by placing the hot-wire probe at certain region downstream of a cube-shaped body in a free stream of air flow. For testing our calibration method we rely on flow statistics that exist, among others, in a certain region of a turbulent wake formed downstream of the cube-shaped body. The specific properties are: first, the velocity signal in the wake should be as close to Gaussian as possible. Second, the signal should cover the desired velocity range that should be calibrated. The appropriate region to place our probe is determined via computation of the first four statistical moments of the measured signals in different regions of the wake.

  18. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments.

    PubMed

    Abdel-Aziz, Omar; Ayad, Miriam F; Tadros, Mariam M

    2015-04-05

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml(-1). The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml(-1). The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml(-1). All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms.

  19. Development and validation of a fast SFC method for the analysis of flavonoids in plant extracts.

    PubMed

    Huang, Yang; Feng, Ying; Tang, Guangyun; Li, Minyi; Zhang, Tingting; Fillet, Marianne; Crommen, Jacques; Jiang, Zhengjin

    2017-03-12

    Flavonoids from plants always show a wide range of biological activities. In the present study, a rapid and highly efficient supercritical fluid chromatography (SFC) method was developed for the separation of 12 flavonoids. After careful optimization, the 12 flavonoids were baseline separated on a ZORBAX RX-SIL column using gradient elution. A 0.1% phosphoric acid solution in methanol was found to be the most suitable polar mobile phase component for the separation of flavonoids. From the viewpoint of retention and resolution, a backpressure of 200bar and a temperature of 40°C were shown to give the best results. Compared with a previously developed reverse phase liquid chromatography method, the SFC method could provide flavonoid separations that were about three times faster, while maintaining good peak shape and comparable peak efficiency. This SFC method was validated and applied to the analysis of five flavonoids (kaempferol, luteolin, quercetin, luteoloside, buddleoside) present in Chrysanthemum morifolium Ramat. from different cultivars (Chuju, Gongju, Hangju, Boju). The results indicated a good repeatability and sensitivity for the quantification of the five analytes with RSDs for overall precision lower than 3%. The limits of detection ranged from 0.73 to 2.34μg/mL, while the limits of quantification were between 2.19 and 5.86μg/mL. The method showed that SFC could be employed as a useful tool for the quality assessment of Traditional Chinese medicines (TCMs) containing flavonoids as active components.

  20. A validated stability-indicating UPLC method for desloratadine and its impurities in pharmaceutical dosage forms.

    PubMed

    Rao, Dantu Durga; Satyanarayana, N V; Malleswara Reddy, A; Sait, Shakil S; Chakole, Dinesh; Mukkanti, K

    2010-02-05

    A novel stability-indicating gradient reverse phase ultra-performance liquid chromatographic (RP-UPLC) method was developed for the determination of purity of desloratadine in presence of its impurities and forced degradation products. The method was developed using Waters Aquity BEH C18 column with mobile phase containing a gradient mixture of solvents A and B. The eluted compounds were monitored at 280nm. The run time was 8min within which desloratadine and its five impurities were well separated. Desloratadine was subjected to the stress conditions of oxidative, acid, base, hydrolytic, thermal and photolytic degradation. Desloratadine was found to degrade significantly in oxidative and thermal stress conditions and stable in acid, base, hydrolytic and photolytic degradation conditions. The degradation products were well resolved from main peak and its impurities, thus proved the stability-indicating power of the method. The developed method was validated as per ICH guidelines with respect to specificity, linearity, limit of detection, limit of quantification, accuracy, precision and robustness. This method was also suitable for the assay determination of desloratadine in pharmaceutical dosage forms.

  1. Validation of a Stability-Indicating Method for Methylseleno-l-Cysteine (l-SeMC)

    PubMed Central

    Canady, Kristin; Cobb, Johnathan; Deardorff, Peter; Larson, Jami; White, Jonathan M.; Boring, Dan

    2016-01-01

    Methylseleno-l-cysteine (l-SeMC) is a naturally occurring amino acid analogue used as a general dietary supplement and is being explored as a chemopreventive agent. As a known dietary supplement, l-SeMC is not regulated as a pharmaceutical and there is a paucity of analytical methods available. To address the lack of methodology, a stability-indicating method was developed and validated to evaluate l-SeMC as both the bulk drug and formulated drug product (400 µg Se/capsule). The analytical approach presented is a simple, nonderivatization method that utilizes HPLC with ultraviolet detection at 220 nm. A C18 column with a volatile ion-pair agent and methanol mobile phase was used for the separation. The method accuracy was 99–100% from 0.05 to 0.15 mg/mL l-SeMC for the bulk drug, and 98–99% from 0.075 to 0.15 mg/mL l-SeMC for the drug product. Method precision was <1% for the bulk drug and was 3% for the drug product. The LOQ was 0.1 µg/mL l-SeMC or 0.002 µg l-SeMC on column. PMID:26199341

  2. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  3. In-house validation and quality control of real-time PCR methods for GMO detection: a practical approach.

    PubMed

    Ciabatti, I; Froiio, A; Gatto, F; Amaddeo, D; Marchesi, U

    2006-01-01

    GMO detection and quantification methods in the EU are mainly based on real-time PCR. The analytical methods in use must be validated, first on an intra-laboratory scale and through a collaborative trial thereafter. Since a consensual protocol for intra-laboratory validation of real-time PCR methods is lacking, we provide a practical approach for the in-house validation of quantitative real-time PCR methods, establishing acceptability criteria and quality controls for PCR runs. Parameters such as limit of detection, limit of quantification, precision, trueness, linear dynamic range, PCR efficiency, robustness and specificity are considered. The protocol is sufficiently detailed to be directly applicable, increases the reliability of results and their harmonization among different laboratories, and represents a necessary preliminary step before proceeding to a time-consuming and costly full validation study.

  4. Development and validation of personal monitoring methods for low levels of acrylonitrile in workplace atmosphere. I. Test atmosphere generation and solvent desorption methods

    SciTech Connect

    Melcher, R.G.; Borders, R.A.; Coyne, L.B.

    1986-03-01

    The purpose of this study was to optimize monitoring methods and to investigate new technology for the determination of low levels of acrylonitrile (0.05 to 5 ppm) in workplace atmospheres. In the first phase of the study, a dynamic atmosphere generation system was developed to produce low levels of acrylonitrile in simulated workplace atmospheres. Various potential sorbents were investigated in the second phase, and the candidate methods were compared in a laboratory validation study over a concentration range from 0.05 to 5 ppm acrylonitrile in the presence of potential interferences and under relative humidity conditions from 30% to 95% RH. A collection tube containing 600 mg Pittsburgh coconut base charcoal was found to be the optimum tube for sampling for a full 8 -hr shift. No breakthrough was observed over the concentrations and humidities tested. The recovery was 91.3% with a total relative precision of +/-21% over the test range, and the recovery was not affected by storage for up to five weeks.

  5. Development and Validation of a New Fallout Transport Method Using Variable Spectral Winds

    NASA Astrophysics Data System (ADS)

    Hopkins, Arthur Thomas

    A new method has been developed to incorporate variable winds into fallout transport calculations. The method uses spectral coefficients derived by the National Meteorological Center. Wind vector components are computed with the coefficients along the trajectories of falling particles. Spectral winds are used in the two-step method to compute dose rate on the ground, downwind of a nuclear cloud. First, the hotline is located by computing trajectories of particles from an initial, stabilized cloud, through spectral winds, to the ground. The connection of particle landing points is the hotline. Second, dose rate on and around the hotline is computed by analytically smearing the falling cloud's activity along the ground. The feasibility of using specgtral winds for fallout particle transport was validated by computing Mount St. Helens ashfall locations and comparing calculations to fallout data. In addition, an ashfall equation was derived for computing volcanic ash mass/area on the ground. Ashfall data and the ashfall equation were used to back-calculate an aggregated particle size distribution for the Mount St. Helens eruption cloud. Further validation was performed by comparing computed and actual trajectories of a high explosive dust cloud (DIRECT COURSE). Using an error propagation formula, it was determined that uncertainties in spectral wind components produce less than four percent of the total dose rate variance. In summary, this research demonstrated the feasibility of using spectral coefficients for fallout transport calculations, developed a two-step smearing model to treat variable winds, and showed that uncertainties in spectral winds do not contribute significantly to the error in computed dose rate.

  6. A validated spectrofluorimetric method for the determination of nifuroxazide through coumarin formation using experimental design

    PubMed Central

    2013-01-01

    Background Nifuroxazide (NF) is an oral nitrofuran antibiotic, having a wide range of bactericidal activity against gram positive and gram negative enteropathogenic organisms. It is formulated either in single form, as intestinal antiseptic or in combination with drotaverine (DV) for the treatment of gastroenteritis accompanied with gastrointestinal spasm. Spectrofluorimetry is a convenient and sensitive technique for pharmaceutical quality control. The new proposed spectrofluorimetric method allows its determination either in single form or in binary mixture with DV. Furthermore, experimental conditions were optimized using the new approach: Experimental design, which has many advantages over the old one, one variable at a time (OVAT approach). Results A novel and sensitive spectrofluorimetric method was designed and validated for the determination of NF in pharmaceutical formulation. The method was based upon the formation of a highly fluorescent coumarin compound by the reaction between NF and ethylacetoacetate (EAA) using sulfuric acid as catalyst. The fluorescence was measure