Science.gov

Sample records for purpose validated method

  1. Validity of a simple videogrammetric method to measure the movement of all hand segments for clinical purposes.

    PubMed

    Sancho-Bru, Joaquín L; Jarque-Bou, Néstor J; Vergara, Margarita; Pérez-González, Antonio

    2014-02-01

    Hand movement measurement is important in clinical, ergonomics and biomechanical fields. Videogrammetric techniques allow the measurement of hand movement without interfering with the natural hand behaviour. However, an accurate measurement of the hand movement requires the use of a high number of markers, which limits its applicability for the clinical practice (60 markers would be needed for hand and wrist). In this work, a simple method that uses a reduced number of markers (29), based on a simplified kinematic model of the hand, is proposed and evaluated. A set of experiments have been performed to evaluate the errors associated with the kinematic simplification, together with the evaluation of its accuracy, repeatability and reproducibility. The global error attributed to the kinematic simplification was 6.68°. The method has small errors in repeatability and reproducibility (3.43° and 4.23°, respectively) and shows no statistically significant difference with the use of electronic goniometers. The relevance of the work lies in the ability of measuring all degrees of freedom of the hand with a reduced number of markers without interfering with the natural hand behaviour, which makes it suitable for its use in clinical applications, as well as for ergonomic and biomechanical purposes.

  2. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  3. Fit for purpose validated method for the determination of the strontium isotopic signature in mineral water samples by multi-collector inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Brach-Papa, Christophe; Van Bocxstaele, Marleen; Ponzevera, Emmanuel; Quétel, Christophe R.

    2009-03-01

    A robust method allowing the routine determination of n( 87Sr)/ n( 86Sr) with at least five significant decimal digits for large sets of mineral water samples is described. It is based on 2 consecutive chromatographic separations of Sr associated to multi-collector inductively coupled plasma mass spectrometry (MC-ICPMS) measurements. Separations are performed using commercial pre-packed columns filled with "Sr resin" to overcome isobaric interferences affecting the determination of strontium isotope ratios. The careful method validation scheme applied is described. It included investigations on all parameters influencing both chromatographic separations and MC-ICPMS measurements, and also the test on a synthetic sample made of an aliquot of the NIST SRM 987 certified reference material dispersed in a saline matrix to mimic complex samples. Correction for mass discrimination was done internally using the n( 88Sr)/ n( 86Sr) ratio. For comparing mineral waters originating from different geological backgrounds or identifying counterfeits, calculations involved the well known consensus value (1/0.1194) ± 0 as reference. The typical uncertainty budget estimated for these results was 40 'ppm' relative ( k = 2). It increased to 150 'ppm' ( k = 2) for the establishment of stand alone results, taking into account a relative difference of about 126 'ppm' systematically observed between measured and certified values of the NIST SRM 987. In case there was suspicion of a deviation of the n( 88Sr)/ n( 86Sr) ratio (worst case scenario) our proposal was to use the NIST SRM 987 value 8.37861 ± 0.00325 ( k = 2) as reference, and assign a typical relative uncertainty budget of 300 'ppm' ( k = 2). This method is thus fit for purpose and was applied to eleven French samples.

  4. Validation of NDE methods

    NASA Astrophysics Data System (ADS)

    Phares, Brent M.; Washer, Glenn A.; Moore, Mark; Graybeal, Benjamin A.

    1999-02-01

    The NDE Validation Center is a national resource for the independent and quantitative evaluation of existing and emerging NDE techniques. The resources of the NDE Validation Center are available to federal and state agencies, the academic community, and industry. The NDE Validation Center is designed to perform critical evaluations of NDE technologies and to provide a source of information and guidance to users and developers of NDE systems. This paper describes the resources available at the Center and the initial efforts to validate the visual inspection of highway bridges. Efforts to evaluate various NDE methods for the inspection of bridge hanger pins are also described.

  5. Simple validated LC-MS/MS method for the determination of atropine and scopolamine in plasma for clinical and forensic toxicological purposes.

    PubMed

    Koželj, Gordana; Perharič, Lucija; Stanovnik, Lovro; Prosen, Helena

    2014-08-01

    A liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the determination of atropine and scopolamine in 100μL human plasma was developed and validated. Sample pretreatment consisted of protein precipitation with acetonitrile followed by a concentration step. Analytes and levobupivacaine (internal standard) were separated on a Zorbax XDB-CN column (75mm×4.6mm i.d., 3.5μm) with gradient elution (purified water, acetonitrile, formic acid). The triple quadrupole MS was operated in ESI positive mode. Matrix effect was estimated for deproteinised plasma samples. Selected reaction monitoring (SRM) was used for quantification in the range of 0.10-50.00ng/mL. Interday precision for both tropanes and intraday precision for atropine was <10%, intraday precision for scopolamine was <14% and <18% at lower limit of quantification (LLOQ). Mean interday and intraday accuracies for atropine were within ±7% and for scopolamine within ±11%. The method can be used for determination of therapeutic and toxic levels of both compounds and has been successfully applied to a study of pharmacodynamic and pharmacokinetic properties of tropanes, where plasma samples of volunteers were collected at fixed time intervals after ingestion of a buckwheat meal, spiked with five low doses of tropanes.

  6. Homework Purpose Scale for Middle School Students: A Validation Study

    ERIC Educational Resources Information Center

    Xu, Jianzhong

    2011-01-01

    The purpose of the present study is to test the validity of scores on the Homework Purpose Scale (HPS) for middle school students. The participants were 1,181 eighth graders in the southeastern United States, including (a) 699 students in urban school districts and (b) 482 students in rural school districts. First, confirmatory factor analysis was…

  7. Review of surface steam sterilization for validation purposes.

    PubMed

    van Doornmalen, Joost; Kopinga, Klaas

    2008-03-01

    Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.

  8. Review of surface steam sterilization for validation purposes.

    PubMed

    van Doornmalen, Joost; Kopinga, Klaas

    2008-03-01

    Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes. PMID:18313509

  9. VAN method lacks validity

    NASA Astrophysics Data System (ADS)

    Jackson, David D.; Kagan, Yan Y.

    Varotsos and colleagues (the VAN group) claim to have successfully predicted many earthquakes in Greece. Several authors have refuted these claims, as reported in the May 27,1996, special issue of Geophysical Research Letters and a recent book, A Critical Review of VAN [Lighthill 1996]. Nevertheless, the myth persists. Here we summarize why the VAN group's claims lack validity.The VAN group observes electrical potential differences that they call “seismic electric signals” (SES) weeks before and hundreds of kilometers away from some earthquakes, claiming that SES are somehow premonitory. This would require that increases in stress or decreases in strength cause the electrical variations, or that some regional process first causes the electrical signals and then helps trigger the earthquakes. Here we adopt their notation SES to refer to the electrical variations, without accepting any link to the quakes.

  10. Development of a systematic computer vision-based method to analyse and compare images of false identity documents for forensic intelligence purposes-Part I: Acquisition, calibration and validation issues.

    PubMed

    Auberson, Marie; Baechler, Simon; Zasso, Michaël; Genessay, Thibault; Patiny, Luc; Esseiva, Pierre

    2016-03-01

    Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be

  11. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology.

  12. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. PMID:24958671

  13. Purpose in Life Test assessment using latent variable methods.

    PubMed

    Harlow, L L; Newcomb, M D; Bentler, P M

    1987-09-01

    A psychometric assessment was conducted on a slightly revised version of the Purpose in Life Test (PIL-R). Factor analyses revealed a large general factor plus four primary factors comprising lack of purpose in life, positive sense of purpose, motivation for meaning, and existential confusion. Validity models showed that the PIL-R was positively related to a construct of happiness and was negatively related to suicidality and meaninglessness. Reliability estimates ranged from 0.78 to 0.86. The revised version can be presented compactly and may be less confusing to subjects than the original PIL. PMID:3664045

  14. ASTM Validates Air Pollution Test Methods

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  15. Fit-for-purpose bioanalytical cross-validation for LC-MS/MS assays in clinical studies.

    PubMed

    Xu, Xiaohui; Ji, Qin C; Jemal, Mohammed; Gleason, Carol; Shen, Jim X; Stouffer, Bruce; Arnold, Mark E

    2013-01-01

    The paradigm shift of globalized research and conducting clinical studies at different geographic locations worldwide to access broader patient populations has resulted in increased need of correlating bioanalytical results generated in multiple laboratories, often across national borders. Cross-validations of bioanalytical methods are often implemented to assure the equivalency of the bioanalytical results is demonstrated. Regulatory agencies, such as the US FDA and European Medicines Agency, have included the requirement of cross-validations in their respective bioanalytical validation guidance and guidelines. While those documents provide high-level expectations, the detailed implementation is at the discretion of each individual organization. At Bristol-Myers Squibb, we practice a fit-for-purpose approach for conducting cross-validations for small-molecule bioanalytical methods using LC-MS/MS. A step-by-step proposal on the overall strategy, procedures and technical details for conducting a successful cross-validation is presented herein. A case study utilizing the proposed cross-validation approach to rule out method variability as the potential cause for high variance observed in PK studies is also presented. PMID:23256474

  16. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  17. Uncertainty profiles for the validation of analytical methods.

    PubMed

    Saffaj, T; Ihssane, B

    2011-09-15

    This article aims to expose a new global strategy for the validation of analytical methods and the estimation of measurement uncertainty. Our purpose is to allow to researchers in the field of analytical chemistry get access to a powerful tool for the evaluation of quantitative analytical procedures. Indeed, the proposed strategy facilitates analytical validation by providing a decision tool based on the uncertainty profile and the β-content tolerance interval. Equally important, this approach allows a good estimate of measurement uncertainty by using data validation and without recourse to other additional experiments. In the example below, we confirmed the applicability of this new strategy for the validation of a chromatographic bioanalytical method and the good estimate of the measurement uncertainty without referring to any extra effort and additional experiments. A comparative study with the SFSTP approach showed that both strategies have selected the same calibration functions. The holistic character of the measurement uncertainty compared to the total error was influenced by our choice of profile uncertainty. Nevertheless, we think that the adoption of the uncertainty in the validation stage controls the risk of using the analytical method in routine phase.

  18. [Methods of risk assessment and their validation].

    PubMed

    Baracco, Alessandro

    2014-01-01

    The review of the literature data shows several methods for the the risks assessment of biomnechanical overload of the musculoskeletal system in activities with repetitive strain of the upper limbs and manual material handling. The application of these methods should allow the quantification ofriskfor the working population, the identification of the preventive measures to reduce the risk and their effectiveness and thle design of a specific health surveillance scheme. In this paper we analyze the factors which must be taken into account in Occupational Medicine to implement a process of validation of these methods. In conclusion we believe it will necessary in the future the availability of new methods able to analyze and reduce the risk already in the design phase of the production process. PMID:25558718

  19. Validation of self-reported motor-vehicle crash and related work leave in a multi-purpose prospective cohort.

    PubMed

    Pons-Villanueva, Juan; Seguí-Gómez, María

    2010-12-01

    Validation studies about self-reported motor vehicle crashes and related work leave are scarce. The Seguimiento Universidad de Navarra (SUN) is a multi-purpose cohort study undertaken in Spain. Motor vehicle crash risk factors are assessed within it. The objective of this study was to validate the incidents of self-reported motor vehicle crashes (MVC) and related work leave through mailing and clinical notes' review. The method included questions in the cohort's questionnaire regarding motor vehicle crash incident and work leave. We made both a re-test and a criterion validity study for self-reported answers. The results show a moderate κ Cohen's correlation coefficient for both events. Re-test reliability κ for MVC was 0.55, for MVC-related work leave it was 0.53. Criterion validity κ was 0.25 for MVC and 0.25 for MVC-related work leave. These results show a moderate agreement for re-test both for MVC and work MVC-related leave. For criterion validity, the results show a fair agreement. The magnitude of the agreement is similar to other similar studies and allows the use of these data in epidemiological studies.

  20. Validation of analytical methods involved in dissolution assays: acceptance limits and decision methodologies.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-11-01

    Dissolution tests are key elements to ensure continuing product quality and performance. The ultimate goal of these tests is to assure consistent product quality within a defined set of specification criteria. Validation of an analytical method aimed at assessing the dissolution profile of products or at verifying pharmacopoeias compliance should demonstrate that this analytical method is able to correctly declare two dissolution profiles as similar or drug products as compliant with respect to their specifications. It is essential to ensure that these analytical methods are fit for their purpose. Method validation is aimed at providing this guarantee. However, even in the ICHQ2 guideline there is no information explaining how to decide whether the method under validation is valid for its final purpose or not. Are the entire validation criterion needed to ensure that a Quality Control (QC) analytical method for dissolution test is valid? What acceptance limits should be set on these criteria? How to decide about method's validity? These are the questions that this work aims at answering. Focus is made to comply with the current implementation of the Quality by Design (QbD) principles in the pharmaceutical industry in order to allow to correctly defining the Analytical Target Profile (ATP) of analytical methods involved in dissolution tests. Analytical method validation is then the natural demonstration that the developed methods are fit for their intended purpose and is not any more the inconsiderate checklist validation approach still generally performed to complete the filing required to obtain product marketing authorization. PMID:23084050

  1. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  2. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  3. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  4. Some useful statistical methods for model validation.

    PubMed Central

    Marcus, A H; Elias, R W

    1998-01-01

    Although formal hypothesis tests provide a convenient framework for displaying the statistical results of empirical comparisons, standard tests should not be used without consideration of underlying measurement error structure. As part of the validation process, predictions of individual blood lead concentrations from models with site-specific input parameters are often compared with blood lead concentrations measured in field studies that also report lead concentrations in environmental media (soil, dust, water, paint) as surrogates for exposure. Measurements of these environmental media are subject to several sources of variability, including temporal and spatial sampling, sample preparation and chemical analysis, and data entry or recording. Adjustments for measurement error must be made before statistical tests can be used to empirically compare environmental data with model predictions. This report illustrates the effect of measurement error correction using a real dataset of child blood lead concentrations for an undisclosed midwestern community. We illustrate both the apparent failure of some standard regression tests and the success of adjustment of such tests for measurement error using the SIMEX (simulation-extrapolation) procedure. This procedure adds simulated measurement error to model predictions and then subtracts the total measurement error, analogous to the method of standard additions used by analytical chemists. Images Figure 1 Figure 3 PMID:9860913

  5. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. PMID:23246613

  6. The method of measurement system software automatic validation using business rules management system

    NASA Astrophysics Data System (ADS)

    Zawistowski, Piotr

    2015-09-01

    The method of measurement system software automatic validation using business rules management system (BRMS) is discussed in this paper. The article contains a description of the new approach to measurement systems execution validation, a description of the implementation of the system that supports mentioned validation and examples documenting the correctness of the approach. In the new approach BRMS are used for measurement systems execution validation. Such systems have not been used for software execution validation nor for measurement systems. The benefits of using them for the listed purposes are discussed as well.

  7. Evaluating regional vulnerability to climate change: purposes and methods

    SciTech Connect

    Malone, Elizabeth L.; Engle, Nathan L.

    2011-03-15

    As the emphasis in climate change research, international negotiations, and developing-country activities has shifted from mitigation to adaptation, vulnerability has emerged as a bridge between impacts on one side and the need for adaptive changes on the other. Still, the term vulnerability remains abstract, its meaning changing with the scale, focus, and purpose of each assessment. Understanding regional vulnerability has advanced over the past several decades, with studies using a combination of indicators, case studies and analogues, stakeholder-driven processes, and scenario-building methodologies. As regions become increasingly relevant scales of inquiry for bridging the aggregate and local, for every analysis, it is perhaps most appropriate to ask three “what” questions: “What/who is vulnerable?,” “What is vulnerability?,” and “Vulnerable to what?” The answers to these questions will yield different definitions of vulnerability as well as different methods for assessing it.

  8. Purpose and methods of a Pollution Prevention Awareness Program

    SciTech Connect

    Flowers, P.A.; Irwin, E.F.; Poligone, S.E.

    1994-08-15

    The purpose of the Pollution Prevention Awareness Program (PPAP), which is required by DOE Order 5400.1, is to foster the philosophy that prevention is superior to remediation. The goal of the program is to incorporate pollution prevention into the decision-making process at every level throughout the organization. The objectives are to instill awareness, disseminate information, provide training and rewards for identifying the true source or cause of wastes, and encourage employee participation in solving environmental issues and preventing pollution. PPAP at the Oak Ridge Y-12 Plant was created several years ago and continues to grow. We believe that we have implemented several unique methods of communicating environmental awareness to promote a more active work force in identifying ways of reducing pollution.

  9. A Clinical Method for Identifying Scapular Dyskinesis, Part 2: Validity

    PubMed Central

    Tate, Angela R; McClure, Philip; Kareha, Stephen; Irwin, Dominic; Barbe, Mary F

    2009-01-01

    Context: Although clinical methods for detecting scapular dyskinesis have been described, evidence supporting the validity of these methods is lacking. Objective: To determine the validity of the scapular dyskinesis test, a visually based method of identifying abnormal scapular motion. A secondary purpose was to explore the relationship between scapular dyskinesis and shoulder symptoms. Design: Validation study comparing 3-dimensional measures of scapular motion among participants clinically judged as having either normal motion or scapular dyskinesis. Setting: University athletic training facilities. Patients or Other Participants: A sample of 142 collegiate athletes (National Collegiate Athletic Association Division I and Division III) participating in sports requiring overhead use of the arm was rated, and 66 of these underwent 3-dimensional testing. Intervention(s): Volunteers were viewed by 2 raters while performing weighted shoulder flexion and abduction. The right and left sides were rated independently as normal, subtle dyskinesis, or obvious dyskinesis using the scapular dyskinesis test. Symptoms were assessed using the Penn Shoulder Score. Main Outcome Measure(s): Athletes judged as having either normal motion or obvious dyskinesis underwent 3-dimensional electromagnetic kinematic testing while performing the same movements. The kinematic data from both groups were compared via multifactor analysis of variance with post hoc testing using the least significant difference procedure. The relationship between symptoms and scapular dyskinesis was evaluated by odds ratios. Results: Differences were found between the normal and obvious dyskinesis groups. Participants with obvious dyskinesis showed less scapular upward rotation (P < .001), less clavicular elevation (P < .001), and greater clavicular protraction (P  =  .044). The presence of shoulder symptoms was not different between the normal and obvious dyskinesis volunteers (odds ratio  =  0.79, 95

  10. The Value of Qualitative Methods in Social Validity Research

    ERIC Educational Resources Information Center

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based…

  11. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  12. Measurement practices: methods for developing content-valid student examinations.

    PubMed

    Bridge, Patrick D; Musial, Joseph; Frank, Robert; Roe, Thomas; Sawilowsky, Shlomo

    2003-07-01

    Measurement experts generally agree that a systematic approach to test construction will probably result in an instrument with sound psychometric properties. One fundamental method is called the blueprint approach to test construction. A test blueprint is a tool used in the process for generating content-valid exams by linking the subject matter delivered during instruction and the items appearing on the test. Unfortunately, this procedure as well as other educational measurement practices is often overlooked A survey of curriculum administrators at 144 United States and international medical schools was conducted to assess the importance and prevalence of test blueprinting in their school. Although most found test blueprinting to be very important, few require the practice. The purpose of this paper is to review the fundamental principals associated with achieving a high level of content validity when developing tests for students. The short-term efforts necessary to develop and integrate measurement theory into practice will lead to long-term gains for students, faculty and academic institutions.

  13. Estimates of External Validity Bias When Impact Evaluations Select Sites Purposively

    ERIC Educational Resources Information Center

    Stuart, Elizabeth A.; Olsen, Robert B.; Bell, Stephen H.; Orr, Larry L.

    2012-01-01

    While there has been some increasing interest in external validity, most work to this point has been in assessing the similarity of a randomized trial sample and a population of interest (e.g., Stuart et al., 2010; Tipton, 2011). The goal of this research is to calculate empirical estimates of the external validity bias in educational intervention…

  14. Purpose in Life in Emerging Adulthood: Development and Validation of a New Brief Measure

    PubMed Central

    Hill, Patrick L.; Edmonds, Grant W.; Peterson, Missy; Luyckx, Koen; Andrews, Judy A.

    2015-01-01

    Accruing evidence points to the value of studying purpose in life across adolescence and emerging adulthood. Research though is needed to understand the unique role of purpose in life in predicting well-being and developmentally relevant outcomes during emerging adulthood. The current studies (total n = 669) found support for the development of a new brief measure of purpose in life using data from American and Canadian samples, while demonstrating evidence for two important findings. First, purpose in life predicted well-being during emerging adulthood, even when controlling for the Big Five personality traits. Second, purpose in life was positively associated with self-image and negatively associated with delinquency, again controlling for personality traits. Findings are discussed with respect to how studying purpose in life can help understand which individuals are more likely to experience positive transitions into adulthood. PMID:26958072

  15. Capillary electrophoresis as an orthogonal technique in HPLC method validation.

    PubMed

    Jimidar, M Ilias; De Smet, Maurits; Sneyers, Rudy; Van Ael, Willy; Janssens, Willy; Redlich, Dirk; Cockaerts, Paul

    2003-01-01

    High-performance liquid chromatography is usually used to assay the main compound and organic impurity content of drug substance and drug product during pharmaceutical development. A crucial validation parameter of these methods is specificity--the ability to unequivocally assess the analyte in the presence of component expected to be present. Typically, these include impurities, degradation products, and matrices. Besides adequate chromatographic separation with sufficient selectivity, additional 2- or 3-D spectroscopic or chromatographic tools are frequently necessary for this purpose. In our current practice, HPLC is used with ultraviolet photodiode array detection and on-line mass spectrometry (LC-UVDAD-MS) during the assessment of specificity. Although this approach is very powerful and can solve the majority of problems, separation of isomers of the main compound is still difficult. Since HPLC usually cannot offer the required selectivity and because of the similar molecular weights, structural isomers are not specifically detected using LC-MS. Capillary electrophoresis, on the other hand, offers high separation efficiency and can be applied as an adjunct to HPLC. Therefore, a set of highly selective CE methods is used orthogonally in the specificity assessment of HPLC methods.

  16. Exploring valid and reliable assessment methods for care management education.

    PubMed

    Gennissen, Lokke; Stammen, Lorette; Bueno-de-Mesquita, Jolien; Wieringa, Sietse; Busari, Jamiu

    2016-07-01

    Purpose It is assumed that the use of valid and reliable assessment methods can facilitate the development of medical residents' management and leadership competencies. To justify this assertion, the perceptions of an expert panel of health care leaders were explored on assessment methods used for evaluating care management (CM) development in Dutch residency programs. This paper aims to investigate how assessors and trainees value these methods and examine for any inherent benefits or shortcomings when they are applied in practice. Design/methodology/approach A Delphi survey was conducted among members of the platform for medical leadership in The Netherlands. This panel of experts was made up of clinical educators, practitioners and residents interested in CM education. Findings Of the respondents, 40 (55.6 per cent) and 31 (43 per cent) participated in the first and second rounds of the Delphi survey, respectively. The respondents agreed that assessment methods currently being used to measure residents' CM competencies were weak, though feasible for use in many residency programs. Multi-source feedback (MSF, 92.1 per cent), portfolio/e-portfolio (86.8 per cent) and knowledge testing (76.3 per cent) were identified as the most commonly known assessment methods with familiarity rates exceeding 75 per cent. Practical implications The findings suggested that an "assessment framework" comprising MSF, portfolios, individual process improvement projects or self-reflections and observations in clinical practice should be used to measure CM competencies in residents. Originality/value This study reaffirms the need for objective methods to assess CM skills in post-graduate medical education, as there was not a single assessment method that stood out as the best instrument. PMID:27397747

  17. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines. PMID:27179186

  18. Comparative assessment of bioanalytical method validation guidelines for pharmaceutical industry.

    PubMed

    Kadian, Naveen; Raju, Kanumuri Siva Rama; Rashid, Mamunur; Malik, Mohd Yaseen; Taneja, Isha; Wahajuddin, Muhammad

    2016-07-15

    The concepts, importance, and application of bioanalytical method validation have been discussed for a long time and validation of bioanalytical methods is widely accepted as pivotal before they are taken into routine use. United States Food and Drug Administration (USFDA) guidelines issued in 2001 have been referred for every guideline released ever since; may it be European Medical Agency (EMA) Europe, National Health Surveillance Agency (ANVISA) Brazil, Ministry of Health and Labour Welfare (MHLW) Japan or any other guideline in reference to bioanalytical method validation. After 12 years, USFDA released its new draft guideline for comments in 2013, which covers the latest parameters or topics encountered in bioanalytical method validation and approached towards the harmonization of bioanalytical method validation across the globe. Even though the regulatory agencies have general agreement, significant variations exist in acceptance criteria and methodology. The present review highlights the variations, similarities and comparison between bioanalytical method validation guidelines issued by major regulatory authorities worldwide. Additionally, other evaluation parameters such as matrix effect, incurred sample reanalysis including other stability aspects have been discussed to provide an ease of access for designing a bioanalytical method and its validation complying with the majority of drug authority guidelines.

  19. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    NASA Astrophysics Data System (ADS)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  20. Objectivity and validity of EMG method in estimating anaerobic threshold.

    PubMed

    Kang, S-K; Kim, J; Kwon, M; Eom, H

    2014-08-01

    The purposes of this study were to verify and compare the performances of anaerobic threshold (AT) point estimates among different filtering intervals (9, 15, 20, 25, 30 s) and to investigate the interrelationships of AT point estimates obtained by ventilatory threshold (VT) and muscle fatigue thresholds using electromyographic (EMG) activity during incremental exercise on a cycle ergometer. 69 untrained male university students, yet pursuing regular exercise voluntarily participated in this study. The incremental exercise protocol was applied with a consistent stepwise increase in power output of 20 watts per minute until exhaustion. AT point was also estimated in the same manner using V-slope program with gas exchange parameters. In general, the estimated values of AT point-time computed by EMG method were more consistent across 5 filtering intervals and demonstrated higher correlations among themselves when compared with those values obtained by VT method. The results found in the present study suggest that the EMG signals could be used as an alternative or a new option in estimating AT point. Also the proposed computing procedure implemented in Matlab for the analysis of EMG signals appeared to be valid and reliable as it produced nearly identical values and high correlations with VT estimates. PMID:24988194

  1. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    ERIC Educational Resources Information Center

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  2. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    ERIC Educational Resources Information Center

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  3. High Explosive Verification and Validation: Systematic and Methodical Approach

    NASA Astrophysics Data System (ADS)

    Scovel, Christina; Menikoff, Ralph

    2011-06-01

    Verification and validation of high explosive (HE) models does not fit the standard mold for several reasons. First, there are no non-trivial test problems with analytic solutions. Second, an HE model depends on a burn rate and the equation of states (EOS) of both the reactants and products. Third, there is a wide range of detonation phenomena from initiation under various stimuli to propagation of curved detonation fronts with non-rigid confining materials. Fourth, in contrast to a shock wave in a non-reactive material, the reaction-zone width is physically significant and affects the behavior of a detonation wave. Because of theses issues, a systematic and methodical approach to HE V & V is needed. Our plan is to build a test suite from the ground up. We have started with the cylinder test and have run simulations with several EOS models and burn models. We have compared with data and cross-compared the different runs to check on the sensitivity to model parameters. A related issue for V & V is what experimental data are available for calibrating and testing models. For this purpose we have started a WEB based high explosive database (HED). The current status of HED will be discussed.

  4. Development and Fit-for-Purpose Validation of a Soluble Human Programmed Death-1 Protein Assay.

    PubMed

    Ni, Yan G; Yuan, Xiling; Newitt, John A; Peterson, Jon E; Gleason, Carol R; Haulenbeek, Jonathan; Santockyte, Rasa; Lafont, Virginie; Marsilio, Frank; Neely, Robert J; DeSilva, Binodh; Piccoli, Steven P

    2015-07-01

    Programmed death-1 (PD-1) protein is a co-inhibitory receptor which negatively regulates immune cell activation and permits tumors to evade normal immune defense. Anti-PD-1 antibodies have been shown to restore immune cell activation and effector function-an exciting breakthrough in cancer immunotherapy. Recent reports have documented a soluble form of PD-1 (sPD-1) in the circulation of normal and disease state individuals. A clinical assay to quantify sPD-1 would contribute to the understanding of sPD-1-function and facilitate the development of anti-PD-1 drugs. Here, we report the development and validation of a sPD-1 protein assay. The assay validation followed the framework for full validation of a biotherapeutic pharmacokinetic assay. A purified recombinant human PD-1 protein was characterized extensively and was identified as the assay reference material which mimics the endogenous analyte in structure and function. The lower limit of quantitation (LLOQ) was determined to be 100 pg/mL, with a dynamic range spanning three logs to 10,000 pg/mL. The intra- and inter-assay imprecision were ≤15%, and the assay bias (percent deviation) was ≤10%. Potential matrix effects were investigated in sera from both normal healthy volunteers and selected cancer patients. Bulk-prepared frozen standards and pre-coated Streptavidin plates were used in the assay to ensure consistency in assay performance over time. This assay appears to specifically measure total sPD-1 protein since the human anti-PD-1 antibody, nivolumab, and the endogenous ligands of PD-1 protein, PDL-1 and PDL-2, do not interfere with the assay.

  5. Implementation and validation of a multi-purpose virtual spectrometer for large systems in complex environments.

    PubMed

    Barone, Vincenzo; Baiardi, Alberto; Biczysko, Malgorzata; Bloino, Julien; Cappelli, Chiara; Lipparini, Filippo

    2012-09-28

    Despite impressive advances of computational spectroscopy, a robust and user-friendly multi-frequency virtual spectrometer is not yet available. This contribution summarises ongoing efforts in our research group toward the implementation and validation of such a tool with special reference to the building blocks of biomolecules in their natural environment. Our integrated computational tool allows the computation of several kinds of spectra, including vibrational (e.g. IR, VCD), electronic (e.g. absorption, emission, ECD) as well as magnetic resonance (e.g. ESR, NMR) for both closed- and open-shell systems in vacuo and in condensed phases, and includes facilities for drawing, comparing, and modifying all the computed spectra. A number of test cases involving a combination of different spectroscopic ranges will be discussed in order to point out strengths, limitations, and ongoing developments of our research plan.

  6. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    ERIC Educational Resources Information Center

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  7. Terrestrial gastropods (Helix spp) as sentinels of primary DNA damage for biomonitoring purposes: a validation study.

    PubMed

    Angeletti, Dario; Sebbio, Claudia; Carere, Claudio; Cimmaruta, Roberta; Nascetti, Giuseppe; Pepe, Gaetano; Mosesso, Pasquale

    2013-04-01

    We validated the alkaline comet assay in two species of land snail (Helix aspersa and Helix vermiculata) to test their suitability as sentinels for primary DNA damage in polluted environments. The study was conducted under the framework of a biomonitoring program for a power station in Central Italy that had recently been converted from oil to coal-fired plant. After optimizing test conditions, the comet assay was used to measure the % Tail DNA induced by in vitro exposure of hemocytes to different concentrations of a reactive oxygen species (H2 O2 ). The treatment induced significant increases in this parameter with a concentration effect, indicating the effectiveness of the assay in snail hemocytes. After evaluating possible differences between the two species, we sampled them in three field sites at different distances from the power station, and in two reference sites assumed to have low or no levels of pollution. No species differences emerged. Percent Tail DNA values in snails from the sites near the power station were higher than those from control sites. An inverse correlation emerged between % Tail DNA and distance from the power station, suggesting that the primary DNA damage decreased as distance increased away from the pollution source. Detection of a gradient of heavy metal concentration in snail tissues suggests that these pollutants are a potential cause of the observed pattern. The comet assay appears to be a suitable assay and Helix spp. populations suitable sentinels to detect the genotoxic impact of pollutants. PMID:23444166

  8. Terrestrial gastropods (Helix spp) as sentinels of primary DNA damage for biomonitoring purposes: a validation study.

    PubMed

    Angeletti, Dario; Sebbio, Claudia; Carere, Claudio; Cimmaruta, Roberta; Nascetti, Giuseppe; Pepe, Gaetano; Mosesso, Pasquale

    2013-04-01

    We validated the alkaline comet assay in two species of land snail (Helix aspersa and Helix vermiculata) to test their suitability as sentinels for primary DNA damage in polluted environments. The study was conducted under the framework of a biomonitoring program for a power station in Central Italy that had recently been converted from oil to coal-fired plant. After optimizing test conditions, the comet assay was used to measure the % Tail DNA induced by in vitro exposure of hemocytes to different concentrations of a reactive oxygen species (H2 O2 ). The treatment induced significant increases in this parameter with a concentration effect, indicating the effectiveness of the assay in snail hemocytes. After evaluating possible differences between the two species, we sampled them in three field sites at different distances from the power station, and in two reference sites assumed to have low or no levels of pollution. No species differences emerged. Percent Tail DNA values in snails from the sites near the power station were higher than those from control sites. An inverse correlation emerged between % Tail DNA and distance from the power station, suggesting that the primary DNA damage decreased as distance increased away from the pollution source. Detection of a gradient of heavy metal concentration in snail tissues suggests that these pollutants are a potential cause of the observed pattern. The comet assay appears to be a suitable assay and Helix spp. populations suitable sentinels to detect the genotoxic impact of pollutants.

  9. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  10. A test of a proposed method for estimating validity of a multivariate composite predictor: extending the job component validity model.

    PubMed

    Hoffman, Calvin C; Morris, David; Luck, Gypsi

    2009-12-01

    In this study, a proposed extension to the job component validity model from the Position Analysis Questionnaire was tested. Job component validity, a form of synthetic validation, allows researchers to select useful predictors and to estimate the criterion-related validity of tests based on conducting a job analysis which includes the Position Analysis Questionnaire. Morris and colleagues described a method for estimating the multiple correlation of a test battery assembled via job component validity estimates. In the current study, job component validity estimates, derived from the multiple correlation procedure proposed by Morris, et al., were compared to unit-weighted validity estimates obtained in a criterion-related validity study of six job progressions. The multivariate job component validity estimates were comparable to unit-weighted validity coefficients obtained using supervisory ratings as criteria. Multivariate job component validity estimates were conservative compared to corrected unit-weighted validity coefficients.

  11. An introduction to clinical microeconomic analysis: purposes and analytic methods.

    PubMed

    Weintraub, W S; Mauldin, P D; Becker, E R

    1994-06-01

    The recent concern with health care economics has fostered the development of a new discipline that is generally called clinical microeconomics. This is a discipline in which microeconomic methods are used to study the economics of specific medical therapies. It is possible to perform stand alone cost analyses, but more profound insight into the medical decision making process may be accomplished by combining cost studies with measures of outcome. This is most often accomplished with cost-effectiveness or cost-utility studies. In cost-effectiveness studies there is one measure of outcome, often death. In cost-utility studies there are multiple measures of outcome, which must be grouped together to give an overall picture of outcome or utility. There are theoretical limitations to the determination of utility that must be accepted to perform this type of analysis. A summary statement of outcome is quality adjusted life years (QALYs), which is utility time socially discounted survival. Discounting is used because people value a year of future life less than a year of present life. Costs are made up of in-hospital direct, professional, follow-up direct, and follow-up indirect costs. Direct costs are for medical services. Indirect costs reflect opportunity costs such as lost time at work. Cost estimates are often based on marginal costs, or the cost for one additional procedure of the same type. Finally an overall statistic may be generated as cost per unit increase in effectiveness, such as dollars per QALY.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:10151059

  12. Adapting CEF-Descriptors for Rating Purposes: Validation by a Combined Rater Training and Scale Revision Approach

    ERIC Educational Resources Information Center

    Harsch, Claudia; Martin, Guido

    2012-01-01

    We explore how a local rating scale can be based on the Common European Framework CEF-proficiency scales. As part of the scale validation (Alderson, 1991; Lumley, 2002), we examine which adaptations are needed to turn CEF-proficiency descriptors into a rating scale for a local context, and to establish a practicable method to revise the initial…

  13. Beyond Correctness: Development and Validation of Concept-Based Categorical Scoring Rubrics for Diagnostic Purposes

    ERIC Educational Resources Information Center

    Arieli-Attali, Meirav; Liu, Ying

    2016-01-01

    Diagnostic assessment approaches intend to provide fine-grained reports of what students know and can do, focusing on their areas of strengths and weaknesses. However, current application of such diagnostic approaches is limited by the scoring method for item responses; important diagnostic information, such as type of errors and strategy use is…

  14. ePortfolios: The Method of Choice for Validation

    ERIC Educational Resources Information Center

    Scott, Ken; Kim, Jichul

    2015-01-01

    Community colleges have long been institutions of higher education in the arenas of technical education and training, as well as preparing students for transfer to universities. While students are engaged in their student learning outcomes, projects, research, and community service, how have these students validated their work? One method of…

  15. Youden test application in robustness assays during method validation.

    PubMed

    Karageorgou, Eftichia; Samanidou, Victoria

    2014-08-01

    Analytical method validation is a vital step following method development for ensuring reliable and accurate method performance. Among examined figures of merit, robustness/ruggedness study allows us to test performance characteristics of the analytical process when operating conditions are altered either deliberately or not. This study yields useful information, being a fundamental part of method validation. Since many experiments are required, this step is high demanding in time and consumables. In order to avoid the difficult task of performing too many experiments the Youden test which makes use of fractional factorial designs and has been proved to be a very effective approach. The main advantage of Youden test is the fact that it keeps the required time and effort to a minimum, since only a limited number of determinations have to be made, using combinations of the chosen investigated factors. Typical applications of this robustness test found in literature covering a wide variety of sample matrices are briefly discussed in this review.

  16. Recommendations for Use and Fit-for-Purpose Validation of Biomarker Multiplex Ligand Binding Assays in Drug Development.

    PubMed

    Jani, Darshana; Allinson, John; Berisha, Flora; Cowan, Kyra J; Devanarayan, Viswanath; Gleason, Carol; Jeromin, Andreas; Keller, Steve; Khan, Masood U; Nowatzke, Bill; Rhyne, Paul; Stephen, Laurie

    2016-01-01

    Multiplex ligand binding assays (LBAs) are increasingly being used to support many stages of drug development. The complexity of multiplex assays creates many unique challenges in comparison to single-plexed assays leading to various adjustments for validation and potentially during sample analysis to accommodate all of the analytes being measured. This often requires a compromise in decision making with respect to choosing final assay conditions and acceptance criteria of some key assay parameters, depending on the intended use of the assay. The critical parameters that are impacted due to the added challenges associated with multiplexing include the minimum required dilution (MRD), quality control samples that span the range of all analytes being measured, quantitative ranges which can be compromised for certain targets, achieving parallelism for all analytes of interest, cross-talk across assays, freeze-thaw stability across analytes, among many others. Thus, these challenges also increase the complexity of validating the performance of the assay for its intended use. This paper describes the challenges encountered with multiplex LBAs, discusses the underlying causes, and provides solutions to help overcome these challenges. Finally, we provide recommendations on how to perform a fit-for-purpose-based validation, emphasizing issues that are unique to multiplex kit assays. PMID:26377333

  17. Validation of a previous day recall for measuring the location and purpose of active and sedentary behaviors compared to direct observation

    PubMed Central

    2014-01-01

    Purpose Gathering contextual information (i.e., location and purpose) about active and sedentary behaviors is an advantage of self-report tools such as previous day recalls (PDR). However, the validity of PDR’s for measuring context has not been empirically tested. The purpose of this paper was to compare PDR estimates of location and purpose to direct observation (DO). Methods Fifteen adult (18–75 y) and 15 adolescent (12–17 y) participants were directly observed during at least one segment of the day (i.e., morning, afternoon or evening). Participants completed their normal daily routine while trained observers recorded the location (i.e., home, community, work/school), purpose (e.g., leisure, transportation) and whether the behavior was sedentary or active. The day following the observation, participants completed an unannounced PDR. Estimates of time in each context were compared between PDR and DO. Intra-class correlations (ICC), percent agreement and Kappa statistics were calculated. Results For adults, percent agreement was 85% or greater for each location and ICC values ranged from 0.71 to 0.96. The PDR-reported purpose of adults’ behaviors were highly correlated with DO for household activities and work (ICCs of 0.84 and 0.88, respectively). Transportation was not significantly correlated with DO (ICC = -0.08). For adolescents, reported classification of activity location was 80.8% or greater. The ICCs for purpose of adolescents’ behaviors ranged from 0.46 to 0.78. Participants were most accurate in classifying the location and purpose of the behaviors in which they spent the most time. Conclusions This study suggests that adults and adolescents can accurately report where and why they spend time in behaviors using a PDR. This information on behavioral context is essential for translating the evidence for specific behavior-disease associations to health interventions and public policy. PMID:24490619

  18. A method for calibration and validation subset partitioning.

    PubMed

    Galvão, Roberto Kawakami Harrop; Araujo, Mário César Ugulino; José, Gledson Emídio; Pontes, Marcio José Coelho; Silva, Edvan Cirino; Saldanha, Teresa Cristina Bezerra

    2005-10-15

    This paper proposes a new method to divide a pool of samples into calibration and validation subsets for multivariate modelling. The proposed method is of value for analytical applications involving complex matrices, in which the composition variability of real samples cannot be easily reproduced by optimized experimental designs. A stepwise procedure is employed to select samples according to their differences in both x (instrumental responses) and y (predicted parameter) spaces. The proposed technique is illustrated in a case study involving the prediction of three quality parameters (specific mass and distillation temperatures at which 10 and 90% of the sample has evaporated) of diesel by NIR spectrometry and PLS modelling. For comparison, PLS models are also constructed by full cross-validation, as well as by using the Kennard-Stone and random sampling methods for calibration and validation subset partitioning. The obtained models are compared in terms of prediction performance by employing an independent set of samples not used for calibration or validation. The results of F-tests at 95% confidence level reveal that the proposed technique may be an advantageous alternative to the other three strategies.

  19. Methods for causal inference from gene perturbation experiments and validation

    PubMed Central

    Meinshausen, Nicolai; Hauser, Alain; Mooij, Joris M.; Peters, Jonas; Versteeg, Philip; Bühlmann, Peter

    2016-01-01

    Inferring causal effects from observational and interventional data is a highly desirable but ambitious goal. Many of the computational and statistical methods are plagued by fundamental identifiability issues, instability, and unreliable performance, especially for large-scale systems with many measured variables. We present software and provide some validation of a recently developed methodology based on an invariance principle, called invariant causal prediction (ICP). The ICP method quantifies confidence probabilities for inferring causal structures and thus leads to more reliable and confirmatory statements for causal relations and predictions of external intervention effects. We validate the ICP method and some other procedures using large-scale genome-wide gene perturbation experiments in Saccharomyces cerevisiae. The results suggest that prediction and prioritization of future experimental interventions, such as gene deletions, can be improved by using our statistical inference techniques. PMID:27382150

  20. Methods for causal inference from gene perturbation experiments and validation.

    PubMed

    Meinshausen, Nicolai; Hauser, Alain; Mooij, Joris M; Peters, Jonas; Versteeg, Philip; Bühlmann, Peter

    2016-07-01

    Inferring causal effects from observational and interventional data is a highly desirable but ambitious goal. Many of the computational and statistical methods are plagued by fundamental identifiability issues, instability, and unreliable performance, especially for large-scale systems with many measured variables. We present software and provide some validation of a recently developed methodology based on an invariance principle, called invariant causal prediction (ICP). The ICP method quantifies confidence probabilities for inferring causal structures and thus leads to more reliable and confirmatory statements for causal relations and predictions of external intervention effects. We validate the ICP method and some other procedures using large-scale genome-wide gene perturbation experiments in Saccharomyces cerevisiae The results suggest that prediction and prioritization of future experimental interventions, such as gene deletions, can be improved by using our statistical inference techniques. PMID:27382150

  1. Validation of cleaning method for various parts fabricated at a Beryllium facility

    SciTech Connect

    Davis, Cynthia M.

    2015-12-15

    This study evaluated and documented a cleaning process that is used to clean parts that are fabricated at a beryllium facility at Los Alamos National Laboratory. The purpose of evaluating this cleaning process was to validate and approve it for future use to assure beryllium surface levels are below the Department of Energy’s release limits without the need to sample all parts leaving the facility. Inhaling or coming in contact with beryllium can cause an immune response that can result in an individual becoming sensitized to beryllium, which can then lead to a disease of the lungs called chronic beryllium disease, and possibly lung cancer. Thirty aluminum and thirty stainless steel parts were fabricated on a lathe in the beryllium facility, as well as thirty-two beryllium parts, for the purpose of testing a parts cleaning method that involved the use of ultrasonic cleaners. A cleaning method was created, documented, validated, and approved, to reduce beryllium contamination.

  2. The Bland-Altman Method Should Not Be Used in Regression Cross-Validation Studies

    ERIC Educational Resources Information Center

    O'Connor, Daniel P.; Mahar, Matthew T.; Laughlin, Mitzi S.; Jackson, Andrew S.

    2011-01-01

    The purpose of this study was to demonstrate the bias in the Bland-Altman (BA) limits of agreement method when it is used to validate regression models. Data from 1,158 men were used to develop three regression equations to estimate maximum oxygen uptake (R[superscript 2] = 0.40, 0.61, and 0.82, respectively). The equations were evaluated in a…

  3. Methods for Geometric Data Validation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  4. An individual and dynamic Body Segment Inertial Parameter validation method using ground reaction forces.

    PubMed

    Hansen, Clint; Venture, Gentiane; Rezzoug, Nasser; Gorce, Philippe; Isableu, Brice

    2014-05-01

    Over the last decades a variety of research has been conducted with the goal to improve the Body Segment Inertial Parameters (BSIP) estimations but to our knowledge a real validation has never been completely successful, because no ground truth is available. The aim of this paper is to propose a validation method for a BSIP identification method (IM) and to confirm the results by comparing them with recalculated contact forces using inverse dynamics to those obtained by a force plate. Furthermore, the results are compared with the recently proposed estimation method by Dumas et al. (2007). Additionally, the results are cross validated with a high velocity overarm throwing movement. Throughout conditions higher correlations, smaller metrics and smaller RMSE can be found for the proposed BSIP estimation (IM) which shows its advantage compared to recently proposed methods as of Dumas et al. (2007). The purpose of the paper is to validate an already proposed method and to show that this method can be of significant advantage compared to conventional methods.

  5. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset.

    PubMed

    Neto, Joana P; Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R

    2016-08-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo "paired-recordings" such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals.

  6. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset.

    PubMed

    Neto, Joana P; Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R

    2016-08-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo "paired-recordings" such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  7. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset

    PubMed Central

    Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R.

    2016-01-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo “paired-recordings” such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671

  8. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  9. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    PubMed

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study.

  10. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    PubMed

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study. PMID:26656945

  11. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  12. [Patients on the move: validated methods to quantify physical activity].

    PubMed

    Bakker, Esmée A; Eijsvogels, Thijs M H; de Vegt, Femmie; Busser, Guus S F; Hopman, Maria T E; Verbeek, André L M

    2015-01-01

    Physical activity is an important component in the maintenance and improvement of general health; physical inactivity is, however, an increasing problem in the Netherlands. Requests for advice on physical activity are increasing within the healthcare. Assessment of an individual's physical activity pattern is required to provide tailored advice. There are a number of methods for measuring physical activity; these are divided into subjective and objective methods. Subjective measures include physical activity questionnaires and diaries. Objective measures include indirect calorimetry, measurement with doubly labelled water, heart-rate monitoring and the use of an accelerometer or pedometer. The choice of method depends predominantly on the aim of the measurement, and the availability of personnel, time and financial resources. In clinical practice a validated questionnaire is usually the preferred method, but when measuring effects this should be combined with an objective measurement instrument.

  13. Validation of an Impedance Education Method in Flow

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Parrott, Tony L.

    2004-01-01

    This paper reports results of a research effort to validate a method for educing the normal incidence impedance of a locally reacting liner, located in a grazing incidence, nonprogressive acoustic wave environment with flow. The results presented in this paper test the ability of the method to reproduce the measured normal incidence impedance of a solid steel plate and two soft test liners in a uniform flow. The test liners are known to be locally react- ing and exhibit no measurable amplitude-dependent impedance nonlinearities or flow effects. Baseline impedance spectra for these liners were therefore established from measurements in a conventional normal incidence impedance tube. A key feature of the method is the expansion of the unknown impedance function as a piecewise continuous polynomial with undetermined coefficients. Stewart's adaptation of the Davidon-Fletcher-Powell optimization algorithm is used to educe the normal incidence impedance at each Mach number by optimizing an objective function. The method is shown to reproduce the measured normal incidence impedance spectrum for each of the test liners, thus validating its usefulness for determining the normal incidence impedance of test liners for a broad range of source frequencies and flow Mach numbers. Nomenclature

  14. Biostatistical methods for the validation of alternative methods for in vitro toxicity testing.

    PubMed

    Edler, Lutz; Ittrich, Carina

    2003-06-01

    Statistical methods for the validation of toxicological in vitro test assays are developed and applied. Validation is performed either in comparison with in vivo assays or in comparison with other in vitro assays of established validity. Biostatistical methods are presented which are of potential use and benefit for the validation of alternative methods for the risk assessment of chemicals, providing at least an equivalent level of protection through in vitro toxicity testing to that obtained through the use of current in vivo methods. Characteristic indices are developed and determined. Qualitative outcomes are characterised by the rates of false-positive and false-negative predictions, sensitivity and specificity, and predictive values. Quantitative outcomes are characterised by regression coefficients derived from predictive models. The receiver operating characteristics (ROC) technique, applicable when a continuum of cut-off values is considered, is discussed in detail, in relation to its use for statistical modelling and statistical inference. The methods presented are examined for their use for the proof of safety and for toxicity detection and testing. We emphasise that the final validation of toxicity testing is human toxicity, and that the in vivo test itself is only a predictor with an inherent uncertainty. Therefore, the validation of the in vitro test has to account for the vagueness and uncertainty of the "gold standard" in vivo test. We address model selection and model validation, and a four-step scheme is proposed for the conduct of validation studies. Gaps and research needs are formulated to improve the validation of alternative methods for in vitro toxicity testing.

  15. Brazilian Center for the Validation of Alternative Methods (BraCVAM) and the process of validation in Brazil.

    PubMed

    Presgrave, Octavio; Moura, Wlamir; Caldeira, Cristiane; Pereira, Elisabete; Bôas, Maria H Villas; Eskes, Chantra

    2016-03-01

    The need for the creation of a Brazilian centre for the validation of alternative methods was recognised in 2008, and members of academia, industry and existing international validation centres immediately engaged with the idea. In 2012, co-operation between the Oswaldo Cruz Foundation (FIOCRUZ) and the Brazilian Health Surveillance Agency (ANVISA) instigated the establishment of the Brazilian Center for the Validation of Alternative Methods (BraCVAM), which was officially launched in 2013. The Brazilian validation process follows OECD Guidance Document No. 34, where BraCVAM functions as the focal point to identify and/or receive requests from parties interested in submitting tests for validation. BraCVAM then informs the Brazilian National Network on Alternative Methods (RENaMA) of promising assays, which helps with prioritisation and contributes to the validation studies of selected assays. A Validation Management Group supervises the validation study, and the results obtained are peer-reviewed by an ad hoc Scientific Review Committee, organised under the auspices of BraCVAM. Based on the peer-review outcome, BraCVAM will prepare recommendations on the validated test method, which will be sent to the National Council for the Control of Animal Experimentation (CONCEA). CONCEA is in charge of the regulatory adoption of all validated test methods in Brazil, following an open public consultation. PMID:27031604

  16. Validation of spectrophotometric method for lactulose assay in syrup preparation

    NASA Astrophysics Data System (ADS)

    Mahardhika, Andhika Bintang; Novelynda, Yoshella; Damayanti, Sophi

    2015-09-01

    Lactulose is a synthetic disaccharide widely used in food and pharmaceutical fields. In the pharmaceutical field, lactulose is used as osmotic laxative in a syrup dosage form. This research was aimed to validate the spectrophotometric method to determine the levels of lactulose in syrup preparation and the commercial sample. Lactulose is hydrolyzed by hydrochloric acid to form fructose and galactose. The fructose was reacted with resorcinol reagent, forming compounds that give absorption peak at 485 nm. Analytical methods was validated, hereafter lactulose content in syrup preparation were determined. The calibration curve was linear in the range of 30-100 μg/mL with a correlation coefficient (r) of 0.9996, coefficient of variance (Vxo) of 1.1 %, limit of detection of 2.32 μg/mL, and limit of quantitation of 7.04 μg/mL. The result of accuracy test for the lactulose assay in the syrup preparation showed recoveries of 96.6 to 100.8 %. Repeatability test of lactulose assay in standard solution of lactulose and sample preparation syrup showed the coefficient of variation (CV) of 0.75 % and 0.7 %. Intermediate precision (interday) test resulted in coefficient of variation 1.06 % on the first day, the second day by 0.99 %, and 0.95 % for the third day. This research gave a valid analysis method and levels of lactulose in syrup preparations of samples A, B, C were 101.6, 100.5, and 100.6 %, respectively.

  17. Experimental validation of an analytical method of calculating photon distributions

    SciTech Connect

    Wells, R.G.; Celler, A.; Harrop, R.

    1996-12-31

    We have developed a method for analytically calculating photon distributions in SPECT projections. This method models primary photon distributions as well as first and second order Compton scattering and Rayleigh scattering. It uses no free fitting parameters and so the projections produced are completely determined by the characteristics of the SPECT camera system, the energy of the isotope, an estimate of the source distribution and an attenuation map of the scattering object. The method was previously validated by comparison with Monte Carlo simulations and we are now verifying its accuracy with respect to phantom experiments. We have performed experiments using a Siemens MS3 SPECT camera system for a point source (2mm in diameter) within a homogeneous water bath and a small spherical source (1cm in diameter) within both a homogeneous water cylinder and a non-homogeneous medium consisting of air and water. Our technique reproduces well the distribution of photons in the experimentally acquired projections.

  18. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology

    PubMed Central

    Krishnamoorthi, Shankarjee; Perotti, Luigi E.; Borgstrom, Nils P.; Ajijola, Olujimi A.; Frid, Anna; Ponnaluri, Aditya V.; Weiss, James N.; Qu, Zhilin; Klug, William S.; Ennis, Daniel B.; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations. PMID:25493967

  19. Determination of Al in cake mix: Method validation and estimation of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Andrade, G.; Rocha, O.; Junqueira, R.

    2016-07-01

    An analytical method for the determination of Al in cake mix was developed. Acceptable values were obtained for the following parameters: linearity, detection limit - LOD (5.00 mg-kg-1) quantification limit - LOQ (12.5 mg-kg-1), the recovery assays values (between 91 and 102%), the relative standard deviation under repeatability and within-reproducibility conditions (<20.0%) and measurement uncertainty tests (<10.0%) The results of the validation process showed that the proposed method is fitness for purpose.

  20. Determination of methylmercury in marine biota samples: method validation.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2014-05-01

    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  1. Flexibility and applicability of β-expectation tolerance interval approach to assess the fitness of purpose of pharmaceutical analytical methods.

    PubMed

    Bouabidi, A; Talbi, M; Bourichi, H; Bouklouze, A; El Karbane, M; Boulanger, B; Brik, Y; Hubert, Ph; Rozet, E

    2012-12-01

    An innovative versatile strategy using Total Error has been proposed to decide about the method's validity that controls the risk of accepting an unsuitable assay together with the ability to predict the reliability of future results. This strategy is based on the simultaneous combination of systematic (bias) and random (imprecision) error of analytical methods. Using validation standards, both types of error are combined through the use of a prediction interval or β-expectation tolerance interval. Finally, an accuracy profile is built by connecting, on one hand all the upper tolerance limits, and on the other hand all the lower tolerance limits. This profile combined with pre-specified acceptance limits allows the evaluation of the validity of any quantitative analytical method and thus their fitness for their intended purpose. In this work, the approach of accuracy profile was evaluated on several types of analytical methods encountered in the pharmaceutical industrial field and also covering different pharmaceutical matrices. The four studied examples depicted the flexibility and applicability of this approach for different matrices ranging from tablets to syrups, different techniques such as liquid chromatography, or UV spectrophotometry, and for different categories of assays commonly encountered in the pharmaceutical industry i.e. content assays, dissolution assays, and quantitative impurity assays. The accuracy profile approach assesses the fitness of purpose of these methods for their future routine application. It also allows the selection of the most suitable calibration curve, the adequate evaluation of a potential matrix effect and propose efficient solution and the correct definition of the limits of quantification of the studied analytical procedures.

  2. Flexibility and applicability of β-expectation tolerance interval approach to assess the fitness of purpose of pharmaceutical analytical methods.

    PubMed

    Bouabidi, A; Talbi, M; Bourichi, H; Bouklouze, A; El Karbane, M; Boulanger, B; Brik, Y; Hubert, Ph; Rozet, E

    2012-12-01

    An innovative versatile strategy using Total Error has been proposed to decide about the method's validity that controls the risk of accepting an unsuitable assay together with the ability to predict the reliability of future results. This strategy is based on the simultaneous combination of systematic (bias) and random (imprecision) error of analytical methods. Using validation standards, both types of error are combined through the use of a prediction interval or β-expectation tolerance interval. Finally, an accuracy profile is built by connecting, on one hand all the upper tolerance limits, and on the other hand all the lower tolerance limits. This profile combined with pre-specified acceptance limits allows the evaluation of the validity of any quantitative analytical method and thus their fitness for their intended purpose. In this work, the approach of accuracy profile was evaluated on several types of analytical methods encountered in the pharmaceutical industrial field and also covering different pharmaceutical matrices. The four studied examples depicted the flexibility and applicability of this approach for different matrices ranging from tablets to syrups, different techniques such as liquid chromatography, or UV spectrophotometry, and for different categories of assays commonly encountered in the pharmaceutical industry i.e. content assays, dissolution assays, and quantitative impurity assays. The accuracy profile approach assesses the fitness of purpose of these methods for their future routine application. It also allows the selection of the most suitable calibration curve, the adequate evaluation of a potential matrix effect and propose efficient solution and the correct definition of the limits of quantification of the studied analytical procedures. PMID:22615163

  3. Experimental validation of boundary element methods for noise prediction

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Oswald, Fred B.

    1992-01-01

    Experimental validation of methods to predict radiated noise is presented. A combined finite element and boundary element model was used to predict the vibration and noise of a rectangular box excited by a mechanical shaker. The predicted noise was compared to sound power measured by the acoustic intensity method. Inaccuracies in the finite element model shifted the resonance frequencies by about 5 percent. The predicted and measured sound power levels agree within about 2.5 dB. In a second experiment, measured vibration data was used with a boundary element model to predict noise radiation from the top of an operating gearbox. The predicted and measured sound power for the gearbox agree within about 3 dB.

  4. Video quality experts group: the quest for valid objective methods

    NASA Astrophysics Data System (ADS)

    Corriveau, Philip J.; Webster, Arthur A.; Rohaly, Ann M.; Libert, John M.

    2000-06-01

    Subjective assessment methods have been used reliably for many years to evaluate video quality. They continue to provide the most reliable assessments compared to objective methods. Some issues that arise with subjective assessment include the cost of conducting the evaluations and the fact that these methods cannot easily be used to monitor video quality in real time. Furthermore, traditional, analog objective methods, while still necessary, are not sufficient to measure the quality of digitally compressed video systems. Thus, there is a need to develop new objective methods utilizing the characteristics of the human visual system. While several new objective methods have been developed, there is to date no internationally standardized method. The Video Quality Experts Group (VQEG) was formed in October 1997 to address video quality issues. The group is composed of experts from various backgrounds and affiliations, including participants from several internationally recognized organizations working in the field of video quality assessment. The majority of participants are active in the International Telecommunications Union (ITU) and VQEG combines the expertise and resources found in several ITU Study Groups to work towards a common goal. The first task undertaken by VQEG was to provide a validation of objective video quality measurement methods leading to Recommendations in both the Telecommunications (ITU-T) and Radiocommunication (ITU-R) sectors of the ITU. To this end, VQEG designed and executed a test program to compare subjective video quality evaluations to the predictions of a number of proposed objective measurement methods for video quality in the bit rate range of 768 kb/s to 50 Mb/s. The results of this test show that there is no objective measurement system that is currently able to replace subjective testing. Depending on the metric used for evaluation, the performance of eight or nine models was found to be statistically equivalent, leading to the

  5. Validation and applications of an expedited tablet friability method.

    PubMed

    Osei-Yeboah, Frederick; Sun, Changquan Calvin

    2015-04-30

    The harmonized monograph on tablet friability test in United States Pharmacopeia (USP), European Pharmacopeia (Pharm. Eur.), and Japanese Pharmacopeia (JP) is designed to assess adequacy of mechanical strength of a batch of tablets. Currently, its potential applications in formulation development have been limited due to the batch requirement that is both labor and material intensive. To this end, we have developed an expedited tablet friability test method, using the existing USP test apparatus. The validity of the expedited friability method is established by showing that the friability data from the expedited method is not statistically different from those from the standard pharmacopeia method using materials of very different mechanical properties, i.e., microcrystalline cellulose and dibasic calcium phosphate dihydrate. Using the expedited friability method, we have shown that the relationship between tablet friability and tablet mechanical strength follows a power law expression. Furthermore, potential applications of this expedited friability test in facilitating systematic and efficient tablet formulation and tooling design are demonstrated with examples.

  6. Examining the Content Validity of the WHOQOL-BRF from Respondents' Perspective by Quantitative Methods

    ERIC Educational Resources Information Center

    Yao, Grace; Wu, Chia-Huei; Yang, Cheng-Ta

    2008-01-01

    Content validity, the extent to which a measurement reflects the specific intended domain of content, is a basic type of validity for a valid measurement. It was usually examined qualitatively and relied on experts' subjective judgments, not on respondents' responses. Therefore, the purpose of this study was to introduce and demonstrate how to use…

  7. Validated spectrofluorometric methods for determination of amlodipine besylate in tablets

    NASA Astrophysics Data System (ADS)

    Abdel-Wadood, Hanaa M.; Mohamed, Niveen A.; Mahmoud, Ashraf M.

    2008-08-01

    Two simple and sensitive spectrofluorometric methods have been developed and validated for determination of amlodipine besylate (AML) in tablets. The first method was based on the condensation reaction of AML with ninhydrin and phenylacetaldehyde in buffered medium (pH 7.0) resulting in formation of a green fluorescent product, which exhibits excitation and emission maxima at 375 and 480 nm, respectively. The second method was based on the reaction of AML with 7-chloro-4-nitro-2,1,3-benzoxadiazole (NBD-Cl) in a buffered medium (pH 8.6) resulting in formation of a highly fluorescent product, which was measured fluorometrically at 535 nm ( λex, 480 nm). The factors affecting the reactions were studied and optimized. Under the optimum reaction conditions, linear relationships with good correlation coefficients (0.9949-0.9997) were found between the fluorescence intensity and the concentrations of AML in the concentration range of 0.35-1.8 and 0.55-3.0 μg ml -1 for ninhydrin and NBD-Cl methods, respectively. The limits of assays detection were 0.09 and 0.16 μg ml -1 for the first and second method, respectively. The precisions of the methods were satisfactory; the relative standard deviations were ranged from 1.69 to 1.98%. The proposed methods were successfully applied to the analysis of AML in pure and pharmaceutical dosage forms with good accuracy; the recovery percentages ranged from 100.4-100.8 ± 1.70-2.32%. The results were compared favorably with those of the reported method.

  8. Protein structure validation using a semi-empirical method

    PubMed Central

    Lahiri, Tapobrata; Singh, Kalpana; Pal, Manoj Kumar; Verma, Gaurav

    2012-01-01

    Current practice of validating predicted protein structural model is knowledge-based where scoring parameters are derived from already known structures to obtain decision on validation out of this structure information. For example, the scoring parameter, Ramachandran Score gives percentage conformity with steric-property higher value of which implies higher acceptability. On the other hand, Force-Field Energy Score gives conformity with energy-wise stability higher value of which implies lower acceptability. Naturally, setting these two scoring parameters as target objectives sometimes yields a set of multiple models for the same protein for which acceptance based on a particular parameter, say, Ramachandran score, may not satisfy well with the acceptance of the same model based on other parameter, say, energy score. The confusion set of such models can further be resolved by introducing some parameters value of which are easily obtainable through experiment on the same protein. In this piece of work it was found that the confusion regarding final acceptance of a model out of multiple models of the same protein can be removed using a parameter Surface Rough Index which can be obtained through semi-empirical method from the ordinary microscopic image of heat denatured protein. PMID:23275692

  9. Computational Methods for RNA Structure Validation and Improvement.

    PubMed

    Jain, Swati; Richardson, David C; Richardson, Jane S

    2015-01-01

    With increasing recognition of the roles RNA molecules and RNA/protein complexes play in an unexpected variety of biological processes, understanding of RNA structure-function relationships is of high current importance. To make clean biological interpretations from three-dimensional structures, it is imperative to have high-quality, accurate RNA crystal structures available, and the community has thoroughly embraced that goal. However, due to the many degrees of freedom inherent in RNA structure (especially for the backbone), it is a significant challenge to succeed in building accurate experimental models for RNA structures. This chapter describes the tools and techniques our research group and our collaborators have developed over the years to help RNA structural biologists both evaluate and achieve better accuracy. Expert analysis of large, high-resolution, quality-conscious RNA datasets provides the fundamental information that enables automated methods for robust and efficient error diagnosis in validating RNA structures at all resolutions. The even more crucial goal of correcting the diagnosed outliers has steadily developed toward highly effective, computationally based techniques. Automation enables solving complex issues in large RNA structures, but cannot circumvent the need for thoughtful examination of local details, and so we also provide some guidance for interpreting and acting on the results of current structure validation for RNA. PMID:26068742

  10. Validating for Use and Interpretation: A Mixed Methods Contribution Illustrated

    ERIC Educational Resources Information Center

    Morell, Linda; Tan, Rachael Jin Bee

    2009-01-01

    Researchers in the areas of psychology and education strive to understand the intersections among validity, educational measurement, and cognitive theory. Guided by a mixed model conceptual framework, this study investigates how respondents' opinions inform the validation argument. Validity evidence for a science assessment was collected through…

  11. Validated HPLC method for quantifying permethrin in pharmaceutical formulations.

    PubMed

    García, E; García, A; Barbas, C

    2001-03-01

    An isocratic HPLC method for permethrin determination in raw material and pharmaceutical presentations as lotion and shampoo has been developed and validated following ICH recommendations. Cis and trans- isomers, impurities and degradation products are well separated. The chromatographic analysis were performed on a 4 microm particle C-18 Nova-Pak (Waters, Madrid, Spain) column (15 x 0.39 cm) kept in a Biorad column oven at 35 degrees C. Mobile phase consisted of methanol--water (78:22, v/v) at a flow rate of 1 ml/min. UV detection was performed at 272 nm and peaks were identified with retention times as compared with standards and confirmed with characteristic spectra using the photodiode array detector.

  12. Method and validity of transcranial sonography in movement disorders.

    PubMed

    Skoloudík, David; Walter, Uwe

    2010-01-01

    Transcranial sonography (TCS) of the brain parenchyma in patients with movement and other neurodegenerative disorders has developed with increasing dynamics during the past two decades. The specific advantages of TCS are the different visualization of brain structures compared to other neuroimaging methods due to the different physical imaging principle, high-resolution imaging of echogenic deep brain structures, on-time dynamic imaging with high resolution in time, relatively low costs of technical equipment, wide availability, short investigation time, noninvasivity, mobility and bedside availability, and little corruption by patients' movements. TCS proved sensitive and reliable in detecting disease-specific alterations of brainstem structures and basal ganglia in various movement disorders. Here, we give an overview on the technical requirements and recommendations on the standardized application of TCS of deep brain structures in movement disorders. We discuss methodological potentials and limitations of TCS, its validity, and future developments.

  13. The Equivalence of Positive and Negative Methods of Validating a Learning Hierarchy.

    ERIC Educational Resources Information Center

    Kee, Kevin N.; White, Richard T.

    1979-01-01

    The compound nature of Gagne's original definition of learning hierarchies leads to two methods of validation, the positive and negative methods. Sections of a hierarchy that had been validated by the negative method were subjected to test by the more cumbersome positive method, and again were found to be valid. (Author/RD)

  14. [Methods for quantifying phasic skin conductance amplitudes: threats to validity?].

    PubMed

    Zimmer, H; Vossel, G

    1993-01-01

    Two methods of determining the event-related skin conductance response (SCR) amplitude are in common use. In one of these, the difference in conductance between the point of onset and the peak level of a single wave is measured (method 1). The second approach is to determine the difference between two measures, one characterizing the prestimulus level, the other the highest conductance point of the SCR reached within a fixed period following the stimulus (method 2). A problem with quantifying the SCR amplitude occurs when a SCR is elicited before an immediately preceding response has had time to recover, because in this case the two methods lead to quite different values. If the amplitude of each response is measured from its own individual deflection point, the measurable amplitude of the second response will be smaller when it occurs immediately after or in the ascending limb of the first response. The problem is most evident in situations with a high probability of response superimposition, such as when a large number of nonspecific responses occur at the same time as the SCRs. This is found in individuals with a high degree of electrodermal lability. Electrodermal lability refers to a psychophysiological construct that is operationally defined by the frequency of spontaneous electrodermal fluctuations. In the present study, we therefore systematically investigated the effects of the two score methods on SCR amplitude in relation to lability by analyzing electrodermal data from two habituation studies. As expected, several method-specific effects which were related to lability emerged. Results and questions concerning the relevance of the findings are discussed, with special emphasis on the validity of psychophysiological investigations.

  15. [Validation of a HPLC method for ochratoxin A determination].

    PubMed

    Bulea, Delia; Spac, A F; Dorneanu, V

    2011-01-01

    Ochratoxin A is a mycotoxin produced by various species of Aspergillus and Penicillium. Ochratoxin A has been detected in cereals and cereal products, coffee beans, beer, wine, spices, pig's kidney and cow's milk. For ochratoxin A, a HPLC method was developed and validated. Ochratoxin A was determined by RP-HPLC, using a liquid chromatograph type HP 1090 Series II, equiped with a fluorescence detector. The analysis was performed with a Phenomenex column, type Luna C18(2) 100A (150 x 4.6 mm; 5 microm) with a mobile phase consisting of a mixture of acetonitrile/water/acid acetic (99/99/2), a flow of 0.7 mL/min. For detection, the wavelenght of excitation was 228 nm and wavelenght of emision was 423 nm. The calibration graph was linear in 6.25-50 ng/mL concentration range (r2 = 0,9991). The detection limits was 1.6 ng/mL and the quantification limit was 4.9 ng/mL. The method precision (RSD = 2.4975%) and the accuracy (recovery was 100.1%) were studied. The HPLC method was applyed for ochratoxin A from food samples with good results. PMID:21870763

  16. Testing and Validation of the Dynamic Inertia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.

  17. Knowledge Transmission versus Social Transformation: A Critical Analysis of Purpose in Elementary Social Studies Methods Textbooks

    ERIC Educational Resources Information Center

    Butler, Brandon M.; Suh, Yonghee; Scott, Wendy

    2015-01-01

    In this article, the authors investigate the extent to which 9 elementary social studies methods textbooks present the purpose of teaching and learning social studies. Using Stanley's three perspectives of teaching social studies for knowledge transmission, method of intelligence, and social transformation; we analyze how these texts prepare…

  18. Validation of a digital PCR method for quantification of DNA copy number concentrations by using a certified reference material.

    PubMed

    Deprez, Liesbet; Corbisier, Philippe; Kortekaas, Anne-Marie; Mazoua, Stéphane; Beaz Hidalgo, Roxana; Trapmann, Stefanie; Emons, Hendrik

    2016-09-01

    Digital PCR has become the emerging technique for the sequence-specific detection and quantification of nucleic acids for various applications. During the past years, numerous reports on the development of new digital PCR methods have been published. Maturation of these developments into reliable analytical methods suitable for diagnostic or other routine testing purposes requires their validation for the intended use. Here, the results of an in-house validation of a droplet digital PCR method are presented. This method is intended for the quantification of the absolute copy number concentration of a purified linearized plasmid in solution with a nucleic acid background. It has been investigated which factors within the measurement process have a significant effect on the measurement results, and the contribution to the overall measurement uncertainty has been estimated. A comprehensive overview is provided on all the aspects that should be investigated when performing an in-house method validation of a digital PCR method. PMID:27617230

  19. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  20. Radionuclide methods validation with FSV [Fort St. Vrain] data

    SciTech Connect

    Jovanovic, V.

    1989-09-29

    As part of the radionuclide methods verification program at GA, a fuel performance analysis of the Fort St. Vrain (FSV) core was performed using the reference fuel performance and fission gas release models. The purpose of the analysis was to predict the fuel and graphite temperature distributions, fuel particle failure, and fission gas release as a function of time, and to compare the predicted fission gas release with data taken as part of the FSV radiochemistry surveillance program. The analysis covered the entire operating time of the FSV plant except for the last 18 days prior to the final shutdown because the operating parameters and data for this period were not available when the analysis was performed. 3 refs., 29 figs., 5 tabs.

  1. Determination of formaldehyde in food and feed by an in-house validated HPLC method.

    PubMed

    Wahed, P; Razzaq, Md A; Dharmapuri, S; Corrales, M

    2016-07-01

    Formalin is carcinogenic and is detrimental to public health. The illegal addition of formalin (37% formaldehyde and 14% methanol) to foods to extend their shelf-life is considered to be a common practice in Bangladesh. The lack of accurate methods and the ubiquitous presence of formaldehyde in foods make the detection of illegally added formalin challenging. With the aim of helping regulatory authorities, a sensitive high performance liquid chromatography method was validated for the quantitative determination of formaldehyde in mango, fish and milk. The method was fit-for-purpose and showed good analytical performance in terms of specificity, linearity, precision, recovery and robustness. The expanded uncertainty was <35%. The validated method was applied to screen samples of fruits, vegetables, fresh fish, milk and fish feed collected from different local markets in Dhaka, Bangladesh. Levels of formaldehyde in food samples were compared with published data. The applicability of the method in different food matrices might mean it has potential as a reference standard method. PMID:26920321

  2. Determination of formaldehyde in food and feed by an in-house validated HPLC method.

    PubMed

    Wahed, P; Razzaq, Md A; Dharmapuri, S; Corrales, M

    2016-07-01

    Formalin is carcinogenic and is detrimental to public health. The illegal addition of formalin (37% formaldehyde and 14% methanol) to foods to extend their shelf-life is considered to be a common practice in Bangladesh. The lack of accurate methods and the ubiquitous presence of formaldehyde in foods make the detection of illegally added formalin challenging. With the aim of helping regulatory authorities, a sensitive high performance liquid chromatography method was validated for the quantitative determination of formaldehyde in mango, fish and milk. The method was fit-for-purpose and showed good analytical performance in terms of specificity, linearity, precision, recovery and robustness. The expanded uncertainty was <35%. The validated method was applied to screen samples of fruits, vegetables, fresh fish, milk and fish feed collected from different local markets in Dhaka, Bangladesh. Levels of formaldehyde in food samples were compared with published data. The applicability of the method in different food matrices might mean it has potential as a reference standard method.

  3. Forward Modeling of Electromagnetic Methods Using General Purpose Finite Element Software

    NASA Astrophysics Data System (ADS)

    Butler, S. L.

    2015-12-01

    Electromagnetic methods are widely used in mineral exploration and environmental applications and are increasingly being used in hydrocarbon exploration. Forward modeling of electromagnetic methods remains challenging and is mostly carried out using purpose-built research software. General purpose commercial modeling software has become increasingly flexible and powerful in recent years and is now capable of modeling field geophysical electromagnetic techniques. In this contribution, I will show examples of the use of commercial finite element modeling software Comsol Multiphysics for modeling frequency and time-domain electromagnetic techniques as well as for modeling the Very Low Frequency technique and magnetometric resistivity. Comparisons are made with analytical solutions, benchmark numerical solutions, analog experiments and field data. Although some calculations take too long to be practical as part of an inversion scheme, I suggest that modeling of this type will be useful for modeling novel techniques and for educational purposes.

  4. A validated GC/MS method for the determination of amisulpride in whole blood.

    PubMed

    Papoutsis, Ioannis; Rizopoulou, Anna; Nikolaou, Panagiota; Pistos, Constantinos; Spiliopoulou, Chara; Athanaselis, Sotiris

    2014-02-01

    A sensitive GC/MS method for the determination of amisulpride in whole blood was developed, optimized and validated. Sample preparation included solid-phase extraction using HF Bond Elut C18 cartridges and further derivatization with heptafluorobutyric anhydride (HFBA). The limits of detection and quantification were 3.00 and 10.0 μg/L, respectively. The calibration curves were linear up to 1000 μg/L (R(2)≥0.991). Absolute recovery ranged from 94.2 to 101%. Accuracy was found to be between -8.7 and 1.9% and imprecision was less than 10.0%. The developed method covers the generally accepted therapeutic range but it can also cover levels above them. This makes our method suitable for the determination of amisulpride not only for clinical purposes on psychiatric patients, but also during the investigation of forensic cases where amisulpride is involved.

  5. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  6. An Anatomically Validated Brachial Plexus Contouring Method for Intensity Modulated Radiation Therapy Planning

    SciTech Connect

    Van de Velde, Joris; Audenaert, Emmanuel; Speleers, Bruno; Vercauteren, Tom; Mulliez, Thomas; Vandemaele, Pieter; Achten, Eric; Kerckaert, Ingrid; D'Herde, Katharina; De Neve, Wilfried; Van Hoof, Tom

    2013-11-15

    Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validated the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection.

  7. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    SciTech Connect

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  8. An overall uncertainty approach for the validation of analytical separation methods.

    PubMed

    Saffaj, T; Ihssane, B; Jhilal, F; Bouchafra, H; Laslami, S; Sosse, S Alaoui

    2013-08-21

    The aim of this paper is to recommend a new strategy for the analytical validation based on the uncertainty profile as a graphical decision-making tool, and to exemplify a novel method to estimate the measurement uncertainty. Indeed, the innovative formula that we offer to assess the uncertainty is based on the calculation of the β-content tolerance interval. Three chemometric methodologies are exposed to build the (β, γ) tolerance interval, namely: the Satterthwaite approximation, the GPQ method (generalized pivotal confidence) and the MLS procedure (modified large simple). Furthermore, we illustrate the applicability and flexibility of the uncertainty profile to assess the fitness of the purpose of chromatographic and electrophoretic analytical methods, which use different instrumental techniques such as liquid chromatography (LC-UV, LC-MS), gas chromatography (GC-FID, GC-MS) and capillary electrophoresis (CE, CE-MS). In addition, we demonstrate here that (β, γ) tolerance intervals will provide perfect estimates of the routine uncertainty. In particular, we show that there is no difference statistically between the uncertainties estimated by our methodology as of the validation stage with those obtained from the routine phase.

  9. Use of Validation by Enterprises for Human Resource and Career Development Purposes. Cedefop Reference Series No 96

    ERIC Educational Resources Information Center

    Cedefop - European Centre for the Development of Vocational Training, 2014

    2014-01-01

    European enterprises give high priority to assessing skills and competences, seeing this as crucial for recruitment and human resource management. Based on a survey of 400 enterprises, 20 in-depth case studies and interviews with human resource experts in 10 countries, this report analyses the main purposes of competence assessment, the standards…

  10. Validation Methods for Fault-Tolerant avionics and control systems, working group meeting 1

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The proceedings of the first working group meeting on validation methods for fault tolerant computer design are presented. The state of the art in fault tolerant computer validation was examined in order to provide a framework for future discussions concerning research issues for the validation of fault tolerant avionics and flight control systems. The development of positions concerning critical aspects of the validation process are given.

  11. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  12. A photographic method to measure food item intake. Validation in geriatric institutions.

    PubMed

    Pouyet, Virginie; Cuvelier, Gérard; Benattar, Linda; Giboreau, Agnès

    2015-01-01

    From both a clinical and research perspective, measuring food intake is an important issue in geriatric institutions. However, weighing food in this context can be complex, particularly when the items remaining on a plate (side dish, meat or fish and sauce) need to be weighed separately following consumption. A method based on photography that involves taking photographs after a meal to determine food intake consequently seems to be a good alternative. This method enables the storage of raw data so that unhurried analyses can be performed to distinguish the food items present in the images. Therefore, the aim of this paper was to validate a photographic method to measure food intake in terms of differentiating food item intake in the context of a geriatric institution. Sixty-six elderly residents took part in this study, which was performed in four French nursing homes. Four dishes of standardized portions were offered to the residents during 16 different lunchtimes. Three non-trained assessors then independently estimated both the total and specific food item intakes of the participants using images of their plates taken after the meal (photographic method) and a reference image of one plate taken before the meal. Total food intakes were also recorded by weighing the food. To test the reliability of the photographic method, agreements between different assessors and agreements among various estimates made by the same assessor were evaluated. To test the accuracy and specificity of this method, food intake estimates for the four dishes were compared with the food intakes determined using the weighed food method. To illustrate the added value of the photographic method, food consumption differences between the dishes were explained by investigating the intakes of specific food items. Although they were not specifically trained for this purpose, the results demonstrated that the assessor estimates agreed between assessors and among various estimates made by the same

  13. Optimal combining of ground-based sensors for the purpose of validating satellite-based rainfall estimates

    NASA Technical Reports Server (NTRS)

    Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie

    1991-01-01

    Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.

  14. Convergent validity of a novel method for quantifying rowing training loads.

    PubMed

    Tran, Jacqueline; Rice, Anthony J; Main, Luana C; Gastin, Paul B

    2015-01-01

    Elite rowers complete rowing-specific and non-specific training, incorporating continuous and interval-like efforts spanning the intensity spectrum. However, established training load measures are unsuitable for use in some modes and intensities. Consequently, a new measure known as the T2minute method was created. The method quantifies load as the time spent in a range of training zones (time-in-zone), multiplied by intensity- and mode-specific weighting factors that scale the relative stress of different intensities and modes to the demands of on-water rowing. The purpose of this study was to examine the convergent validity of the T2minute method with Banister's training impulse (TRIMP), Lucia's TRIMP and Session-RPE when quantifying elite rowing training. Fourteen elite rowers (12 males, 2 females) were monitored during four weeks of routine training. Unadjusted T2minute loads (using coaches' estimates of time-in-zone) demonstrated moderate-to-strong correlations with Banister's TRIMP, Lucia's TRIMP and Session-RPE (rho: 0.58, 0.55 and 0.42, respectively). Adjusting T2minute loads by using actual time-in-zone data resulted in stronger correlations between the T2minute method and Banister's TRIMP and Lucia's TRIMP (rho: 0.85 and 0.81, respectively). The T2minute method is an appropriate in-field measure of elite rowing training loads, particularly when actual time-in-zone values are used to quantify load.

  15. Convergent validity of a novel method for quantifying rowing training loads.

    PubMed

    Tran, Jacqueline; Rice, Anthony J; Main, Luana C; Gastin, Paul B

    2015-01-01

    Elite rowers complete rowing-specific and non-specific training, incorporating continuous and interval-like efforts spanning the intensity spectrum. However, established training load measures are unsuitable for use in some modes and intensities. Consequently, a new measure known as the T2minute method was created. The method quantifies load as the time spent in a range of training zones (time-in-zone), multiplied by intensity- and mode-specific weighting factors that scale the relative stress of different intensities and modes to the demands of on-water rowing. The purpose of this study was to examine the convergent validity of the T2minute method with Banister's training impulse (TRIMP), Lucia's TRIMP and Session-RPE when quantifying elite rowing training. Fourteen elite rowers (12 males, 2 females) were monitored during four weeks of routine training. Unadjusted T2minute loads (using coaches' estimates of time-in-zone) demonstrated moderate-to-strong correlations with Banister's TRIMP, Lucia's TRIMP and Session-RPE (rho: 0.58, 0.55 and 0.42, respectively). Adjusting T2minute loads by using actual time-in-zone data resulted in stronger correlations between the T2minute method and Banister's TRIMP and Lucia's TRIMP (rho: 0.85 and 0.81, respectively). The T2minute method is an appropriate in-field measure of elite rowing training loads, particularly when actual time-in-zone values are used to quantify load. PMID:25083912

  16. An Automatic Method for Geometric Segmentation of Masonry Arch Bridges for Structural Engineering Purposes

    NASA Astrophysics Data System (ADS)

    Riveiro, B.; DeJong, M.; Conde, B.

    2016-06-01

    Despite the tremendous advantages of the laser scanning technology for the geometric characterization of built constructions, there are important limitations preventing more widespread implementation in the structural engineering domain. Even though the technology provides extensive and accurate information to perform structural assessment and health monitoring, many people are resistant to the technology due to the processing times involved. Thus, new methods that can automatically process LiDAR data and subsequently provide an automatic and organized interpretation are required. This paper presents a new method for fully automated point cloud segmentation of masonry arch bridges. The method efficiently creates segmented, spatially related and organized point clouds, which each contain the relevant geometric data for a particular component (pier, arch, spandrel wall, etc.) of the structure. The segmentation procedure comprises a heuristic approach for the separation of different vertical walls, and later image processing tools adapted to voxel structures allows the efficient segmentation of the main structural elements of the bridge. The proposed methodology provides the essential processed data required for structural assessment of masonry arch bridges based on geometric anomalies. The method is validated using a representative sample of masonry arch bridges in Spain.

  17. FIELD VALIDATION OF SEDIMENT TOXCITY IDENTIFCATION AND EVALUATION METHODS

    EPA Science Inventory

    Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both porewaters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question of whethe...

  18. Voltammetric determination of copper in selected pharmaceutical preparations--validation of the method.

    PubMed

    Lutka, Anna; Maruszewska, Małgorzata

    2011-01-01

    It were established and validated the conditions of voltammetric determination of copper in pharmaceutical preparations. The three selected preparations: Zincuprim (A), Wapń, cynk, miedź z wit. C (B), Vigor complete (V) contained different salts and different quantity of copper (II) and increasing number of accompanied ingredients. For the purpose to transfer copper into solution, the samples of powdered tablets of the first and second preparation were undergone extraction and of the third the mineralization procedures. The concentration of copper in solution was determined by differential pulse voltammetry (DP) using comparison with standard technique. In the validation process, the selectivity, accuracy, precision and linearity of DP determination of copper in three preparations were estimated. Copper was determined within the concentration range of 1-9 ppm (1-9 microg/mL): the mean recoveries approached 102% (A), 100% (B), 102% (V); the relative standard deviations of determinations (RSD) were 0.79-1.59% (A), 0.62-0.85% (B) and 1.68-2.28% (V), respectively. The mean recoveries and the RSDs of determination satisfied the requirements for the analyte concentration at the level 1-10 ppm. The statistical verification confirmed that the tested voltammetric method is suitable for determination of copper in pharmaceutical preparation.

  19. Design, Development, Validation, and Use of Synthetic Nucleic Acid Controls for Diagnostic Purposes and Application to Cystic Fibrosis Testing

    PubMed Central

    Christensen, Todd M.; Jama, Mohamed; Ponek, Victor; Lyon, Elaine; Wilson, Jean Amos; Hoffmann, Marcy L.; Bejjani, Bassem A.

    2007-01-01

    We have designed, tested, and validated synthetic DNA molecules that may be used as reference standard controls in the simultaneous detection of mutations in one or more genes. These controls consist of a mixture of oligonucleotides (100 to 120 bases long) each designed for the detection of one or more disease-causing mutation(s), depending on the proximity of the mutations to one another. Each control molecule is identical to 80 to 100 bases that span the targeted mutations. In addition, each oligonucleotide is tagged at the 5′ and 3′ ends with distinct nucleic acid sequences that allow for the design of complementary primers for polymerase chain reaction amplification. We designed the tags to amplify control molecules comprising 32 CFTR mutations, including the American College of Medical Genetics minimum carrier screening panel of 23, with one pair of primers in a single tube. We tested the performance of these controls on many platforms including the Applied Biosystems/Celera oligonucleotide ligation assay and the Tm Bioscience Tag-It platforms. All 32 mutations were detected consistently. This simple methodology allows for maximum flexibility and rapid implementation. It has not escaped our notice that the design of these molecules makes possible the production of similar controls for virtually any mutation or sequence of interest. PMID:17591930

  20. VALIDATION OF A METHOD FOR ESTIMATING LONG-TERM EXPOSURES BASED ON SHORT-TERM MEASUREMENTS

    EPA Science Inventory

    A method for estimating long-term exposures from short-term measurements is validated using data from a recent EPA study of exposure to fine particles. The method was developed a decade ago but long-term exposure data to validate it did not exist until recently. In this paper, ...

  1. VALIDATION OF A METHOD FOR ESTIMATING LONG-TERM EXPOSURES BASED ON SHORT-TERM MEASUREMENTS

    EPA Science Inventory

    A method for estimating long-term exposures from short-term measurements is validated using data from a recent EPA study of exposure to fine particles. The method was developed a decade ago but data to validate it did not exist until recently. In this paper, data from repeated ...

  2. Sample size considerations of prediction-validation methods in high-dimensional data for survival outcomes.

    PubMed

    Pang, Herbert; Jung, Sin-Ho

    2013-04-01

    A variety of prediction methods are used to relate high-dimensional genome data with a clinical outcome using a prediction model. Once a prediction model is developed from a data set, it should be validated using a resampling method or an independent data set. Although the existing prediction methods have been intensively evaluated by many investigators, there has not been a comprehensive study investigating the performance of the validation methods, especially with a survival clinical outcome. Understanding the properties of the various validation methods can allow researchers to perform more powerful validations while controlling for type I error. In addition, sample size calculation strategy based on these validation methods is lacking. We conduct extensive simulations to examine the statistical properties of these validation strategies. In both simulations and a real data example, we have found that 10-fold cross-validation with permutation gave the best power while controlling type I error close to the nominal level. Based on this, we have also developed a sample size calculation method that will be used to design a validation study with a user-chosen combination of prediction. Microarray and genome-wide association studies data are used as illustrations. The power calculation method in this presentation can be used for the design of any biomedical studies involving high-dimensional data and survival outcomes.

  3. Validation of two ribosomal RNA removal methods for microbial metatranscriptomics

    SciTech Connect

    He, Shaomei; Wurtzel, Omri; Singh, Kanwar; Froula, Jeff L; Yilmaz, Suzan; Tringe, Susannah G; Wang, Zhong; Chen, Feng; Lindquist, Erika A; Sorek, Rotem; Hugenholtz, Philip

    2010-10-01

    The predominance of rRNAs in the transcriptome is a major technical challenge in sequence-based analysis of cDNAs from microbial isolates and communities. Several approaches have been applied to deplete rRNAs from (meta)transcriptomes, but no systematic investigation of potential biases introduced by any of these approaches has been reported. Here we validated the effectiveness and fidelity of the two most commonly used approaches, subtractive hybridization and exonuclease digestion, as well as combinations of these treatments, on two synthetic five-microorganism metatranscriptomes using massively parallel sequencing. We found that the effectiveness of rRNA removal was a function of community composition and RNA integrity for these treatments. Subtractive hybridization alone introduced the least bias in relative transcript abundance, whereas exonuclease and in particular combined treatments greatly compromised mRNA abundance fidelity. Illumina sequencing itself also can compromise quantitative data analysis by introducing a G+C bias between runs.

  4. Fit for purpose? A case study: validation of immunological endpoint assays for the detection of cellular and humoral responses to anti-tumour DNA fusion vaccines.

    PubMed

    Mander, Ann; Chowdhury, Ferdousi; Low, Lindsey; Ottensmeier, Christian H

    2009-05-01

    Clinical trials are governed by an increasingly stringent regulatory framework, which applies to all levels of trial conduct. Study critical immunological endpoints, which define success or failure in early phase clinical immunological trials, require formal pre-trial validation. In this case study, we describe the assay validation process, during which the sensitivity, and precision of immunological endpoint assays were defined. The purpose was the evaluation of two multicentre phase I/II clinical trials from our unit in Southampton, UK, which assess the effects of DNA fusion vaccines on immune responses in HLA-A2+ patients with carcinoembryonic antigen (CEA)-expressing malignancies and prostate cancer. Validated immunomonitoring is being performed using ELISA and IFNgamma ELISPOTs to assess humoral and cellular responses to the vaccines over time. The validated primary endpoint assay, a peptide-specific CD8+ IFNgamma ELISPOT, was tested in a pre-trial study and found to be suitable for the detection of low frequency naturally occurring CEA- and prostate-derived tumour-antigen-specific T cells in patients with CEA-expressing malignancies and prostate cancer.

  5. Data on the verification and validation of segmentation and registration methods for diffusion MRI.

    PubMed

    Esteban, Oscar; Zosso, Dominique; Daducci, Alessandro; Bach-Cuadra, Meritxell; Ledesma-Carbayo, María J; Thiran, Jean-Philippe; Santos, Andres

    2016-09-01

    The verification and validation of segmentation and registration methods is a necessary assessment in the development of new processing methods. However, verification and validation of diffusion MRI (dMRI) processing methods is challenging for the lack of gold-standard data. The data described here are related to the research article entitled "Surface-driven registration method for the structure-informed segmentation of diffusion MR images" [1], in which publicly available data are used to derive golden-standard reference-data to validate and evaluate segmentation and registration methods in dMRI. PMID:27508235

  6. Data on the verification and validation of segmentation and registration methods for diffusion MRI.

    PubMed

    Esteban, Oscar; Zosso, Dominique; Daducci, Alessandro; Bach-Cuadra, Meritxell; Ledesma-Carbayo, María J; Thiran, Jean-Philippe; Santos, Andres

    2016-09-01

    The verification and validation of segmentation and registration methods is a necessary assessment in the development of new processing methods. However, verification and validation of diffusion MRI (dMRI) processing methods is challenging for the lack of gold-standard data. The data described here are related to the research article entitled "Surface-driven registration method for the structure-informed segmentation of diffusion MR images" [1], in which publicly available data are used to derive golden-standard reference-data to validate and evaluate segmentation and registration methods in dMRI.

  7. STATISTICAL VALIDATION OF SULFATE QUANTIFICATION METHODS USED FOR ANALYSIS OF ACID MINE DRAINAGE

    EPA Science Inventory

    Turbidimetric method (TM), ion chromatography (IC) and inductively coupled plasma atomic emission spectrometry (ICP-AES) with and without acid digestion have been compared and validated for the determination of sulfate in mining wastewater. Analytical methods were chosen to compa...

  8. Lead isotope ratios for bullets, a descriptive approach for investigative purposes and a new method for sampling of bullet lead.

    PubMed

    Sjåstad, Knut-Endre; Simonsen, Siri Lene; Andersen, Tom H

    2014-11-01

    To establish a link between a bullet fired from a suspected firearm, investigation of striation marks are one of the corner stones in the forensic laboratory. Nevertheless, on some occasions, the bullet may be deformed to such extent that traditional investigation of striation marks will be impossible. Fragments of lead can be investigated by lead isotope ratio determination in order to distinguish between bullets with different origin. This approach initially seems reasonable, since the abundance of lead isotopes varies significantly in nature. To make a method valid for forensic purposes, it is important to have a fundamental understanding of the variation within a box of lead bullets and the expected variation between boxes. Studies of variability within and between boxes of ammunition are imperative to perform any type of forensic interpretation, both in an investigative and evaluative context. This work presents an extensive study of variability within and between boxes of ammunition by use of multicollector inductive coupled mass spectrometry. As a first approximation to classify bullets to any given source, a simple and robust graphical method is presented. In addition, an easy-to-use sampling procedure of lead is presented.

  9. Testing and Validation of the Dynamic Interia Measurement Method

    NASA Technical Reports Server (NTRS)

    Chin, Alexander; Herrera, Claudia; Spivey, Natalie; Fladung, William; Cloutier, David

    2015-01-01

    This presentation describes the DIM method and how it measures the inertia properties of an object by analyzing the frequency response functions measured during a ground vibration test (GVT). The DIM method has been in development at the University of Cincinnati and has shown success on a variety of small scale test articles. The NASA AFRC version was modified for larger applications.

  10. Validation of a Generic qHNMR Method for Natural Products Analysis†

    PubMed Central

    Gödecke, Tanja; Napolitano, José G.; Rodríguez-Brasco, María F.; Chen, Shao-Nong; Jaki, Birgit U.; Lankin, David C.; Pauli, Guido F.

    2014-01-01

    Introduction Nuclear magnetic resonance (NMR) spectroscopy is increasingly employed in the quantitative analysis and quality control (QC) of natural products (NPs) including botanical dietary supplements (BDSs). The establishment of qHNMR based QC protocols requires method validation. Objective Develop and validate a generic qHNMR method. Optimize acquisition and processing parameters, with specific attention to the requirements for the analysis of complex NP samples, including botanicals and purity assessment of NP isolates. Methodology In order to establish the validated qHNMR method, samples containing two highly pure reference materials were used. The influence of acquisition and processing parameters on the method validation were examined, and general aspects of method validation of qHNMR methods discussed. Subsequently, the established method was applied to the analysis of two natural products samples: a purified reference compound and a crude mixture. Results The accuracy and precision of qHNMR using internal or external calibration were compared, using a validated method suitable for complex samples. The impact of post-acquisition processing on method validation was examined using three software packages: TopSpin, MNova, and NUTS. The dynamic range of the developed qHNMR method was 5,000:1 with a limit of detection (LOD) of better than 10 μM. The limit of quantification (LOQ) depends on the desired level of accuracy and experiment time spent. Conclusions This study revealed that acquisition parameters, processing parameters, and processing software all contribute to qHNMR method validation. A validated method with high dynamic range and general workflow for qHNMR analysis of NPs is proposed. PMID:23740625

  11. Flight critical system design guidelines and validation methods

    NASA Technical Reports Server (NTRS)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  12. PEM fuel cell fault detection and identification using differential method: simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Frappé, E.; de Bernardinis, A.; Bethoux, O.; Candusso, D.; Harel, F.; Marchand, C.; Coquery, G.

    2011-05-01

    PEM fuel cell performance and lifetime strongly depend on the polymer membrane and MEA hydration. As the internal moisture is very sensitive to the operating conditions (temperature, stoichiometry, load current, water management…), keeping the optimal working point is complex and requires real-time monitoring. This article focuses on PEM fuel cell stack health diagnosis and more precisely on stack fault detection monitoring. This paper intends to define new, simple and effective methods to get relevant information on usual faults or malfunctions occurring in the fuel cell stack. For this purpose, the authors present a fault detection method using simple and non-intrusive on-line technique based on the space signature of the cell voltages. The authors have the objective to minimize the number of embedded sensors and instrumentation in order to get a precise, reliable and economic solution in a mass market application. A very low number of sensors are indeed needed for this monitoring and the associated algorithm can be implemented on-line. This technique is validated on a 20-cell PEMFC stack. It demonstrates that the developed method is particularly efficient in flooding case. As a matter of fact, it uses directly the stack as a sensor which enables to get a quick feedback on its state of health.

  13. Validation of a spectrophotometric method for quantification of xanthone in biodegradable nanoparticles.

    PubMed

    Teixeira, M; Pinto, M M M; Barbosa, C M

    2004-04-01

    Xanthone has been incorporated for the first time in nanoparticles of poly(D,L-lactide-co-glycolide) (PLGA). For this purpose the estimation of xanthone content in the nanoparticles is a crucial tool for guaranteeing the reliability of the results. Thus, a simple spectrophotometric method was validated according to USP25 and ICH guidelines for its specificity, linearity, accuracy and precision. The method was found to be specific for xanthone in the presence of nanoparticle excipients. The calibration curve was linear over the concentration range of 0.5 to 4.0 microg/mL (r > 0.999). Recovery of xanthone from nanoparticles ranged from 86.5 to 95.9%. Repeatability (intra-assay precision) and intermediate precision were found to be acceptable with relative standard deviations values (RSD) ranging from 0.3 to 3.0% and from 1.4 to 3.1%, respectively. The method was found to be suitable for the evaluation of xanthone content in nanoparticles of PLGA.

  14. Thermogravimetric desorption and de novo tests I: method development and validation.

    PubMed

    Tsytsik, Palina; Czech, Jan; Carleer, Robert; Reggers, Guy; Buekens, Alfons

    2008-08-01

    Thermogravimetric analysis (TGA) has been combined with evolved gas analysis (EGA) with the purpose of simulating the thermal behaviour of filter dust samples under inert (desorption) and de novo test oxidising conditions. Emphasis is on studying de novo formation of dioxins, surrogates and precursors arising from filter dust derived from thermal processes, such as municipal solid waste incineration and metallurgy. A new method is tested for sampling and analysing dioxin surrogates and precursors in the TGA effluent, which are collected on sampling tubes; the adsorbed compounds are eventually desorbed and quantified by TD-GC-MS. The major sources of error and losses are considered, including potential sorbent artefacts, possible breakthrough of volatiles through sampling tubes, or eventual losses of semi-volatiles due to their incomplete desorption or re-condensation inside the TG Analyser. The method is optimised and validated for di- to hexa-chlorinated benzenes in a range of 10-1000 ppb with average recovery exceeding 85%. The results are compared with data obtained in similar studies, performed by other research groups. As a result, the method provides the means for simulating de novo synthesis of dioxins in fly-ash and facilitates reliable and easy estimation of de novo activity, comparable with results of other studies, in combination with wide flexibility of testing conditions.

  15. An evaluation of alternate production methods for Pu-238 general purpose heat source pellets

    SciTech Connect

    Mark Borland; Steve Frank

    2009-06-01

    For the past half century, the National Aeronautics and Space Administration (NASA) has used Radioisotope Thermoelectric Generators (RTG) to power deep space satellites. Fabricating heat sources for RTGs, specifically General Purpose Heat Sources (GPHSs), has remained essentially unchanged since their development in the 1970s. Meanwhile, 30 years of technological advancements have been made in the applicable fields of chemistry, manufacturing and control systems. This paper evaluates alternative processes that could be used to produce Pu 238 fueled heat sources. Specifically, this paper discusses the production of the plutonium-oxide granules, which are the input stream to the ceramic pressing and sintering processes. Alternate chemical processes are compared to current methods to determine if alternative fabrication processes could reduce the hazards, especially the production of respirable fines, while producing an equivalent GPHS product.

  16. Validation of doubly labeled water method using a ruminant

    SciTech Connect

    Fancy, S.G.; Blanchard, J.M.; Holleman, D.F.; Kokjer, K.J.; White, R.G.

    1986-07-01

    CO/sub 2/ production (CDP, ml CO/sub 2/ . g-1 . h-1) by captive caribou and reindeer (Rangifer tarandus) was measured using the doubly labeled water method (/sup 3/H/sub 2/O and H2(18)O) and compared with CO/sub 2/ expiration rates (VCO/sub 2/), adjusted for CO/sub 2/ losses in CH4 and urine, as determined by open-circuit respirometry. CDP calculated from samples of blood or urine from a reindeer in winter was 1-3% higher than the adjusted VCO/sub 2/. Differences between values derived by the two methods of 5-20% were found in summer trials with caribou. None of these differences were statistically significant (P greater than 0.05). Differences in summer could in part be explained by the net deposition of /sup 3/H, 18O, and unlabeled CO/sub 2/ in antlers and other growing tissues. Total body water volumes calculated from /sup 3/H/sub 2/O dilution were up to 15% higher than those calculated from H/sub 2/(18)O dilution. The doubly labeled water method appears to be a reasonably accurate method for measuring CDP by caribou and reindeer in winter when growth rates are low, but the method may overestimate CDP by rapidly growing and/or fattening animals.

  17. Validated spectrofluorimetric method for determination of selected aminoglycosides

    NASA Astrophysics Data System (ADS)

    Omar, Mahmoud A.; Ahmed, Hytham M.; Hammad, Mohamed A.; Derayea, Sayed M.

    2015-01-01

    New, sensitive, and selective spectrofluorimetric method was developed for determination of three aminoglycoside drugs in different dosage forms, namely; neomycin sulfate (NEO), tobramycin (TOB) and kanamycin sulfate (KAN). The method is based on Hantzsch condensation reaction between the primary amino group of aminoglycosides with acetylacetone and formaldehyde in pH 2.7 yielding highly yellow fluorescent derivatives measured emission (471 nm) and excitation (410 nm) wavelengths. The fluorescence intensity was directly proportional to the concentration over the range 10-60, 40-100 and 5-50 ng/mL for NEO, TOB and KAN respectively. The proposed method was applied successfully for determination of these drugs in their pharmaceutical dosage forms.

  18. Validation of ESR analyzer using Westergren ESR method.

    PubMed

    Sikka, Meera; Tandon, Rajesh; Rusia, Usha; Madan, Nishi

    2007-07-01

    Erythrocyte sedimentation rate (ESR) is one of the most frequently ordered laboratory test. ESR analyzers were developed to provide a quick and efficient measure of ESR. We compared the results of ESR obtained by an ESR analyzer with those by the Westergren method in a group of 75 patients Linear regression analysis showed a good correlation between the two results (r = 0.818, p < 0.01). The intra class correlation was 0.82. The analyzer method had the advantages of safety, decreased technician time and improved patient care by providing quick results.

  19. Validity of the Kusnetz method for measuring radon progeny concentration

    SciTech Connect

    Duport, P.

    1998-12-31

    The standard Kusnetz and Rolle methods currently used to measure the concentration of potential alpha energy due to the presence of radon progeny were designed at a time when ventilation conditions were very different than in current mines. This report reviews those methods and evaluated whether they are still reliable when the residency time of air underground is very short compared to the half-life of short-lived radon progeny, and in particular, under the new ventilation conditions that exist in Saskatchewan uranium mines. The uncertainty in measurements of potential alpha energy concentration is evaluated using Monte Carlo simulation.

  20. Obtaining Valid Response Rates: Considerations beyond the Tailored Design Method.

    ERIC Educational Resources Information Center

    Huang, Judy Y.; Hubbard, Susan M.; Mulvey, Kevin P.

    2003-01-01

    Reports on the use of the tailored design method (TDM) to achieve high survey response in two separate studies of the dissemination of Treatment Improvement Protocols (TIPs). Findings from these two studies identify six factors may have influenced nonresponse, and show that use of TDM does not, in itself, guarantee a high response rate. (SLD)

  1. Maladjustment of Bully-Victims: Validation with Three Identification Methods

    ERIC Educational Resources Information Center

    Yang, An; Li, Xiang; Salmivalli, Christina

    2016-01-01

    Although knowledge on the psychosocial (mal)adjustment of bully-victims, children who bully others and are victimised by others, has been increasing, the findings have been principally gained utilising a single method to identify bully-victims. The present study examined the psychosocial adjustment of bully-victims (as compared with pure bullies…

  2. Methods of Validating Learning Hierarchies with Applications to Mathematics Learning.

    ERIC Educational Resources Information Center

    Ekstrand, Judith M.

    The relationship between mathematics tests and the theoretical learning process was explored using alternative statistical methods and models. Data for over 1300 students in grade 5 using the mathematics subscales from the National Longitudinal Study of Mathematical Abilities (NLSMA) were analyzed. Results indicated that Bloom's taxonomy is weakly…

  3. A Permutation Method to Assess Heterogeneity in External Validation for Risk Prediction Models

    PubMed Central

    Wang, Ling-Yi; Lee, Wen-Chung

    2015-01-01

    The value of a developed prediction model depends on its performance outside the development sample. The key is therefore to externally validate the model on a different but related independent data. In this study, we propose a permutation method to assess heterogeneity in external validation for risk prediction models. The permutation p value measures the extent of homology between development and validation datasets. If p < 0.05, the model may not be directly transported to the external validation population without further revision or updating. Monte-Carlo simulations are conducted to evaluate the statistical properties of the proposed method, and two microarray breast cancer datasets are analyzed for demonstration. The permutation method is easy to implement and is recommended for routine use in external validation for risk prediction models. PMID:25606854

  4. A Validation of Elements, Methods, and Barriers to Inclusive High School Service-Learning Programs

    ERIC Educational Resources Information Center

    Dymond, Stacy K.; Chun, Eul Jung; Kim, Rah Kyung; Renzaglia, Adelle

    2013-01-01

    A statewide survey of coordinators of inclusive high school service-learning programs was conducted to validate elements, methods, and barriers to including students with and without disabilities in service-learning. Surveys were mailed to 655 service-learning coordinators; 190 (29%) returned a completed survey. Findings support the validity of…

  5. Double Cross-Validation in Multiple Regression: A Method of Estimating the Stability of Results.

    ERIC Educational Resources Information Center

    Rowell, R. Kevin

    In multiple regression analysis, where resulting predictive equation effectiveness is subject to shrinkage, it is especially important to evaluate result replicability. Double cross-validation is an empirical method by which an estimate of invariance or stability can be obtained from research data. A procedure for double cross-validation is…

  6. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  7. Establishing Survey Validity and Reliability for American Indians Through “Think Aloud” and Test–Retest Methods

    PubMed Central

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L.; Burgess, Katherine M.; Puumala, Susan E.; Wilton, Georgiana; Hanson, Jessica D.

    2015-01-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a “think aloud” methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test–retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test–retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. PMID:25888693

  8. Production of general purpose heat source (GPHS) using advanced manufacturing methods

    NASA Astrophysics Data System (ADS)

    Miller, Roger G.

    1996-03-01

    Mankind will continue to explore the stars through the use of unmanned space craft until the technology and costs are compatible with sending travelers to the outer planets of our solar system and beyond. Unmanned probes of the present and future will be necessary to develop the necessary technologies and obtain information that will make this travel possible. Because of the significant costs incurred, the use of modern manufacturing technologies must be used to lower the investment needed even when shared by international partnerships. For over the last 30 years, radioisotopes have provided the heat from which electrical power is extracted. Electric power for future spacecraft will be provided by either Radioisotope Thermoelectric Generators (RTG), Radioisotopic Thermophotovoltaic systems (RTPV), radioisotope Stirling systems, or a combination of these. All of these systems will be thermally driven by General Purpose Heat Source (GPHS) fueled clad in some configuration. The GPHS clad contains a 238PuO2 pellet encapsulated in an iridium alloy container. Historically, the fabrication of the iridium alloy shells has been performed at EG&G Mound and Oak Ridge National Laboratory (ORNL), and girth welding at Westinghouse Savannah River Corporation (WSRC) and Los Alamos National Laboratory (LANL). This paper will describe the use of laser processing for welding, drilling, cutting, and machining with other manufacturing methods to reduce the costs of producing GPHS fueled clad components and compléted assemblies. Incorporation of new quality technologies will compliment these manufacturing methods to reduce cost.

  9. Modified method to enhanced recovery of Toxocara cati larvae for the purposes of diagnostic and therapeutic.

    PubMed

    Zibaei, Mohammad; Uga, Shoji

    2016-10-01

    Human toxocariasis, extraintestinal-migration of Toxocara species, is a worldwide helminthic zoonosis in many places of the undeveloped countries. Toxocara cati is one of the common helminths in cats and it is a potentially preventable disease. Its diagnosis and treatment depend on the demonstration of specific excretory-secretory Toxocara antibodies from Toxocara larvae by immunological assays. This study provides a simple manual technique which can be performed in any laboratory for recovering a large number of Toxocara cati larvae from the thick-shelled eggs. The devices that are required contain a manual homogenizer and a filter membrane of 40 μm mesh; the rest of materials and solutions is standard laboratory ware. In the modified method the larval yields were 2.7 times higher (3000 larval/ml) and the time spent in performing the modified method was shorter (75 min). Further benefits over already techniques are the easy and repeatable, inexpensive and convenient materials, simplicity to perform and require less time for recovery of Toxocara cati larvae for subsequent cultivation and harvest of the larval excretory-secretory antigens for diagnostic or treatment purposes. PMID:27502936

  10. Validation of a blood group genotyping method based on high-resolution melting curve analysis.

    PubMed

    Gong, Tianxiang; Hong, Ying; Wang, Naihong; Fu, Xuemei; Zhou, Changhua

    2014-01-01

    The detection of polymorphism is the basis of blood group genotyping and phenotype prediction. Genotyping may be useful to determine blood groups when serologic results are unclear. The development and application of different methods for blood group genotyping may be needed as a substitute for blood group typing. The purpose of this study is to establish an approach for blood group genotyping based on a melting curve analysis of real-time polymerase chain reaction (PCR). Using DNA extracted from whole blood, we developed and validated a DNA typing method for detecting DO*01/DO*02, DO*01/DI*02, LU*01/LU*02, and GYPB*03/GYBP*04 alleles using a melting curve analysis. All assays were confirmed with a commercial reagent containing sequence-specific primers (PCR-SSP), and a cohort of the samples was confirmed with sequencing. Results for all blood groups were within the range of specificity and assay variability. Genotypes of 300 blood donors were fully consistent with PCR-SSP data. The obtained genotype distribution is in complete concordance with existing data for the Chinese population. There are several advantages for this approach of blood group genotyping: lower contamination rates with PCR products in this laboratory, ease of performance, automation potential, and rapid cycling time.

  11. System identification methods for aircraft flight control development and validation

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1995-01-01

    System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.

  12. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    SciTech Connect

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated.

  13. Validation of a new ELISA method for in vitro potency testing of hepatitis A vaccines.

    PubMed

    Morgeaux, S; Variot, P; Daas, A; Costanzo, A

    2013-01-01

    The goal of the project was to standardise a new in vitro method in replacement of the existing standard method for the determination of hepatitis A virus antigen content in hepatitis A vaccines (HAV) marketed in Europe. This became necessary due to issues with the method used previously, requiring the use of commercial test kits. The selected candidate method, not based on commercial kits, had already been used for many years by an Official Medicines Control Laboratory (OMCL) for routine testing and batch release of HAV. After a pre-qualification phase (Phase 1) that showed the suitability of the commercially available critical ELISA reagents for the determination of antigen content in marketed HAV present on the European market, an international collaborative study (Phase 2) was carried out in order to fully validate the method. Eleven laboratories took part in the collaborative study. They performed assays with the candidate standard method and, in parallel, for comparison purposes, with their own in-house validated methods where these were available. The study demonstrated that the new assay provides a more reliable and reproducible method when compared to the existing standard method. A good correlation of the candidate standard method with the in vivo immunogenicity assay in mice was shown previously for both potent and sub-potent (stressed) vaccines. Thus, the new standard method validated during the collaborative study may be implemented readily by manufacturers and OMCLs for routine batch release but also for in-process control or consistency testing. The new method was approved in October 2012 by Group of Experts 15 of the European Pharmacopoeia (Ph. Eur.) as the standard method for in vitro potency testing of HAV. The relevant texts will be revised accordingly. Critical reagents such as coating reagent and detection antibodies have been adopted by the Ph. Eur. Commission and are available from the EDQM as Ph. Eur. Biological Reference Reagents (BRRs).

  14. Assessment and validation of a simple automated method for the detection of gait events and intervals.

    PubMed

    Ghoussayni, Salim; Stevens, Christopher; Durham, Sally; Ewins, David

    2004-12-01

    A simple and rapid automatic method for detection of gait events at the foot could speed up and possibly increase the repeatability of gait analysis and evaluations of treatments for pathological gaits. The aim of this study was to compare and validate a kinematic-based algorithm used in the detection of four gait events, heel contact, heel rise, toe contact and toe off. Force platform data is often used to obtain start and end of contact phases, but not usually heel rise and toe contact events. For this purpose synchronised kinematic, kinetic and video data were captured from 12 healthy adult subjects walking both barefoot and shod at slow and normal self-selected speeds. The data were used to determine the gait events using three methods: force, visual inspection and algorithm methods. Ninety percent of all timings given by the algorithm were within one frame (16.7 ms) when compared to visual inspection. There were no statistically significant differences between the visual and algorithm timings. For both heel and toe contact the differences between the three methods were within 1.5 frames, whereas for heel rise and toe off the differences between the force on one side and the visual and algorithm on the other were higher and more varied (up to 175 ms). In addition, the algorithm method provided the duration of three intervals, heel contact to toe contact, toe contact to heel rise and heel rise to toe off, which are not readily available from force platform data. The ability to automatically and reliably detect the timings of these four gait events and three intervals using kinematic data alone is an asset to clinical gait analysis.

  15. Experimental Validation for Hot Stamping Process by Using Taguchi Method

    NASA Astrophysics Data System (ADS)

    Fawzi Zamri, Mohd; Lim, Syh Kai; Razlan Yusoff, Ahmad

    2016-02-01

    Due to the demand for reduction in gas emissions, energy saving and producing safer vehicles has driven the development of Ultra High Strength Steel (UHSS) material. To strengthen UHSS material such as boron steel, it needed to undergo a process of hot stamping for heating at certain temperature and time. In this paper, Taguchi method is applied to determine the appropriate parameter of thickness, heating temperature and heating time to achieve optimum strength of boron steel. The experiment is conducted by using flat square shape of hot stamping tool with tensile dog bone as a blank product. Then, the value of tensile strength and hardness is measured as response. The results showed that the lower thickness, higher heating temperature and heating time give the higher strength and hardness for the final product. In conclusion, boron steel blank are able to achieve up to 1200 MPa tensile strength and 650 HV of hardness.

  16. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    PubMed Central

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  17. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose.

  18. Validity assessment of the detection method of maize event Bt10 through investigation of its molecular structure.

    PubMed

    Milcamps, Anne; Rabe, Scott; Cade, Rebecca; De Framond, Anic J; Henriksson, Peter; Kramer, Vance; Lisboa, Duarte; Pastor-Benito, Susana; Willits, Michael G; Lawrence, David; Van den Eede, Guy

    2009-04-22

    In March 2005, U.S. authorities informed the European Commission of the inadvertent release of unauthorized maize GM event Bt10 in their market and subsequently the grain channel. In the United States measures were taken to eliminate Bt10 from seed and grain supplies; in the European Union an embargo for maize gluten and brewer's grain import was implemented unless certified of Bt10 absence with a Bt10-specific PCR detection method. With the aim of assessing the validity of the Bt10 detection method, an in-depth analysis of the molecular organization of the genetic modification of this event was carried out by both the company Syngenta, who produced the event, and the European Commission Joint Research Centre, who validated the detection method. Using a variety of molecular analytical tools, both organizations found the genetic modification of event Bt10 to be very complex in structure, with rearrangements, inversions, and multiple copies of the structural elements (cry1Ab, pat, and the amp gene), interspersed with small genomic maize fragments. Southern blot analyses demonstrated that all Bt10 elements were found tightly linked on one large fragment, including the region that would generate the event-specific PCR amplicon of the Bt10 detection method. This study proposes a hypothetical map of the insert of event Bt10 and concludes that the validated detection method for event Bt10 is fit for its purpose. PMID:19368351

  19. Convergent Validity of Three Methods for Measuring Postoperative Complications

    PubMed Central

    Fritz, Bradley A.; Escallier, Krisztina E.; Abdallah, Arbi Ben; Oberhaus, Jordan; Becker, Jennifer; Geczi, Kristin; McKinnon, Sherry; Helsten, Dan L.; Sharma, Anshuman; Wildes, Troy S.; Avidan, Michael S.

    2016-01-01

    Background Anesthesiologists need tools to accurately track postoperative outcomes. The accuracy of patient report in identifying a wide variety of postoperative complications after diverse surgical procedures has not previously been investigated. Methods In this cohort study, 1,578 adult surgical patients completed a survey at least 30 days after their procedure asking if they had experienced any of 18 complications while in the hospital after surgery. Patient responses were compared to the results of an automated electronic chart review and (for a random subset of 750 patients) to a manual chart review. Results from automated chart review were also compared to those from manual chart review. Forty-two randomly selected patients were contacted by telephone to explore reasons for discrepancies between patient report and manual chart review. Results Comparisons between patient report, automated chart review, and manual chart review demonstrated poor-to-moderate positive agreement (range, 0 to 58%) and excellent negative agreement (range, 82 to 100%). Discordance between patient report and manual chart review was frequently explicable by patients reporting events that happened outside the time period of interest. Conclusions Patient report can provide information about subjective experiences or events that happen after hospital discharge, but often yields different results from chart review for specific in-hospital complications. Effective in-hospital communication with patients and thoughtful survey design may increase the quality of patient-reported complication data. PMID:27028469

  20. Application of neural networks and geomorphometry method for purposes of urban planning (Kazan, Russia)

    NASA Astrophysics Data System (ADS)

    Yermolaev, Oleg; Selivanov, Renat

    2013-04-01

    The landscape structure of a territory imposes serious limitations on the adoption of certain decisions. Differentiation of the relief into separate elementary geomorphological sections yields the basis for most adequate determination of the boundaries of urban geosystems. In paper the results of approbation of relief classification methods based on Artificial Neuron Networks are presented. Approbation of Artificial Neuron Networks (ANN) method (Kohonen's Self-Organizing Maps - SOM) for purposes of automated zoning of a modern city's territory on the example of the city of Kazan. The developed model of the restored landscapes represents the city territory as a system of geomorphologically homogenous terrains. Main research objectives: development of a digital model of relief of the city of Kazan; approbation of relief classification methods based on ANN and expert estimations; creation of a SOM-based map of urban geosystems; verification of the received results of classification, clarification and enlargement of landscape units; determination of the applicability of the method in question for purposes of zoning of big cities' territory, identification of strengths and weaknesses. First stage: analysis and digitalization of the detailed large-scale topographic map of Kazan. Digital model of the relief with a grid size of 10m has been produced. We have used this data for building various analytical maps of certain morphometric characteristics of the relief: height, slope, exposition, profile and plan curvature. Calculated morphometric values were transformed into a data matrix. Software packages use training algorithms without the use of a tutor, whereas weight coefficients are redistributed for each specific operational-territorial unit. After several iterations of the "education" process, neural network leads to gradual clumping of groups of operational-territorial unit with similar sets of morphometric parameters. 81 classes have been distinguished. Such atomism

  1. Field validation of the dnph method for aldehydes and ketones. Final report

    SciTech Connect

    Workman, G.S.; Steger, J.L.

    1996-04-01

    A stationary source emission test method for selected aldehydes and ketones has been validated. The method employs a sampling train with impingers containing 2,4-dinitrophenylhydrazine (DNPH) to derivatize the analytes. The resulting hydrazones are recovered and analyzed by high performance liquid chromatography. Nine analytes were studied; the method was validated for formaldehyde, acetaldehyde, propionaldehyde, acetophenone and isophorone. Acrolein, menthyl ethyl ketone, menthyl isobutyl ketone, and quinone did not meet the validation criteria. The study employed the validation techniques described in EPA method 301, which uses train spiking to determine bias, and collocated sampling trains to determine precision. The studies were carried out at a plywood veneer dryer and a polyester manufacturing plant.

  2. Development and validity of a method for the evaluation of printed education material

    PubMed Central

    Castro, Mauro Silveira; Pilger, Diogo; Fuchs, Flávio Danni; Ferreira, Maria Beatriz Cardoso

    Objectives To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM); to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men) and 5 nurses (all women). Results Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group. Conclusions The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material. PMID:25214924

  3. Validity of Eye Movement Methods and Indices for Capturing Semantic (Associative) Priming Effects

    ERIC Educational Resources Information Center

    Odekar, Anshula; Hallowell, Brooke; Kruse, Hans; Moates, Danny; Lee, Chao-Yang

    2009-01-01

    Purpose: The purpose of this investigation was to evaluate the usefulness of eye movement methods and indices as a tool for studying priming effects by verifying whether eye movement indices capture semantic (associative) priming effects in a visual cross-format (written word to semantically related picture) priming paradigm. Method: In the…

  4. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  5. Validation of a commercially available automated canine-specific immunoturbidimetric method for measuring canine C-reactive protein

    PubMed Central

    Hillström, Anna; Hagman, Ragnvi; Tvedten, Harold; Kjelgaard-Hansen, Mads

    2014-01-01

    Background Measurement of C-reactive protein (CRP) is used for diagnosing and monitoring systemic inflammatory disease in canine patients. An automated human immunoturbidimetric assay has been validated for measuring canine CRP, but cross-reactivity with canine CRP is unpredictable. Objective The purpose of the study was to validate a new automated canine-specific immunoturbidimetric CRP method (Gentian cCRP). Methods Studies of imprecision, accuracy, prozone effect, interference, limit of quantification, and stability under different storage conditions were performed. The new method was compared with a human CRP assay previously validated for canine CRP determination. Samples from 40 healthy dogs were analyzed to establish a reference interval. Results Total imprecision was < 2.4% for 4 tested serum pools analyzed twice daily over 10 days. The method was linear under dilution, and no prozone effect was detected at a concentration of 1200 mg/L. Recovery after spiking serum with purified canine CRP at 2 different concentrations was 123% and 116%, respectively. No interference from hemoglobin or triglycerides (10 g/L) was detected. CRP was stable for 14 days at 4°C and 22°C. In the method comparison study, there was good agreement between the validated human CRP assay and the new canine-specific assay. Healthy dogs had CRP concentrations that were less than the limit of quantification of the Gentian cCRP method (6.8 mg/L). Conclusions The new canine-specific immunoturbidimetric CRP assay is a reliable and rapid method for measuring canine CRP, suitable for clinical use due to the option for an automated assay. PMID:24798319

  6. Production of general purpose heat source (GPHS) using advanced manufacturing methods

    SciTech Connect

    Miller, R.G.

    1996-03-01

    Mankind will continue to explore the stars through the use of unmanned space craft until the technology and costs are compatible with sending travelers to the outer planets of our solar system and beyond. Unmanned probes of the present and future will be necessary to develop the necessary technologies and obtain information that will make this travel possible. Because of the significant costs incurred, the use of modern manufacturing technologies must be used to lower the investment needed even when shared by international partnerships. For over the last 30 years, radioisotopes have provided the heat from which electrical power is extracted. Electric power for future spacecraft will be provided by either Radioisotope Thermoelectric Generators (RTG), Radioisotopic Thermophotovoltaic systems (RTPV), radioisotope Stirling systems, or a combination of these. All of these systems will be thermally driven by General Purpose Heat Source (GPHS) fueled clad in some configuration. The GPHS clad contains a {sup 238}PuO{sub 2} pellet encapsulated in an iridium alloy container. Historically, the fabrication of the iridium alloy shells has been performed at EG&G Mound and Oak Ridge National Laboratory (ORNL), and girth welding at Westinghouse Savannah River Corporation (WSRC) and Los Alamos National Laboratory (LANL). This paper will describe the use of laser processing for welding, drilling, cutting, and machining with other manufacturing methods to reduce the costs of producing GPHS fueled clad components and compl{acute e}ted assemblies. Incorporation of new quality technologies will compliment these manufacturing methods to reduce cost. {copyright} {ital 1996 American Institute of Physics.}

  7. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  8. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  9. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  10. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is...

  11. Single Lab Validation of a LC/UV/FLD/MS Method for Simultaneous Determination of Water-soluble Vitamins in Multi-Vitamin Dietary Supplements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this study was to develop a Single-Lab Validated Method using high-performance liquid chromatography (HPLC) with different detectors (diode array detector - DAD, fluorescence detector - FLD, and mass spectrometer - MS) for determination of seven B-complex vitamins (B1 - thiamin, B2 – ...

  12. Bioanalytical method validation: concepts, expectations and challenges in small molecule and macromolecule--a report of PITTCON 2013 symposium.

    PubMed

    Bashaw, Edward D; DeSilva, Binodh; Rose, Mark J; Wang, Yow-Ming C; Shukla, Chinmay

    2014-05-01

    The concepts, importance, and implications of bioanalytical method validation has been discussed and debated for a long time. The recent high profile issues related to bioanalytical method validation at both Cetero Houston and former MDS Canada has brought this topic back in the limelight. Hence, a symposium on bioanalytical method validation with the aim of revisiting the building blocks as well as discussing the challenges and implications on the bioanalysis of both small molecules and macromolecules was featured at the PITTCON 2013 Conference and Expo. This symposium was cosponsored by the American Chemical Society (ACS)-Division of Analytical Chemistry and Analysis and Pharmaceutical Quality (APQ) Section of the American Association of Pharmaceutical Scientists (AAPS) and featured leading speakers from the Food & Drug Administration (FDA), academia, and industry. In this symposium, the speakers shared several unique examples, and this session also provided a platform to discuss the need for continuous vigilance of the bioanalytical methods during drug discovery and development. The purpose of this article is to provide a concise report on the materials that were presented.

  13. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  14. Determination of methylmercury in marine sediment samples: method validation and occurrence data.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography-pyrolysis-atomic fluorescence spectrometry (GC-Py-AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO3/CuSO4, solvent extraction and back extraction into Na2S2O3 yielded the highest extraction recovery, i.e., 94±3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC-ICP-MS), using isotopically enriched Me(201)Hg and (202)Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed. With this in mind, blanks, selectivity, working range (1-800 pg), linearity (0.9995), recovery (94-96%), repeatability (3%), intermediate precision (4%), limit of detection (0.45 pg) and limit of quantification (0.85 pg) were systematically assessed with CRM IAEA-405. The uncertainty budget was calculated and the major contribution to the combined uncertainty (16.24%, k=2) was found to arise from the uncertainty associated with recovery (74.1%). Demonstration of traceability of

  15. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    PubMed

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  16. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    PubMed

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method. PMID:19533405

  17. 25 CFR 309.8 - For marketing purposes, what is the recommended method of identifying authentic Indian products?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... recommended method of identifying authentic Indian products? (a) The recommended method of marketing authentic... 25 Indians 2 2010-04-01 2010-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT...

  18. A Model Incorporating the Rationale and Purpose for Conducting Mixed-Methods Research in Special Education and beyond

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Sutton, Ida L.

    2006-01-01

    This article provides a typology of reasons for conducting mixed-methods research in special education. The mixed-methods research process is described along with the role of the rationale and purpose of study. The reasons given in the literature for utilizing mixed-methods research are explicated, and the limitations of these reason frameworks…

  19. General purpose nonlinear system solver based on Newton-Krylov method.

    SciTech Connect

    2013-12-01

    KINSOL is part of a software family called SUNDIALS: SUite of Nonlinear and Differential/Algebraic equation Solvers [1]. KINSOL is a general-purpose nonlinear system solver based on Newton-Krylov and fixed-point solver technologies [2].

  20. Development and Validation of UV-Visible Spectrophotometric Method for Simultaneous Determination of Eperisone and Paracetamol in Solid Dosage Form

    PubMed Central

    Khanage, Shantaram Gajanan; Mohite, Popat Baban; Jadhav, Sandeep

    2013-01-01

    Purpose: Eperisone Hydrochloride (EPE) is a potent new generation antispasmodic drug which is used in the treatment of moderate to severe pain in combination with Paracetamol (PAR). Both drugs are available in tablet dosage form in combination with a dose of 50 mg for EPE and 325 mg PAR respectively. Methods: The method is based upon Q-absorption ratio method for the simultaneous determination of the EPE and PAR. Absorption ratio method is used for the ratio of the absorption at two selected wavelength one of which is the iso-absorptive point and other being the λmax of one of the two components. EPE and PAR shows their iso-absorptive point at 260 nm in methanol, the second wavelength used is 249 nm which is the λmax of PAR in methanol. Results: The linearity was obtained in the concentration range of 5-25 μg/mL for EPE and 2-10 μg/mL for PAR. The proposed method was effectively applied to tablet dosage form for estimation of both drugs. The accuracy and reproducibility results are close to 100% with 2% RSD. Results of the analysis were validated statistically and found to be satisfactory. The results of proposed method have been validated as per ICH guidelines. Conclusion: A simple, precise and economical spectrophotometric method has been developed for the estimation of EPE and PAR in pharmaceutical formulation. PMID:24312876

  1. Development of Data Validation Methods for System Configurations of Train Protection Systems

    NASA Astrophysics Data System (ADS)

    Shimazoe, Toshiyuki

    This paper proposes new methods of data validation that targets configuration data prepared for train protection systems as represented by the automatic train control (ATC) systems that are applied to the Shinkansen and other heavy-duty trains. This configuration data is assigned to a generic application program in order to realise specific applications according to track layouts and different local conditions. The potential of introducing errors to the configuration data can lead to undesirable events; the safe separation of trains and safe speed are not ensured by such errors. Therefore, it is a safety-critical issue to eliminate errors from the configuration data. Because of this, the author developed the data validation methods not depending on human works, utilising Extensible Markup Language (XML) technologies. This paper illustrates that XML is useful to represent the configuration data to which flexible expressions are required, and that XML Schema is valuable when performing syntax validation. Subsequently, the semantic validation methods are proposed by means of Extensible StyleSheet Language Transformations (XSLT) to provide a way to realise the semantic validation without custom application programming. Additionally, reliable processes to ensure the checking of validation results are proposed, and the effectiveness of the proposed methods are demonstrated by applying to actual configurations.

  2. Development and validation of an ionic chromatography method for the determination of nitrate, nitrite and chloride in meat.

    PubMed

    Lopez-Moreno, Cristina; Perez, Isabel Viera; Urbano, Ana M

    2016-03-01

    The purpose of this study is to develop the validation of a method for the analysis of certain preservatives in meat and to obtain a suitable Certified Reference Material (CRM) to achieve this task. The preservatives studied were NO3(-), NO2(-) and Cl(-) as they serve as important antimicrobial agents in meat to inhibit the growth of bacteria spoilage. The meat samples were prepared using a treatment that allowed the production of a known CRM concentration that is highly homogeneous and stable in time. The matrix effects were also studied to evaluate the influence on the analytical signal for the ions of interest, showing that the matrix influence does not affect the final result. An assessment of the signal variation in time was carried out for the ions. In this regard, although the chloride and nitrate signal remained stable for the duration of the study, the nitrite signal decreased appreciably with time. A mathematical treatment of the data gave a stable nitrite signal, obtaining a method suitable for the validation of these anions in meat. A statistical study was needed for the validation of the method, where the precision, accuracy, uncertainty and other mathematical parameters were evaluated obtaining satisfactory results.

  3. Development and validation of an ionic chromatography method for the determination of nitrate, nitrite and chloride in meat.

    PubMed

    Lopez-Moreno, Cristina; Perez, Isabel Viera; Urbano, Ana M

    2016-03-01

    The purpose of this study is to develop the validation of a method for the analysis of certain preservatives in meat and to obtain a suitable Certified Reference Material (CRM) to achieve this task. The preservatives studied were NO3(-), NO2(-) and Cl(-) as they serve as important antimicrobial agents in meat to inhibit the growth of bacteria spoilage. The meat samples were prepared using a treatment that allowed the production of a known CRM concentration that is highly homogeneous and stable in time. The matrix effects were also studied to evaluate the influence on the analytical signal for the ions of interest, showing that the matrix influence does not affect the final result. An assessment of the signal variation in time was carried out for the ions. In this regard, although the chloride and nitrate signal remained stable for the duration of the study, the nitrite signal decreased appreciably with time. A mathematical treatment of the data gave a stable nitrite signal, obtaining a method suitable for the validation of these anions in meat. A statistical study was needed for the validation of the method, where the precision, accuracy, uncertainty and other mathematical parameters were evaluated obtaining satisfactory results. PMID:26471608

  4. Validation of an in-vivo proton beam range check method in an anthropomorphic pelvic phantom using dose measurements

    SciTech Connect

    Bentefour, El H. Prieels, Damien; Tang, Shikui; Cascio, Ethan W.; Testa, Mauro; Lu, Hsiao-Ming; Samuel, Deepak; Gottschalk, Bernard

    2015-04-15

    Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification in the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose

  5. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    PubMed

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods. PMID:20012027

  6. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    PubMed

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  7. Validation of Satellite-Derived Land Surface Temperature Products - Methods and Good Practice

    NASA Astrophysics Data System (ADS)

    Guillevic, P. C.; Hulley, G. C.; Hook, S. J.; Biard, J.; Ghent, D.

    2014-12-01

    Land Surface Temperature (LST) is a key variable for surface water and energy budget calculations that can be obtained globally and operationally from satellite observations. LST is used for many applications, including weather forecasting, short-term climate prediction, extreme weather monitoring, and irrigation and water resource management. In order to maximize the usefulness of LST for research and studies it is necessary to know the uncertainty in the LST measurement. Multiple validation methods and activities are necessary to assess LST compliance with the quality specifications of operational users. This work presents four different validation methods that have been widely used to determine the uncertainties in LST products derived from satellite measurements. 1) The temperature based validation method involves comparisons with ground-based measurements of LST. The method is strongly limited by the number and quality of available field stations. 2) Scene-based comparisons involve comparing a new satellite LST product with a heritage LST product. This method is not an absolute validation and satellite LST inter-comparisons alone do not provide an independent validation measurement. 3) The radiance-based validation method does not require ground-based measurements and is usually used for large scale validation effort or for LST products with coarser spatial resolution (> 1km). 4) Time series comparisons are used to detect problems that can occur during the instrument's life, e.g. calibration drift, or unrealistic outliers due to cloud coverage. This study enumerates the sources of errors associated with each method. The four different approaches are complementary and provide different levels of information about the quality of the retrieved LST. The challenges in retrieving the LST from satellite measurements are discussed using results obtained for MODIS and VIIRS. This work contributes to the objective of the Land Product Validation (LPV) sub-group of the

  8. Verification, Validation, and Solution Quality in Computational Physics: CFD Methods Applied to Ice Sheet Physics

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    2005-01-01

    Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.

  9. Validation of a novel method for quantifying and comparing regional ACL elongations during uniaxial tensile loading.

    PubMed

    Beaulieu, Mélanie L; Haladik, Jeffrey A; Bey, Michael J; McLean, Scott G

    2012-10-11

    Given the complex three-dimensional (3D) knee joint loading associated with anterior cruciate ligament (ACL) injuries, accurate site- and bundle-specific strain measurements are critical. The purpose of this study was to quantify tensile load-induced migrations of radio-opaque markers injected directly into the ACL, as a first step in validating a roentgen stereophotogrammetric analysis-based method for measuring ligament strain. Small markers were inserted into the femur and tibia, as well as injected into the antero-medial bundle of the ACL of eight (42-56 yrs) femur-ACL-tibia complexes (FATCs). The FATCs were then loaded under tension along the ligament's longitudinal axis by a material testing machine from 10 N to 50 N, 100 N, 125 N, and 150 N, each over 10 load-unload cycles. Complexes were imaged before the loading protocol, between each loading sequence, and after the protocol via biplane radiography. Marker migrations within the ACL tissue were quantified as the difference in their 3D positions between the pre- and each post-loading condition. Negligible migration was evident, with the lowest average root mean square values observed along the longitudinal axis of the ACL, ranging from 0.128 to 0.219 mm. Further, neither marker location nor load magnitude significantly affected migration values. This innovative method, therefore, presents as a plausible means to measure global and regional ACL strains, as small as 0.75% strain. In particular, it may provide important new insights in ACL strain behaviors during complex 3D knee load states associated with ligament injury. PMID:22939290

  10. Laboratory and field validation of a Cry1Ab protein quantitation method for water.

    PubMed

    Strain, Katherine E; Whiting, Sara A; Lydy, Michael J

    2014-10-01

    The widespread planting of crops expressing insecticidal proteins derived from the soil bacterium Bacillus thuringiensis (Bt) has given rise to concerns regarding potential exposure to non-target species. These proteins are released from the plant throughout the growing season into soil and surface runoff and may enter adjacent waterways as runoff, erosion, aerial deposition of particulates, or plant debris. It is crucial to be able to accurately quantify Bt protein concentrations in the environment to aid in risk analyses and decision making. Enzyme-linked immunosorbent assay (ELISA) is commonly used for quantitation of Bt proteins in the environment; however, there are no published methods detailing and validating the extraction and quantitation of Bt proteins in water. The objective of the current study was to optimize the extraction of a Bt protein, Cry1Ab, from three water matrices and validate the ELISA method for specificity, precision, accuracy, stability, and sensitivity. Recovery of the Cry1Ab protein was matrix-dependent and ranged from 40 to 88% in the validated matrices, with an overall method detection limit of 2.1 ng/L. Precision among two plates and within a single plate was confirmed with a coefficient of variation less than 20%. The ELISA method was verified in field and laboratory samples, demonstrating the utility of the validated method. The implementation of a validated extraction and quantitation protocol adds consistency and reliability to field-collected data regarding transgenic products. PMID:25059137

  11. Bridging the gap between comprehensive extraction protocols in plant metabolomics studies and method validation.

    PubMed

    Bijttebier, Sebastiaan; Van der Auwera, Anastasia; Foubert, Kenn; Voorspoels, Stefan; Pieters, Luc; Apers, Sandra

    2016-09-01

    It is vital to pay much attention to the design of extraction methods developed for plant metabolomics, as any non-extracted or converted metabolites will greatly affect the overall quality of the metabolomics study. Method validation is however often omitted in plant metabolome studies, as the well-established methodologies for classical targeted analyses such as recovery optimization cannot be strictly applied. The aim of the present study is to thoroughly evaluate state-of-the-art comprehensive extraction protocols for plant metabolomics with liquid chromatography-photodiode array-accurate mass mass spectrometry (LC-PDA-amMS) by bridging the gap with method validation. Validation of an extraction protocol in untargeted plant metabolomics should ideally be accomplished by validating the protocol for all possible outcomes, i.e. for all secondary metabolites potentially present in the plant. In an effort to approach this ideal validation scenario, two plant matrices were selected based on their wide versatility of phytochemicals: meadowsweet (Filipendula ulmaria) for its polyphenols content, and spicy paprika powder (from the genus Capsicum) for its apolar phytochemicals content (carotenoids, phytosterols, capsaicinoids). These matrices were extracted with comprehensive extraction protocols adapted from literature and analysed with a generic LC-PDA-amMS characterization platform that was previously validated for broad range phytochemical analysis. The performance of the comprehensive sample preparation protocols was assessed based on extraction efficiency, repeatability and intermediate precision and on ionization suppression/enhancement evaluation. The manuscript elaborates on the finding that none of the extraction methods allowed to exhaustively extract the metabolites. Furthermore, it is shown that depending on the extraction conditions enzymatic degradation mechanisms can occur. Investigation of the fractions obtained with the different extraction methods

  12. Range of valid arguments for data reduction method in the steam flow fields

    NASA Astrophysics Data System (ADS)

    Šafařík, Pavel; Nový, Adam; Hajšman, Miroslav; Jícha, David

    2014-08-01

    The data reduction method is based on conservation laws and has been developed for an ideal gas. The method is widely adopted in the aerodynamic experimental research. The presented paper shows an attempt to extend this method for steam flows. Range of valid arguments in the data reduction method is determined by means of numerical calculations using the IAPWS-IF97 equations of state of steam.

  13. Inter-laboratory validation of standardized method to determine permeability of plastic films

    Technology Transfer Automated Retrieval System (TEKTRAN)

    To support regulations controlling soil fumigation, we are standardizing the laboratory method we developed to measure the permeability of plastic films to fumigant vapors. The method was validated using an inter-laboratory comparison with 7 participants. Each participant evaluated the mass transfer...

  14. Method of validating measurement data of a process parameter from a plurality of individual sensor inputs

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1998-01-01

    A method for generating a validated measurement of a process parameter at a point in time by using a plurality of individual sensor inputs from a scan of said sensors at said point in time. The sensor inputs from said scan are stored and a first validation pass is initiated by computing an initial average of all stored sensor inputs. Each sensor input is deviation checked by comparing each input including a preset tolerance against the initial average input. If the first deviation check is unsatisfactory, the sensor which produced the unsatisfactory input is flagged as suspect. It is then determined whether at least two of the inputs have not been flagged as suspect and are therefore considered good inputs. If two or more inputs are good, a second validation pass is initiated by computing a second average of all the good sensor inputs, and deviation checking the good inputs by comparing each good input including a present tolerance against the second average. If the second deviation check is satisfactory, the second average is displayed as the validated measurement and the suspect sensor as flagged as bad. A validation fault occurs if at least two inputs are not considered good, or if the second deviation check is not satisfactory. In the latter situation the inputs from each of all the sensors are compared against the last validated measurement and the value from the sensor input that deviates the least from the last valid measurement is displayed.

  15. Validation and application of an HPLC-CAD-TOF/MS method for identification and quantification of pharmaceutical counterions.

    PubMed

    Ilko, D; Nap, C J; Holzgrabe, U; Almeling, S

    2014-01-01

    A generic approach for the analysis of counterions of pharmaceutical reference substances, which are established by the laboratory department of the European Pharmacopoeia (Ph. Eur.), was developed. A mixed-mode chromatography method using charged aerosol detection (CAD) published by Zhang et al. separating 25 commonly used pharmaceutical counterions was selected for this purpose. The method was validated in terms of specificity, repeatability, limits of quantification (LOQs), linearity and range according to ICH guideline Q2(R1) and the Technical Guide for the Elaboration of Monographs of the Ph. Eur. Moreover, the applicability of the method for the purpose of counterion identification and quantification in drug substances as well as for the control of inorganic ions as impurities was demonstrated using selected examples of Ph. Eur. reference standards and other samples of substances for pharmaceutical use (e.g. cloxacillin sodium, somatostatin). It was shown that for identification purposes of the parent substance as well as organic ions the chromatographic system can easily be coupled to a mass selective detector without any modification.

  16. Statistical methods and software for validation studies on new in vitro toxicity assays.

    PubMed

    Schaarschmidt, Frank; Hothorn, Ludwig A

    2014-11-01

    When a new in vitro assay method is introduced, it should be validated against the best available knowledge or a reference standard assay. For assays resulting in a simple binary outcome, the data can be displayed as a 2×2 table. Based on the estimated sensitivity and specificity, and the assumed prevalence of true positives in the population of interest, the positive and negative predictive values of the new assay can be calculated. We briefly discuss the experimental design of validation experiments and previously published methods for computing confidence intervals for predictive values. The application of the methods is illustrated for two toxicological examples, by using tools available in the free software, namely, R: confidence intervals for predictive values are computed for a validation study of an in vitro test battery, and sample size calculation is illustrated for an acute toxicity assay. The R code necessary to reproduce the results is given.

  17. Display format, highlight validity, and highlight method: Their effects on search performance

    NASA Technical Reports Server (NTRS)

    Donner, Kimberly A.; Mckay, Tim D.; Obrien, Kevin M.; Rudisill, Marianne

    1991-01-01

    Display format and highlight validity were shown to affect visual display search performance; however, these studies were conducted on small, artificial displays of alphanumeric stimuli. A study manipulating these variables was conducted using realistic, complex Space Shuttle information displays. A 2x2x3 within-subjects analysis of variance found that search times were faster for items in reformatted displays than for current displays. Responses to valid applications of highlight were significantly faster than responses to non or invalidly highlighted applications. The significant format by highlight validity interaction showed that there was little difference in response time to both current and reformatted displays when the highlight validity was applied; however, under the non or invalid highlight conditions, search times were faster with reformatted displays. A separate within-subject analysis of variance of display format, highlight validity, and several highlight methods did not reveal a main effect of highlight method. In addition, observed display search times were compared to search time predicted by Tullis' Display Analysis Program. Benefits of highlighting and reformatting displays to enhance search and the necessity to consider highlight validity and format characteristics in tandem for predicting search performance are discussed.

  18. A mixed method approach to clarify the construct validity of interprofessional collaboration: an empirical research illustration.

    PubMed

    Ødegård, Atle; Bjørkly, Stål

    2012-07-01

    The rapid development of empirical studies in the field of interprofessional collaboration (IPC) calls for a wide array of scientific approaches ranging from recruitment and motivation to measurement and design questions. Regardless of whether researchers choose qualitative or quantitative approaches, they must substantiate their findings. We argue that more attention should be given to reliability and validity issues to improve our understanding of IPC as a phenomenon and practice. A mixed methods approach is presented as a relevant design format for the study of IPC. This paper aims to argue that a combination of methodologies may be a feasible way to enhance our understanding of IPC, with a special focus on reliability and validity issues; illustrate the application of different methodologies in an IPC research project; and emphasize the distinction between validity and validation to mitigate possible obstacles in integrating qualitative and quantitative research in the study of IPC.

  19. Validation of the Endopep-MS method for qualitative detection of active botulinum neurotoxins in human and chicken serum.

    PubMed

    Björnstad, Kristian; Tevell Åberg, Annica; Kalb, Suzanne R; Wang, Dongxia; Barr, John R; Bondesson, Ulf; Hedeland, Mikael

    2014-11-01

    Botulinum neurotoxins (BoNTs) are highly toxic proteases produced by anaerobic bacteria. Traditionally, a mouse bioassay (MBA) has been used for detection of BoNTs, but for a long time, laboratories have worked with alternative methods for their detection. One of the most promising in vitro methods is a combination of an enzymatic and mass spectrometric assay called Endopep-MS. However, no comprehensive validation of the method has been presented. The main purpose of this work was to perform a validation for the qualitative analysis of BoNT-A, B, C, C/D, D, D/C, and F in serum. The limit of detection (LOD), selectivity, precision, stability in matrix and solution, and correlation with the MBA were evaluated. The LOD was equal to or even better than that of the MBA for BoNT-A, B, D/C, E, and F. Furthermore, Endopep-MS was for the first time successfully used to differentiate between BoNT-C and D and their mosaics C/D and D/C by different combinations of antibodies and target peptides. In addition, sequential antibody capture was presented as a new way to multiplex the method when only a small sample volume is available. In the comparison with the MBA, all the samples analyzed were positive for BoNT-C/D with both methods. These results indicate that the Endopep-MS method is a valid alternative to the MBA as the gold standard for BoNT detection based on its sensitivity, selectivity, and speed and that it does not require experimental animals.

  20. Novel validated spectrofluorimetric methods for the determination of taurine in energy drinks and human urine.

    PubMed

    Sharaf El Din, M K; Wahba, M E K

    2015-03-01

    Two sensitive, selective, economic and validated spectrofluorimetric methods were developed for the determination of taurine in energy drinks and spiked human urine. Method Ι is based on fluorimetric determination of the amino acid through its reaction with Hantzsch reagent to form a highly fluorescent product measured at 490 nm after excitation at 419 nm. Method ΙΙ is based on the reaction of taurine with tetracyanoethylene yielding a fluorescent charge transfer complex, which was measured at λex /em of (360 nm/450 nm). The proposed methods were subjected to detailed validation procedures, and were statistically compared with the reference method, where the results obtained were in good agreement. Method Ι was further applied to determine taurine in energy drinks and spiked human urine giving promising results. Moreover, the stoichiometry of the reactions was studied, and reaction mechanisms were postulated.

  1. Validity and Feasibility of a Digital Diet Estimation Method for Use with Preschool Children: A Pilot Study

    ERIC Educational Resources Information Center

    Nicklas, Theresa A.; O'Neil, Carol E.; Stuff, Janice; Goodell, Lora Suzanne; Liu, Yan; Martin, Corby K.

    2012-01-01

    Objective: The goal of the study was to assess the validity and feasibility of a digital diet estimation method for use with preschool children in "Head Start." Methods: Preschool children and their caregivers participated in validation (n = 22) and feasibility (n = 24) pilot studies. Validity was determined in the metabolic research unit using…

  2. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... = theoretical mass of compound spiked into spiked sample (µ g). 3.1.Method Evaluation In order for the...

  3. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... = theoretical mass of compound spiked into spiked sample (µ g). 3.1.Method Evaluation In order for the...

  4. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... = theoretical mass of compound spiked into spiked sample (µ g). 3.1.Method Evaluation In order for the...

  5. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... = theoretical mass of compound spiked into spiked sample (µ g). 3.1.Method Evaluation In order for the...

  6. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... validated according to the procedures in Sections 5.1 and 5.3 of Test Method 301, 40 CFR part 63, appendix A... Method 25D of 40 CFR part 60, appendix A. 2.1. Sampling and Analysis 2.1.1. For each waste matrix... = theoretical mass of compound spiked into spiked sample (µ g). 3.1.Method Evaluation In order for the...

  7. Comparison of validity of assessment methods using indices of adjusted agreement.

    PubMed

    Nam, Jun-mo

    2007-02-10

    For comparing the validity of rating methods, the adjusted kappa (S coefficient) and Yule's Y index are better than Cohen's kappa which is affected by marginal probabilities. We consider a validity study in which a subject is assessed as exposed or not-exposed by two competing rating methods and the gold standard. We are interested in one of the methods, which is closer in agreement with the gold standard. We present statistical methods taking correlations into account for comparing the validity of the rating methods using S coefficient and Y index. We show how the S coefficient and Yule's Y index are related to sensitivity and specificity. In comparing the two rating methods, the preference is clear when the inference is the same for both S and Y. If the inference using S differs from that using Y, then it is not obvious how to decide a preference. This may occur when one rating method is better than the other in sensitivity but not in specificity. Numerical examples for comparing asbestos-exposure assessment methods are illustrated.

  8. Validated spectrofluorimetric method for the determination of clonazepam in pharmaceutical preparations.

    PubMed

    Ibrahim, Fawzia; El-Enany, Nahed; Shalan, Shereen; Elsharawy, Rasha

    2016-05-01

    A simple, highly sensitive and validated spectrofluorimetric method was applied in the determination of clonazepam (CLZ). The method is based on reduction of the nitro group of clonazepam with zinc/CaCl2, and the product is then reacted with 2-cyanoacetamide (2-CNA) in the presence of ammonia (25%) yielding a highly fluorescent product. The produced fluorophore exhibits strong fluorescence intensity at ʎ(em) = 383 nm after excitation at ʎ(ex) = 333 nm. The method was rectilinear over a concentration range of 0.1-0.5 ng/mL with a limit of detection (LOD) of 0.0057 ng/mL and a limit of quantification (LOQ) of 0.017 ng/mL. The method was fully validated and successfully applied to the determination of CLZ in its tablets with a mean percentage recovery of 100.10 ± 0.75%. Method validation according to ICH Guidelines was evaluated. Statistical analysis of the results obtained using the proposed method was successfully compared with those obtained using a reference method, and there was no significance difference between the two methods in terms of accuracy and precision.

  9. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    EPA Science Inventory

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  10. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    NASA Astrophysics Data System (ADS)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  11. Translating pharmacodynamic biomarkers from bench to bedside: analytical validation and fit-for-purpose studies to qualify multiplex immunofluorescent assays for use on clinical core biopsy specimens.

    PubMed

    Marrero, Allison; Lawrence, Scott; Wilsker, Deborah; Voth, Andrea Regier; Kinders, Robert J

    2016-08-01

    Multiplex pharmacodynamic (PD) assays have the potential to increase sensitivity of biomarker-based reporting for new targeted agents, as well as revealing significantly more information about target and pathway activation than single-biomarker PD assays. Stringent methodology is required to ensure reliable and reproducible results. Common to all PD assays is the importance of reagent validation, assay and instrument calibration, and the determination of suitable response calibrators; however, multiplex assays, particularly those performed on paraffin specimens from tissue blocks, bring format-specific challenges adding a layer of complexity to assay development. We discuss existing multiplex approaches and the development of a multiplex immunofluorescence assay measuring DNA damage and DNA repair enzymes in response to anti-cancer therapeutics and describe how our novel method addresses known issues. PMID:27663477

  12. Validity of a Simulation Game as a Method for History Teaching

    ERIC Educational Resources Information Center

    Corbeil, Pierre; Laveault, Dany

    2011-01-01

    The aim of this research is, first, to determine the validity of a simulation game as a method of teaching and an instrument for the development of reasoning and, second, to study the relationship between learning and students' behavior toward games. The participants were college students in a History of International Relations course, with two…

  13. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  14. A validation framework for microbial forensic methods based on statistical pattern recognition

    SciTech Connect

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  15. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  16. Validation of three new methods for determination of metal emissions using a modified Environmental Protection Agency Method 301.

    PubMed

    Yanca, Catherine A; Barth, Douglas C; Petterson, Krag A; Nakanishi, Michael P; Cooper, John A; Johnsen, Bruce E; Lambert, Richard H; Bivins, Daniel G

    2006-12-01

    Three new methods applicable to the determination of hazardous metal concentrations in stationary source emissions were developed and evaluated for use in U.S. Environmental Protection Agency (EPA) compliance applications. Two of the three independent methods, a continuous emissions monitor-based method (Xact) and an X-ray-based filter method (XFM), are used to measure metal emissions. The third method involves a quantitative aerosol generator (QAG), which produces a reference aerosol used to evaluate the measurement methods. A modification of EPA Method 301 was used to validate the three methods for As, Cd, Cr, Pb, and Hg, representing three hazardous waste combustor Maximum Achievable Control Technology (MACT) metal categories (low volatile, semivolatile, and volatile). The modified procedure tested the methods using more stringent criteria than EPA Method 301; these criteria included accuracy, precision, and linearity. The aerosol generation method was evaluated in the laboratory by comparing actual with theoretical aerosol concentrations. The measurement methods were evaluated at a hazardous waste combustor (HWC) by comparing measured with reference aerosol concentrations. The QAG, Xact, and XFM met the modified Method 301 validation criteria. All three of the methods demonstrated precisions and accuracies on the order of 5%. In addition, correlation coefficients for each method were on the order of 0.99, confirming the methods' linear response and high precision over a wide range of concentrations. The measurement methods should be applicable to emissions from a wide range of sources, and the reference aerosol generator should be applicable to additional analytes. EPA recently approved an alternative monitoring petition for an HWC at Eli Lilly's Tippecanoe site in Lafayette, IN, in which the Xact is used for demonstrating compliance with the HWC MACT metal emissions (low volatile, semivolatile, and volatile). The QAG reference aerosol generator was approved as

  17. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders. PMID:23179190

  18. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  19. Defining the knee joint flexion-extension axis for purposes of quantitative gait analysis: an evaluation of methods.

    PubMed

    Schache, Anthony G; Baker, Richard; Lamoreux, Larry W

    2006-08-01

    Minimising measurement variability associated with hip axial rotation and avoiding knee joint angle cross-talk are two fundamental objectives of any method used to define the knee joint flexion-extension axis for purposes of quantitative gait analysis. The aim of this experiment was to compare three different methods of defining this axis: the knee alignment device (KAD) method, a method based on the transepicondylar axis (TEA) and an alternative numerical method (Dynamic). The former two methods are common approaches that have been applied clinically in many quantitative gait analysis laboratories; the latter is an optimisation procedure. A cohort of 20 subjects performed three different functional tasks (normal gait; squat; non-weight bearing knee flexion) on repeated occasions. Three-dimensional hip and knee angles were computed using the three alternative methods of defining the knee joint flexion-extension axis. The repeatability of hip axial rotation measurements during normal gait was found to be significantly better for the Dynamic method (p<0.01). Furthermore, both the variance in the knee varus-valgus kinematic profile and the degree of knee joint angle cross-talk were smallest for the Dynamic method across all functional tasks. The Dynamic method therefore provided superior results in comparison to the KAD and TEA-based methods and thus represents an attractive solution for orientating the knee joint flexion-extension axis for purposes of quantitative gait analysis.

  20. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    NASA Astrophysics Data System (ADS)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  1. Validity and reliability of an alternative method for measuring power output during six-second all-out cycling.

    PubMed

    Watson, Martin; Bibbo, Daniele; Duffy, Charles R; Riches, Philip E; Conforto, Silvia; Macaluso, Andrea

    2014-08-01

    In a laboratory setting where both a mechanically-braked cycling ergometer and a motion analysis (MA) system are available, flywheel angular displacement can be estimated by using MA. The purpose of this investigation was to assess the validity and reliability of a MA method for measuring maximal power output (Pmax) in comparison with a force transducer (FT) method. Eight males and eight females undertook three identical sessions, separated by 4 to 6 days; the first being a familiarization session. Individuals performed three 6-second sprints against 50% of the maximal resistance to complete two pedal revolutions with a 3-minute rest between trials. Power was determined independently using both MA and FT analyses. Validity: MA recorded significantly higher Pmax than FT (P < .05). Bland-Altman plots showed that there was a systematic bias in the difference between the measures of the two systems. This difference increased as power increased. Repeatability: Intraclass correlation coefficients were on average 0.90 ± 0.05 in males and 0.85 ± 0.08 in females. Measuring Pmax by MA, therefore, is as appropriate for use in exercise physiology research as Pmax measured by FT, provided that a bias between these measurements methods is allowed for. PMID:24977624

  2. From the Bronx to Bengifunda (and Other Lines of Flight): Deterritorializing Purposes and Methods in Science Education Research

    ERIC Educational Resources Information Center

    Gough, Noel

    2011-01-01

    In this essay I explore a number of questions about purposes and methods in science education research prompted by my reading of Wesley Pitts' ethnographic study of interactions among four students and their teacher in a chemistry classroom in the Bronx, New York City. I commence three "lines of flight" (small acts of Deleuzo-Guattarian…

  3. General-purpose optimization methods for parallelization of digital terrain analysis based on cellular automata

    NASA Astrophysics Data System (ADS)

    Cheng, Guo; Liu, Lu; Jing, Ning; Chen, Luo; Xiong, Wei

    2012-08-01

    Solving traditional spatial analysis problems benefits from high performance geo-computation powered by parallel computing. Digital Terrain Analysis (DTA) is a typical example of data and computationally intensive spatial analysis problems and can be improved by parallelization technologies. Previous work on this topic has mainly focused on applying optimization schemes for specific DTA case studies. The task addressed in this paper, in contrast, is to find optimization methods that are generally applicable to the parallelization of DTA. By modeling a complex DTA problem with Cellular Automata (CA), we developed a temporal model that can describe the time cost of the solution. Three methods for optimizing different components in the temporal model are proposed: (1) a parallel loading/writing method that can improve the IO efficiency; (2) a best cell division method that can minimize the communication time among processes; and (3) a communication evolution overlapping method that can reduce the total time of evolutions and communications. The feasibilities and practical efficiencies of the proposed methods have been verified by comparative experiments conducted on an elevation dataset from North America using the Slope of Aspect (SOA) as an example of a general DTA problem. The results showed that the parallel performance of the SOA can be improved by applying the proposed methods individually or in an integrated fashion.

  4. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  5. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  6. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  7. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  8. 27 CFR 46.261 - Purpose of an alternate method or procedure.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TOBACCO PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Alternate Methods Or Procedures §...

  9. E-Flux2 and SPOT: Validated Methods for Inferring Intracellular Metabolic Flux Distributions from Transcriptomic Data

    PubMed Central

    Kim, Min Kyung; Lane, Anatoliy; Kelley, James J.; Lun, Desmond S.

    2016-01-01

    Background Several methods have been developed to predict system-wide and condition-specific intracellular metabolic fluxes by integrating transcriptomic data with genome-scale metabolic models. While powerful in many settings, existing methods have several shortcomings, and it is unclear which method has the best accuracy in general because of limited validation against experimentally measured intracellular fluxes. Results We present a general optimization strategy for inferring intracellular metabolic flux distributions from transcriptomic data coupled with genome-scale metabolic reconstructions. It consists of two different template models called DC (determined carbon source model) and AC (all possible carbon sources model) and two different new methods called E-Flux2 (E-Flux method combined with minimization of l2 norm) and SPOT (Simplified Pearson cOrrelation with Transcriptomic data), which can be chosen and combined depending on the availability of knowledge on carbon source or objective function. This enables us to simulate a broad range of experimental conditions. We examined E. coli and S. cerevisiae as representative prokaryotic and eukaryotic microorganisms respectively. The predictive accuracy of our algorithm was validated by calculating the uncentered Pearson correlation between predicted fluxes and measured fluxes. To this end, we compiled 20 experimental conditions (11 in E. coli and 9 in S. cerevisiae), of transcriptome measurements coupled with corresponding central carbon metabolism intracellular flux measurements determined by 13C metabolic flux analysis (13C-MFA), which is the largest dataset assembled to date for the purpose of validating inference methods for predicting intracellular fluxes. In both organisms, our method achieves an average correlation coefficient ranging from 0.59 to 0.87, outperforming a representative sample of competing methods. Easy-to-use implementations of E-Flux2 and SPOT are available as part of the open

  10. Cross-validation of component models: a critical look at current methods.

    PubMed

    Bro, R; Kjeldahl, K; Smilde, A K; Kiers, H A L

    2008-03-01

    In regression, cross-validation is an effective and popular approach that is used to decide, for example, the number of underlying features, and to estimate the average prediction error. The basic principle of cross-validation is to leave out part of the data, build a model, and then predict the left-out samples. While such an approach can also be envisioned for component models such as principal component analysis (PCA), most current implementations do not comply with the essential requirement that the predictions should be independent of the entity being predicted. Further, these methods have not been properly reviewed in the literature. In this paper, we review the most commonly used generic PCA cross-validation schemes and assess how well they work in various scenarios.

  11. Patient Monitoring in the Operating Room: Validation of Instrument Readings by Artificial Intelligence Methods

    PubMed Central

    Garfinkel, D.; Matsiras, P.V.; Mavrides, T.; McAdams, J.; Aukburg, S.J.

    1989-01-01

    Physiological monitoring in the operating room is needed to follow the patient's state of ventilation, circulation, etc. Parameters such as heart rate, blood pressure, and respiratory gas content are observed with devices of uncertain reliability. These provide speedy information in the form of cautions and alarms, which may indicate that corrective action is needed. In practice the large number of alarms when there is no hazard to the patient (false alarms) is a considerable problem. We describe a method of comparing and validating instrument readings in this situation involving a knowledge base whose core is a set of 36 rules. This was applied to 7803 warnings (6287 cautions and 1516 alarms) from 68 day surgery patients undergoing 115 hours of surgery. Most of the cautions were validated by our analysis, but 734 of the 1516 alarms were invalidated while 419 were validated and 363 left indeterminate. This translates to a potential reduction from one alarm every 4 minutes to one every 16 minutes.

  12. Co-validation of three methods for optical characterization of point-focus concentrators

    SciTech Connect

    Wendelin, T.J.; Grossman, J.W.

    1994-10-01

    Three different methods for characterizing point-focus solar concentrator optical performance have been developed for specific applications. These methods include a laser ray trace technique called the Scanning Hartmann Optical Test, a video imaging process called the 2f Technique and actual on-sun testing in conjunction with optical computer modeling. Three concentrator test articles, each of a different design, were characterized using at least two of the methods and, in one case, all three. The results of these tests are compared in order to validate the methods. Excellent agreement is observed in the results, suggesting that the techniques provide consistent and accurate characterizations of solar concentrator optics.

  13. Validating 3D Seismic Velocity Models Using the Spectral Element Method

    NASA Astrophysics Data System (ADS)

    Maceira, M.; Rowe, C. A.; Allen, R. M.; Obrebski, M. J.

    2010-12-01

    As seismic instrumentation, data storage and dissemination and computational power improve, seismic velocity models attempt to resolve smaller structures and cover larger areas. However, it is unclear how accurate these velocity models are and, while the best models available are used for event determination, it is difficult to put uncertainties on seismic event parameters. Model validation is typically done using resolution tests that assume the imaging theory used is accurate and thus only considers the impact of the data coverage on resolution. We present the results of a more rigorous approach to model validation via full three-dimensional waveform propagation using Spectral Element Methods (SEM). This approach makes no assumptions about the theory used to generate the models but require substantial computational resources. We first validate 3D tomographic models for the Western USA generated using both ray-theoretical and finite-frequency methods. The Dynamic North America (DNA) Models of P- and S- velocity structure (DNA09-P and DNA09-S) use teleseismic body-wave traveltime residuals recorded at over 800 seismic stations provided by the Earthscope USArray and regional seismic networks. We performed systematic computations of synthetics for the dataset used to generate the DNA models. Direct comparison of these synthetic seismograms to the actual observations allows us to accurately assess and validate the models. Implementation of the method for a densely instrumented region such as that covered by the DNA model provides a useful testbed for the validation methods that we will subsequently apply to other, more challenging study areas.

  14. Blood Density Is Nearly Equal to Water Density: A Validation Study of the Gravimetric Method of Measuring Intraoperative Blood Loss.

    PubMed

    Vitello, Dominic J; Ripper, Richard M; Fettiplace, Michael R; Weinberg, Guy L; Vitello, Joseph M

    2015-01-01

    Purpose. The gravimetric method of weighing surgical sponges is used to quantify intraoperative blood loss. The dry mass minus the wet mass of the gauze equals the volume of blood lost. This method assumes that the density of blood is equivalent to water (1 gm/mL). This study's purpose was to validate the assumption that the density of blood is equivalent to water and to correlate density with hematocrit. Methods. 50 µL of whole blood was weighed from eighteen rats. A distilled water control was weighed for each blood sample. The averages of the blood and water were compared utilizing a Student's unpaired, one-tailed t-test. The masses of the blood samples and the hematocrits were compared using a linear regression. Results. The average mass of the eighteen blood samples was 0.0489 g and that of the distilled water controls was 0.0492 g. The t-test showed P = 0.2269 and R (2) = 0.03154. The hematocrit values ranged from 24% to 48%. The linear regression R (2) value was 0.1767. Conclusions. The R (2) value comparing the blood and distilled water masses suggests high correlation between the two populations. Linear regression showed the hematocrit was not proportional to the mass of the blood. The study confirmed that the measured density of blood is similar to water.

  15. Optimization and validation of a new CE method for the determination of pantoprazole enantiomers.

    PubMed

    Guan, Jin; Yan, Feng; Shi, Shuang; Wang, Silin

    2012-06-01

    A new CE method using sulfobutylether-beta-cyclodextrin (SBE-beta-CD) as chiral additive was developed and validated for the determination of pantoprazole enantiomers. The primary factors affecting its separation efficiency, which include chiral selector, buffer pH, organic additive, and applied voltage, were optimized. The best results were obtained using a buffer consisting of 50 mM borax-150 mM phosphate adjusted to pH 6.5, 20 mg/mL SBE-beta-CD, and a 10 kV applied voltage. The optimized method was validated for linearity, precision, accuracy, and proved to be robust. The LOD and LOQ for R-(+)-pantoprazole were 0.9 and 2.5 μg/mL, respectively. The method is capable of determining a minimum limit of 0.1% (w/w) of R-enantiomer in S-(-)-pantoprazole bulk samples. PMID:22736366

  16. Validated UV-spectrophotometric method for the evaluation of the efficacy of makeup remover.

    PubMed

    Charoennit, P; Lourith, N

    2012-04-01

    A UV-spectrophotometric method for the analysis of makeup remover was developed and validated according to ICH guidelines. Three makeup removers for which the main ingredients consisted of vegetable oil (A), mineral oil and silicone (B) and mineral oil and water (C) were sampled in this study. Ethanol was the optimal solvent because it did not interfere with the maximum absorbance of the liquid foundation at 250 nm. The linearity was determined over a range of makeup concentrations from 0.540 to 1.412 mg mL⁻¹ (R² = 0.9977). The accuracy of this method was determined by analysing low, intermediate and high concentrations of the liquid foundation and gave 78.59-91.57% recoveries with a relative standard deviation of <2% (0.56-1.45%). This result demonstrates the validity and reliability of this method. The reproducibilities were 97.32 ± 1.79, 88.34 ± 2.69 and 95.63 ± 2.94 for preparations A, B and C respectively, which are within the acceptable limits set forth by the ASEAN analytical validation guidelines, which ensure the precision of the method under the same operating conditions over a short time interval and the inter-assay precision within the laboratory. The proposed method is therefore a simple, rapid, accurate, precise and inexpensive technique for the routine analysis of makeup remover efficacy.

  17. Determination of vitamin C in foods: current state of method validation.

    PubMed

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review.

  18. Development and validation of a molecular size distribution method for polysaccharide vaccines.

    PubMed

    Clément, G; Dierick, J-F; Lenfant, C; Giffroy, D

    2014-01-01

    Determination of the molecular size distribution of vaccine products by high performance size exclusion chromatography coupled to refractive index detection is important during the manufacturing process. Partial elution of high molecular weight compounds in the void volume of the chromatographic column is responsible for variation in the results obtained with a reference method using a TSK G5000PWXL chromatographic column. GlaxoSmithKline Vaccines has developed an alternative method relying on the selection of a different chromatographic column with a wider separation range and the generation of a dextran calibration curve to determine the optimal molecular weight cut-off values for all tested products. Validation of this method was performed according to The International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). The new method detected product degradation with the same sensitivity as that observed for the reference method. All validation parameters were within the pre-specified range. Precision (relative standard deviation (RSD) of mean values) was < 5 per cent (intra-assay) and < 10 per cent (inter-assay). Sample recovery was > 70 per cent for all polysaccharide conjugates and for the Haemophilus influenzae type B final container vaccine. All results obtained for robustness met the acceptance criteria defined in the validation protocol (≤ 2 times (RSD) or ≤ 2 per cent difference between the modified and the reference parameter value if RSD = 0 per cent). The new method was shown to be a suitable quality control method for the release and stability follow-up of polysaccharide-containing vaccines. The new method gave comparable results to the reference method, but with less intra- and inter-assay variability. PMID:25655242

  19. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1314 and Method 1315

    EPA Science Inventory

    This report summarizes the results of an interlaboratory study conducted to generate precision estimates for two leaching methods under review by the U.S. EPA’s OSWER for inclusion into the EPA’s SW-846: Method 1314: Liquid-Solid Partitioning as a Function of Liquid...

  20. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    PubMed

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy.

  1. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    SciTech Connect

    Melius, J.; Margolis, R.; Ong, S.

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  2. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    PubMed

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. PMID:24360478

  3. Tocopherol and tocotrienol analysis in raw and cooked vegetables: a validated method with emphasis on sample preparation.

    PubMed

    Knecht, Katharina; Sandfuchs, Katja; Kulling, Sabine E; Bunzel, Diana

    2015-02-15

    Vegetables can be important dietary sources of vitamin E. However, data on vitamin E in raw and cooked vegetables are in part conflicting, indicating analytical pitfalls. The purpose of the study was to develop and validate an HPLC-FLD method for tocochromanol (tocopherols and tocotrienols) analysis equally suitable for raw and cooked vegetables. Significant instability of tocochromanols was observed in raw broccoli and carrot homogenates. Tocochromanols could be stabilized by freeze-drying or ascorbic acid addition prior to homogenization. The optimized protocol for tocochromanol analysis included knife and ball milling of freeze-dried vegetable pieces. Direct acetone extraction of vegetable powders allowed for satisfactory recoveries and precisions. A significant decrease of tocochromanols in baked compared to raw vegetables was shown, the extent of which varied largely between vegetables. For some raw vegetables, such as spinach or broccoli, underestimation of vitamin E in nutrient databases cannot be ruled out and should be examined.

  4. Tocopherol and tocotrienol analysis in raw and cooked vegetables: a validated method with emphasis on sample preparation.

    PubMed

    Knecht, Katharina; Sandfuchs, Katja; Kulling, Sabine E; Bunzel, Diana

    2015-02-15

    Vegetables can be important dietary sources of vitamin E. However, data on vitamin E in raw and cooked vegetables are in part conflicting, indicating analytical pitfalls. The purpose of the study was to develop and validate an HPLC-FLD method for tocochromanol (tocopherols and tocotrienols) analysis equally suitable for raw and cooked vegetables. Significant instability of tocochromanols was observed in raw broccoli and carrot homogenates. Tocochromanols could be stabilized by freeze-drying or ascorbic acid addition prior to homogenization. The optimized protocol for tocochromanol analysis included knife and ball milling of freeze-dried vegetable pieces. Direct acetone extraction of vegetable powders allowed for satisfactory recoveries and precisions. A significant decrease of tocochromanols in baked compared to raw vegetables was shown, the extent of which varied largely between vegetables. For some raw vegetables, such as spinach or broccoli, underestimation of vitamin E in nutrient databases cannot be ruled out and should be examined. PMID:25236193

  5. Verification and validation of the maximum entropy method of moment reconstruction of energy dependent neutron flux

    NASA Astrophysics Data System (ADS)

    Crawford, Douglas Spencer

    Verification and Validation of reconstructed neutron flux based on the maximum entropy method, is presented in this paper. The verification is carried out by comparing the neutron flux spectrum from the maximum entropy method with Monte Carlo N Particle 5 version 1.40 (MCNP5) and Attila-7.1.0-beta (Attila). A spherical 100% 235U critical assembly is modeled as the test case to compare the three methods. The verification error range for the maximum entropy method is 15% to 23% where MCNP5 is taken to be the comparison standard. Attila relative error for the critical assembly is 20% to 35%. Validation is accomplished by comparing a neutron flux spectrum that is back calculated from foil activation measurements performed in the GODIVA experiment (GODIVA). The error range of the reconstructed flux compared to GODIVA is 0%-10%. The error range of the neutron flux spectrum from MCNP5 compared to GODIVA is 0%-20% and the Attila error range compared to the GODIVA is 0%-35%. The maximum entropy method for reconstructing flux is shown to be a fast reliable method, compared to either Monte Carlo methods (MCNP5) or 30 multienergy group methods (Attila) and with respect to the GODIVA experiment.

  6. [Examination of myocardial perfusion with positron emission tomography: a clinically useful and valid method?].

    PubMed

    vom Dahl, J

    1997-02-01

    Positron emission tomography (PET) of the heart has gained widespread scientific and clinical acceptance with regard to 2 indications: 1. the detection of perfusion abnormalities by qualitative and semiquantitative analyses of perfusion images at rest and during physical or pharmacological stress using well validated perfusion tracers such as N-13 ammonia, Rb-82 rubidiumchloride, or O-15 labeled water, 2. Viability imaging of myocardial regions with reduced contractility by combining perfusion measurements with substrate metabolism as assessed from F-18 deoxyglucose utilization. This overview summarizes the use of PET as a perfusion imaging method. With a sensitivity > 90% in combination with a high specificity, PET is today the best available nuclear imaging technique for the diagnosis of coronary artery disease (CAD). The short half-life of the perfusion tracers in combination with highly sophisticated hard- and software enables rapid PET studies with high patient throughput. The high diagnostic accuracy and the methological advantages as compared to conventional scintigraphy allows to use PET perfusion imaging for detection of subtle changes of the perfusion reserve for detection of CAD in high risk but asymptomatic patients as well as in patients with proven CAD undergoing various treatment forms such as risk factor reduction or coronary revascularization. In patients following orthotopic heart transplantation, evolving transplant vasculopathy can be detected at an early stage. Quantitative PET imaging at rest allows for detection of myocardial viability since cellular survival is based on maintenance of a minimal perfusion and structural changes correlate to the degree of perfusion reduction. Furthermore, quantitative assessment of the myocardial perfusion reserve detects the magnitude and competence of collaterals in regions with occluded epicardial arteries and thus, imaging of several coronary distribution territories in one noninvasive study. The cost of

  7. Multiple methods, maps, and management applications: Purpose made seafloor maps in support of ocean management

    NASA Astrophysics Data System (ADS)

    Brown, Craig J.; Sameoto, Jessica A.; Smith, Stephen J.

    2012-08-01

    The establishment of multibeam echosounders (MBES) as a mainstream tool in ocean mapping has facilitated integrative approaches toward nautical charting, benthic habitat mapping, and seafloor geotechnical surveys. The inherent bathymetric and backscatter information generated by MBES enables marine scientists to present highly accurate bathymetric data with a spatial resolution closely matching that of terrestrial mapping. Furthermore, developments in data collection and processing of MBES backscatter, combined with the quality of the co-registered depth information, have resulted in the increasing preferential use of multibeam technology over conventional sidescan sonar for the production of benthic habitat maps. A range of post-processing approaches can generate customized map products to meet multiple ocean management needs, thus extracting maximum value from a single survey data set. Based on recent studies over German Bank off SW Nova Scotia, Canada, we show how primary MBES bathymetric and backscatter data, along with supplementary data (i.e. in situ video and stills), were processed using a variety of methods to generate a series of maps. Methods conventionally used for classification of multi-spectral data were tested for classification of the MBES data set to produce a map summarizing broad bio-physical characteristics of the seafloor (i.e. a benthoscape map), which is of value for use in many aspects of marine spatial planning. A species-specific habitat map for the sea scallop Placopecten magellanicus was also generated from the MBES data by applying a Species Distribution Modeling (SDM) method to spatially predict habitat suitability, which offers tremendous promise for use in fisheries management. In addition, we explore the challenges of incorporating benthic community data into maps based on species information derived from a large number of seafloor photographs. Through the process of applying multiple methods to generate multiple maps for

  8. Assembly for collecting samples for purposes of identification or analysis and method of use

    DOEpatents

    Thompson, Cyril V [Knoxville, TN; Smith, Rob R [Knoxville, TN

    2010-02-02

    An assembly and an associated method for collecting a sample of material desired to be characterized with diagnostic equipment includes or utilizes an elongated member having a proximal end with which the assembly is manipulated by a user and a distal end. In addition, a collection tip which is capable of being placed into contact with the material to be characterized is supported upon the distal end. The collection tip includes a body of chemically-inert porous material for binding a sample of material when the tip is placed into contact with the material and thereby holds the sample of material for subsequent introduction to the diagnostic equipment.

  9. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  10. PAH detection in Quercus robur leaves and Pinus pinaster needles: A fast method for biomonitoring purpose.

    PubMed

    De Nicola, F; Concha Graña, E; Aboal, J R; Carballeira, A; Fernández, J Á; López Mahía, P; Prada Rodríguez, D; Muniategui Lorenzo, S

    2016-06-01

    Due to the complexity and heterogeneity of plant matrices, new procedure should be standardized for each single biomonitor. Thus, here is described a matrix solid-phase dispersion extraction method, previously used for moss samples, improved and modified for the analyses of PAHs in Quercus robur leaves and Pinus pinaster needles, species widely used in biomonitoring studies across Europe. The improvements compared to the previous procedure are the use of Florisil added with further clean-up sorbents, 10% deactivated silica for pine needles and PSA for oak leaves, being these matrices rich in interfering compounds, as shown by the gas chromatography-mass spectrometry analyses acquired in full scan mode. Good trueness, with values in the range 90-120% for the most of compounds, high precision (intermediate precision between 2% and 12%) and good sensitivity using only 250mg of samples (limits of quantification lower than 3 and 1.5ngg(-1), respectively for pine and oak) were achieved by the selected procedures. These methods proved to be reliable for PAH analyses and, having advantage of fastness, can be used in biomonitoring studies of PAH air contamination. PMID:27130099

  11. Validation of a liquid chromatographic method for determination of tacrolimus in pharmaceutical dosage forms.

    PubMed

    Moyano, María A; Simionato, Laura D; Pizzorno, María T; Segall, Adriana I

    2006-01-01

    An accurate, simple, and reproducible liquid chromatographic method was developed and validated for the determination of tacrolimus in capsules. The analysis is performed at room temperature on a reversed-phase C18 column with UV detection at 210 nm. The mobile phase is methanol-water (90 + 10) at a constant flow rate of 0.8 mL/min. The method was validated in terms of linearity, precision, accuracy, and specificity by forced decomposition of tacrolimus, using acid, base, water, hydrogen peroxide, heat, and light. The response was linear in the range of 0.09-0.24 mg/mL (r2 = 0.9997). The relative standard deviation values for intra- and interday precision studies were 1.28 and 2.91%, respectively. Recoveries ranged from 98.06 to 102.52%.

  12. Comparison of Assertive Community Treatment Fidelity Assessment Methods: Reliability and Validity.

    PubMed

    Rollins, Angela L; McGrew, John H; Kukla, Marina; McGuire, Alan B; Flanagan, Mindy E; Hunt, Marcia G; Leslie, Doug L; Collins, Linda A; Wright-Berryman, Jennifer L; Hicks, Lia J; Salyers, Michelle P

    2016-03-01

    Assertive community treatment is known for improving consumer outcomes, but is difficult to implement. On-site fidelity measurement can help ensure model adherence, but is costly in large systems. This study compared reliability and validity of three methods of fidelity assessment (on-site, phone-administered, and expert-scored self-report) using a stratified random sample of 32 mental health intensive case management teams from the Department of Veterans Affairs. Overall, phone, and to a lesser extent, expert-scored self-report fidelity assessments compared favorably to on-site methods in inter-rater reliability and concurrent validity. If used appropriately, these alternative protocols hold promise in monitoring large-scale program fidelity with limited resources. PMID:25721146

  13. Validation of a partial coherence interferometry method for estimating retinal shape

    PubMed Central

    Verkicharla, Pavan K.; Suheimat, Marwan; Pope, James M.; Sepehrband, Farshid; Mathur, Ankit; Schmid, Katrina L.; Atchison, David A.

    2015-01-01

    To validate a simple partial coherence interferometry (PCI) based retinal shape method, estimates of retinal shape were determined in 60 young adults using off-axis PCI, with three stages of modeling using variants of the Le Grand model eye, and magnetic resonance imaging (MRI). Stage 1 and 2 involved a basic model eye without and with surface ray deviation, respectively and Stage 3 used model with individual ocular biometry and ray deviation at surfaces. Considering the theoretical uncertainty of MRI (12-14%), the results of the study indicate good agreement between MRI and all three stages of PCI modeling with <4% and <7% differences in retinal shapes along horizontal and vertical meridians, respectively. Stage 2 and Stage 3 gave slightly different retinal co-ordinates than Stage 1 and we recommend the intermediate Stage 2 as providing a simple and valid method of determining retinal shape from PCI data. PMID:26417496

  14. [Validation Study for Analytical Method of Diarrhetic Shellfish Poisons in 9 Kinds of Shellfish].

    PubMed

    Yamaguchi, Mizuka; Yamaguchi, Takahiro; Kakimoto, Kensaku; Nagayoshi, Haruna; Okihashi, Masahiro; Kajimura, Keiji

    2016-01-01

    A method was developed for the simultaneous determination of okadaic acid, dinophysistoxin-1 and dinophysistoxin-2 in shellfish using ultra performance liquid chromatography with tandem mass spectrometry. Shellfish poisons in spiked samples were extracted with methanol and 90% methanol, and were hydrolyzed with 2.5 mol/L sodium hydroxide solution. Purification was done on an HLB solid-phase extraction column. This method was validated in accordance with the notification of Ministry of Health, Labour and Welfare of Japan. As a result of the validation study in nine kinds of shellfish, the trueness, repeatability and within-laboratory reproducibility were 79-101%, less than 12 and 16%, respectively. The trueness and precision met the target values of notification.

  15. Comparison of Assertive Community Treatment Fidelity Assessment Methods: Reliability and Validity.

    PubMed

    Rollins, Angela L; McGrew, John H; Kukla, Marina; McGuire, Alan B; Flanagan, Mindy E; Hunt, Marcia G; Leslie, Doug L; Collins, Linda A; Wright-Berryman, Jennifer L; Hicks, Lia J; Salyers, Michelle P

    2016-03-01

    Assertive community treatment is known for improving consumer outcomes, but is difficult to implement. On-site fidelity measurement can help ensure model adherence, but is costly in large systems. This study compared reliability and validity of three methods of fidelity assessment (on-site, phone-administered, and expert-scored self-report) using a stratified random sample of 32 mental health intensive case management teams from the Department of Veterans Affairs. Overall, phone, and to a lesser extent, expert-scored self-report fidelity assessments compared favorably to on-site methods in inter-rater reliability and concurrent validity. If used appropriately, these alternative protocols hold promise in monitoring large-scale program fidelity with limited resources.

  16. RETROSPECTIVE METHOD VALIDATION AND UNCERTAINTY ESTIMATION FOR ACTINIDES DETERMINATION IN EXCRETA BY ALPHA SPECTROMETRY.

    PubMed

    Hernández, C; Sierra, I

    2016-09-01

    Two essential technical requirements of ISO 17025 guide for accreditation of testing and calibration laboratories are the validation of methods and the estimation of all sources of uncertainty that may affect the analytical result. Bioelimination Laboratory from Radiation Dosimetry Service of CIEMAT (Spain) uses alpha spectrometry to quantify alpha emitters (Pu, Am, Th, U and Cm isotopes) in urine and faecal samples from workers exposed to internal radiation. Therefore and as a step previous to achieving the ISO 17025 accreditation, the laboratory has performed retrospective studies based on the obtained results in the past few years to validate the analytical method. Uncertainty estimation was done identifying and quantifying all the contributions, and finally the overall combined standard uncertainty was calculated. PMID:26424133

  17. A method of setting limits for the purpose of quality assurance

    NASA Astrophysics Data System (ADS)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Kim, Gwe-Ya; Pawlicki, Todd

    2013-10-01

    The result from any assurance measurement needs to be checked against some limits for acceptability. There are two types of limits; those that define clinical acceptability (action limits) and those that are meant to serve as a warning that the measurement is close to the action limits (tolerance limits). Currently, there is no standard procedure to set these limits. In this work, we propose an operational procedure to set tolerance limits and action limits. The approach to establish the limits is based on techniques of quality engineering using control charts and a process capability index. The method is different for tolerance limits and action limits with action limits being categorized into those that are specified and unspecified. The procedure is to first ensure process control using the I-MR control charts. Then, the tolerance limits are set equal to the control chart limits on the I chart. Action limits are determined using the Cpm process capability index with the requirements that the process must be in-control. The limits from the proposed procedure are compared to an existing or conventional method. Four examples are investigated: two of volumetric modulated arc therapy (VMAT) point dose quality assurance (QA) and two of routine linear accelerator output QA. The tolerance limits range from about 6% larger to 9% smaller than conventional action limits for VMAT QA cases. For the linac output QA, tolerance limits are about 60% smaller than conventional action limits. The operational procedure describe in this work is based on established quality management tools and will provide a systematic guide to set up tolerance and action limits for different equipment and processes.

  18. Comparison of the quantitative performances and measurement uncertainty estimates obtained during method validation versus routine applications of a novel hydrophilic interaction chromatography method for the determination of cidofovir in human plasma.

    PubMed

    Lecomte, F; Hubert, C; Demarche, S; De Bleye, C; Dispas, A; Jost, M; Frankenne, F; Ceccato, A; Rozet, E; Hubert, Ph

    2012-01-01

    Method validation is essential to ensure that an analytical method is fit for its intended purpose. Additionally, it is advisable to estimate measurement uncertainty in order to allow a correct interpretation of the results generated by analytical methods. Measurement uncertainty can be efficiently estimated during method validation as a top-down approach. However, method validation predictions of the quantitative performances of the assay and estimations of measurement uncertainty may be far away from the real performances obtained during the routine application of this assay. In this work, the predictions of the quantitative performances and measurement uncertainty estimations obtained from a method validation are compared to those obtained during routine applications of a bioanalytical method. For that purpose, a new hydrophilic interaction chromatography (HILIC) method was used. This method was developed for the determination of cidofovir, an antiviral drug, in human plasma. Cidofovir (CDV) is a highly polar molecule presenting three ionizable functions. Therefore, it is an interesting candidate for determination by HILIC mode. CDV is an acyclic cytidine monophosphate analog that has a broad antiviral spectrum and is currently undergoing evaluation in clinical trials as a topical agent for treatment of papillomavirus infections. The analytical conditions were optimized by means of design of experiments approach in order to obtain robust analytical conditions. These ones were absolutely necessary to enable the comparisons mentioned above. After a sample clean-up by means of solid phase extraction, the chromatographic analysis was performed on bare silica stationary phase using a mixture of acetonitrile-ammonium hydrogen carbonate (pH 7.0; 20mM) (72:28, v/v) as mobile phase. This newly developed bioanalytical method was then fully validated according to FDA (Food and Drug Administration) requirements using a total error approach that guaranteed that each future

  19. Experimental validation of a modal flexibility-based damage detection method for a cyber-physical system

    NASA Astrophysics Data System (ADS)

    Martinez-Castro, Rosana E.; Eskew, Edward L.; Jang, Shinae

    2014-03-01

    The detection and localization of damage in a timely manner is critical in order to avoid the failure of structures. When a structure is subjected to an unscheduled impulsive force, the resulting damage can lead to failure in a very short period of time. As such, a monitoring strategy that can adapt to variability in the environment and that anticipates changes in physical processes has the potential of detecting, locating and mitigating damage. These requirements can be met by a cyber-physical system (CPS) equipped with Wireless Smart Sensor Network (WSSN) systems that is capable of measuring and analyzing dynamic responses in real time using on-board in network processing. The Eigenparameter Decomposition of Structural Flexibility Change (ED) Method is validated with real data and considered to be used in the computational core of this CPS. The condition screening is implemented on a damaged structure and compared to an original baseline calculation, hence providing a supervised learning environment. An experimental laboratory study on a 5-story shear building with three damage conditions subjected to an impulsive force has been chosen to validate the effectiveness of the method proposed to locate and quantify the extent of damage. A numerical simulation of the same building subject to band-limited white noise has also been developed with this purpose. The effectiveness of the ED Method to locate damage is compared to that of the Damage Index Method. With some modifications, the ED Method is capable of locating and quantifying damage satisfactorily in a shear building subject to a lower frequency content predominant excitation.

  20. Experimental validation of normalized uniform load surface curvature method for damage localization.

    PubMed

    Jung, Ho-Yeon; Sung, Seung-Hoon; Jung, Hyung-Jo

    2015-10-16

    In this study, we experimentally validated the normalized uniform load surface (NULS) curvature method, which has been developed recently to assess damage localization in beam-type structures. The normalization technique allows for the accurate assessment of damage localization with greater sensitivity irrespective of the damage location. In this study, damage to a simply supported beam was numerically and experimentally investigated on the basis of the changes in the NULS curvatures, which were estimated from the modal flexibility matrices obtained from the acceleration responses under an ambient excitation. Two damage scenarios were considered for the single damage case as well as the multiple damages case by reducing the bending stiffness (EI) of the affected element(s). Numerical simulations were performed using MATLAB as a preliminary step. During the validation experiments, a series of tests were performed. It was found that the damage locations could be identified successfully without any false-positive or false-negative detections using the proposed method. For comparison, the damage detection performances were compared with those of two other well-known methods based on the modal flexibility matrix, namely, the uniform load surface (ULS) method and the ULS curvature method. It was confirmed that the proposed method is more effective for investigating the damage locations of simply supported beams than the two conventional methods in terms of sensitivity to damage under measurement noise.

  1. Validated spectrophotometric methods for simultaneous determination of troxerutin and carbazochrome in dosage form.

    PubMed

    Khattab, Fatma I; Ramadan, Nesrin K; Hegazy, Maha A; Al-Ghobashy, Medhat A; Ghoniem, Nermine S

    2015-03-15

    Four simple, accurate, sensitive and precise spectrophotometric methods were developed and validated for simultaneous determination of Troxerutin (TXN) and Carbazochrome (CZM) in their bulk powders, laboratory prepared mixtures and pharmaceutical dosage forms. Method A is first derivative spectrophotometry (D(1)) where TXN and CZM were determined at 294 and 483.5 nm, respectively. Method B is first derivative of ratio spectra (DD(1)) where the peak amplitude at 248 for TXN and 439 nm for CZM were used for their determination. Method C is ratio subtraction (RS); in which TXN was determined at its λmax (352 nm) in the presence of CZM which was determined by D(1) at 483.5 nm. While, method D is mean centering of the ratio spectra (MCR) in which the mean centered values at 300 nm and 340.0 nm were used for the two drugs in a respective order. The two compounds were simultaneously determined in the concentration ranges of 5.00-50.00 μg mL(-1) and 0.5-10.0 μg mL(-1) for TXN and CZM, respectively. The methods were validated according to the ICH guidelines and the results were statistically compared to the manufacturer's method.

  2. Experimental validation of finite element and boundary element methods for predicting structural vibration and radiated noise

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Wu, T. W.; Wu, X. F.

    1994-01-01

    This research report is presented in three parts. In the first part, acoustical analyses were performed on modes of vibration of the housing of a transmission of a gear test rig developed by NASA. The modes of vibration of the transmission housing were measured using experimental modal analysis. The boundary element method (BEM) was used to calculate the sound pressure and sound intensity on the surface of the housing and the radiation efficiency of each mode. The radiation efficiency of each of the transmission housing modes was then compared to theoretical results for a finite baffled plate. In the second part, analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise level radiated from the box. The FEM was used to predict the vibration, while the BEM was used to predict the sound intensity and total radiated sound power using surface vibration as the input data. Vibration predicted by the FEM model was validated by experimental modal analysis; noise predicted by the BEM was validated by measurements of sound intensity. Three types of results are presented for the total radiated sound power: sound power predicted by the BEM model using vibration data measured on the surface of the box; sound power predicted by the FEM/BEM model; and sound power measured by an acoustic intensity scan. In the third part, the structure used in part two was modified. A rib was attached to the top plate of the structure. The FEM and BEM were then used to predict structural vibration and radiated noise respectively. The predicted vibration and radiated noise were then validated through experimentation.

  3. A Comparative Study of Methods To Validate Formaldehyde Decontamination of Biological Safety Cabinets

    PubMed Central

    Munro, Kerry; Lanser, Janice; Flower, Robert

    1999-01-01

    Methods of validation of formaldehyde decontamination of biological safety cabinets were compared. Decontamination of metal strips inoculated with Mycobacterium bovis, poliovirus, or Bacillus spp. spores was compared with the results obtained with three biological indicators. Conditions for successful decontamination, particularly relative humidity, were defined. The Attest 1291 biological indicator was the only biological indicator which was an aid in the detection of gross decontamination failure. PMID:9925635

  4. Validation of an immersed thick boundary method for simulating fluid-structure interactions of deformable membranes

    NASA Astrophysics Data System (ADS)

    Sigüenza, J.; Mendez, S.; Ambard, D.; Dubois, F.; Jourdan, F.; Mozul, R.; Nicoud, F.

    2016-10-01

    This paper constitutes an extension of the work of Mendez et al. (2014) [36], for three-dimensional simulations of deformable membranes under flow. An immersed thick boundary method is used, combining the immersed boundary method with a three-dimensional modeling of the structural part. The immersed boundary method is adapted to unstructured grids for the fluid resolution, using the reproducing kernel particle method. An unstructured finite-volume flow solver for the incompressible Navier-Stokes equations is coupled with a finite-element solver for the structure. The validation process relying on a number of test cases proves the efficiency of the method, and its robustness is illustrated when computing the dynamics of a tri-leaflet aortic valve. The proposed immersed thick boundary method is able to tackle applications involving both thin and thick membranes/closed and open membranes, in significantly high Reynolds number flows and highly complex geometries.

  5. Determination of paraquat and diquat: LC-MS method optimization and validation.

    PubMed

    Pizzutti, Ionara R; Vela, Giovana M E; de Kok, André; Scholten, Jos M; Dias, Jonatan V; Cardoso, Carmem D; Concenço, Germani; Vivian, Rafael

    2016-10-15

    This study describes the optimization and single-laboratory validation of a single residue method for determination of two bipyridylium herbicides, paraquat and diquat, in cowpeas by UPLC-MS/MS in a total run time of 9.3min. The method is based on extraction with an acidified methanol-water mixture. Different extraction parameters (extraction solvent composition, temperature, sample extract filtration, and pre-treatment of the laboratory sample) were evaluated in order to optimize the extraction method efficiency. Isotopically labeled internal standards, Paraquat-D6 and Diquat-D4, were used and added to the test portions prior to extraction. The method validation was performed by analyzing spiked samples at three concentrations (10, 20 and 50μgkg(-1)), with seven replicates (n=7) for each concentration. Linearity (r(2)) of analytical curves, accuracy (trueness as recovery % and precision as RSD%), instrument and method limits of detection and quantification (LOD and LOQ) and matrix effects were determined. Average recoveries obtained for diquat were between 77 and 85% with RSD values ⩽20%, for all spike levels studied. On the other hand, paraquat showed average recoveries between 68 and 103% with RSDs in the range 14.4-25.4%. The method LOQ was 10 and 20μgkg(-1) for diquat and paraquat, respectively. The matrix effect was significant for both pesticides. Consequently, matrix-matched calibration standards and using isotopically labeled (IL) analogues as internal standards for the target analytes are required for application in routine analysis. The validated method was successfully applied for cowpea samples obtained from various field studies. PMID:27173559

  6. Capillary isoelectric focusing method development and validation for investigation of recombinant therapeutic monoclonal antibody.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2015-10-10

    Capillary isoelectric focusing (cIEF) is a basic and highly accurate routine analytical tool to prove identity of protein drugs in quality control (QC) and release tests in biopharmaceutical industries. However there are some "out-of-the-box" applications commercially available which provide easy and rapid isoelectric focusing solutions for investigating monoclonal antibody drug proteins. However use of these kits in routine testings requires high costs. A capillary isoelectric focusing method was developed and validated for identification testing of monoclonal antibody drug products with isoelectric point between 7.0 and 9.0. A method was developed providing good pH gradient for internal calibration (R(2)>0.99) and good resolution between all of the isoform peaks (R=2), minimizing the time and complexity of sample preparation (no urea or salt used). The method is highly reproducible and it is suitable for validation and method transfer to any QC laboratories. Another advantage of the method is that it operates with commercially available chemicals which can be purchased from any suppliers. The interaction with capillary walls (avoid precipitation and adsorption as far as possible) was minimized and synthetic isoelectric small molecular markers were used instead of peptide or protein based markers. The developed method was validated according to the recent ICH guideline (Q2(R1)). Relative standard deviation results were below 0.2% for isoelectric points and below 4% according to the normalized migration times. The method is robust to buffer components with different lot numbers and neutral capillaries with different type of inner coatings. The fluoro-carbon coated column was chosen because of costs-effectivity aspects. PMID:26025812

  7. Validation of analysis methods for assessing flawed piping subjected to dynamic loading

    SciTech Connect

    Olson, R.J.; Wolterman, R.L.; Wilkowski, G.M.; Kot, C.A.

    1994-08-01

    Argonne National Laboratory and Battelle have jointly conducted a research program for the USNRC to evaluate the ability of current engineering analysis methods and one state-of-the-art analysis method to predict the behavior of circumferentially surface-cracked pipe system water-hammer experiment. The experimental data used in the evaluation were from the HDR Test Group E31 series conducted by the Kernforschungszentrum Karlsruhe (KfK) in Germany. The incentive for this evaluation was that simplified engineering methods, as well as newer ``state-of-the-art`` fracture analysis methods, have been typically validated only with static experimental data. Hence, these dynamic experiments were of high interest. High-rate dynamic loading can be classified as either repeating, e.g., seismic, or nonrepeating, e.g., water hammer. Development of experimental data and validation of cracked pipe analyses under seismic loading (repeating dynamic loads) are being pursued separately within the NRC`s International Piping Integrity Research Group (IPIRG) program. This report describes developmental and validation efforts to predict crack stability under water hammer loading, as well as comparisons using currently used analysis procedures. Current fracture analysis methods use the elastic stress analysis loads decoupled from the fracture mechanics analysis, while state-of-the-art methods employ nonlinear cracked-pipe time-history finite element analyses. The results showed that the current decoupled methods were conservative in their predictions, whereas the cracked pipe finite element analyses were more accurate, yet slightly conservative. The nonlinear time-history cracked-pipe finite element analyses conducted in this program were also attractive in that they were done on a small Apollo DN5500 workstation, whereas other cracked-pipe dynamic analyses conducted in Europe on the same experiments required the use of a CRAY2 supercomputer, and were less accurate.

  8. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    PubMed

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  9. Validation of finite element and boundary element methods for predicting structural vibration and radiated noise

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Wu, X. F.; Oswald, Fred B.

    1992-01-01

    Analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise radiated from the box. The FEM was used to predict the vibration, and the surface vibration was used as input to the BEM to predict the sound intensity and sound power. Vibration predicted by the FEM model was validated by experimental modal analysis. Noise predicted by the BEM was validated by sound intensity measurements. Three types of results are presented for the total radiated sound power: (1) sound power predicted by the BEM modeling using vibration data measured on the surface of the box; (2) sound power predicted by the FEM/BEM model; and (3) sound power measured by a sound intensity scan. The sound power predicted from the BEM model using measured vibration data yields an excellent prediction of radiated noise. The sound power predicted by the combined FEM/BEM model also gives a good prediction of radiated noise except for a shift of the natural frequencies that are due to limitations in the FEM model.

  10. A validated method for analysis of Swerchirin in Swertia longifolia Boiss. by high performance liquid chromatography

    PubMed Central

    Shekarchi, M.; Hajimehdipoor, H.; Khanavi, M.; Adib, N.; Bozorgi, M.; Akbari-Adergani, B.

    2010-01-01

    Swertia spp. (Gentianaceae) grow widely in the eastern and southern Asian countries and are used as traditional medicine for gastrointestinal disorders. Swerchirin, one of the xanthones in Swertia spp., has many pharmacological properties, such as, antimalarial, antihepatotoxic, and hypoglycemic effects. Because of the pharmacological importance of Swerchirin in this investigation, it was purified from Swertia longifolia Boiss. as one of the main components and quantified by means of a validated high performance liquid chromatography (HPLC) technique. Aerial parts of the plant were extracted with acetone 80%. Phenolic and non-phenolic constituents of the extract were separated from each other during several processes. The phenolic fraction was injected into the semi-preparative HPLC system, which consisted of a C18 column and a gradient methanol: 0.1% formic acid mode. Using this method, we were able to purify six xanthones from the plant, in order to use them as standard materials. The analytical method was validated for Swerchirin as one of the most important components of the plant, with more pharmacological activities according to the validation parameters, such as, selectivity, linearity (r2 > 0.9998), precision (≤3.3), and accuracy, which were measured by the determination of recovery (98-107%). The limits of detection and quantization were found to be 2.1 and 6.3 μg/mL, respectively. On account of the speed and accuracy, the UV-HPLC method may be used for quantitative analysis of Swerchirin. PMID:20548931

  11. HPLC method development, validation, and impurity characterization of a potent antitumor indenoisoquinoline, LMP776 (NSC 725776).

    PubMed

    Wang, Jennie; Liu, Mingtao; Yang, Chun; Wu, Xiaogang; Wang, Euphemia; Liu, Paul

    2016-05-30

    An HPLC method for the assay of a DNA topoisomerase inhibitor, LMP776 (NSC 725776), has been developed and validated. The stress testing of LMP776 was carried out in accordance with International Conference on Harmonization (ICH) guidelines Q1A (R2) under acidic, alkaline, oxidative, thermolytic, and photolytic conditions. The separation of LMP776 from its impurities and degradation products was achieved within 40 min on a Supelco Discovery HS F5 column (150 mm × 4.6 mm i.d., 5 μm) with a gradient mobile phase comprising 38-80% acetonitrile in water, with 0.1% trifluoroacetic acid in both phases. LC/MS was used to obtain mass data for characterization of impurities and degradation products. One major impurity was isolated through chloroform extraction and identified by NMR. The proposed HPLC assay method was validated for specificity, linearity (concentration range 0.25-0.75 mg/mL, r = 0.9999), accuracy (recovery 98.6-100.4%), precision (RSD ≤ 1.4%), and sensitivity (LOD 0.13 μg/mL). The validated method was used in the stability study of the LMP776 drug substance in conformance with the ICH Q1A (R2) guideline. PMID:26970596

  12. A Thematic Review of Interactive Whiteboard Use in Science Education: Rationales, Purposes, Methods and General Knowledge

    NASA Astrophysics Data System (ADS)

    Ormanci, Ummuhan; Cepni, Salih; Deveci, Isa; Aydin, Ozhan

    2015-10-01

    In Turkey and many other countries, the importance of the interactive whiteboard (IWB) is increasing, and as a result, projects and studies are being conducted regarding the use of the IWB in classrooms. Accordingly, in these countries, many issues are being researched, such as the IWB's contribution to the education process, its use in classroom settings and problems that occur when using the IWB. In this context, the research and analysis of studies regarding the use of the IWB have important implications for educators, researchers and teachers. This study aims to review and analyze studies conducted regarding the use of the IWB in the field of science. Accordingly, as a thematic review of the research was deemed appropriate, extant articles available in the literature were analyzed using a matrix that consisted of general features (type of journal, year and demographic properties) and content features (rationales, aims, research methods, samples, data collections, results and suggestions). According to the findings, it was concluded that the studies regarding the use of IWBs were conducted due to deficiencies in the current literature. However, there are rare studies in which the reasons for the research were associated with the nature of science education. There were also studies that focused on the effects of the IWB on student academic success and learning outcomes. Within this context, it is evident that there is a need for further research concerning the use of IWBs in science education and for studies regarding the effect of IWBs on students' skills.

  13. Pressure ulcer prevention algorithm content validation: a mixed-methods, quantitative study.

    PubMed

    van Rijswijk, Lia; Beitz, Janice M

    2015-04-01

    Translating pressure ulcer prevention (PUP) evidence-based recommendations into practice remains challenging for a variety of reasons, including the perceived quality, validity, and usability of the research or the guideline itself. Following the development and face validation testing of an evidence-based PUP algorithm, additional stakeholder input and testing were needed. Using convenience sampling methods, wound care experts attending a national wound care conference and a regional wound ostomy continence nursing (WOCN) conference and/or graduates of a WOCN program were invited to participate in an Internal Review Board-approved, mixed-methods quantitative survey with qualitative components to examine algorithm content validity. After participants provided written informed consent, demographic variables were collected and participants were asked to comment on and rate the relevance and appropriateness of each of the 26 algorithm decision points/steps using standard content validation study procedures. All responses were anonymous. Descriptive summary statistics, mean relevance/appropriateness scores, and the content validity index (CVI) were calculated. Qualitative comments were transcribed and thematically analyzed. Of the 553 wound care experts invited, 79 (average age 52.9 years, SD 10.1; range 23-73) consented to participate and completed the study (a response rate of 14%). Most (67, 85%) were female, registered (49, 62%) or advanced practice (12, 15%) nurses, and had > 10 years of health care experience (88, 92%). Other health disciplines included medical doctors, physical therapists, nurse practitioners, and certified nurse specialists. Almost all had received formal wound care education (75, 95%). On a Likert-type scale of 1 (not relevant/appropriate) to 4 (very relevant and appropriate), the average score for the entire algorithm/all decision points (N = 1,912) was 3.72 with an overall CVI of 0.94 (out of 1). The only decision point/step recommendation

  14. Pressure ulcer prevention algorithm content validation: a mixed-methods, quantitative study.

    PubMed

    van Rijswijk, Lia; Beitz, Janice M

    2015-04-01

    Translating pressure ulcer prevention (PUP) evidence-based recommendations into practice remains challenging for a variety of reasons, including the perceived quality, validity, and usability of the research or the guideline itself. Following the development and face validation testing of an evidence-based PUP algorithm, additional stakeholder input and testing were needed. Using convenience sampling methods, wound care experts attending a national wound care conference and a regional wound ostomy continence nursing (WOCN) conference and/or graduates of a WOCN program were invited to participate in an Internal Review Board-approved, mixed-methods quantitative survey with qualitative components to examine algorithm content validity. After participants provided written informed consent, demographic variables were collected and participants were asked to comment on and rate the relevance and appropriateness of each of the 26 algorithm decision points/steps using standard content validation study procedures. All responses were anonymous. Descriptive summary statistics, mean relevance/appropriateness scores, and the content validity index (CVI) were calculated. Qualitative comments were transcribed and thematically analyzed. Of the 553 wound care experts invited, 79 (average age 52.9 years, SD 10.1; range 23-73) consented to participate and completed the study (a response rate of 14%). Most (67, 85%) were female, registered (49, 62%) or advanced practice (12, 15%) nurses, and had > 10 years of health care experience (88, 92%). Other health disciplines included medical doctors, physical therapists, nurse practitioners, and certified nurse specialists. Almost all had received formal wound care education (75, 95%). On a Likert-type scale of 1 (not relevant/appropriate) to 4 (very relevant and appropriate), the average score for the entire algorithm/all decision points (N = 1,912) was 3.72 with an overall CVI of 0.94 (out of 1). The only decision point/step recommendation

  15. A hot-wire method based thermal conductivity measurement apparatus for teaching purposes

    NASA Astrophysics Data System (ADS)

    Alvarado, S.; Marín, E.; Juárez, A. G.; Calderón, A.; Ivanov, R.

    2012-07-01

    The implementation of an automated system based on the hot-wire technique is described for the measurement of the thermal conductivity of liquids using equipment easily available in modern physics laboratories at high schools and universities (basically a precision current source and a voltage meter, a data acquisition card, a personal computer and a high purity platinum wire). The wire, which is immersed in the investigated sample, is heated by passing a constant electrical current through it, and its temperature evolution, ΔT, is measured as a function of time, t, for several values of the current. A straightforward methodology is then used for data processing in order to obtain the liquid thermal conductivity. The start point is the well known linear relationship between ΔT and ln(t) predicted for long heating times by a model based on a solution of the heat conduction equation for an infinite lineal heat source embedded in an infinite medium into which heat is conducted without convective and radiative heat losses. A criterion is used to verify that the selected linear region is the one that matches the conditions imposed by the theoretical model. As a consequence the method involves least-squares fits in linear, semi-logarithmic (semi-log) and log-log graphs, so that it becomes attractive not only to teach about heat transfer and thermal properties measurement techniques, but also as a good exercise for students of undergraduate courses of physics and engineering learning about these kinds of mathematical functional relationships between variables. The functionality of the experiment was demonstrated by measuring the thermal conductivity in samples of liquids with well known thermal properties.

  16. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... of 40 CFR part 63 on June 3, 1991. We proposed amendments to Method 301 on December 22, 2004 (69 FR... proposed on December 22, 2004, EPA promulgated a rule on September 13, 2010 (75 FR 55636), that moves all... terms of Executive Order 12866 (58 FR 51735, October 4, 1993) and is therefore not subject to...

  17. Development of a benchtop baking method for chemically leavened crackers. II. Validation of the method

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...

  18. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    PubMed

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.

  19. Development and validation of event-specific quantitative PCR method for genetically modified maize MIR604.

    PubMed

    Mano, Junichi; Furui, Satoshi; Takashima, Kaori; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2012-01-01

    A GM maize event, MIR604, has been widely distributed and an analytical method to quantify its content is required to monitor the validity of food labeling. Here we report a novel real-time PCR-based quantitation method for MIR604 maize. We developed real-time PCR assays specific for MIR604 using event-specific primers designed by the trait developer, and for maize endogenous starch synthase IIb gene (SSIIb). Then, we determined the conversion factor, which is required to calculate the weight-based GM maize content from the copy number ratio of MIR604-specific DNA to the endogenous reference DNA. Finally, to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind samples containing MIR604 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The reproducibility (RSDr) of the developed method was evaluated to be less than 25%. The limit of quantitation of the method was estimated to be 0.5% based on the ISO 24276 guideline. These results suggested that the developed method would be suitable for practical quantitative analyses of MIR604 maize. PMID:23132355

  20. Validation of GOSAT SWIR XCO2 and XCH4 retrieved by PPDF-S method

    NASA Astrophysics Data System (ADS)

    Iwasaki, C.; Hayashida, S.; Imasu, R.; Ono, A.; Yokota, T.; Morino, I.; Yoshida, Y.; Oshchepkov, S.; Bril, A.

    2015-12-01

    Photon path length probability density function (PPDF)-based method is an effective algorithm for retrieving columnar concentrations of carbon dioxide and methane (XCO2 and XCH4) from Greenhouse gases Observing SATellite (GOSAT) data, even under conditions of high aerosol concentrations. Few studies have validated XCO2 analyzed using a PPDF-based method, and no report describes validation for XCH4. In this study, we validated both XCO2 and XCH4 data analyzed using an advanced version of PPDF, PPDF-simultaneous (PPDF-S), by comparing the retrieved data with the Total Carbon Column Observing Network (TCCON) data (GGG 2012 and 2014 releases) observed at 11 observation sites. As the first step, our validation procedure was applied to the NIES standard products of XCO2 and XCH4 (V02.xx) to confirm its performance. Results show that the biases ±their standard devations, defined by the difference from TCCON data (GGG 2012), were evaluated to be -1.94±1.79 ppm and -7.59±11.8 ppb, respectively, for XCO2 and XCH4, using radiance data used for NIES V02.11, and to be -1.63±1.88 ppm and -6.45±12.2 ppb using data for NIES V02.21. Absolute values of these biases are slightly larger than those from Yoshida et al. (2013) reported as -1.56±1.88 ppm and -6.10±12.3 ppb for XCO2 and XCH4, respectively using radiance data used for NIES V02.00. The validation procedure was then applied to XCO2 and XCH4 analyzed using PPDF-S. Results show that biases are 0.16±1.77 ppm and 4.18±14.3 ppb, respectively, for XCO2 and XCH4 using radiance used for NIES V02.11. They are 0.27±1.72 ppm and 4.54±14.4 ppb using data for NIES V02.21. We conclude that PPDF-S method exhibits better performance for both XCO2 and XCH4 retrieval than for NIES standard products where TCCON (GGG 2012) data are used as validation data, although the standard products might be better for XCH4 if TCCON (GGG 2014) data are used.

  1. Diffuse reflectance near infrared-chemometric methods development and validation of amoxicillin capsule formulations

    PubMed Central

    Khan, Ahmed Nawaz; Khar, Roop Krishen; Ajayakumar, P. V.

    2016-01-01

    Objective: The aim of present study was to establish near infrared-chemometric methods that could be effectively used for quality profiling through identification and quantification of amoxicillin (AMOX) in formulated capsule which were similar to commercial products. In order to evaluate a large number of market products easily and quickly, these methods were modeled. Materials and Methods: Thermo Scientific Antaris II near infrared analyzer with TQ Analyst Chemometric Software were used for the development and validation of the identification and quantification models. Several AMOX formulations were composed with four excipients microcrystalline cellulose, magnesium stearate, croscarmellose sodium and colloidal silicon dioxide. Development includes quadratic mixture formulation design, near infrared spectrum acquisition, spectral pretreatment and outlier detection. According to prescribed guidelines by International Conference on Harmonization (ICH) and European Medicine Agency (EMA) developed methods were validated in terms of specificity, accuracy, precision, linearity, and robustness. Results: On diffuse reflectance mode, an identification model based on discriminant analysis was successfully processed with 76 formulations; and same samples were also used for quantitative analysis using partial least square algorithm with four latent variables and 0.9937 correlation of coefficient followed by 2.17% root mean square error of calibration (RMSEC), 2.38% root mean square error of prediction (RMSEP), 2.43% root mean square error of cross-validation (RMSECV). Conclusion: Proposed model established a good relationship between the spectral information and AMOX identity as well as content. Resulted values show the performance of the proposed models which offers alternate choice for AMOX capsule evaluation, relative to that of well-established high-performance liquid chromatography method. Ultimately three commercial products were successfully evaluated using developed

  2. [Valid and reliable methods for describing pressure sores and leg ulcer--a systematic literature review].

    PubMed

    Panfil, Eva-Maria; Linde, Eva

    2007-08-01

    In the wound documentation of pressure sore and leg ulcer the most important tasks and objectives are the presentation of the outcomes of the diagnostic inspection, planning of therapy and evaluation of wound healing. The aim of the systematic literature review covering the period of time between 2001 and 2006 was to look for valid, reliable and feasible methods to the size, appearance, edge, grade, and healing of wounds. Due to their heterogeneity the studies that were found can hardly be compared; some of them show methodological weaknesses. Measurements of an elliptical area based on the perpendicular method using a ruler are the most reliable within the linear methods; however, they only allow an estimation of the size. Together with mechanical or digital planimetry tracings can measure the wound's size reliably. Photographs do not assess large or circular wounds reliably, nor do they adequately document the wound's colour. There are no valid and reliable standardized procedures for the documentation of the wound's colour, exudate, odour; margins and maceration. To describe the pressure sore's degree of severity there are twenty different systems of classification. The data, however, confirm the difficulty to classify pressure ulcers reliably. Wound healing can also be assessed by a number of standardized tools: PSST, PUSH, SWHT, SS, PUHP, CODED and DESIGN (pressure sore) and LUMT (leg ulcer). These tools have not been translated into German and have not been adequately researched. No data exists to allow generalization concerning the practicability of these methods. For all methods of measurement, it can be concluded that training and experience in the use of the method is required and that the validity and reliability are higher when measurements are conducted by an experienced person. PMID:18019553

  3. Control rod heterogeneity effects in liquid-metal fast breeder reactors: Method developments and experimental validation

    SciTech Connect

    Carta, M.; Granget, G.; Palmiotti, G.; Salvatores, M.; Soule, R.

    1988-11-01

    The control rod worth assessment in a large liquid-metal fast breeder reactor is strongly dependent on the actual arrangement of the absorber pins inside the control rod subassemblies. The so-called heterogeneity effects (i.e., the effects on the rod reactivity of the actual rod internal geometry versus homogenization of the absorber atoms over all the subassembly volume) have been evaluated, using explicit and variational methods to derive appropriate cross sections. An experimental program performed at the MASURCA facility has been used to validate these methods.

  4. Estimating drug consumption in opioid users: reliability and validity of a 'recent use' episodes method.

    PubMed

    Darke, S; Heather, N; Hall, W; Ward, J; Wodak, A

    1991-10-01

    The efficient and accurate measurement of recent drug use is an essential component of treatment and research among opioid users. Urinalysis results alone will not give sufficient information to either the clinician or researcher, due to limitations in detection and an inability to distinguish extent of use. The present paper describes a 'recent use episodes method', adapted from the measurement of alcohol consumption, for obtaining self-reported drug use in eleven different drug categories. Reliability and validity data indicate that the method provides a quick means by which accurate information may be obtained on the overall recent drug use of opioid users.

  5. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  6. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1989-01-01

    The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.

  7. T2 Preparation Method for Measuring Hyperemic Myocardial O2 Consumption: In Vivo Validation by Positron Emission Tomography

    PubMed Central

    McCommis, Kyle S.; O’Connor, Robert; Abendschein, Dana R.; Muccigrosso, David; Gropler, Robert J.; Zheng, Jie

    2013-01-01

    Purpose To validate a new T2-prepared method for the quantification of regional myocardial O2 consumption during pharmacologic stress with positron emission tomography (PET). Materials and Methods A T2 prepared gradient-echo sequence was modified to measure myocardial T2 within a single breath-hold. Six beagle dogs were randomly selected for the induction of coronary artery stenosis. Magnetic resonance imaging (MRI) experiments were performed with the T2 imaging and first-pass perfusion imaging at rest and during either dobutamine- or dipyridamole-induced hyperemia. Myocardial blood flow (MBF) was quantified using a previously developed model-free algorithm. Hyperemic myocardial O2 extraction fraction (OEF) and consumption (MVO2) were calculated using a two-compartment model developed previously. PET imaging using 11C-acetate and 15O-water was performed in the same day to validate OEF, MBF, and MVO2 measurements. Results The T2-prepared mapping sequence measured regional myocardial T2 with a repeatability of 2.3%. By myocardial segment-basis analysis, MBF measured by MRI is closely correlated with that measured by PET (R2 = 0.85, n = 22). Similar correlation coefficients were observed for hyperemic OEF (R2 = 0.90, n = 9, mean difference of PET − MRI = −2.4%) and MVO2 (R2 = 0.83, n = 7, mean difference = 4.2%). Conclusion The T2-prepared imaging method may allow quantitative estimation of regional myocardial oxygenation with relatively good accuracy. The precision of the method remains to be improved. PMID:21274973

  8. Bioremediation of toluene and naphthalene: development and validation of a GC-FID method for their monitoring.

    PubMed

    Bianchi, Federica; Careri, Maria; Mustat, Lara; Malcevschi, Alessio; Musci, Marilena

    2005-01-01

    Bioremediation of toluene and naphthalene in liquid cultures of bacteria grown in the presence of these aromatic compounds as unique sources of carbon was investigated by gas chromatography (GC). For this purpose, a method based on the use of GC with flame ionization detection was developed and validated. Validation was carried out in terms of limit of detection (LOD), limit of quantitation (LOQ), linearity, precision and trueness. In the case of naphthalene, LOD and LOQ values of 0.43 and 0.72 mg kg(-1) were achieved. Linearity was established over one order of magnitude in the range of interest, i.e. 10-100 mg kg(-1). Excellent precision was obtained both in terms of intra-day repeatability and between-day precision on two concentration levels (RSD% lower than 0.5%). A recovery of 97.9 +/- 0.2% (n=3) was calculated by addition of 640 mg kg(-1) of naphthalene to the Bushnell & Haas mineral salts basal solution containing the micro-organisms. Findings clearly showed a reduction of the naphthalene content equal to 50% and 75% after two and four weeks of contact with the micro-organisms, whereas a lower degradation was shown in the case of toluene. Finally bioremediation activity was ascribed to two different microbial populations, Bordetella Petrii and Bacillus Sphericus, which survived in the polluted medium. PMID:16235785

  9. A Self-Validation Method for High-Temperature Thermocouples Under Oxidizing Atmospheres

    NASA Astrophysics Data System (ADS)

    Mokdad, S.; Failleau, G.; Deuzé, T.; Briaudeau, S.; Kozlova, O.; Sadli, M.

    2015-08-01

    Thermocouples are prone to significant drift in use particularly when they are exposed to high temperatures. Indeed, high-temperature exposure can affect the response of a thermocouple progressively by changing the structure of the thermoelements and inducing inhomogeneities. Moreover, an oxidizing atmosphere contributes to thermocouple drift by changing the chemical nature of the metallic wires by the effect of oxidation. In general, severe uncontrolled drift of thermocouples results from these combined influences. A periodic recalibration of the thermocouple can be performed, but sometimes it is not possible to remove the sensor out of the process. Self-validation methods for thermocouples provide a solution to avoid this drawback, but there are currently no high-temperature contact thermometers with self-validation capability at temperatures up to . LNE-Cnam has developed fixed-point devices integrated to the thermocouples consisting of machined alumina-based devices for operation under oxidizing atmospheres. These devices require small amounts of pure metals (typically less than 2 g). They are suitable for self-validation of high-temperature thermocouples up to . In this paper the construction and the characterization of these integrated fixed-point devices are described. The phase-transition plateaus of gold, nickel, and palladium, which enable coverage of the temperature range between and , are assessed with this self-validation technique. Results of measurements performed at LNE-Cnam with the integrated self-validation module at several levels of temperature will be presented. The performance of the devices are assessed and discussed, in terms of robustness and metrological characteristics. Uncertainty budgets are also proposed and detailed.

  10. Quantitative Imaging Methods for the Development and Validation of Brain Biomechanics Models

    PubMed Central

    Bayly, Philip V.; Clayton, Erik H.; Genin, Guy M.

    2013-01-01

    Rapid deformation of brain tissue in response to head impact or acceleration can lead to numerous pathological changes, both immediate and delayed. Modeling and simulation hold promise for illuminating the mechanisms of traumatic brain injury (TBI) and for developing preventive devices and strategies. However, mathematical models have predictive value only if they satisfy two conditions. First, they must capture the biomechanics of the brain as both a material and a structure, including the mechanics of brain tissue and its interactions with the skull. Second, they must be validated by direct comparison with experimental data. Emerging imaging technologies and recent imaging studies provide important data for these purposes. This review describes these techniques and data, with an emphasis on magnetic resonance imaging approaches. In combination, these imaging tools promise to extend our understanding of brain biomechanics and improve our ability to study TBI in silico. PMID:22655600

  11. Validation of rapid assessment methods to determine streamflow duration classes in the Pacific Northwest, USA.

    PubMed

    Nadeau, Tracie-Lynn; Leibowitz, Scott G; Wigington, Parker J; Ebersole, Joseph L; Fritz, Ken M; Coulombe, Robert A; Comeleo, Randy L; Blocksom, Karen A

    2015-07-01

    United States Supreme Court rulings have created uncertainty regarding U.S. Clean Water Act (CWA) authority over certain waters, and established new data and analytical requirements for determining CWA jurisdiction. Thus, rapid assessment methods are needed that can differentiate between ephemeral, intermittent, and perennial streams. We report on the validation of several methods. The first (Interim Method) was developed through best professional judgment (BPJ); an alternative (Revised Method) resulted from statistical analysis. We tested the Interim Method on 178 study reaches in Oregon, and constructed the Revised Method based on statistical analysis of the Oregon data. Next, we evaluated the regional applicability of the methods on 86 study reaches across a variety of hydrologic landscapes in Washington and Idaho. During the second phase, we also compared the Revised Method with a similar approach (Combined Method) based on combined field data from Oregon, Washington, and Idaho. We further compared field-based methods with a GIS-based approach (GIS Method) that used the National Hydrography Dataset and a synthetic stream network. Evaluations of all methods compared results with actual streamflow duration classes. The Revised Method correctly determined known streamflow duration 83.9% of the time, versus 62.3% accuracy of the Interim Method and 43.6% accuracy for the GIS-based approach. The Combined Method did not significantly outperform the Revised Method. Analysis showed biological indicators most accurately discriminate streamflow duration classes. While BPJ established a testable hypothesis, this study illustrates the importance of quantitative field testing of rapid assessment methods. Results support a consistent method applicable across the Pacific Northwest. PMID:25931296

  12. Validation of Rapid Assessment Methods to Determine Streamflow Duration Classes in the Pacific Northwest, USA

    NASA Astrophysics Data System (ADS)

    Nadeau, Tracie-Lynn; Leibowitz, Scott G.; Wigington, Parker J.; Ebersole, Joseph L.; Fritz, Ken M.; Coulombe, Robert A.; Comeleo, Randy L.; Blocksom, Karen A.

    2015-07-01

    United States Supreme Court rulings have created uncertainty regarding U.S. Clean Water Act (CWA) authority over certain waters, and established new data and analytical requirements for determining CWA jurisdiction. Thus, rapid assessment methods are needed that can differentiate between ephemeral, intermittent, and perennial streams. We report on the validation of several methods. The first (Interim Method) was developed through best professional judgment (BPJ); an alternative (Revised Method) resulted from statistical analysis. We tested the Interim Method on 178 study reaches in Oregon, and constructed the Revised Method based on statistical analysis of the Oregon data. Next, we evaluated the regional applicability of the methods on 86 study reaches across a variety of hydrologic landscapes in Washington and Idaho. During the second phase, we also compared the Revised Method with a similar approach (Combined Method) based on combined field data from Oregon, Washington, and Idaho. We further compared field-based methods with a GIS-based approach (GIS Method) that used the National Hydrography Dataset and a synthetic stream network. Evaluations of all methods compared results with actual streamflow duration classes. The Revised Method correctly determined known streamflow duration 83.9 % of the time, versus 62.3 % accuracy of the Interim Method and 43.6 % accuracy for the GIS-based approach. The Combined Method did not significantly outperform the Revised Method. Analysis showed biological indicators most accurately discriminate streamflow duration classes. While BPJ established a testable hypothesis, this study illustrates the importance of quantitative field testing of rapid assessment methods. Results support a consistent method applicable across the Pacific Northwest.

  13. Validation of Modifications to the ANSR® Salmonella Method for Improved Ease of Use.

    PubMed

    Caballero, Oscar; Alles, Susan; Walton, Kayla; Gray, R Lucas; Mozola, Mark; Rice, Jennifer

    2015-01-01

    This paper describes the results of a study to validate minor reagent formulation and procedural changes to the ANSR® Salmonella method, AOAC Performance Tested Method™ 061203. In order to improve ease of use and diminish risk of amplicon contamination, the lyophilized reagent components were reformulated for increased solubility, thus eliminating the need to mix by pipetting. In the alternative procedure, an aliquot of the lysate is added to lyophilized ANSR reagents, immediately capped, and briefly mixed by vortex. Results of the validation study with ice cream, peanut butter, dry dog food, raw ground turkey, raw ground beef, and sponge samples from a stainless steel surface showed no statistically significant differences in performance between the ANSR method and the U.S. Food and Drug Administration Bacteriological Analytical Manual or U.S. Department of Agriculture-Food Safety and Inspection Services Microbiology Laboratory Guidebook reference culture procedures. Results of inclusivity and exclusivity testing were unchanged from those of the original validation study; exclusivity was 100% and inclusivity was 99.1% with only a single strain of Salmonella Weslaco testing negative. Robustness testing was also conducted, with variations to lysis buffer volume, lysis time, and sample volume having no demonstrable effect on assay results. PMID:26086257

  14. Validation of structural analysis methods using burner liner cyclic rig test data

    NASA Technical Reports Server (NTRS)

    Thompson, R.

    1983-01-01

    The objectives of the hot section technology (HOST) burner liner cyclic rig test program are basically threefold: (1) to assist in developing predictive tools needed to improve design analyses and procedures for the efficient and accurate prediction of burner liner structural response; (2) to calibrate, evaluate and validate these predictive tools by comparing the predicted results with the experimental data generated in the tests; and (3) to evaluate existing as well as advanced temperature and strain measurement instrumentation, both contact and noncontact, in a simulated engine cycle environment. The data generated will include measurements of the thermal environment (metal surface temperatures) as well as structural (strain) and life (fatigue) responses of simulated burner liners and specimens under controlled boundary and operating conditions. These data will be used to calibrate, compare and validate analytical theories, methodologies and design procedures, as well as improvements in them, for predicting liner temperatures, stress-strain responses and cycles to failure. Comparison of predicted results with experimental data will be used to show where the predictive theories, etc. need improvements. In addition, as the predictive tools, as well as the tests, test methods, and data acquisition and reduction techniques, are developed and validated, a proven, integrated analysis/experiment method will be developed to determine the cyclic life of a simulated burner liner.

  15. Validation Study of a Method for Assessing Complex Ill-Structured Problem Solving by Using Causal Representations

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ifenthaler, Dirk; Ge, Xun

    2013-01-01

    The important but little understood problem that motivated this study was the lack of research on valid assessment methods to determine progress in higher-order learning in situations involving complex and ill-structured problems. Without a valid assessment method, little progress can occur in instructional design research with regard to designing…

  16. Update from the Japanese Center for the Validation of Alternative Methods (JaCVAM).

    PubMed

    Kojima, Hajime

    2013-12-01

    The Japanese Center for the Validation of Alternative Methods (JaCVAM) was established in 2005 to promote the use of alternatives to animal testing in regulatory studies, thereby replacing, reducing, or refining the use of animals, according to the Three Rs principles. JaCVAM assesses the utility, limitations and suitability for use in regulatory studies, of test methods needed to determine the safety of chemicals and other materials. JaCVAM also organises and performs validation studies of new test methods, when necessary. In addition, JaCVAM co-operates and collaborates with similar organisations in related fields, both in Japan and internationally, which also enables JaCVAM to provide input during the establishment of guidelines for new alternative experimental methods. These activities help facilitate application and approval processes for the manufacture and sale of pharmaceuticals, chemicals, pesticides, and other products, as well as for revisions to standards for cosmetic products. In this manner, JaCVAM plays a leadership role in the introduction of new alternative experimental methods for regulatory acceptance in Japan. PMID:24512226

  17. Method development validation for corticoids in animal feed samples by liquid chromatography using a monolithic column.

    PubMed

    Muñiz-Valencia, Roberto; Gonzalo-Lumbreras, Raquel; Santos-Montes, Ana; Izquierdo-Hornillos, Roberto

    2007-11-01

    A LC method for corticosteroids (CC) determination in poultry feed using a Chromolith column and UV detection has been developed and validated. The method development involved the optimization of different hydro-organic mobile phases using methanol or ACN as organic modifiers, flow rate, and temperature. The optimum separation was achieved at 40 degrees C using ACN/water (21:79 v/v) as mobile phase and 3 mL/min flow rate, allowing the separation to baseline of four out of seven CC in about 10 min. Prior to LC, a sample preparation procedure previously assayed for anabolics was used. It includes a leaching process, saponification of the esters from fatty acids, and SPE. Method validation was carried out according to the EU criteria established for quantitative screening methods. The extraction efficiencies, decision limits (CCalpha), and detection capabilities (CCbeta) for these compounds were in the ranges of 86-92%, 27-36 microg/kg, and 33-43 microg/kg, respectively. The repeatability and the within-laboratory reproducibility at 1, 1.5, and 2 CCbeta concentration levels were smaller than 9.0, 5.0, and 4.2% and 9.4, 6.4, and 4.9%, respectively. The CV values of the robustness test were less than 3.8% and the accuracy was in the range of 98-103%. The proposed method was applied to other feed with satisfactory results.

  18. When is the mode-summation method of calculating van der Waals force valid?

    NASA Astrophysics Data System (ADS)

    Narayanaswamy, Arvind

    2015-03-01

    Most calculations of van der Waals forces and Casimir forces can be categorized as variations of two ``proto methods'': (1) Lifshitz theory, and (2) mode summation method. In the Lifshitz theory, by which I include the subsequent generalization by Dzyaloshinskii, Lifshitz, and Pitaevskii [Adv. Phys. 10, 165 (1961); See also Zheng and Narayanaswamy, Phys. Rev. A 83, 042504 (2011)] the dispersion force is expressed in terms of the (dyadic) Green's function of the vector Helmholtz equation. In the mode summation method [see Casimir, Proc. Kon. Ned. Akad. Wetensch. 51, 793 (1948); Van Kampen, Nijboer, and Schram, Phys. Lett. A 26, 307 (1968)], the free energy of a configuration of objects is expressed in terms of the sum of the free energies of each of the possible electromagnetic modes. The derivative of this free energy with respect to variation of relative positions between the objects yields the force between two objects. However, we raised questions about the validity of the mode summation method when calculating van der Waals forces in dissipative media [see Narayanaswamy and Zheng, Phys. Rev. A 88, 012502 (2013) and Ninham, Parsegian, and Weiss, J. Stat. Phys. 2, 323 (1970)]. In this talk, I want to start a discussion about the validity of the mode summation method.

  19. Validation of a method to directly and specifically measure nitrite in biological matrices.

    PubMed

    Almeida, Luis E F; Kamimura, Sayuri; Kenyon, Nicholas; Khaibullina, Alfia; Wang, Li; de Souza Batista, Celia M; Quezado, Zenaide M N

    2015-02-15

    The bioactivity of nitric oxide (NO) is influenced by chemical species generated through reactions with proteins, lipids, metals, and its conversion to nitrite and nitrate. A better understanding of the functions played by each of these species could be achieved by developing selective assays able of distinguishing nitrite from other NO species. Nagababu and Rifkind developed a method using acetic and ascorbic acids to measure nitrite-derived NO in plasma. Here, we adapted, optimized, and validated this method to assay nitrite in tissues. The method yielded linear measurements over 1-300 pmol of nitrite and was validated for tissue preserved in a nitrite stabilization solution composed of potassium ferricyanide, N-ethylmaleimide and NP-40. When samples were processed with chloroform, but not with methanol, ethanol, acetic acid or acetonitrile, reliable and reproducible nitrite measurements in up to 20 sample replicates were obtained. The method's accuracy in tissue was ≈ 90% and in plasma 99.9%. In mice, during basal conditions, brain, heart, lung, liver, spleen and kidney cortex had similar nitrite levels. In addition, nitrite tissue levels were similar regardless of when organs were processed: immediately upon collection, kept in stabilization solution for later analysis or frozen and later processed. After ip nitrite injections, rapidly changing nitrite concentrations in tissue and plasma could be measured and were shown to change in significantly distinct patterns. This validated method could be valuable for investigations of nitrite biology in conditions such as sickle cell disease, cardiovascular disease, and diabetes, where nitrite is thought to play a role. PMID:25445633

  20. Glycol ethers--validation procedures for tube/pump and dosimeter monitoring methods.

    PubMed

    Langhorst, M L

    1984-06-01

    Methods were developed and validated for personal monitoring of exposures to airborne glycol ethers, both short-term and long-term time-weighted-averages. Either a 600 mg charcoal tube or a 780 mg silica gel tube is recommended for monitoring nine glycol ethers, depending upon the humidity and other organic compounds to be monitored. The charcoal tube allows maximum sensitivity and is unaffected by high humidity conditions. Two-phase solvent desorption with CS2 and water allows aqueous phase recoveries of DOWANOL EM, PM, EE, DM, DPM, and TM glycol ethers. DOWANOL EB, DB and TPM glycol ethers are partitioned between the two layers, necessitating chromatographic analysis of both layers. The silica gel tube method can be used to monitor all nine glycol ethers tested, but is affected by high humidity conditions, resulting in significant breakthrough of the more volatile glycol ethers. The 3M organic vapor monitor can accurately and conveniently determine exposure concentrations for DOWANOL EM, EE, and PM glycol ethers, but sensitivities may be inadequate for sampling periods less than one hour. These methods were validated at levels down to 0.1 times the Dow internal exposure guidelines for those substances with Dow exposure guidelines and well above the current ACGIH and OSHA guidelines. This paper also illustrates validation procedures for tube/pump and dosimeter methods, allowing good definition of method accuracy and precision. Some screening experiments are described for diffusional dosimeters to check the most important parameters in a minimum of time. This methodology will allow assessment of human airborne exposures relative to the new toxicology data available on animals. PMID:6331145

  1. The development and validation of a single SNaPshot multiplex for tiger species and subspecies identification--implications for forensic purposes.

    PubMed

    Kitpipit, Thitika; Tobe, Shanan S; Kitchener, Andrew C; Gill, Peter; Linacre, Adrian

    2012-03-01

    The tiger (Panthera tigris) is currently listed on Appendix I of the Convention on the International Trade in Endangered Species of Wild Fauna and Flora; this affords it the highest level of international protection. To aid in the investigation of alleged illegal trade in tiger body parts and derivatives, molecular approaches have been developed to identify biological material as being of tiger in origin. Some countries also require knowledge of the exact tiger subspecies present in order to prosecute anyone alleged to be trading in tiger products. In this study we aimed to develop and validate a reliable single assay to identify tiger species and subspecies simultaneously; this test is based on identification of single nucleotide polymorphisms (SNPs) within the tiger mitochondrial genome. The mitochondrial DNA sequence from four of the five extant putative tiger subspecies that currently exist in the wild were obtained and combined with DNA sequence data from 492 tiger and 349 other mammalian species available on GenBank. From the sequence data a total of 11 SNP loci were identified as suitable for further analyses. Five SNPs were species-specific for tiger and six amplify one of the tiger subspecies-specific SNPs, three of which were specific to P. t. sumatrae and the other three were specific to P. t. tigris. The multiplex assay was able to reliably identify 15 voucher tiger samples. The sensitivity of the test was 15,000 mitochondrial DNA copies (approximately 0.26 pg), indicating that it will work on trace amounts of tissue, bone or hair samples. This simple test will add to the DNA-based methods currently being used to identify the presence of tiger within mixed samples.

  2. The development and validation of a single SNaPshot multiplex for tiger species and subspecies identification--implications for forensic purposes.

    PubMed

    Kitpipit, Thitika; Tobe, Shanan S; Kitchener, Andrew C; Gill, Peter; Linacre, Adrian

    2012-03-01

    The tiger (Panthera tigris) is currently listed on Appendix I of the Convention on the International Trade in Endangered Species of Wild Fauna and Flora; this affords it the highest level of international protection. To aid in the investigation of alleged illegal trade in tiger body parts and derivatives, molecular approaches have been developed to identify biological material as being of tiger in origin. Some countries also require knowledge of the exact tiger subspecies present in order to prosecute anyone alleged to be trading in tiger products. In this study we aimed to develop and validate a reliable single assay to identify tiger species and subspecies simultaneously; this test is based on identification of single nucleotide polymorphisms (SNPs) within the tiger mitochondrial genome. The mitochondrial DNA sequence from four of the five extant putative tiger subspecies that currently exist in the wild were obtained and combined with DNA sequence data from 492 tiger and 349 other mammalian species available on GenBank. From the sequence data a total of 11 SNP loci were identified as suitable for further analyses. Five SNPs were species-specific for tiger and six amplify one of the tiger subspecies-specific SNPs, three of which were specific to P. t. sumatrae and the other three were specific to P. t. tigris. The multiplex assay was able to reliably identify 15 voucher tiger samples. The sensitivity of the test was 15,000 mitochondrial DNA copies (approximately 0.26 pg), indicating that it will work on trace amounts of tissue, bone or hair samples. This simple test will add to the DNA-based methods currently being used to identify the presence of tiger within mixed samples. PMID:21723800

  3. Improved computational neutronics methods and validation protocols for the advanced test reactor

    SciTech Connect

    Nigg, D. W.; Nielsen, J. W.; Chase, B. M.; Murray, R. K.; Steuhm, K. A.; Unruh, T.

    2012-07-01

    The Idaho National Laboratory (INL) is in the process of updating the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purposes. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry have been conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for flexible and repeatable ATR physics code validation protocols that are consistent with applicable national standards. (authors)

  4. Ultrasound measurements of segmental temperature distribution in solids: Method and its high-temperature validation.

    PubMed

    Jia, Yunlu; Chernyshev, Vasiliy; Skliar, Mikhail

    2016-03-01

    A novel approach that uses noninvasive ultrasound to measure the temperature distribution in solid materials is described and validated in high-temperature laboratory experiments. The approach utilizes an ultrasound propagation path with naturally occurring or purposefully introduced echogenic features that partially redirect the energy of an ultrasound excitation pulse back to the transducer, resulting in a train of echoes. Their time of flight depends on the velocity of ultrasound propagation, which changes with temperature distribution in different segments of the propagation path. We reconstruct segmental temperature distributions under different parameterizations. Several parameterizations are discussed, including piecewise constant and piecewise linear, and the parametrization that requires that the estimated temperature profile satisfies an appropriate heat conduction model. The experimental validation of the proposed approach with an alumina sample shows that even with simple parameterizations, the temperature profile is correctly captured with an accuracy that may be comparable to that of the traditional pointwise sensors. The advantages of the approach are discussed, including its suitability for real time and non-destructive temperature measurements in extreme environments and locations inaccessible to the traditional insertion sensors. PMID:26678789

  5. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  6. Validation and evaluation of the advanced aeronautical CFD system SAUNA: A method developer's view

    NASA Astrophysics Data System (ADS)

    Shaw, J. A.; Peace, A. J.; Georgala, J. M.; Childs, P. N.

    1993-09-01

    This paper is concerned with a detailed validation and evaluation of the SAUNA CFD system for complex aircraft configurations. The methodology of the complete system is described in brief, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the configuration. A wide range of configurations and flow conditions are chosen in the validation and evaluation exercise to demonstrate the scope of SAUNA. A detailed description of the results from the method is preceded by a discussion on the philosophy behind the strategy followed in the exercise, in terms of equality assessment and the differing roles of the code developer and the code user. It is considered that SAUNA has grown into a highly usable tool for the aircraft designer, in combining flexibility and accuracy in an efficient manner.

  7. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  8. Validation of the Delvotest SP NT. Performance Tested Methods(SM) 011102.

    PubMed

    Hennart, Stephen L A; Faragher, John

    2012-01-01

    Delvotest SP NT is designed to test milk for the presence of antibacterial substances such as antibiotics. The test is made of an agar gel containing bacterial spores and a pH indicator. The milk sample is added onto the agar gel, and the test is placed for incubation at 64 degrees C. The principle of the test is based on the diffusion of possible inhibitory substances that may be present in the milk sample into the agar. This reduces growth and acid production by the test organism and delays or prevents the agar from changing color from purple to yellow. The present report includes all technical details about the Delvotest SP NT and the results of the validation study. The validation study demonstrates that the Delvotest SP NT conforms to the product performance claims and confirms the robustness of the test. The Delvotest SP NT is therefore granted Performance Tested Method(SM) certification. PMID:23451401

  9. Reliability and validity of an ultrasound-based imaging method for measuring interspinous process distance in the lumbar spine using two different index points.

    PubMed

    Tozawa, Ryosuke; Katoh, Munenori; Aramaki, Hidefumi; Kumamoto, Tsuneo; Fujinawa, Osamu

    2015-07-01

    [Purpose] This study assessed the reliability and validity of an ultrasound-based imaging method for measuring the interspinous process distance in the lumbar spine using two different index points. [Subjects and Methods] Ten healthy males were recruited. Five physical therapy students participated in this study as examiners. The L2-L3 interspinous distance was measured from the caudal end of the L2 spinous process to the cranial end of the L3 spinous process (E-E measurement) and from the top of the L2 spinous process to the top of the L3 spinous process (T-T measurement). Intraclass correlation coefficients were calculated to estimate the relative reliability. Validity was assessed using a model resembling the living human body. [Results] The reliability study showed no difference in intra-rater reliability between the two measurements. However, the E-E measurement showed higher inter-rater reliability than the T-T measurement (Intraclass correlation coefficients: 0.914 vs. 0.725). Moreover, the E-E measurement method had good validity (Intraclass correlation coefficients: 0.999 and 95% confidence interval for minimal detectable change: 0.29 mm). [Conclusion] These results demonstrate the high reliability and validity of ultrasound-based imaging in the quantitative assessment of lumbar interspinous process distance. Of the two methods, the E-E measurement method is recommended.

  10. Validation and Recommendation of Methods to Measure Biogas Production Potential of Animal Manure

    PubMed Central

    Pham, C. H.; Triolo, J. M.; Cu, T. T. T.; Pedersen, L.; Sommer, S. G.

    2013-01-01

    In developing countries, biogas energy production is seen as a technology that can provide clean energy in poor regions and reduce pollution caused by animal manure. Laboratories in these countries have little access to advanced gas measuring equipment, which may limit research aimed at improving local adapted biogas production. They may also be unable to produce valid estimates of an international standard that can be used for articles published in international peer-reviewed science journals. This study tested and validated methods for measuring total biogas and methane (CH4) production using batch fermentation and for characterizing the biomass. The biochemical methane potential (BMP) (CH4 NL kg−1 VS) of pig manure, cow manure and cellulose determined with the Moller and VDI methods was not significantly different in this test (p>0.05). The biodegradability using a ratio of BMP and theoretical BMP (TBMP) was slightly higher using the Hansen method, but differences were not significant. Degradation rate assessed by methane formation rate showed wide variation within the batch method tested. The first-order kinetics constant k for the cumulative methane production curve was highest when two animal manures were fermented using the VDI 4630 method, indicating that this method was able to reach steady conditions in a shorter time, reducing fermentation duration. In precision tests, the repeatability of the relative standard deviation (RSDr) for all batch methods was very low (4.8 to 8.1%), while the reproducibility of the relative standard deviation (RSDR) varied widely, from 7.3 to 19.8%. In determination of biomethane concentration, the values obtained using the liquid replacement method (LRM) were comparable to those obtained using gas chromatography (GC). This indicates that the LRM method could be used to determine biomethane concentration in biogas in laboratories with limited access to GC. PMID:25049861

  11. Validated spectrophotometric methods for determination of sodium valproate based on charge transfer complexation reactions

    NASA Astrophysics Data System (ADS)

    Belal, Tarek S.; El-Kafrawy, Dina S.; Mahrous, Mohamed S.; Abdel-Khalek, Magdi M.; Abo-Gharam, Amira H.

    2016-02-01

    This work presents the development, validation and application of four simple and direct spectrophotometric methods for determination of sodium valproate (VP) through charge transfer complexation reactions. The first method is based on the reaction of the drug with p-chloranilic acid (p-CA) in acetone to give a purple colored product with maximum absorbance at 524 nm. The second method depends on the reaction of VP with dichlone (DC) in dimethylformamide forming a reddish orange product measured at 490 nm. The third method is based upon the interaction of VP and picric acid (PA) in chloroform resulting in the formation of a yellow complex measured at 415 nm. The fourth method involves the formation of a yellow complex peaking at 361 nm upon the reaction of the drug with iodine in chloroform. Experimental conditions affecting the color development were studied and optimized. Stoichiometry of the reactions was determined. The proposed spectrophotometric procedures were effectively validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. Calibration curves of the formed color products with p-CA, DC, PA and iodine showed good linear relationships over the concentration ranges 24-144, 40-200, 2-20 and 1-8 μg/mL respectively. The proposed methods were successfully applied to the assay of sodium valproate in tablets and oral solution dosage forms with good accuracy and precision. Assay results were statistically compared to a reference pharmacopoeial HPLC method where no significant differences were observed between the proposed methods and reference method.

  12. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  13. Chemometric approach to open validation protocols: Prediction of validation parameters in multi-residue ultra-high performance liquid chromatography-tandem mass spectrometry methods.

    PubMed

    Alladio, Eugenio; Pirro, Valentina; Salomone, Alberto; Vincenti, Marco; Leardi, Riccardo

    2015-06-01

    The recent technological advancements of liquid chromatography-tandem mass spectrometry allow the simultaneous determination of tens, or even hundreds, of target analytes. In such cases, the traditional approach to quantitative method validation presents three major drawbacks: (i) it is extremely laborious, repetitive and rigid; (ii) it does not allow to introduce new target analytes without starting the validation from its very beginning and (iii) it is performed on spiked blank matrices, whose very nature is significantly modified by the addition of a large number of spiking substances, especially at high concentration. In the present study, several predictive chemometric models were developed from closed sets of analytes in order to estimate validation parameters on molecules of the same class, but not included in the original training set. Retention time, matrix effect, recovery, detection and quantification limits were predicted with partial least squares regression method. In particular, iterative stepwise elimination, iterative predictors weighting and genetic algorithms approaches were utilized and compared to achieve effective variables selection. These procedures were applied to data reported in our previously validated ultra-high performance liquid chromatography-tandem mass spectrometry multi-residue method for the determination of pharmaceutical and illicit drugs in oral fluid samples in accordance with national and international guidelines. Then, the partial least squares model was successfully tested on naloxone and lormetazepam, in order to introduce these new compounds in the oral fluid validated method, which adopts reverse-phase chromatography. Retention time, matrix effect, recovery, limit of detection and limit of quantification parameters for naloxone and lormetazepam were predicted by the model and then positively compared with their corresponding experimental values. The whole study represents a proof-of-concept of chemometrics potential to

  14. Pressurised liquid extraction of flavonoids in onions. Method development and validation.

    PubMed

    Søltoft, Malene; Christensen, Jan H; Nielsen, John; Knuthsen, Pia

    2009-11-15

    A rapid and reliable analytical method for quantification of flavonoids in onions was developed and validated. Five extraction methods were tested on freeze-dried onions and subsequently high performance liquid chromatography (HPLC) with UV detection was used for quantification of seven flavonoids. The extraction efficiencies were lowest for the conventional water bath extraction compared to pressurized liquid extraction (PLE), ultrasonication, ultrasonic liquid processor, and microwave extraction, which yielded similar efficiencies. The reproducibility was in the same range (RSD: 1-11%) for all tested extraction methods. However, PLE was the preferred extraction method because the method can be highly automated, use only small amounts of solvents, provide the cleanest extracts, and allow the extraction of light and oxygen-sensitive flavonoids to be carried out in an inert atmosphere protected from light. The method parameters: extraction temperature, sample weight, flush volume and solvent type were optimised, and a clean-up step was integrated in the PLE procedure by in-cell addition of C18-material to the extraction cells, which slightly improved the recovery and reproducibility of the method. The one-step PLE method showed good selectivity, precision (RSDs=3.1-11%) and recovery of the extractable flavonoids (98-99%). The method also appeared to be a multi-method, i.e. generally applicable to, e.g. phenolic acids in potatoes and carrots. PMID:19782226

  15. Validating a nondestructive optical method for apportioning colored particulate matter into black carbon and additional components

    PubMed Central

    Yan, Beizhan; Kennedy, Daniel; Miller, Rachel L.; Cowin, James P.; Jung, Kyung-hwa; Perzanowski, Matt; Balletta, Marco; Perera, Federica P.; Kinney, Patrick L.; Chillrud, Steven N.

    2011-01-01

    Exposure of black carbon (BC) is associated with a variety of adverse health outcomes. A number of optical methods for estimating BC on Teflon filters have been adopted but most assume all light absorption is due to BC while other sources of colored particulate matter exist. Recently, a four-wavelength-optical reflectance measurement for distinguishing second hand cigarette smoke (SHS) from soot-BC was developed (Brook et al., 2010; Lawless et al., 2004). However, the method has not been validated for soot-BC nor SHS and little work has been done to look at the methodological issues of the optical reflectance measurements for samples that could have SHS, BC, and other colored particles. We refined this method using a lab-modified integrating sphere with absorption measured continuously from 350 nm to 1000 nm. Furthermore, we characterized the absorption spectrum of additional components of particulate matter (PM) on PM2.5 filters including ammonium sulfate, hematite, goethite, and magnetite. Finally, we validate this method for BC by comparison to other standard methods. Use of synthesized data indicates that it is important to optimize the choice of wavelengths to minimize computational errors as additional components (more than 2) are added to the apportionment model of colored components. We found that substantial errors are introduced when using 4 wavelengths suggested by Lawless et al. to quantify four substances, while an optimized choice of wavelengths can reduce model-derived error from over 10% to less than 2%. For environmental samples, the method was sensitive for estimating airborne levels of BC and SHS, but not mass loadings of iron oxides and sulfate. Duplicate samples collected in NYC show high reproducibility (points consistent with a 1:1 line, R2 = 0.95). BC data measured by this method were consistent with those measured by other optical methods, including Aethalometer and Smoke-stain Reflectometer (SSR); although the SSR looses sensitivity at

  16. Validating a nondestructive optical method for apportioning colored particulate matter into black carbon and additional components

    NASA Astrophysics Data System (ADS)

    Yan, Beizhan; Kennedy, Daniel; Miller, Rachel L.; Cowin, James P.; Jung, Kyung-hwa; Perzanowski, Matt; Balletta, Marco; Perera, Federica P.; Kinney, Patrick L.; Chillrud, Steven N.

    2011-12-01

    Exposure of black carbon (BC) is associated with a variety of adverse health outcomes. A number of optical methods for estimating BC on Teflon filters have been adopted but most assume all light absorption is due to BC while other sources of colored particulate matter exist. Recently, a four-wavelength-optical reflectance measurement for distinguishing second hand cigarette smoke (SHS) from soot-BC was developed (Brook et al., 2010; Lawless et al., 2004). However, the method has not been validated for soot-BC nor SHS and little work has been done to look at the methodological issues of the optical reflectance measurements for samples that could have SHS, BC, and other colored particles. We refined this method using a lab-modified integrating sphere with absorption measured continuously from 350 nm to 1000 nm. Furthermore, we characterized the absorption spectrum of additional components of particulate matter (PM) on PM 2.5 filters including ammonium sulfate, hematite, goethite, and magnetite. Finally, we validate this method for BC by comparison to other standard methods. Use of synthesized data indicates that it is important to optimize the choice of wavelengths to minimize computational errors as additional components (more than 2) are added to the apportionment model of colored components. We found that substantial errors are introduced when using 4 wavelengths suggested by Lawless et al. to quantify four substances, while an optimized choice of wavelengths can reduce model-derived error from over 10% to less than 2%. For environmental samples, the method was sensitive for estimating airborne levels of BC and SHS, but not mass loadings of iron oxides and sulfate. Duplicate samples collected in NYC show high reproducibility (points consistent with a 1:1 line, R2 = 0.95). BC data measured by this method were consistent with those measured by other optical methods, including Aethalometer and Smoke-stain Reflectometer (SSR); although the SSR looses sensitivity at

  17. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  18. Validated spectrophotometric methods for simultaneous determination of Omeprazole, Tinidazole and Doxycycline in their ternary mixture

    NASA Astrophysics Data System (ADS)

    Lotfy, Hayam M.; Hegazy, Maha A.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-01-01

    A comparative study of smart spectrophotometric techniques for the simultaneous determination of Omeprazole (OMP), Tinidazole (TIN) and Doxycycline (DOX) without prior separation steps is developed. These techniques consist of several consecutive steps utilizing zero/or ratio/or derivative spectra. The proposed techniques adopt nine simple different methods, namely direct spectrophotometry, dual wavelength, first derivative-zero crossing, amplitude factor, spectrum subtraction, ratio subtraction, derivative ratio-zero crossing, constant center, and successive derivative ratio method. The calibration graphs are linear over the concentration range of 1-20 μg/mL, 5-40 μg/mL and 2-30 μg/mL for OMP, TIN and DOX, respectively. These methods are tested by analyzing synthetic mixtures of the above drugs and successfully applied to commercial pharmaceutical preparation. The methods that are validated according to the ICH guidelines, accuracy, precision, and repeatability, were found to be within the acceptable limits.

  19. Validated spectrofluorometric method for determination of gemfibrozil in self nanoemulsifying drug delivery systems (SNEDDS)

    NASA Astrophysics Data System (ADS)

    Sierra Villar, Ana M.; Calpena Campmany, Ana C.; Bellowa, Lyda Halbaut; Trenchs, Monserrat Aróztegui; Naveros, Beatriz Clares

    2013-09-01

    A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r2 > 0.999) and low limits of detection and quantification (LOD of 0.075 μg mL-1 and LOQ of 0.226 μg mL-1) in the range of 0.2-5 μg mL-1, equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily.

  20. Validated spectrophotometric methods for simultaneous determination of Omeprazole, Tinidazole and Doxycycline in their ternary mixture.

    PubMed

    Lotfy, Hayam M; Hegazy, Maha A; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-01-15

    A comparative study of smart spectrophotometric techniques for the simultaneous determination of Omeprazole (OMP), Tinidazole (TIN) and Doxycycline (DOX) without prior separation steps is developed. These techniques consist of several consecutive steps utilizing zero/or ratio/or derivative spectra. The proposed techniques adopt nine simple different methods, namely direct spectrophotometry, dual wavelength, first derivative-zero crossing, amplitude factor, spectrum subtraction, ratio subtraction, derivative ratio-zero crossing, constant center, and successive derivative ratio method. The calibration graphs are linear over the concentration range of 1-20 μg/mL, 5-40 μg/mL and 2-30 μg/mL for OMP, TIN and DOX, respectively. These methods are tested by analyzing synthetic mixtures of the above drugs and successfully applied to commercial pharmaceutical preparation. The methods that are validated according to the ICH guidelines, accuracy, precision, and repeatability, were found to be within the acceptable limits. PMID:26322842

  1. [Development and validation of method for the determination of cynarin, luteolin in plasma].

    PubMed

    Kulza, Maksymilian; Malinowska, Katarzyna; Woźniak, Anna; Seńczuk-Przybyłowska, Monika; Nowak, Gerard; Florek, Ewa

    2012-01-01

    The aim of this study was to develop and validate the method of cynarin and luteolin, the main constituents of artichoke (Cynara scolymus L.) leaf extract, determination in plasma. The compounds were separated using the high-performance liquid chromatography technique with diode array detection (HPLC-DAD). The analysis was preceded by liquid-liquid extraction using as the extracting agent ethyl acetate. The HPLC separation was performed on C18 column under gradient conditions using a mobile phase - 0,05% trifluoroacetic acid in water and methanol. The detector was set at lambda=330 nm. The validation was related to linearity, sensitivity (LOD and LOQ), accuracy and repeatability. In the validated method the linearity was achieved within concentration range 1,5625 - 50,0 microg/cm3 for the cynarin (R2=0,9989) and 1,5625 - 200,0 microg/cm3 for the luteolin (R2=0998). The limits of detection for cynarin and luteolin was: 0,75 microg/cm3 and 0,1 microg/cm3 and the limits of quatification: 2,25 microg/cm3 and 0,2 microg/cm3, respectively. Coefficient of variation for the inter-day and the intra-day analysis, which is a precision and accuracy parameter, do not exceed 10%. Recovery was 67% for the cynarin and 96% for the luteolin. The practical application of this method was proved by analysis of plasma samples from rats. The animals were administrated artichoke leaf extract - orally and intraperitoneally at a dose of 3 g/kg body weight or pure substances - intraperitoneally at a dose 1 mg/kg of luteolin and 0,5 mg/kg of cynarin. The presence of investigated compounds was proved only in samples after intraperitoneal administration of pure substances. The developed method is used to determine simultaneously cynarin and luteolin, after intraperitoneal administration of pure compounds. PMID:23421076

  2. AAPS and US FDA Crystal City VI workshop on bioanalytical method validation for biomarkers.

    PubMed

    Lowes, Steve; Ackermann, Bradley L

    2016-02-01

    Crystal City VI Workshop on Bioanalytical Method Validation of Biomarkers, Renaissance Baltimore Harborplace Hotel, Baltimore, MD, USA, 28-29 September 2015 The Crystal City VI workshop was organized by the American Association of Pharmaceutical Scientists in association with the US FDA to continue discussion on the bioanalysis of biomarkers. An outcome of the Crystal City V workshop, convened following release of the draft FDA Guidance for Industry on Bioanalytical Methods Validation in 2013 was the need to have further discussion on biomarker methods. Biomarkers ultimately became the sole focal point for Crystal City VI, a meeting attended by approximately 200 people and composed of industry scientists and regulators from around the world. The meeting format included several panel discussions to maximize the opportunity for dialogue among participants. Following an initial session on the general topic of biomarker assays and intended use, more focused sessions were held on chromatographic (LC-MS) and ligand-binding assays. In addition to participation by the drug development community, significant representation was present from clinical testing laboratories. The experience of this latter group, collectively identified as practitioners of CLIA (Clinical Laboratory Improvement Amendments), helped shape the discussion and takeaways from the meeting. While the need to operate within the framework of the current BMV guidance was clearly acknowledged, a general understanding that biomarker methods validation cannot be adequately depicted by current PK-centric guidelines emerged as a consensus from the meeting. This report is not intended to constitute the official proceedings from Crystal City VI, which is expected to be published in early 2016. PMID:26795584

  3. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    PubMed Central

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-01-01

    Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625

  4. Validation of experimental whole-body SAR assessment method in a complex indoor environment.

    PubMed

    Bamba, Aliou; Joseph, Wout; Vermeeren, Gunter; Tanghe, Emmeric; Gaillot, Davy Paul; Andersen, Jørgen B; Nielsen, Jesper Ødum; Lienard, Martine; Martens, Luc

    2013-02-01

    Experimentally assessing the whole-body specific absorption rate (SAR(wb) ) in a complex indoor environment is very challenging. An experimental method based on room electromagnetics theory (accounting only the line-of-sight as specular path) is validated using numerical simulations with the finite-difference time-domain method. Furthermore, the method accounts for diffuse multipath components (DMC) in the total absorption rate by considering the reverberation time of the investigated room, which describes all the losses in a complex indoor environment. The advantage of the proposed method is that it allows discarding the computational burden because it does not use any discretizations. Results show good agreement between measurement and computation at 2.8 GHz, as long as the plane wave assumption is valid, that is, at large distances from the transmitter. Relative deviations of 0.71% and 4% have been obtained for far-field scenarios, and 77.5% for the near field-scenario. The contribution of the DMC in the total absorption rate is also quantified here, which has never been investigated before. It is found that the DMC may represent an important part of the total absorption rate; its contribution may reach up to 90% for certain scenarios in an indoor environment.

  5. Validation of an Association Rule Mining-Based Method to Infer Associations Between Medications and Problems

    PubMed Central

    Wright, A.; McCoy, A.; Henkin, S.; Flaherty, M.; Sittig, D.

    2013-01-01

    Background In a prior study, we developed methods for automatically identifying associations between medications and problems using association rule mining on a large clinical data warehouse and validated these methods at a single site which used a self-developed electronic health record. Objective To demonstrate the generalizability of these methods by validating them at an external site. Methods We received data on medications and problems for 263,597 patients from the University of Texas Health Science Center at Houston Faculty Practice, an ambulatory practice that uses the Allscripts Enterprise commercial electronic health record product. We then conducted association rule mining to identify associated pairs of medications and problems and characterized these associations with five measures of interestingness: support, confidence, chi-square, interest and conviction and compared the top-ranked pairs to a gold standard. Results 25,088 medication-problem pairs were identified that exceeded our confidence and support thresholds. An analysis of the top 500 pairs according to each measure of interestingness showed a high degree of accuracy for highly-ranked pairs. Conclusion The same technique was successfully employed at the University of Texas and accuracy was comparable to our previous results. Top associations included many medications that are highly specific for a particular problem as well as a large number of common, accurate medication-problem pairs that reflect practice patterns. PMID:23650491

  6. Validation of a UV Spectrometric Method for the Assay of Tolfenamic Acid in Organic Solvents

    PubMed Central

    Ahmed, Sofia; Mustaan, Nafeesa; Sheraz, Muhammad Ali; Nabi, Syeda Ayesha Ahmed un; Ahmad, Iqbal

    2015-01-01

    The present study has been carried out to validate a UV spectrometric method for the assay of tolfenamic acid (TA) in organic solvents. TA is insoluble in water; therefore, a total of thirteen commonly used organic solvents have been selected in which the drug is soluble. Fresh stock solutions of TA in each solvent in a concentration of 1 × 10−4 M (2.62 mg%) were prepared for the assay. The method has been validated according to the guideline of International Conference on Harmonization and parameters like linearity, range, accuracy, precision, sensitivity, and robustness have been studied. Although the method was found to be efficient for the determination of TA in all solvents on the basis of statistical data 1-octanol, followed by ethanol and methanol, was found to be comparatively better than the other studied solvents. No change in the stock solution stability of TA has been observed in each solvent for 24 hours stored either at room (25 ± 1°C) or at refrigerated temperature (2–8°C). A shift in the absorption maxima has been observed for TA in various solvents indicating drug-solvent interactions. The studied method is simple, rapid, economical, accurate, and precise for the assay of TA in different organic solvents. PMID:26783497

  7. Validation of a two-plate microbiological method for screening antibiotic residues in shrimp tissue.

    PubMed

    Dang, Pham Kim; Degand, Guy; Danyi, Sophie; Pierret, Gilles; Delahaut, Philippe; Ton, Vu Dinh; Maghuin-Rogister, Guy; Scippo, Marie-Louise

    2010-07-01

    Microbiological inhibition screening tests could play an important role to detect residues of antibiotics in the different animal food products, but very few are available for the aquaculture products in general, and for shrimps in particular. A two-plate microbiological method to screen shrimp for residues of the most commonly used antibiotics has been developed and validated according to criteria derived from the European Commission Decision 2002/657/CE. Bacillus subtilis was used as a sensitive strain to target antibiotics. Culture conditions on Petri plates (pH of medium) were selected to enhance the capacity of antibiotic detection. Antibiotic residues were extracted from shrimps using acetonitrile/acetone (70/30, v/v) before application on Petri plates seeded with B. subtilis. The method was validated using spiked blank tissues as well as antibiotic treated shrimps with enrofloxacin and tetracycline, two antibiotics often found to be used in shrimp production. For tetracyclines and (fluoro)quinolones, the detection capability was below the maximum residue limit (MRL), while it was around the MRL for sulfonamides. The specificity of the microbiological screening was 100% in all cases while the sensitivity and accuracy was 100% in almost all cases. The capacity of the method to detect contaminated samples was confirmed on antibiotic treated shrimps, analyzed in parallel with a confirmatory method (Liquid Chromatography coupled to mass spectrometry (LC-MS)).

  8. Development and validation of sensitive spectrophotometric method for determination of two antiepileptics in pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Gouda, Ayman A.; Malah, Zakia Al

    2013-03-01

    Rapid, sensitive and validated spectrophotometric methods for the determination of two antiepileptics (gabapentin (GAB) and pregabalin (PRG)) in pure forms and in pharmaceutical formulations was developed. The method is based on the formation of charge transfer complex between drug and the chromogenic reagents quinalizarin (Quinz) and alizarin red S (ARS) producing charge transfer complexes in methanolic medium which showed an absorption maximum at 571 and 528 nm for GAB and 572 and 538 nm for PRG using Quinz and ARS, respectively. The optimization of the reaction conditions such as the type of solvent, reagent concentration and reaction time were investigated. Beer's law is obeyed in the concentration ranges 0.4-8.0 and 0.5-10 μg mL-1 for GAB and PRG using Quinz and ARS, respectively. The molar absorptivity, Sandell sensitivity, detection and quantification limits are also calculated. The correlation coefficients were ⩾0.9992 with a relative standard deviation (RSD%) of ⩽1.76. The methods are successfully applied to the determination of GAB and PRG in pharmaceutical formulations and the validity assesses by applying the standard addition technique, which compared with those obtained using the reported methods.

  9. Development and Validation of Stability-indicating HPLC Method for Simultaneous Estimation of Cefixime and Linezolid.

    PubMed

    Patel, Nidhi S; Tandel, Falguni B; Patel, Yogita D; Thakkar, Kartavya B

    2014-01-01

    A stability-indicating reverse phase high performance liquid chromatography method was developed and validated for cefixime and linezolid. The wavelength selected for quantitation was 276 nm. The method has been validated for linearity, accuracy, precision, robustness, limit of detection and limit of quantitation. Linearity was observed in the concentration range of 2-12 μg/ml for cefixime and 6-36 μg/ml for linezolid. For RP-HPLC, the separation was achieved by Phenomenex Luna C18 (250×4.6 mm) 5 μm column using phosphate buffer (pH 7):methanol (60:40 v/v) as mobile phase with flow rate 1 ml/min. The retention time of cefixime and linezolid were found to be 3.127 min and 11.986 min, respectively. During force degradation, drug product was exposed to hydrolysis (acid and base hydrolysis), H2O2, thermal degradation and photo degradation. The % degradation was found to be 10 to 20% for both cefixime and linezolid in the given condition. The method specifically estimates both the drugs in presence of all the degradants generated during forced degradation study. The developed methods were simple, specific and economic, which can be used for simultaneous estimation of cefixime and linezolid in tablet dosage form.

  10. Determination of phosphine in plant materials: method optimization and validation in interlaboratory comparison tests.

    PubMed

    Amrein, Thomas M; Ringier, Lara; Amstein, Nathalie; Clerc, Laurence; Bernauer, Sabine; Baumgartner, Thomas; Roux, Bernard; Stebler, Thomas; Niederer, Markus

    2014-03-01

    The optimization and validation of a method for the determination of phosphine in plant materials are described. The method is based on headspace sampling over the sample heated in 5% sulfuric acid. Critical factors such as sample amount, equilibration conditions, method of quantitation, and matrix effects are discussed, and validation data are presented. Grinding of coarse samples does not lead to lower results and is a prerequisite for standard addition experiments, which present the most reliable approach for quantitation because of notable matrix effects. Two interlaboratory comparisons showed that results varied considerably and that an uncertainty of measurement of about 50% has to be assessed. Flame photometric and mass spectrometric detection gave similar results. The proposed method is well reproducible within one laboratory, and results from the authors' laboratories using different injection and detection techniques are very close to each other. The considerable variation in the interlaboratory comparison shows that this analysis is still challenging in practice and further proficiency testing is needed. PMID:24564743

  11. GC-MS method development and validation for anabolic steroids in feed samples.

    PubMed

    Muñiz-Valencia, Roberto; Ceballos-Magaña, Silvia G; Gonzalo-Lumbreras, Raquel; Santos-Montes, Ana; Izquierdo-Hornillos, Roberto

    2008-03-01

    A GC-MS method for the determination of AAS used as growth promoting agents using SIM in piglet feed samples has been developed and validated, using testosterone as internal standard. The formation of volatile steroid derivatives was carried out by derivatization with N-methyl-N-(trimethylsilyl)trifluoroacetamide. The optimum separation was achieved using a Zebron ZB-5 column under a gradient temperature elution, allowing the separation of steroids in 18 min. The required sample treatment process was discussed. A leaching using ACN, saponification using a binary NaOH/MgCl2 solution, and LLE using ethyl acetate were finally selected. Method validation has been carried out according to the Commission Decision 2002/657/EC criteria established for quantitative confirmatory methods. The extraction efficiencies, CCalpha and CCbeta for these compounds were in the ranges 78-98%, 10-21 and 18-35 mug/kg, respectively. The repeatability and the within-laboratory reproducibility at 1, 1.5, and 2 CCbeta concentration levels were smaller than 8.2, 7.5, and 5.8% and 12.2, 9.5, and 7.5%, respectively. Accuracy was in the 99-103% range. The robustness was evaluated using the Youden robustness test. The proposed method was applied to the analysis of steroids spiked in different kinds of animal feed samples with satisfactory results.

  12. Quantification of histone modifications by parallel-reaction monitoring: a method validation.

    PubMed

    Sowers, James L; Mirfattah, Barsam; Xu, Pei; Tang, Hui; Park, In Young; Walker, Cheryl; Wu, Ping; Laezza, Fernanda; Sowers, Lawrence C; Zhang, Kangling

    2015-10-01

    Abnormal epigenetic reprogramming is one of the major causes leading to irregular gene expression and regulatory pathway perturbations, in the cells, resulting in unhealthy cell development or diseases. Accurate measurements of these changes of epigenetic modifications, especially the complex histone modifications, are very important, and the methods for these measurements are not trivial. By following our previous introduction of PRM to targeting histone modifications (Tang, H.; Fang, H.; Yin, E.; Brasier, A. R.; Sowers, L. C.; Zhang, K. Multiplexed parallel reaction monitoring targeting histone modifications on the QExactive mass spectrometer. Anal. Chem. 2014, 86 (11), 5526-34), herein we validated this method by varying the protein/trypsin ratios via serial dilutions. Our data demonstrated that PRM with SILAC histones as the internal standards allowed reproducible measurements of histone H3/H4 acetylation and methylation in the samples whose histone contents differ at least one-order of magnitude. The method was further validated by histones isolated from histone H3 K36 trimethyltransferase SETD2 knockout mouse embryonic fibroblasts (MEF) cells. Furthermore, histone acetylation and methylation in human neural stem cells (hNSC) treated with ascorbic acid phosphate (AAP) were measured by this method, revealing that H3 K36 trimethylation was significantly down-regulated by 6 days of treatment with vitamin C.

  13. Development and method validation for the determination of nitroimidazole residues in salmon, tilapia and shrimp muscle.

    PubMed

    Watson, Lynn; Potter, Ross; MacNeil, James D; Murphy, Cory

    2014-01-01

    The use of nitroimidazoles in aquacultured fish has been banned in many countries due to the suspected mutagenic and carcinogenic effects of these compounds. In response to the need to conduct residue testing of these compounds in fish, a simple, rapid, and sensitive method was developed and validated that is suitable for regulatory monitoring of nitroimidazole residues and their hydroxy metabolites in fish muscle tissue. Following solvent extraction of homogenized tissue and clean-up using a C18 SPE cartridge, analyses were conducted by ultra-performance UPLC-MS/MS. A precursor ion and two product ions were monitored for each of the parent compounds and metabolites included in the method. The validated method has an analytical range from 1 to 50 ng/g in the representative species (tilapia, salmon, and shrimp), with an LOD and LOQ ranging from 0.07 to 1.0 nglg and 0.21 to 3.0 nglg, respectively, depending on the analyte. Recoveries ranged from 81 to 124% and repeatability was between 4 and 17%. HorRat values were within typical limits of acceptability for a single laboratory. Working standards were stable for 12 months, extracts were stable for 5 days, and tissues for 2 months under appropriate storage conditions. This method was determined to be suitable for routine use for screening, quantification, and confirmation of nitroimidazole residues in a residue monitoring program for fish with regulatory oversight.

  14. An evaluation method based on absolute difference to validate the performance of SBNUC algorithms

    NASA Astrophysics Data System (ADS)

    Jin, Minglei; Jin, Weiqi; Li, Yiyang; Li, Shuo

    2016-09-01

    Scene-based non-uniformity correction (SBNUC) algorithms are an important part of infrared image processing; however, SBNUC algorithms usually cause two defects: (1) ghosting artifacts and (2) over-correction. In this paper, we use the absolute difference based on guided image filter (AD-GF) method to validate the performance of SBNUC algorithms. We obtain a self-separation source using the improved guided image filter to process the input image, and use the self-separation source to obtain the space-high-frequency parts of the input image and the corrected image. Finally, we use the absolute difference between the two space-high-frequency parts as the evaluation result. Based on experimental results, the AD-GF method has better robustness and can validate the performance of SBNUC algorithms even if ghosting artifacts or over-correction occur. Also the AD-GF method can measure how SBNUC algorithms perform in the time domain, it's an effective evaluation method for SBNUC algorithm.

  15. Development and Validation of RP-HPLC Method for the Estimation of Ivabradine Hydrochloride in Tablets

    PubMed Central

    Seerapu, Sunitha; Srinivasan, B. P.

    2010-01-01

    A simple, sensitive, precise and robust reverse–phase high-performance liquid chromatographic method for analysis of ivabradine hydrochloride in pharmaceutical formulations was developed and validated as per ICH guidelines. The separation was performed on SS Wakosil C18AR, 250×4.6 mm, 5 μm column with methanol:25 mM phosphate buffer (60:40 v/v), adjusted to pH 6.5 with orthophosphoric acid, added drop wise, as mobile phase. A well defined chromatographic peak of Ivabradine hydrochloride was exhibited with a retention time of 6.55±0.05 min and tailing factor of 1.14 at the flow rate of 0.8 ml/min and at ambient temperature, when monitored at 285 nm. The linear regression analysis data for calibration plots showed good linear relationship with R=0.9998 in the concentration range of 30-210 μg/ml. The method was validated for precision, recovery and robustness. Intra and Inter-day precision (% relative standard deviation) were always less than 2%. The method showed the mean % recovery of 99.00 and 98.55 % for Ivabrad and Inapure tablets, respectively. The proposed method has been successfully applied to the commercial tablets without any interference of excipients. PMID:21695008

  16. Optimization and validation of a minicolumn method for determining aflatoxins in copra meal.

    PubMed

    Arim, R H; Aguinaldo, A R; Tanaka, T; Yoshizawa, T

    1999-01-01

    A minicolumn (MC) method for determining aflatoxins in copra meal was optimized and validated. The method uses methanol-4% KCl solution as extractant and CuSO4 solution as clarifying agent. The chloroform extract is applied to an MC that incorporates "lahar," an indigenous material, as substitute for silica gel. The "lahar"-containing MC produces a more distinct and intense blue fluoresence on the Florisil layer than an earlier MC. The method has a detection limit of 15 micrograms total aflatoxins/kg sample. Confirmatory tests using 50% H2SO4 and trifluoroacetic acid in benzene with 25% HNO3 showed that copra meal samples contained aflatoxins and no interfering agents. The MC responses of the copra meal samples were in good agreement with their behavior in thin-layer chromatography. This modified MC method is accurate, giving linearity-valid results; rapid, being done in 15 min; economical, using low-volume reagents; relatively safe, having low-exposure risk of analysts to chemicals; and simple, making its field application feasible. PMID:10444827

  17. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  18. Determination of plate waste in primary school lunches by weighing and visual estimation methods: a validation study.

    PubMed

    Liz Martins, Margarida; Cunha, Luís M; Rodrigues, Sara S P; Rocha, Ada

    2014-08-01

    The aim of this study was to validate the visual estimation method for aggregated plate waste of main dish at Portuguese primary school canteens. For this purpose plate waste at school lunch was measured for 505 individual servings, using weighing individual servings and plate waste and visual estimation method by a 6-point scale, as developed by Comstock et al. (1981). A high variability of initial serving weights was found with serving sizes ranging from 88.9 to 283.3g and with a coefficient of variation ranging from 5.5% to 24.7%. Mean plate waste was 27.5% according to the weighing method. There was a significant bias in the conversion of the visual waste estimations to actual waste, being overestimated by an average of 8.0 g (ranging from -12.9 g to 41.4 g). According to Bland and Altman plot, the mean difference between methods was of 8.0 g and the amplitude interval was 102.6g. The study showed that the visual estimation method is not as accurate as the weighing method in assessing nonselective aggregated plate waste at primary school canteens. Our findings are thus very important on considering plate waste assessment, since the wide variation on initial servings introduces a relevant bias when considering standard portions or a random sample of initial servings. Although, greater convenience, time-saving and the possibility to monitor plate waste of large groups, make the visual estimation method an important method to assess plate waste at school canteens, these results highlighted the need of portions standardization and control of initial servings to allow for its use. PMID:24841068

  19. Fisk-based criteria to support validation of detection methods for drinking water and air.

    SciTech Connect

    MacDonell, M.; Bhattacharyya, M.; Finster, M.; Williams, M.; Picel, K.; Chang, Y.-S.; Peterson, J.; Adeshina, F.; Sonich-Mullin, C.; Environmental Science Division; EPA

    2009-02-18

    This report was prepared to support the validation of analytical methods for threat contaminants under the U.S. Environmental Protection Agency (EPA) National Homeland Security Research Center (NHSRC) program. It is designed to serve as a resource for certain applications of benchmark and fate information for homeland security threat contaminants. The report identifies risk-based criteria from existing health benchmarks for drinking water and air for potential use as validation targets. The focus is on benchmarks for chronic public exposures. The priority sources are standard EPA concentration limits for drinking water and air, along with oral and inhalation toxicity values. Many contaminants identified as homeland security threats to drinking water or air would convert to other chemicals within minutes to hours of being released. For this reason, a fate analysis has been performed to identify potential transformation products and removal half-lives in air and water so appropriate forms can be targeted for detection over time. The risk-based criteria presented in this report to frame method validation are expected to be lower than actual operational targets based on realistic exposures following a release. Note that many target criteria provided in this report are taken from available benchmarks without assessing the underlying toxicological details. That is, although the relevance of the chemical form and analogues are evaluated, the toxicological interpretations and extrapolations conducted by the authoring organizations are not. It is also important to emphasize that such targets in the current analysis are not health-based advisory levels to guide homeland security responses. This integrated evaluation of chronic public benchmarks and contaminant fate has identified more than 200 risk-based criteria as method validation targets across numerous contaminants and fate products in drinking water and air combined. The gap in directly applicable values is

  20. Experimental comparison and validation of hot-ball method with guarded hot plate method on polyurethane foams

    NASA Astrophysics Data System (ADS)

    Hudec, Ján; Glorieux, Christ; Dieška, Peter; Kubičár, Ľudovít

    2016-07-01

    The Hot-ball method is an innovative transient method for measuring thermophysical properties. The principle is based on heating of a small ball, incorporated in measured medium, by constant heating power and simultaneous measuring of the ball's temperature response since the heating was initiated. The shape of the temperature response depends on thermophysical properties of the medium, where the sensor is placed. This method is patented by Institute of Physics, SAS, where the method and sensors based on this method are being developed. At the beginning of the development of sensors for this method we were oriented on monitoring applications, where relative precision is much more important than accuracy. Meanwhile, the quality of sensors was improved good enough to be used for a new application - absolute measuring of thermophysical parameters of low thermally conductive materials. This paper describes experimental verification and validation of measurement by hot-ball method. Thanks to cooperation with Laboratory of Soft Matter and Biophysics of Catholic University of Leuven in Belgium, established Guarded Hot Plate method was used as a reference. Details about measuring setups, description of the experiments and results of the comparison are presented.

  1. Continental-scale Validation of MODIS-based and LEDAPS Landsat ETM+ Atmospheric Correction Methods

    NASA Technical Reports Server (NTRS)

    Ju, Junchang; Roy, David P.; Vermote, Eric; Masek, Jeffrey; Kovalskyy, Valeriy

    2012-01-01

    The potential of Landsat data processing to provide systematic continental scale products has been demonstrated by several projects including the NASA Web-enabled Landsat Data (WELD) project. The recent free availability of Landsat data increases the need for robust and efficient atmospheric correction algorithms applicable to large volume Landsat data sets. This paper compares the accuracy of two Landsat atmospheric correction methods: a MODIS-based method and the Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) method. Both methods are based on the 6SV radiative transfer code but have different atmospheric characterization approaches. The MODIS-based method uses the MODIS Terra derived dynamic aerosol type, aerosol optical thickness, and water vapor to atmospherically correct ETM+ acquisitions in each coincident orbit. The LEDAPS method uses aerosol characterizations derived independently from each Landsat acquisition and assumes a fixed continental aerosol type and uses ancillary water vapor. Validation results are presented comparing ETM+ atmospherically corrected data generated using these two methods with AERONET corrected ETM+ data for 95 10 km×10 km 30 m subsets, a total of nearly 8 million 30 m pixels, located across the conterminous United States. The results indicate that the MODIS-based method has better accuracy than the LEDAPS method for the ETM+ red and longer wavelength bands.

  2. Validation of different spectrophotometric methods for determination of vildagliptin and metformin in binary mixture

    NASA Astrophysics Data System (ADS)

    Abdel-Ghany, Maha F.; Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    New, simple, specific, accurate, precise and reproducible spectrophotometric methods have been developed and subsequently validated for determination of vildagliptin (VLG) and metformin (MET) in binary mixture. Zero order spectrophotometric method was the first method used for determination of MET in the range of 2-12 μg mL-1 by measuring the absorbance at 237.6 nm. The second method was derivative spectrophotometric technique; utilized for determination of MET at 247.4 nm, in the range of 1-12 μg mL-1. Derivative ratio spectrophotometric method was the third technique; used for determination of VLG in the range of 4-24 μg mL-1 at 265.8 nm. Fourth and fifth methods adopted for determination of VLG in the range of 4-24 μg mL-1; were ratio subtraction and mean centering spectrophotometric methods, respectively. All the results were statistically compared with the reported methods, using one-way analysis of variance (ANOVA). The developed methods were satisfactorily applied to analysis of the investigated drugs and proved to be specific and accurate for quality control of them in pharmaceutical dosage forms.

  3. Validation of a standardised method for determining beryllium in human urine at nanogram level.

    PubMed

    Devoy, Jérôme; Melczer, Mathieu; Antoine, Guillaume; Remy, Aurélie; Heilier, Jean-François

    2013-10-01

    The potential toxicity of beryllium at low levels of exposure means that a biological and/or air monitoring strategy may be required to monitor the exposure of subjects. The main objective of the work presented in this manuscript was to develop and validate a sensitive and reproducible method for determining levels of beryllium in human urine and to establish reference values in workers and in non-occupationally exposed people. A chelate of beryllium acetylacetonate formed from beryllium(II) in human urine was pre-concentrated on a SPE C18 cartridge and eluted with methanol. After drying the eluate, the residue was solubilised in nitric acid and analysed by atomic absorption spectrometry and/or inductively coupled plasma mass spectrometry. The proposed method is 4 to 100 times more sensitive than other methods currently in routine use. The new method was validated with the concordance correlation coefficient test for beryllium concentrations ranging from 10 to 100 ng/L. Creatinine concentration, urine pH, interfering compounds and freeze-thaw cycles were found to have only slight effects on the performance of the method (less than 6%). The effectiveness of the two analytical techniques was compared statistically with each other and to direct analysis techniques. Even with a detection limit of 0.6 ng/L (obtained with inductively coupled plasma mass spectrometry), the method is not sensitive enough to detect levels in non-occupationally exposed persons. The method performance does however appear to be suitable for monitoring worker exposure in some industrial settings and it could therefore be of use in biological monitoring strategies.

  4. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    PubMed Central

    Musmade, Kranti P.; Trilok, M.; Dengale, Swapnil J.; Bhat, Krishnamurthy; Reddy, M. S.; Musmade, Prashant B.; Udupa, N.

    2014-01-01

    A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC) method with UV detection has been developed and validated for quantification of naringin (NAR) in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm) column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v) as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1). The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV) less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity. PMID:26556205

  5. Method for counting motor units in mice and validation using a mathematical model.

    PubMed

    Major, Lora A; Hegedus, Janka; Weber, Douglas J; Gordon, Tessa; Jones, Kelvin E

    2007-02-01

    Weakness and atrophy are clinical signs that accompany muscle denervation resulting from motor neuron disease, peripheral neuropathies, and injury. Advances in our understanding of the genetics and molecular biology of these disorders have led to the development of therapeutic alternatives designed to slow denervation and promote reinnervation. Preclinical in vitro research gave rise to the need of a method for measuring the effects in animal models. Our goal was to develop an efficient method to determine the number of motor neurons making functional connections to muscle in a transgenic mouse model of amyotrophic lateral sclerosis (ALS). We developed a novel protocol for motor unit number estimation (MUNE) using incremental stimulation. The method involves analysis of twitch waveforms using a new software program, ITS-MUNE, designed for interactive calculation of motor unit number. The method was validated by testing simulated twitch data from a mathematical model of the neuromuscular system. Computer simulations followed the same stimulus-response protocol and produced waveform data that were indistinguishable from experiments. We show that our MUNE protocol is valid, with high precision and small bias across a wide range of motor unit numbers. The method is especially useful for large muscle groups where MUNE could not be done using manual methods. The results are reproducible across naïve and expert analysts, making it suitable for easy implementation. The ITS-MUNE analysis method has the potential to quantitatively measure the progression of motor neuron diseases and therefore the efficacy of treatments designed to alleviate pathologic processes of muscle denervation. PMID:17151224

  6. Performance of the Tariff Method: validation of a simple additive algorithm for analysis of verbal autopsies

    PubMed Central

    2011-01-01

    Background Verbal autopsies provide valuable information for studying mortality patterns in populations that lack reliable vital registration data. Methods for transforming verbal autopsy results into meaningful information for health workers and policymakers, however, are often costly or complicated to use. We present a simple additive algorithm, the Tariff Method (termed Tariff), which can be used for assigning individual cause of death and for determining cause-specific mortality fractions (CSMFs) from verbal autopsy data. Methods Tariff calculates a score, or "tariff," for each cause, for each sign/symptom, across a pool of validated verbal autopsy data. The tariffs are summed for a given response pattern in a verbal autopsy, and this sum (score) provides the basis for predicting the cause of death in a dataset. We implemented this algorithm and evaluated the method's predictive ability, both in terms of chance-corrected concordance at the individual cause assignment level and in terms of CSMF accuracy at the population level. The analysis was conducted separately for adult, child, and neonatal verbal autopsies across 500 pairs of train-test validation verbal autopsy data. Results Tariff is capable of outperforming physician-certified verbal autopsy in most cases. In terms of chance-corrected concordance, the method achieves 44.5% in adults, 39% in children, and 23.9% in neonates. CSMF accuracy was 0.745 in adults, 0.709 in children, and 0.679 in neonates. Conclusions Verbal autopsies can be an efficient means of obtaining cause of death data, and Tariff provides an intuitive, reliable method for generating individual cause assignment and CSMFs. The method is transparent and flexible and can be readily implemented by users without training in statistics or computer science. PMID:21816107

  7. Experimental methods to validate measures of emotional state and readiness for duty in critical operations.

    SciTech Connect

    Weston, Louise Marie

    2007-09-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended.

  8. Physical activity assessment in the general population; validated self-report methods.

    PubMed

    Ara, Ignacio; Aparicio-Ugarriza, Raquel; Morales-Barco, David; Nascimento de Souza, Wysllenny; Mata, Esmeralda; González-Gross, Marcela

    2015-02-26

    Self-reported questionnaires have been commonly used to assess physical activity levels in large cohort studies. As a result, strong and convincing evidences that physical activity can protect health are widely recognized. However, validation studies using objective measures of physical activity or energy expenditure (double labelled water, accelerometers, pedometers, etc.) indicate that the accuracy and precision of survey techniques are limited. Physical activity questionnaires could fail in estimating particularly non-vigorous physical activity. They have a disproportionate focus on volitional type exercise (i.e. biking, jogging, and walking), while not capturing the activities of daily living and low to moderate intensity movements. Energy expenditure estimates from these data are not recommended. On the other hand, despite objective tools should be the measurement of choice to assess PA level, self-reported questionnaires remain valid, and have many advantages. i.e. low costs. These kind of recalls are designed and validated for different age groups and provide value and important information, mainly about physical activity pattern. Future studies will require more precision and accuracy in physical activity measurement than those provided by traditional survey methods. We can conclude that probably a mixed approach that combines both the objective and subjective techniques involving novel devices and electronic capture of physical activity questionnaires will be more effective.

  9. Determination of benzimidazoles and levamisole residues in milk by liquid chromatography-mass spectrometry: screening method development and validation.

    PubMed

    Jedziniak, Piotr; Szprengier-Juszkiewicz, Teresa; Olejnik, Małgorzata

    2009-11-13

    The screening method for the determination of residues of 19 benzimidazoles (parent drugs and their metabolites) and levamisole in bovine milk has been developed and validated. Milk samples were extracted with ethyl acetate, sample extracts were cleaned up by liquid-liquid partitioning with hexane and acidic ethanol. Liquid chromatography-single-quadrupole mass spectrometry was used for the separation and determination of analytes. The method was validated in bovine milk, according to the CD 2002/657/EC criteria. An alternative approach to the validation of the method was applied ("sum MRL" substances). The method was successfully verified in CRL proficiency test. PMID:19656518

  10. Development and validation of an ultra-performance liquid chromatography method for simultaneous analysis of 20 antihistaminics in dietary supplements.

    PubMed

    Kim, Jung Yeon; Do, Jung-Ah; Choi, Ji Yeon; Cho, Sooyeul; Kim, Woo-Seong; Yoon, Chang-Yong

    2015-03-01

    The purpose of this study was to develop and validate an ultra-performance liquid chromatography method for simultaneous analysis of 20 antihistamines (illegal additives) in dietary supplements. The limits of detection and quantitation of the method ranged from 1.5 to 2.5 µg/mL and from 20.0 to 50.0 µg/mL, respectively. The determination coefficient was >0.999, precisions were 0.2-5.1% (intra-day) and 0.1-8.8% (inter-day), and accuracies were 84.5-111.2% (intra-day) and 91.9-112.0% (inter-day). The mean recoveries of 20 targeted compounds from dietary supplements ranged from 75.4 to 119.3%. The relative standard deviations were <6.6% and complied with established international guidelines. The relative standard deviation of stability was <0.8%. Fifty-two commercially available dietary supplements were evaluated using this method, and were found to have none of the 20 antihistamines in significant abundance.

  11. Validity and reliability of different kinematics methods used for bike fitting.

    PubMed

    Fonda, Borut; Sarabon, Nejc; Li, François-Xavier

    2014-01-01

    The most common bike fitting method to set the seat height is based on the knee angle when the pedal is in its lowest position, i.e. bottom dead centre (BDC). However, there is no consensus on what method should be used to measure the knee angle. Therefore, the first aim of this study was to compare three dynamic methods to each other and against a static method. The second aim was to test the intra-session reliability of the knee angle at BDC measured by dynamic methods. Eleven cyclists performed five 3-min cycling trials; three at different seat heights (25°, 30° and 35° knee angle at BDC according to static measure) and two at preferred seat height. Thirteen infrared cameras (3D), a high-speed camera (2D), and an electrogoniometer were used to measure the knee angle during pedalling, when the pedal was at the BDC. Compared to 3D kinematics, all other methods statistically significantly underestimated the knee angle (P = 0.00; η(2) = 0.73). All three dynamic methods have been found to be substantially different compared to the static measure (effect sizes between 0.4 and 0.6). All dynamic methods achieved good intra-session reliability. 2D kinematics is a valid tool for knee angle assessment during bike fitting. However, for higher precision, one should use correction factor by adding 2.2° to the measured value.

  12. Validation of the intrinsic spatial efficiency method for non cylindrical homogeneous sources using MC simulation

    NASA Astrophysics Data System (ADS)

    Ortiz-Ramírez, Pablo; Ruiz, Andrés

    2016-07-01

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of the intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.

  13. Development and Validation of New Spectrophotometric Methods to Determine Enrofloxacin in Pharmaceuticals

    NASA Astrophysics Data System (ADS)

    Rajendraprasad, N.; Basavaiah, K.

    2015-07-01

    Four spectrophotometric methods, based on oxidation with cerium(IV), are investigated and developed to determine EFX in pure form and in dosage forms. The frst and second methods (Method A and method B) are direct, in which after the oxidation of EFX with cerium(IV) in acid medium, the absorbance of reduced and unreacted oxidant is measured at 275 and 320 nm, respectively. In the third (C) and fourth (D) methods after the reaction between EFX and oxidant is ensured to be completed the surplus oxidant is treated with either N-phenylanthranilic acid (NPA) or Alizarin Red S (ARS) dye and the absorbance of the oxidized NPA or ARS is measured at 440 or 420 nm. The methods showed good linearity over the concentration ranges of 0.5-5.0, 1.25-12.5, 10.0-100.0, and 6.0-60.0 μg/ml, for method A, B, C and D, respectively, with apparent molar absorptivity values of 4.42 × 10 4 , 8.7 × 10 3 , 9.31 × 10 2 , and 2.28 × 10 3 l/(mol· cm). The limits of detection (LOD), quantification (LOQ), and Sandell's sensitivity values and other validation results have also been reported. The proposed methods are successfully applied to determine EFX in pure form and in dosage forms.

  14. Further validation of the hybrid particle-mesh method for vortex shedding flow simulations

    NASA Astrophysics Data System (ADS)

    Lee, Seung-Jae; Lee, Jun-Hyeok; Suh, Jung-Chun

    2015-11-01

    This is the continuation of a numerical study on vortex shedding from a blunt trailing-edge of a hydrofoil. In our previous work (Lee et al., 2015), numerical schemes for efficient computations were successfully implemented; i.e. multiple domains, the approximation of domain boundary conditions using cubic spline functions, and particle-based domain decomposition for better load balancing. In this study, numerical results through a hybrid particle-mesh method which adopts the Vortex-In-Cell (VIC) method and the Brinkman penalization model are further rigorously validated through comparison to experimental data at the Reynolds number of 2 × 106. The effects of changes in numerical parameters are also explored herein. We find that the present numerical method enables us to reasonably simulate vortex shedding phenomenon, as well as turbulent wakes of a hydrofoil.

  15. Development and Validation of Stability-Indicating Derivative Spectrophotometric Methods for Determination of Dronedarone Hydrochloride

    NASA Astrophysics Data System (ADS)

    Chadha, R.; Bali, A.

    2016-05-01

    Rapid, sensitive, cost effective and reproducible stability-indicating derivative spectrophotometric methods have been developed for the estimation of dronedarone HCl employing peak-zero (P-0) and peak-peak (P-P) techniques, and their stability-indicating potential assessed in forced degraded solutions of the drug. The methods were validated with respect to linearity, accuracy, precision and robustness. Excellent linearity was observed in concentrations 2-40 μg/ml ( r 2 = 0.9986). LOD and LOQ values for the proposed methods ranged from 0.42-0.46 μg/ml and 1.21-1.27 μg/ml, respectively, and excellent recovery of the drug was obtained in the tablet samples (99.70 ± 0.84%).

  16. Antioxidant Activity and Validation of Quantification Method for Lycopene Extracted from Tomato.

    PubMed

    Cefali, Letícia Caramori; Cazedey, Edith Cristina Laignier; Souza-Moreira, Tatiana Maria; Correa, Marcos Antônio; Salgado, Hérida Regina Nunes; Isaac, Vera Lucia Borges

    2015-01-01

    Lycopene is a carotenoid found in tomatoes with potent antioxidant activity. The aim of the study was to obtain an extract containing lycopene from four types of tomatoes, validate a quantification method for the extracts by HPLC, and assess its antioxidant activity. Results revealed that the tomatoes analyzed contained lycopene and antioxidant activity. Salad tomato presented the highest concentration of this carotenoid and antioxidant activity. The quantification method exhibited linearity with a correlation coefficient of 0.9992. Tests for the assessment of precision, accuracy, and robustness achieved coefficients with variation of less than 5%. The LOD and LOQ were 0.0012 and 0.0039 μg/mL, respectively. Salad tomato can be used as a source of lycopene for the development of topical formulations, and based on performed tests, the chosen method for the identification and quantification of lycopene was considered to be linear, precise, exact, selective, and robust. PMID:26525253

  17. Development and validation of an analytical method for the determination of 4-hexylresorcinol in food.

    PubMed

    Kim, Young-Hyun; Kim, Jae-Min; Lee, Jong Seok; Gang, Seong-Ran; Lim, Ho-Soo; Kim, Meehye; Lee, Ok-Hwan

    2016-01-01

    This study presents a method validation for extraction and quantitative analysis of 4-hexylresorcinol residues in shrimp and crab meat using HPLC-FLD. We were focused on the collaboratively analysis of each shrimp and crab meat samples, and developed LC-MS/MS method for the correct confirmation of the identity of compound. Validation parameters; selectivity, linearity, LOD, LOQ, accuracy, precision, and measurement of uncertainty were attained. The measurement of uncertainty was based on the precision study, data related to the performance of the analytical process and quantification of 4-hexylresorcinol. For HPLC-FLD analysis, the recoveries of 4-hexylresorcinol from spiked samples at levels of 0.2-10.0 ppm ranged from 92.54% to 97.67% with RSDs between 0.07% and 1.88%. According to these results, the method has been proven to be appropriate for extraction and determination of 4-hexylresorcinol, and can be used to maintain the safety of shrimp and crab products containing 4-hexylresorcinol residues.

  18. Validated HPAEC-PAD Method for the Determination of Fully Deacetylated Chitooligosaccharides

    PubMed Central

    Cao, Lidong; Wu, Jinlong; Li, Xiuhuan; Zheng, Li; Wu, Miaomiao; Liu, Pingping; Huang, Qiliang

    2016-01-01

    An efficient and sensitive analytical method based on high-performance anion exchange chromatography with pulsed amperometric detection (HPAEC-PAD) was established for the simultaneous separation and determination of glucosamine (GlcN)1 and chitooligosaccharides (COS) ranging from (GlcN)2 to (GlcN)6 without prior derivatization. Detection limits were 0.003 to 0.016 mg/L (corresponding to 0.4–0.6 pmol), and the linear range was 0.2 to 10 mg/L. The optimized analysis was carried out on a CarboPac-PA100 analytical column (4 × 250 mm) using isocratic elution with 0.2 M aqueous sodium hydroxide-water mixture (10:90, v/v) as the mobile phase at a 0.4 mL/min flow rate. Regression equations revealed a good linear relationship (R2 = 0.9979–0.9995, n = 7) within the test ranges. Quality parameters, including precision and accuracy, were fully validated and found to be satisfactory. The fully validated HPAEC-PAD method was readily applied for the quantification of (GlcN)1–6 in a commercial COS technical concentrate. The established method was also used to monitor the acid hydrolysis of a COS technical concentrate to ensure optimization of reaction conditions and minimization of (GlcN)1 degradation. PMID:27735860

  19. Validated method for the quantification of free and total carnitine, butyrobetaine, and acylcarnitines in biological samples.

    PubMed

    Minkler, Paul E; Stoll, Maria S K; Ingalls, Stephen T; Kerner, Janos; Hoppel, Charles L

    2015-09-01

    A validated quantitative method for the determination of free and total carnitine, butyrobetaine, and acylcarnitines is presented. The versatile method has four components: (1) isolation using strong cation-exchange solid-phase extraction, (2) derivatization with pentafluorophenacyl trifluoromethanesulfonate, (3) sequential ion-exchange/reversed-phase (ultra) high-performance liquid chromatography [(U)HPLC] using a strong cation-exchange trap in series with a fused-core HPLC column, and (4) detection with electrospray ionization multiple reaction monitoring (MRM) mass spectrometry (MS). Standardized carnitine along with 65 synthesized, standardized acylcarnitines (including short-chain, medium-chain, long-chain, dicarboxylic, hydroxylated, and unsaturated acyl moieties) were used to construct multiple-point calibration curves, resulting in accurate and precise quantification. Separation of the 65 acylcarnitines was accomplished in a single chromatogram in as little as 14 min. Validation studies were performed showing a high level of accuracy, precision, and reproducibility. The method provides capabilities unavailable by tandem MS procedures, making it an ideal approach for confirmation of newborn screening results and for clinical and basic research projects, including treatment protocol studies, acylcarnitine biomarker studies, and metabolite studies using plasma, urine, tissue, or other sample matrixes. PMID:26270397

  20. Development and Validation of GC-ECD Method for the Determination of Metamitron in Soil

    PubMed Central

    Tandon, Shishir; Kumar, Satyendra; Sand, N. K.

    2015-01-01

    This paper aims at developing and validating a convenient, rapid, and sensitive method for estimation of metamitron from soil samples.Determination andquantification was carried out by Gas Chromatography on microcapillary column with an Electron Capture Detector source. The compound was extracted from soil using methanol and cleanup by C-18 SPE. After optimization, the method was validated by evaluating the analytical curves, linearity, limits of detection, and quantification, precision (repeatability and intermediate precision), and accuracy (recovery). Recovery values ranged from 89 to 93.5% within 0.05- 2.0 µg L−1 with average RSD 1.80%. The precision (repeatability) ranged from 1.7034 to 1.9144% and intermediate precision from 1.5685 to 2.1323%. Retention time was 6.3 minutes, and minimum detectable and quantifiable limits were 0.02 ng mL−1 and 0.05 ng g−1, respectively. Good linearity (R2 = 0.998) of the calibration curves was obtained over the range from 0.05 to 2.0 µg L−1. Results indicated that the developed method is rapid and easy to perform, making it applicable for analysis in large pesticide monitoring programmes. PMID:25733978

  1. Measure profile surrogates: A method to validate the performance of epileptic seizure prediction algorithms

    NASA Astrophysics Data System (ADS)

    Kreuz, Thomas; Andrzejak, Ralph G.; Mormann, Florian; Kraskov, Alexander; Stögbauer, Harald; Elger, Christian E.; Lehnertz, Klaus; Grassberger, Peter

    2004-06-01

    In a growing number of publications it is claimed that epileptic seizures can be predicted by analyzing the electroencephalogram (EEG) with different characterizing measures. However, many of these studies suffer from a severe lack of statistical validation. Only rarely are results passed to a statistical test and verified against some null hypothesis H0 in order to quantify their significance. In this paper we propose a method to statistically validate the performance of measures used to predict epileptic seizures. From measure profiles rendered by applying a moving-window technique to the electroencephalogram we first generate an ensemble of surrogates by a constrained randomization using simulated annealing. Subsequently the seizure prediction algorithm is applied to the original measure profile and to the surrogates. If detectable changes before seizure onset exist, highest performance values should be obtained for the original measure profiles and the null hypothesis. “The measure is not suited for seizure prediction” can be rejected. We demonstrate our method by applying two measures of synchronization to a quasicontinuous EEG recording and by evaluating their predictive performance using a straightforward seizure prediction statistics. We would like to stress that the proposed method is rather universal and can be applied to many other prediction and detection problems.

  2. Development and Validation of Reversed-Phase High Performance Liquid Chromatographic Method for Hydroxychloroquine Sulphate

    PubMed Central

    Singh, A.; Roopkishora; Singh, C. L.; Gupta, R.; Kumar, S.; Kumar, M.

    2015-01-01

    In the present work new, simple reversed-phase high performance liquid chromatographic method was developed and validated for the determination of hydroxychloroquine sulphate in blood plasma. Chloroquine sulphate was used as an internal standard. The chromatographic separation was achieved with octadecyl silane Hypersil C18 column (250×6 mm, 5 μm) using water and organic (acetonitrile:methanol: 50:50, v/v) mobile phase in 75:25 v/v ratio, with sodium 1-pentanesulfonate and phosphoric acid. This organic phase was maintained at pH 3.0 by orthophosphoric acid. The flow rate of 2.0 ml/min. with detection at 343 nm was used in the analysis. The calibration curve of standard hydroxychloroquine sulphate was linear in range 0.1-20.0 μg/ml. The method was validated with respected to linearity, range, precision, accuracy, specificity and robustness studies according to ICH guidelines. The method was found to be accurate and robust to analyze the hydroxychloroquine sulphate in plasma samples. PMID:26798174

  3. Development and Validation of Reversed-Phase High Performance Liquid Chromatographic Method for Hydroxychloroquine Sulphate.

    PubMed

    Singh, A; Roopkishora; Singh, C L; Gupta, R; Kumar, S; Kumar, M

    2015-01-01

    In the present work new, simple reversed-phase high performance liquid chromatographic method was developed and validated for the determination of hydroxychloroquine sulphate in blood plasma. Chloroquine sulphate was used as an internal standard. The chromatographic separation was achieved with octadecyl silane Hypersil C18 column (250×6 mm, 5 μm) using water and organic (acetonitrile:methanol: 50:50, v/v) mobile phase in 75:25 v/v ratio, with sodium 1-pentanesulfonate and phosphoric acid. This organic phase was maintained at pH 3.0 by orthophosphoric acid. The flow rate of 2.0 ml/min(.) with detection at 343 nm was used in the analysis. The calibration curve of standard hydroxychloroquine sulphate was linear in range 0.1-20.0 μg/ml. The method was validated with respected to linearity, range, precision, accuracy, specificity and robustness studies according to ICH guidelines. The method was found to be accurate and robust to analyze the hydroxychloroquine sulphate in plasma samples. PMID:26798174

  4. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    PubMed

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas.

  5. Comparative Study in Laboratory Rats to Validate Sperm Quality Methods and Endpoints

    NASA Technical Reports Server (NTRS)

    Price, W. A.; Briggs, G. B.; Alexander, W. K.; Still, K. R.; Grasman, K. A.

    2000-01-01

    Abstract The Naval Health Research Center, Detachment (Toxicology) performs toxicity studies in laboratory animals to characterize the risk of exposure to chemicals of Navy interest. Research was conducted at the Toxicology Detachment at WPAFB, OH in collaboration with Wright State University, Department of Biological Sciences for the validation of new bioassay methods for evaluating reproductive toxicity. The Hamilton Thorne sperm analyzer was used to evaluate sperm damage produced by exposure to a known testicular toxic agent, methoxyacetic acid and by inhalation exposure to JP-8 and JP-5 in laboratory rats. Sperm quality parameters were evaluated (sperm concentration, motility, and morphology) to provide evidence of sperm damage. The Hamilton Thorne sperm analyzer utilizes a DNA specific fluorescent stain (similar to flow cytometry) and digitized optical computer analysis to detect sperm cell damage. The computer assisted sperm analysis (CASA) is a more rapid, robust, predictive and sensitive method for characterizing reproductive toxicity. The results presented in this poster report validation information showing exposure to methoxyacetic acid causes reproductive toxicity and inhalation exposure to JP-8 and JP-5 had no significant effects. The CASA method detects early changes that result in reproductive deficits and these data will be used in a continuing program to characterize the toxicity of chemicals, and combinations of chemicals, of military interest to formulate permissible exposure limits.

  6. A validated stability-indicating HPLC method for the determination of PEGylated puerarin in aqueous solutions.

    PubMed

    Liu, Xinyi; Yu, Boyang; Wang, Naijie; Zhang, Bei; Du, Feng; He, Cheng; Ye, Zuguang

    2010-08-01

    The aim of this study was to develop a validated specific stability-indicating HPLC method for the quantitative determination of PEGylated puerarin (PEG-PUE) in aqueous solutions. The method was validated by subjecting PEG-PUE to forced degradation under stress conditions of acid, alkali, water hydrolysis, and oxidation. Both PEG-PUE and puerarin (PUE) were simultaneously determined and separated on CAPCELL PAK C18 column by gradient elution with 0.2% aqueous phosphoric acid and acetonitrile as the mobile phase. The flow rate was 1.0 mL min(-1) and detection wavelength was set at 250 nm. Both calibration curves showed good linear regression (r> or =0.9998) within test ranges. The LOD and LOQ of PEG-PUE were determined to be 3 and 9 microg mL(-1) respectively. Degradation of PEG-PUE followed pseudo-first-order kinetics with t(1/2) of 59 min at pH 9.0 and 17.79 h at pH 7.4. However, at pH 5.0 and 2.0, there was no significant degradation of PEG-PUE over time. In conclusion, the method was observed to have the necessary specificity, precision, and accuracy, and to be suitable for quantity monitoring the degradation process of PEG-PUE during stability studies. The degradation studies may give insight into useful information for formulation development of PEG-PUE.

  7. Self-validated Variance-based Methods for Sensitivity Analysis of Model Outputs

    SciTech Connect

    Tong, C

    2009-04-20

    Global sensitivity analysis (GSA) has the advantage over local sensitivity analysis in that GSA does not require strong model assumptions such as linearity or monotonicity. As a result, GSA methods such as those based on variance decomposition are well-suited to multi-physics models, which are often plagued by large nonlinearities. However, as with many other sampling-based methods, inadequate sample size can badly pollute the result accuracies. A natural remedy is to adaptively increase the sample size until sufficient accuracy is obtained. This paper proposes an iterative methodology comprising mechanisms for guiding sample size selection and self-assessing result accuracy. The elegant features in the the proposed methodology are the adaptive refinement strategies for stratified designs. We first apply this iterative methodology to the design of a self-validated first-order sensitivity analysis algorithm. We also extend this methodology to design a self-validated second-order sensitivity analysis algorithm based on refining replicated orthogonal array designs. Several numerical experiments are given to demonstrate the effectiveness of these methods.

  8. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation.

    PubMed

    Azemard, Sabine; Vassileva, Emilia

    2015-06-01

    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. PMID:25624245

  9. Validation of Broadly Filtered Diagonalization Method for Extracting Frequencies and Modes from High-Performance Computations

    SciTech Connect

    Austin, T.M.; Cary, J.R.; Werner, G.R.; Bellantoni, L.; /Fermilab

    2009-06-01

    Recent developments have shown that one can get around the difficulties of finding the eigenvalues and eigenmodes of the large systems studied with high performance computation by using broadly filtered diagonalization [G. R. Werner and J. R. Cary, J. Compo Phys. 227, 5200 (2008)]. This method can be used in conjunction with any time-domain computation, in particular those that scale very well up to 10000s of processors and beyond. Here we present results that show that this method accurately obtains both modes and frequencies of electromagnetic cavities, even when frequencies are nearly degenerate. The application was to a well-characterized Kaon separator cavity, the A15. The computations are shown to have a precision to a few parts in 10{sup 5}. Because the computed frequency differed from the measured frequency by more than this amount, a careful validation study to determine all sources of difference was undertaken. Ultimately, more precise measurements of the cavity showed that the computations were correct, with remaining differences accounted for by uncertainties in cavity dimensions and atmospheric and thermal conditions. Thus, not only was the method validated, but it was shown to have the ability to predict differences in cavity dimensions from fabrication specifications.

  10. Recommendations and best practices for reference standards and reagents used in bioanalytical method validation.

    PubMed

    Bower, Joseph F; McClung, Jennifer B; Watson, Carl; Osumi, Takahiko; Pastre, Kátia

    2014-03-01

    The continued globalization of pharmaceutics has increased the demand for companies to know and understand the regulations that exist across the globe. One hurdle facing pharmaceutical and biotechnology companies developing new drug candidates is interpreting the current regulatory guidance documents and industry publications associated with bioanalytical method validation (BMV) from each of the different agencies throughout the world. The objective of this commentary is to provide our opinions on the best practices for reference standards and key reagents, such as metabolites and internal standards used in the support of regulated bioanalysis based on a review of current regulatory guidance documents and industry white papers for BMV.

  11. A fast and reliable method for GHB quantitation in whole blood by GC-MS/MS (TQD) for forensic purposes.

    PubMed

    Castro, André L; Tarelho, Sónia; Dias, Mário; Reis, Flávio; Teixeira, Helena M

    2016-02-01

    Gamma-hydroxybutyric acid (GHB) is an endogenous compound with a story of clinical use since the 1960s. However, due to its secondary effects, it has become a controlled substance, entering the illicit market. A fully validated, sensitive and reproducible method for the quantification of GHB by methanolic precipitation and GC-MS/MS (TQD) in whole blood is presented. Using 100μL of whole blood, obtained results included a LOD and LLOQ of 0.1mg/L and a recovery of 86% in a working range between 0.1 and 100mg/L. This method is sensitive and specific to detect the presence of GHB in small amounts of whole blood (both ante-mortem or post-mortem), and is, to the authors' knowledge, the first GC-MS-MS TQD method that uses different precursor ions and product ions for the identification of GHB and GHB-D6 (internal standard). Hence, this method may be especially useful for the study of endogenous values in this biological sample.

  12. SWeRF—A Method for Estimating the Relevant Fine Particle Fraction in Bulk Materials for Classification and Labelling Purposes

    PubMed Central

    2014-01-01

    In accordance with the European regulation for classification, labelling and packaging of substances and mixtures (CLP) as well as the criteria as set out in the Globally Harmonized System (GHS), fine fraction of crystalline silica (CS) has been classified as a specific target organ toxicity, the specific organ in this case being the lung. Generic cut-off values for products containing a fine fraction of CS trigger the need for a method for the quantification of the fine fraction of CS in bulk materials. This article describes the so-called SWeRF method, the size-weighted relevant fine fraction. The SWeRF method combines the particle size distribution of a powder with probability factors from the EN 481 standard and allows the relevant fine fraction of a material to be calculated. The SWeRF method has been validated with a number of industrial minerals. This will enable manufacturers and blenders to apply the CLP and GHS criteria for the classification of mineral products containing RCS a fine fraction of CS. PMID:24389081

  13. Validation of the Symptom Pattern Method for Analyzing Verbal Autopsy Data

    PubMed Central

    Murray, Christopher J. L; Lopez, Alan D; Feehan, Dennis M; Peter, Shanon T; Yang, Gonghuan

    2007-01-01

    Background Cause of death data are a critical input to formulating good public health policy. In the absence of reliable vital registration data, information collected after death from household members, called verbal autopsy (VA), is commonly used to study causes of death. VA data are usually analyzed by physician-coded verbal autopsy (PCVA). PCVA is expensive and its comparability across regions is questionable. Nearly all validation studies of PCVA have allowed physicians access to information collected from the household members' recall of medical records or contact with health services, thus exaggerating accuracy of PCVA in communities where few deaths had any interaction with the health system. In this study we develop and validate a statistical strategy for analyzing VA data that overcomes the limitations of PCVA. Methods and Findings We propose and validate a method that combines the advantages of methods proposed by King and Lu, and Byass, which we term the symptom pattern (SP) method. The SP method uses two sources of VA data. First, it requires a dataset for which we know the true cause of death, but which need not be representative of the population of interest; this dataset might come from deaths that occur in a hospital. The SP method can then be applied to a second VA sample that is representative of the population of interest. From the hospital data we compute the properties of each symptom; that is, the probability of responding yes to each symptom, given the true cause of death. These symptom properties allow us first to estimate the population-level cause-specific mortality fractions (CSMFs), and to then use the CSMFs as an input in assigning a cause of death to each individual VA response. Finally, we use our individual cause-of-death assignments to refine our population-level CSMF estimates. The results from applying our method to data collected in China are promising. At the population level, SP estimates the CSMFs with 16% average relative

  14. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    DOE PAGESBeta

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary; Geller, Jil; Fisher, Susan; Hall, Steven; Hazen, Terry C.; et al

    2016-04-20

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less

  15. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions.

    PubMed

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E; Geller, Jil T; Fisher, Susan J; Hall, Steven C; Hazen, Terry C; Brenner, Steven E; Butland, Gareth; Jin, Jian; Witkowska, H Ewa; Chandonia, John-Marc; Biggin, Mark D

    2016-06-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  16. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions

    SciTech Connect

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary; Geller, Jil; Fisher, Susan; Hall, Steven; Hazen, Terry C; Brenner, Steven; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.

  17. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions*

    PubMed Central

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E.; Geller, Jil T.; Fisher, Susan J.; Hall, Steven C.; Hazen, Terry C.; Brenner, Steven E.; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  18. Validated stability-indicating TLC method for the determination of noscapine.

    PubMed

    Ashour, Ahmed; Hegazy, Maha Abdel Monem; Moustafa, Azza Aziz; Kelani, Khadiga Omar; Fattah, Laila Elsayed Abdel

    2009-07-01

    A sensitive, selective, precise and stability-indicating thin-layer chromatographic (TLC) method was developed and validated for the analysis of noscapine, both as a bulk drug and in its formulation. The method employed TLC aluminium plates precoated with silica gel 60F-254 as the stationary phase. The solvent system consisted of chloroform-methanol (10:0.5 v/v). Densitometric analysis of noscapine and its degradation products was carried out in the absorbance mode at 254 nm. This system was found to give compact symmetrical spots for noscapine (R(f) value 0.85 +/- 0.04). Noscapine was subjected to acid and alkali hydrolysis, oxidation and photo degradation. The drug undergoes photo degradation and also degrades under acidic and basic conditions. The prepared degradation products were identified and verified through infrared (IR) and mass spectral analyses. The degraded products were also well resolved from the pure drug with significantly different R(f) values and they were quantitatively determined. The method was validated for linearity, precision, robustness, limit of detection (LOD), limit of quantitation (LOQ), specificity and accuracy. Linearity was found to be in the 1.0-10.0 microg, 0.4-3.2 microg, 1.0-9.0 microg and 0.5-5.0 microg/band ranges for noscapine, cotarnine, meconine and opionic acid, respectively. The polynomial regression analysis for the calibration plots showed a good polynomial relationship with r(2) of 0.9998, 9989, 9996 and 0.9997 for noscapine and its three degradation products, cotarnine, meconine and opionic acid, respectively. Statistical analysis proves that the method is repeatable and specific for the estimation of noscapine. As this approach could effectively separate the drug from its degradation products it can be employed as a stability-indicating method in Quality Control laboratories.

  19. A novel method for extraction of a proteinous coagulant from Plantago ovata seeds for water treatment purposes.

    PubMed

    Ramavandi, Bahman; Hashemi, Seyedenayat; Kafaei, Raheleh

    2015-01-01

    Several chemicals have been applied in the process of coagulant extraction from herbal seeds, and the best extraction has been obtained in the presence of KCl or NaNO3[1-3], and NaCl [4]. However, the main challenge posed to these methods of coagulant extraction is their relatively low efficiency for water treatment purposes and the formation of dissolved organic matter during the treatment process. In these methods the salts, which have a one-valance metal (Na(+) and K(+)), are deposited in the internal structure and the pore of the coagulant, and may be useful for the coagulation/flocculation process. In this research, we found that modified methods produced more dense protein. Therefore, the modified procedure was better than the older one for removal of turbidity and harness from the contaminated water. Here we describe a method where: •According to the Hardy-Schulze rule, we applied the Fe(3+) ions instead of Na(+) and K(+) for the extraction of protein from Plantago ovata seeds.•The method was narrowed to extract protein by ethanol (defatting) and ammonium acetate and CM-Sepharose (protein extraction).•Two consecutive elutriations of crude extract was directly performed using 0.025-M FeCl3 and 0.05-M FeCl3 according to the basis of the ion-exchange processes. PMID:26150999

  20. A novel method for extraction of a proteinous coagulant from Plantago ovata seeds for water treatment purposes

    PubMed Central

    Ramavandi, Bahman; Hashemi, Seyedenayat; Kafaei, Raheleh

    2015-01-01

    Several chemicals have been applied in the process of coagulant extraction from herbal seeds, and the best extraction has been obtained in the presence of KCl or NaNO3[1], [2], [3], and NaCl [4]. However, the main challenge posed to these methods of coagulant extraction is their relatively low efficiency for water treatment purposes and the formation of dissolved organic matter during the treatment process. In these methods the salts, which have a one-valance metal (Na+ and K+), are deposited in the internal structure and the pore of the coagulant, and may be useful for the coagulation/flocculation process. In this research, we found that modified methods produced more dense protein. Therefore, the modified procedure was better than the older one for removal of turbidity and harness from the contaminated water. Here we describe a method where: • According to the Hardy–Schulze rule, we applied the Fe3+ ions instead of Na+ and K+ for the extraction of protein from Plantago ovata seeds. • The method was narrowed to extract protein by ethanol (defatting) and ammonium acetate and CM-Sepharose (protein extraction). • Two consecutive elutriations of crude extract was directly performed using 0.025-M FeCl3 and 0.05-M FeCl3 according to the basis of the ion-exchange processes. PMID:26150999

  1. Design and validation of bending test method for characterization of miniature pediatric cortical bone specimens.

    PubMed

    Albert, Carolyne I; Jameson, John; Harris, Gerald

    2013-02-01

    Osteogenesis imperfecta is a genetic disorder of bone fragility; however, the effects of this disorder on bone material properties are not well understood. No study has yet measured bone material strength in humans with osteogenesis imperfecta. Small bone specimens are often extracted during routine fracture surgeries in children with osteogenesis imperfecta. These specimens could provide valuable insight into the effects of osteogenesis imperfecta on bone material strength; however, their small size poses a challenge to their mechanical characterization. In this study, a validated miniature three-point bending test is described that enables measurement of the flexural material properties of pediatric cortical osteotomy specimens as small as 5 mm in length. This method was validated extensively using bovine bone, and the effect of span/depth aspect ratio (5 vs 6) on the measured flexural properties was examined. The method provided reasonable results for both Young's modulus and flexural strength in bovine bone. With a span/depth ratio of 6, the median longitudinal modulus and flexural strength results were 16.1 (range: 14.4-19.3)GPa and 251 (range: 219-293)MPa, respectively. Finally, the pilot results from two osteotomy specimens from children with osteogenesis imperfecta are presented. These results provide the first measures of bone material strength in this patient population.

  2. Validation of a novel sequential cultivation method for the production of enzymatic cocktails from Trichoderma strains.

    PubMed

    Florencio, C; Cunha, F M; Badino, A C; Farinas, C S

    2015-02-01

    The development of new cost-effective bioprocesses for the production of cellulolytic enzymes is needed in order to ensure that the conversion of biomass becomes economically viable. The aim of this study was to determine whether a novel sequential solid-state and submerged fermentation method (SF) could be validated for different strains of the Trichoderma genus. Cultivation of the Trichoderma reesei Rut-C30 reference strain under SF using sugarcane bagasse as substrate was shown to be favorable for endoglucanase (EGase) production, resulting in up to 4.2-fold improvement compared with conventional submerged fermentation. Characterization of the enzymes in terms of the optimum pH and temperature for EGase activity and comparison of the hydrolysis profiles obtained using a synthetic substrate did not reveal any qualitative differences among the different cultivation conditions investigated. However, the thermostability of the EGase was influenced by the type of carbon source and cultivation system. All three strains of Trichoderma tested (T. reesei Rut-C30, Trichoderma harzianum, and Trichoderma sp INPA 666) achieved higher enzymatic productivity when cultivated under SF, hence validating the proposed SF method for use with different Trichoderma strains. The results suggest that this bioprocess configuration is a very promising development for the cellulosic biofuels industry.

  3. An extended validation of the last generation of particle finite element method for free surface flows

    NASA Astrophysics Data System (ADS)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  4. A validated new method for nevirapine quantitation in human plasma via high-performance liquid chromatography.

    PubMed

    Silverthorn, Courtney F; Parsons, Teresa L

    2006-01-01

    A fully validated and clinically relevant assay was developed for the assessment of nevirapine concentrations in neonate blood plasma samples. Solid-phase extraction with an acid-base wash series was used to prepare subject samples for analysis. Samples were separated by high performance liquid chromatography and detected at 280 nm on a C8 reverse-phase column in an isocratic mobile phase. The retention times of nevirapine and its internal standard were 5.0 and 6.9 min, respectively. The method was validated by assessment of accuracy and precision (statistical values <15%), specificity, and stability. The assay was linear in the range 25-10,000 ng/mL (r2 > 0.996) and the average recovery was 93% (n = 18). The lower limit of quantification (relative standard deviation <20%) was determined to be 25 ng/mL for 50 microL of plasma, allowing detection of as little as 1.25 ng of nevirapine in a sample. This value represents an increase in sensitivity of up to 30-fold over previously published methods.

  5. Unsteady immiscible multiphase flow validation of a multiple-relaxation-time lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Leclaire, S.; Pellerin, N.; Reggio, M.; Trépanier, J.-Y.

    2014-03-01

    The lattice Boltzmann modeling of immiscible multiphase flows needs to be further validated, especially when density variation occurs between the different flow phases. From this perspective, the goal of this research is to introduce the multiple-relaxation-time operator into a lattice Boltzmann model in order to improve its numerical stability in the presence of large density and viscosity ratios. Essentially, this research shows that the introduction of this operator greatly improves the numerical stability of the approach compared to the original single-relaxation-time collision operator. In many lattice Boltzmann research studies, multiphase lattice Boltzmann methods are validated using a reduced number of test cases, and unsteady flow test cases are frequently omitted before much more complex flow configurations are simulated. In this context, several test cases are proposed to evaluate the behavior of a lattice Boltzmann method for simulating immiscible multiphase flows with high density and viscosity ratios. These are: (1) two-phase Couette flow; (2) three-phase Laplace law; (3) three-phase Zalesak disk; (4) two-phase flow between oscillating plates; (5) two-phase capillary wave; and (6) the two-phase oscillating cylindrical bubble. The first two involve a steady regime, and the remaining four an unsteady regime.

  6. Validation of a novel sequential cultivation method for the production of enzymatic cocktails from Trichoderma strains.

    PubMed

    Florencio, C; Cunha, F M; Badino, A C; Farinas, C S

    2015-02-01

    The development of new cost-effective bioprocesses for the production of cellulolytic enzymes is needed in order to ensure that the conversion of biomass becomes economically viable. The aim of this study was to determine whether a novel sequential solid-state and submerged fermentation method (SF) could be validated for different strains of the Trichoderma genus. Cultivation of the Trichoderma reesei Rut-C30 reference strain under SF using sugarcane bagasse as substrate was shown to be favorable for endoglucanase (EGase) production, resulting in up to 4.2-fold improvement compared with conventional submerged fermentation. Characterization of the enzymes in terms of the optimum pH and temperature for EGase activity and comparison of the hydrolysis profiles obtained using a synthetic substrate did not reveal any qualitative differences among the different cultivation conditions investigated. However, the thermostability of the EGase was influenced by the type of carbon source and cultivation system. All three strains of Trichoderma tested (T. reesei Rut-C30, Trichoderma harzianum, and Trichoderma sp INPA 666) achieved higher enzymatic productivity when cultivated under SF, hence validating the proposed SF method for use with different Trichoderma strains. The results suggest that this bioprocess configuration is a very promising development for the cellulosic biofuels industry. PMID:25399068

  7. Comparison of sample preparation methods, validation of an UPLC-MS/MS procedure for the quantification of tetrodotoxin present in marine gastropods and analysis of pufferfish.

    PubMed

    Nzoughet, Judith Kouassi; Campbell, Katrina; Barnes, Paul; Cooper, Kevin M; Chevallier, Olivier P; Elliott, Christopher T

    2013-02-15

    Tetrodotoxin (TTX) is one of the most potent marine neurotoxins reported. The global distribution of this toxin is spreading with the European Atlantic coastline now being affected. Climate change and increasing pollution have been suggested as underlying causes for this. In the present study, two different sample preparation techniques were used to extract TTX from Trumpet shells and pufferfish samples. Both extraction procedures (accelerated solvent extraction (ASE) and a simple solvent extraction) were shown to provide good recoveries (80-92%). A UPLC-MS/MS method was developed for the analysis of TTX and validated following the guidelines contained in the Commission Decision 2002/657/EC for chemical contaminant analysis. The performance of this procedure was demonstrated to be fit for purpose. This study is the first report on the use of ASE as a mean for TTX extraction, the use of UPLC-MS/MS for TTX analysis, and the validation of this method for TTX in gastropods.

  8. Further validation to the variational method to obtain flow relations for generalized Newtonian fluids

    NASA Astrophysics Data System (ADS)

    Sochi, Taha

    2015-05-01

    We continue our investigation to the use of the variational method to derive flow relations for generalized Newtonian fluids in confined geometries. While in the previous investigations we used the straight circular tube geometry with eight fluid rheological models to demonstrate and establish the variational method, the focus here is on the plane long thin slit geometry using those eight rheological models, namely: Newtonian, power law, Ree-Eyring, Carreau, Cross, Casson, Bingham and Herschel-Bulkley. We demonstrate how the variational principle based on minimizing the total stress in the flow conduit can be used to derive analytical expressions, which are previously derived by other methods, or used in conjunction with numerical procedures to obtain numerical solutions which are virtually identical to the solutions obtained previously from well established methods of fluid dynamics. In this regard, we use the method of Weissenberg-Rabinowitsch- Mooney-Schofield (WRMS), with our adaptation from the circular pipe geometry to the long thin slit geometry, to derive analytical formulae for the eight types of fluid where these derived formulae are used for comparison and validation of the variational formulae and numerical solutions. Although some examples may be of little value, the optimization principle which the variational method is based upon has a significant theoretical value as it reveals the tendency of the flow system to assume a configuration that minimizes the total stress. Our proposal also offers a new methodology to tackle common problems in fluid dynamics and rheology.

  9. Pyramid projection - validation of a new method of skin defect measurement.

    PubMed

    Růzicka, J; Nový, P; Vávra, F; Bolek, L; Benes, J

    2007-01-01

    This paper presents a new method for the determination of the volume, surface area and depth of skin defects. The method is based on the description of a spatial defect using a pyramid (made, for example, from injection needles), which is placed over the defect. The projection of the pyramid on to the defect is photographed using a digital camera and subsequently compared with the projection of the same pyramid on to a sheet of grid paper. The defect is mathematically reconstructed on a computer, and an optimal body shape describing the defect is found, using a number of simplifications and assumptions. The method was then validated using a plaster mold of a real foot with 19 defects simulating real wounds. These plaster wounds were molded using alginate hydrocolloid, and the volume, surface area and depth were measured and compared with the results of the pyramid projection by means of regression analysis.This method correlates in all variables with correlation coefficients higher than 0.9. It can be concluded that the projection pyramid method correlates well with the reference mold method and can be used with good results for a whole range of variables.

  10. Experimental validation of theoretical methods to estimate the energy radiated by elastic waves during an impact

    NASA Astrophysics Data System (ADS)

    Farin, Maxime; Mangeney, Anne; Rosny, Julien de; Toussaint, Renaud; Sainte-Marie, Jacques; Shapiro, Nikolaï M.

    2016-02-01

    Estimating the energy lost in elastic waves during an impact is an important problem in seismology and in industry. We propose three complementary methods to estimate the elastic energy radiated by bead impacts on thin plates and thick blocks from the generated vibration. The first two methods are based on the direct wave front and are shown to be equivalent. The third method makes use of the diffuse regime. These methods are tested for laboratory experiments of impacts and are shown to give the same results, with error bars of 40 percent and 300 percent for impacts on a smooth plate and on a rough block, respectively. We show that these methods are relevant to establish the energy budget of an impact. On plates of glass and PMMA, the radiated elastic energy increases from 2 percent to almost 100 percent of the total energy lost as the bead diameter approaches the plate thickness. The rest of the lost energy is dissipated by viscoelasticity. For beads larger than the plate thickness, plastic deformation occurs and reduces the amount of energy radiated in the form of elastic waves. On a concrete block, the energy dissipation during the impact is principally inelastic because only 0.2-2 percent of the energy lost by the bead is transported by elastic waves. The radiated elastic energy estimated with the presented methods is quantitatively validated by Hertz's model of elastic impact.

  11. The role of validated analytical methods in JECFA drug assessments and evaluation for recommending MRLs.

    PubMed

    Boison, Joe O

    2016-05-01

    The Joint Food and Agriculture Organization and World Health Organization (FAO/WHO) Expert Committee on Food Additives (JECFA) is one of three Codex committees tasked with applying risk analysis and relying on independent scientific advice provided by expert bodies organized by FAO/WHO when developing standards. While not officially part of the Codex Alimentarius Commission structure, JECFA provides independent scientific advice to the Commission and its specialist committees such as the Codex Committee on Residues of Veterinary Drugs in Foods (CCRVDF) in setting maximum residue limits (MRLs) for veterinary drugs. Codex methods of analysis (Types I, II, III, and IV) are defined in the Codex Procedural Manual as are criteria to be used for selecting methods of analysis. However, if a method is to be used under a single laboratory condition to support regulatory work, it must be validated according to an internationally recognized protocol and the use of the method must be embedded in a quality assurance system in compliance with ISO/IEC 17025:2005. This paper examines the attributes of the methods used to generate residue depletion data for drug registration and/or licensing and for supporting regulatory enforcement initiatives that experts consider to be useful and appropriate in their assessment of methods of analysis. Copyright © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd. PMID:27443214

  12. Spectrofluorimetric method for determination and validation of cefixime in pharmaceutical preparations through derivatization with 2-cyanoacetamide.

    PubMed

    Shah, Jasmin; Jan, M Rasul; Shah, Sultan; Inayatullah

    2011-03-01

    A simple, sensitive and accurate method has been developed for spectrofluorimetric determination of cefixime in pure form and pharmaceutical preparations. The method is based on the reaction of cefixime with 2-cyanoacetamide in the presence of 21% ammonia at 100 °C. The fluorescent reaction product showed maximum fluorescence intensity at λ 378 nm after excitation at λ 330 nm. The factors affecting the derivatization reaction were carefully studied and optimized. The fluorescence intensity versus concentration plot was rectilinear over the range of 0.02 to 4 μg mL(-1) with correlation coefficient of 0.99036. The limit of detection (LOD) and limit of quantification (LOQ) was found to be 2.95 ng mL(-1) and 9.84 ng mL(-1), respectively. The proposed method was validated statistically and through recovery studies. The method was successfully applied for the determination of cefixime in pure and dosage form with percent recoveries from 98.117% to 100.38%. The results obtained from the proposed method have been compared with the official HPLC method and good agreement was found between them.

  13. Method validation and dissipation dynamics of chlorfenapyr in squash and okra.

    PubMed

    Abdel Ghani, Sherif B; Abdallah, Osama I

    2016-03-01

    QuEChERS method combined with GC-IT-MS was developed and validated for the determination of chlorfenapyr residues in squash and okra matrices. Method accuracy, repeatability, linearity and specificity were investigated. Matrix effect was discussed. Determination coefficients (R(2)) were 0.9992 and 0.9987 in both matrices. LODs were 2.4 and 2.2μg/kg, while LOQs were 8.2 and 7.3μg/kg. Method accuracy ranged from 92.76% to 106.49%. Method precision RSDs were ⩽12.59%. A field trial to assess chlorfenapyr dissipation behavior was carried out. The developed method was employed in analyzing field samples. Dissipation behavior followed first order kinetics in both crops. Half-life values (t1/2) ranged from 0.2 to 6.58days with determination coefficient (R(2)) ranged from 0.78 to 0.96. The developed method was utilized for surveying chlorfenapyr residues in squash and okra samples collected from the market. Monitoring results are discussed.

  14. Spectrophotometric method for the determination, validation, spectroscopic and thermal analysis of diphenhydramine in pharmaceutical preparation

    NASA Astrophysics Data System (ADS)

    Ulu, Sevgi Tatar; Elmali, Fikriye Tuncel

    2010-09-01

    A sensitive, simple and rapid spectrophotometric method was developed for the determination of diphenhydramine in pharmaceutical preparation. The method was based on the charge-transfer complex of the drug, as n-electron donor, with 2,3-dichloro-5,6-dicyano- p-benzoquinone (DDQ), as π-acceptor. The formation of this complex was also confirmed by UV-vis, FTIR and 1H NMR spectra techniques and thermal analysis. The proposed method was validated according to the ICH guidelines with respect to linearity, limit of detection, limit of quantification, accuracy, precision, recovery and robustness. The linearity range for concentrations of diphenhydramine was found to be 12.5-150 μg/mL with acceptable correlation coefficients. The detection and quantification limits were found to be 2.09 and 6.27 μg/mL, respectively. The proposed and references methods were applied to the determination of drug in syrup. This preparation were also analyzed with an reference method and statistical comparison by t- and F-tests revealed that there was no significant difference between the results of the two methods with respect to mean values and standard deviations at the 95% confidence level.

  15. Method validation and dissipation dynamics of chlorfenapyr in squash and okra.

    PubMed

    Abdel Ghani, Sherif B; Abdallah, Osama I

    2016-03-01

    QuEChERS method combined with GC-IT-MS was developed and validated for the determination of chlorfenapyr residues in squash and okra matrices. Method accuracy, repeatability, linearity and specificity were investigated. Matrix effect was discussed. Determination coefficients (R(2)) were 0.9992 and 0.9987 in both matrices. LODs were 2.4 and 2.2μg/kg, while LOQs were 8.2 and 7.3μg/kg. Method accuracy ranged from 92.76% to 106.49%. Method precision RSDs were ⩽12.59%. A field trial to assess chlorfenapyr dissipation behavior was carried out. The developed method was employed in analyzing field samples. Dissipation behavior followed first order kinetics in both crops. Half-life values (t1/2) ranged from 0.2 to 6.58days with determination coefficient (R(2)) ranged from 0.78 to 0.96. The developed method was utilized for surveying chlorfenapyr residues in squash and okra samples collected from the market. Monitoring results are discussed. PMID:26471587

  16. Validated spectrofluorimetric method for the determination of tamsulosin in spiked human urine, pure and pharmaceutical preparations.

    PubMed

    Karasakal, A; Ulu, S T

    2014-05-01

    A novel, sensitive and selective spectrofluorimetric method was developed for the determination of tamsulosin in spiked human urine and pharmaceutical preparations. The proposed method is based on the reaction of tamsulosin with 1-dimethylaminonaphthalene-5-sulfonyl chloride in carbonate buffer pH 10.5 to yield a highly fluorescent derivative. The described method was validated and the analytical parameters of linearity, limit of detection (LOD), limit of quantification (LOQ), accuracy, precision, recovery and robustness were evaluated. The proposed method showed a linear dependence of the fluorescence intensity on drug concentration over the range 1.22 × 10(-7) to 7.35 × 10(-6)  M. LOD and LOQ were calculated as 1.07 × 10(-7) and 3.23 × 10(-7)  M, respectively. The proposed method was successfully applied for the determination of tamsulosin in pharmaceutical preparations and the obtained results were in good agreement with those obtained using the reference method.

  17. Validity of the t-plot method to assess microporosity in hierarchical micro/mesoporous materials.

    PubMed

    Galarneau, Anne; Villemot, François; Rodriguez, Jeremy; Fajula, François; Coasne, Benoit

    2014-11-11

    The t-plot method is a well-known technique which allows determining the micro- and/or mesoporous volumes and the specific surface area of a sample by comparison with a reference adsorption isotherm of a nonporous material having the same surface chemistry. In this paper, the validity of the t-plot method is discussed in the case of hierarchical porous materials exhibiting both micro- and mesoporosities. Different hierarchical zeolites with MCM-41 type ordered mesoporosity are prepared using pseudomorphic transformation. For comparison, we also consider simple mechanical mixtures of microporous and mesoporous materials. We first show an intrinsic failure of the t-plot method; this method does not describe the fact that, for a given surface chemistry and pressure, the thickness of the film adsorbed in micropores or small mesopores (< 10σ, σ being the diameter of the adsorbate) increases with decreasing the pore size (curvature effect). We further show that such an effect, which arises from the fact that the surface area and, hence, the free energy of the curved gas/liquid interface decreases with increasing the film thickness, is captured using the simple thermodynamical model by Derjaguin. The effect of such a drawback on the ability of the t-plot method to estimate the micro- and mesoporous volumes of hierarchical samples is then discussed, and an abacus is given to correct the underestimated microporous volume by the t-plot method.

  18. Chemometric approach for development, optimization, and validation of different chromatographic methods for separation of opium alkaloids.

    PubMed

    Acevska, J; Stefkov, G; Petkovska, R; Kulevanova, S; Dimitrovska, A

    2012-05-01

    The excessive and continuously growing interest in the simultaneous determination of poppy alkaloids imposes the development and optimization of convenient high-throughput methods for the assessment of the qualitative and quantitative profile of alkaloids in poppy straw. Systematic optimization of two chromatographic methods (gas chromatography (GC)/flame ionization detector (FID)/mass spectrometry (MS) and reversed-phase (RP)-high-performance liquid chromatography (HPLC)/diode array detector (DAD)) for the separation of alkaloids from Papaver somniferum L. (Papaveraceae) was carried out. The effects of various conditions on the predefined chromatographic descriptors were investigated using chemometrics. A full factorial linear design of experiments for determining the relationship between chromatographic conditions and the retention behavior of the analytes was used. Central composite circumscribed design was utilized for the final method optimization. By conducting the optimization of the methods in very rational manner, a great deal of excessive and unproductive laboratory research work was avoided. The developed chromatographic methods were validated and compared in line with the resolving power, sensitivity, accuracy, speed, cost, ecological aspects, and compatibility with the poppy straw extraction procedure. The separation of the opium alkaloids using the GC/FID/MS method was achieved within 10 min, avoiding any derivatization step. This method has a stronger resolving power, shorter analysis time, better cost/effectiveness factor than the RP-HPLC/DAD method and is in line with the "green trend" of the analysis. The RP-HPLC/DAD method on the other hand displayed better sensitivity for all tested alkaloids. The proposed methods provide both fast screening and an accurate content assessment of the six alkaloids in the poppy samples obtained from the selection program of Papaver strains.

  19. Analysis of flavonoids and iridoids in Vitex negundo by HPLC-PDA and method validation.

    PubMed

    Roy, Somendu K; Bairwa, Khemraj; Grover, Jagdeep; Srivastava, Amit; Jachak, Sanjay M

    2013-09-01

    The leaves of Vitex negundo have been reported to contain various bioactive constituents including iridoids and flavonoids. This is the first report on the simultaneous determination of iridoids and flavonoids by HPLC in three different samples of V. negundo leaves collected from three regions of India. Separation of iridoids and flavonoids was accomplished by HPLC and further elaborated for their quantification in V. negundo leaves using a C-18 column with detection at 254 and 330 nm, respectively. The developed HPLC method showed good linearity (r2 > or = 0.999), high precision (RSD < 5%) and a good recovery (99.3-103.0%) of the compounds. All the validation parameters of the developed HPLC were found to be within the permissible limits according to the ICH guidelines. The developed method was robust, accurate and reliable for the quality control of V. negundo leaves.

  20. Validation of a mass spectrometry method to quantify oak ellagitannins in wine samples.

    PubMed

    García-Estévez, Ignacio; Escribano-Bailón, M Teresa; Rivas-Gonzalo, Julián C; Alcalde-Eon, Cristina

    2012-02-15

    Detection and individual quantification of oak wood ellagitannins in oak barrel aged red wine samples are difficult mainly due to their low levels and the similarity between their structures. In this work, a quantification method using mass spectrometry has been developed and validated to quantify wine ellagitannins after sample fractionation with a previously reported method. The use of an internal standard is a requirement to correct mass signal variability. (-)-Gallocatechin, among the different tested compounds, was the only one that proved to be a suitable internal standard making possible the accurate and individual quantification of the main oak wood ellagitannins. The developed methodology has been used to detect and quantify these ellagitannins in different Spanish commercial wines, proving its usefulness.

  1. Validation and comparison of two computerized methods of obtaining a diet history.

    PubMed

    Landig, J; Erhardt, J G; Bode, J C; Bode, C

    1998-06-01

    The aim of this study was to validate two computerized methods of obtaining a diet history (DH and EBIS). The food consumption of 12 men and eight women was calculated by weighing each food item over a period of 8 days. Thereafter the diet history was taken over this period by using both programs alternatively. The intake of energy, protein, fat and carbohydrates, and 10 further nutrients was evaluated and the percentage difference calculated. In general, the intake of nutrients calculated from the diet history tended to be underestimated by most of the people interviewed. The mean daily intake of the nutrients calculated from the DH program deviates from -34% to +20% (mean SD = 48.1) and -35% to +15% for EBIS (mean SD = 28.1). In conclusion, both computerized methods proved useful for epidemiological studies, but not for the determination of deficiencies in individuals.

  2. Rayleigh-Bénard convection via Lattice Boltzmann method: code validation and grid resolution effects

    NASA Astrophysics Data System (ADS)

    Lavezzo, V.; Clercx, H. J. H.; Toschi, F.

    2011-12-01

    Thermal plumes, formed at the wall of turbulent natural convection cells, play an important role in the re-suspension and dispersion process of inertial particles. For this reason, a good resolution of the region close to the wall is necessary to correctly describe the plumes and, consequently, the particle dynamics. In this work, a Lattice Boltzmann Method (LBM) coupled with Lagrangian particle tracking is used to understand the effects of the filtering action exerted by the grid resolution on particle trajectories. A validation of the numerical method against the work of Kunnen (2009) and Schumacher (2009) is presented and, in this framework, mean and RMS statistics on fluid temperature are considered and analyzed in detail.

  3. Validation of Inlet and Exhaust Boundary Conditions for a Cartesian Method

    NASA Technical Reports Server (NTRS)

    Pandya, Shishir A.; Murman, Scott M.; Aftosmis, Michael J.

    2004-01-01

    Inlets and exhaust nozzles are often omitted in aerodynamic simulations of aircraft due to the complexities involved in the modeling of engine details and flow physics. However, the omission is often improper since inlet or plume flows may have a substantial effect on vehicle aerodynamics. A method for modeling the effect of inlets and exhaust plumes using boundary conditions within an inviscid Cartesian flow solver is presented. This approach couples with both CAD systems and legacy geometry to provide an automated tool suitable for parameter studies. The method is validated using two and three-dimensional test problems which are compared with both theoretical and experimental results. The numerical results demonstrate excellent agreement with theory and available data, even for extremely strong jets and very sensitive inlets.

  4. Development and Validation of a Spectrofluorimetric Method for the Estimation of Rivastigmine in Formulations

    PubMed Central

    Kapil, R.; Dhawan, S.; Singh, Bhupinder

    2009-01-01

    A rapid, sensitive, simple, and cost-effective spectrofluorimetric method was developed for the estimation of rivastigmine in bulk and pharmaceutical formulations. The relative fluorescence intensity of rivastigmine was measured in triple distilled water at an excitation wavelength of 220 nm and an emission wavelength of 289 nm. Linearity range was found to be 100 to 4000 ng/ml. The method was validated for various parameters as per the ICH guidelines and USP requirements. The detection and quantitation limits were found to be 20.5 and 62.1 ng/ml, respectively. The results demonstrate that the procedure is accurate, precise, and reproducible, while being simple and rapid too. The results were found to be in good agreement with the label claims. PMID:20502586

  5. A validated HPLC method for the assay of xanthone and 3-methoxyxanthone in PLGA nanocapsules.

    PubMed

    Teixeira, Maribel; Afonso, Carlos M M; Pinto, Madalena M M M; Barbosa, Carlos Maurício

    2003-08-01

    This work relates the development and validation of a simple reversed-phase high-performance liquid chromatographic (HPLC) method for the analysis of xanthone (XAN) and 3-methoxyxanthone (3-MeOXAN) in poly(D,L-lactide-co-glycolide) (PLGA) nanocapsule formulations. This method does not require any complex sample extraction procedure. Chromatographic separation is made with a reversed-phase C(18) column, using methanol-water (90:10, v/v) as a mobile phase at a flow rate of 1 mL/min. Identification is made by UV detection at 237 nm. The isocratic system operates at ambient temperature and requires 7 min of chromatographic time. The developed method is statistically validated according to United States Pharmacopoeia 25 and International Conference on Harmonization guidelines for its specificity, linearity, accuracy, and precision. The assay method proposed in this study is specific for XAN and 3-MeOXAN in the presence of nanocapsule excipients. Diode-array analyses confirm the homogeneity of XAN and 3-MeOXAN peaks in stressed conditions. Standard curves are linear (r > 0.999) over the concentration range of 0.4-2.5 and 1.0-5.8 micro g/mL for XAN and 3-MeOXAN, respectively. Recovery from nanocapsules ranges from 99.6% to 102.8% for XAN and 98.8% to 102.4% for 3-MeOXAN. Repeatability (intra-assay precision) is acceptable with relative standard deviation values of 1.2% for XAN and 0.3% for 3-MeOXAN.

  6. [Comparative validation of manual and automated methods for mixing and volume control of total blood samples].

    PubMed

    Folléa, G; Bigey, F; Jacob, D; Cazenave, J P

    1997-07-01

    During blood collection, agitation and volume limitations are critical to ensure thorough mixing of the blood with the anticoagulant and obtention of the predetermined volume. These 2 factors are essential to prevent blood activation and to obtain well standardized blood products. The objective of this study was to compare the quality of the blood collected using 2 types of collection method: tripping of a scale at a predetermined volume limit of 450 mL in the presence of manual agitation, and the 3 blood collection monitors currently available in France. A minimum of 100 collection procedures was performed for each of the 4 methods tested. Results were found to be equivalent using either the manual or the automated procedures with regard to both the accuracy and reproducibility of the blood volumes obtained and the collection times and flow rates. The characteristics of the red blood cell concentrates, platelet concentrates and plasma units prepared from the first 30 collections of each group were assessed and compared to regulatory requirements. The quality of all these products was found to be comparable to that currently observed at quality control and no product was rejected at the release control for reasons of poor collection. An assessment of the practicability of the different methods showed that the automated devices are subject to practical difficulties involving transport and battery loading. In addition, the cost of this equipment is approximately 5 times higher than that of the scales. In conclusion, the results of this study show that in our hands, no significant advantage could be expected from the use of automated blood collection monitors as compared to simple scales with manual mixing. These results further raise the question of the applicability to labile blood products of the comparative validations currently accepted in the pharmaceutical industry, in order to allow the use of correctly validated alternative methods.

  7. Overcoming barriers to validation of non-animal partial replacement methods/Integrated Testing Strategies: the report of an EPAA-ECVAM workshop.

    PubMed

    Kinsner-Ovaskainen, Agnieszka; Akkan, Zerrin; Casati, Silvia; Coecke, Sandra; Corvi, Raffaella; Dal Negro, Gianni; De Bruijn, Jack; De Silva, Odile; Gribaldo, Laura; Griesinger, Claudius; Jaworska, Joanna; Kreysa, Joachim; Maxwell, Gavin; McNamee, Pauline; Price, Anna; Prieto, Pilar; Schubert, Roland; Tosti, Luca; Worth, Andrew; Zuang, Valerie

    2009-09-01

    The use of Integrated Testing Strategies (ITS) in toxicological hazard identification and characterisation is becoming increasingly common as a method for enabling the integration of diverse types of toxicology data. At present, there are no existing procedures and guidelines for the construction and validation of ITS, so a joint EPAA WG5-ECVAM workshop was held with the following objectives: a) to investigate the role of ITS and the need for validation of ITS in the different industry sectors (pharmaceuticals, cosmetics, chemicals); b) to formulate a common definition of ITS applicable across different sectors; c) to explore how and when Three Rs methods are used within ITS; and d) to propose a validation rationale for ITS and for alternative methods that are foreseen to be used within ITS. The EPAA provided a platform for comparing experiences with ITS across different industry sectors. It became clear that every ITS has to be adapted to the product type, R&D stage, and regulatory context. However, common features of ITS were also identified, and this permitted the formulation of a general definition of ITS in a regulatory context. The definition served as a basis for discussing the needs, rationale and process of formal ITS validation. One of the main conclusions was that a formal validation should not be required, unless the strategy will serve as full replacement of an in vivo study used for regulatory purposes. Finally, several challenges and bottlenecks to the ITS validation were identified, and it was agreed that a roadmap on how to address these barriers would be established by the EPAA partners. PMID:19807215

  8. Using Self- and Peer-Assessments for Summative Purposes: Analysing the Relative Validity of the AASL (Authentic Assessment for Sustainable Learning) Model

    ERIC Educational Resources Information Center

    Kearney, Sean; Perkins, Timothy; Kennedy-Clark, Shannon

    2016-01-01

    The purpose of this paper is to provide a proof of concept of a collaborative peer-, self- and lecturer assessment processes. The research presented here is part of an ongoing study on self- and peer assessments in higher education. The authentic assessment for sustainable learning (AASL) model is evaluated in terms of the correlations between…

  9. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  10. Validated spectrophotometric and chromatographic methods for simultaneous determination of ketorolac tromethamine and phenylephrine hydrochloride.

    PubMed

    Belal, T S; El-Kafrawy, D S; Mahrous, M S; Abdel-Khalek, M M; Abo-Gharam, A H

    2016-07-01

    This work describes five simple and reliable spectrophotometric and chromatographic methods for analysis of the binary mixture of ketorolac tromethamine (KTR) and phenylephrine hydrochloride (PHE). Method I is based on the use of conventional Amax and derivative spectrophotometry with the zero-crossing technique where KTR was determined using its Amax and (1)D amplitudes at 323 and 341nm respectively, while PHE was determined by measuring the (1)D amplitudes at 248.5nm. Method II involves the application of the ratio spectra derivative spectrophotometry. For KTR, 12μg/mL PHE was used as a divisor and the (1)DD amplitudes at 265nm were plotted against KTR concentrations; while - by using 4μg/mL KTR as divisor - the (1)DD amplitudes at 243.5nm were found proportional to PHE concentrations. Method III depends on ratio-difference measurement where the peak to trough amplitudes between 260 and 284nm were measured and correlated to KTR concentration. Similarly, the peak to trough amplitudes between 235 and 260nm in the PHE ratio spectra were recorded. For method IV, the two compounds were separated using Merck HPTLC sheets of silica gel 60 F254 and a mobile phase composed of chloroform/methanol/ammonia (70:30:2, by volume) followed by densitometric measurement of KTR and PHE spots at 320 and 278nm respectively. Method V depends on HPLC-DAD. Effective chromatographic separation was achieved using Zorbax eclipse plus C8 column (4.6×250mm, 5μm) with a mobile phase consisting of 0.05M o-phosphoric acid and acetonitrile (50:50, by volume) at a flow rate 1mL/min and detection at 313 and 274nm for KTR and PHE respectively. Analytical performance of the developed methods was statistically validated according to the ICH guidelines with respect to linearity, ranges, precision, accuracy, detection and quantification limits. The validated spectrophotometric and chromatographic methods were successfully applied to the simultaneous analysis of KTR and PHE in synthetic mixtures

  11. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    SciTech Connect

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  12. The reliability and validity of the comfort level method of setting hearing aid gain.

    PubMed

    Walden, B E; Schuchman, G I; Sedge, R K

    1977-11-01

    The comfort level method (Carhart, 1946) probably is the most widely used procedure for setting the acoustic gain of hearing aids. A series of experiments were conducted to determine the test-retest reliability of the comfort level method and the relationship between the comfort settings established in a clinical test suite and the comfort settings utilized in more realistic daily listening situations. Adults with bilateral sensorineural hearing impairments were subjects. The results suggest that the comfort level method has good test-retest reliability for most clinical purposes. Further, clinically established comfort settings may accurately represent typical daily-use settings if the input level used to establish the comfort settings in the clinical environment is 70 dB SPL.

  13. Validation of two innovative methods to measure contaminant mass flux in groundwater.

    PubMed

    Goltz, Mark N; Close, Murray E; Yoon, Hyouk; Huang, Junqi; Flintoft, Mark J; Kim, Sehjong; Enfield, Carl

    2009-04-15

    The ability to quantify the mass flux of a groundwater contaminant that is leaching from a source area is critical to enable us to: (1) evaluate the risk posed by the contamination source and prioritize cleanup, (2) evaluate the effectiveness of source remediation technologies or natural attenuation processes, and (3) quantify a source term for use in models that may be applied to predict maximum contaminant concentrations in downstream wells. Recently, a number of new methods have been developed and subsequently applied to measure contaminant mass flux in groundwater in the field. However, none of these methods has been validated at larger than the laboratory-scale through a comparison of measured mass flux and a known flux that has been introduced into flowing groundwater. A couple of innovative flux measurement methods, the tandem circulation well (TCW) and modified integral pumping test (MIPT) methods, have recently been proposed. The TCW method can measure mass flux integrated over a large subsurface volume without extracting water. The TCW method may be implemented using two different techniques. One technique, the multi-dipole technique, is relatively simple and inexpensive, only requiring measurement of heads, while the second technique requires conducting a tracer test. The MIPT method is an easily implemented method of obtaining volume-integrated flux measurements. In the current study, flux measurements obtained using these two methods are compared with known mass fluxes in a three-dimensional, artificial aquifer. Experiments in the artificial aquifer show that the TCW multi-dipole and tracer test techniques accurately estimated flux, within 2% and 16%, respectively; although the good results obtained using the multi-dipole technique may be fortuitous. The MIPT method was not as accurate as the TCW method, underestimating flux by as much as 70%. MIPT method inaccuracies may be due to the fact that the method assumptions (two-dimensional steady

  14. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments

    NASA Astrophysics Data System (ADS)

    Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    2015-04-01

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml-1. The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml-1. The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml-1. All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms.

  15. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments.

    PubMed

    Abdel-Aziz, Omar; Ayad, Miriam F; Tadros, Mariam M

    2015-04-01

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml(-1). The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml(-1). The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml(-1). All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms.

  16. Spectrofluorimetric Method for Estimation of Curcumin in Rat Blood Plasma: Development and Validation

    NASA Astrophysics Data System (ADS)

    Trivedi, J.; Variya, B.; Gandhi, H.; Rathod, S. P.

    2016-01-01

    Curcumin is a medicinally important phytoconstituent of curcuminoids. The present study describes development of a simple method for estimation of curcumin in rat plasma. This method involves the use of spectrofluorimetry for evaluation of curcumin at 257 (Ex) and 504 nm (Em). Sample preparation involves only two steps: extraction of curcumin and drying the extract. Following this procedure, the samples are reconstituted with ethyl acetate, and relative fluorescence intensity is measured using a spectrofluorimeter. The method was validated as per CDER guidelines. The linearity of the method was found to be in the range of 100-500 ng/mL with accuracy and precision lying within 2% RSD. The LOD and LOQ were found to be 15.3 and 46.1 ng/mL, respectively. The method was applied for pharmacokinetic evaluation in rats, and AUC, Cmax, and Tmax were found to be 5580 ± 1006 h × ng/mL, 1526 ± 209 ng/mL, and 2.97 ± 0.28 h, respectively, with a plasma half-life of 1.14 ± 0.27 h.

  17. Development and validation of an HPLC method for tetracycline-related USP monographs.

    PubMed

    Hussien, Emad M

    2014-09-01

    A novel reversed-phase HPLC method was developed and validated for the assay of tetracycline hydrochloride and the limit of 4-epianhydrotetracycline hydrochloride impurity in tetracycline hydrochloride commercial bulk and pharmaceutical products. The method employed L1 (3 µm, 150 × 4.6 mm) columns, a mobile phase of 0.1% phosphoric acid and acetonitrile at a flow rate of 1.0 mL/min, and detection at 280 nm. The separation was performed in HPLC gradient mode. Forced degradation studies showed that tetracycline eluted as a spectrally pure peak and was well resolved from its degradation products. The fast degradation of tetracycline hydrochloride and 4-epianhydrotetracycline hydrochloride in solution was retarded by controlling the autosampler temperature at 4 °C and using 0.1% H3 PO4 as diluent. The robustness of the method was tested starting with the maximum variations allowed in the US Pharmacopeia (USP) general chapter Chromatography <621>. The method was linear over the range 80-120% of the assay concentration (0.1 mg/mL) for tetracycline hydrochloride and 50-150% of the acceptance criteria specified in the individual USP monographs for 4-epianhydrotetracycline hydrochloride. The limit of quantification for 4-epianhydrotetracycline hydrochloride was 0.1 µg/mL, 20 times lower than the acceptance criteria. The method was specific, precise, accurate and robust.

  18. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  19. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  20. A Novel Point-of-Care Biomarker Recognition Method: Validation by Detecting Marker for Diabetic Nephropathy

    PubMed Central

    Pentyala, Sahana; Muller, John; Tumillo, Thomas; Roy, Avijit; Mysore, Pooja; Pentyala, Srinivas

    2015-01-01

    Biological fluid collection to identify and analyze different disease markers is a routine and normal procedure in health care settings. Body fluids are as varied as urine, blood, mucus, cerebrospinal fluid (CSF), tears, semen, etc. The volumes of the collected fluids range from micro liters (e.g., tears, CSF) to tens and hundreds of milliliters (blood, urine, etc.). In some manifestations, a disease marker (particularly protein markers) can occur in trace amounts, yet the fluids collected are in large volumes. To identify these trace markers, cumbersome methods, expensive instruments, and trained personnel are required. We developed an easy method to rapidly capture, concentrate, and identify protein markers in large volumes of test fluids. This method involves the utilization of two antibodies recognizing two different epitopes of the protein biomarker. Antibody-1 helps to capture and concentrate the biomarker and Antibody-2 adsorbed or conjugated to nanogold beads will detect the biomarker. This method was validated in capturing and detecting lipocalin type prostaglandin-D2 synthase, a marker in urine that implicates diabetic nephropathy. A one-step collection, concentration, and detection device was designed based on this method. This device can replace many of the normal body fluid collection devices such as tubes and containers. A one-step fluid collection and biomarker capture and concentration device for rapid diagnosis of diseases has tremendous advantage in terms of cost and providing timely results. PMID:26854148

  1. Description and validation of a limb scatter retrieval method for Odin/OSIRIS

    NASA Astrophysics Data System (ADS)

    Tukiainen, S.; Hassinen, S.; SeppäLä, A.; Auvinen, H.; KyröLä, E.; Tamminen, J.; Haley, C. S.; Lloyd, N.; Verronen, P. T.

    2008-02-01

    In this paper we present the Modified Onion Peeling (MOP) inversion method, which is for the first time used to retrieve vertical profiles of stratospheric trace gases from Odin/OSIRIS limb scatter measurements. Since the original publication of the method in 2002, the method has undergone major modifications discussed here. The MOP method now uses a spectral microwindow for the NO2 retrieval, instead of the wide UV-visible band used for the ozone, air, and aerosol retrievals. We give a brief description of the algorithm itself and show its performance with both simulated and real data. Retrieved ozone and NO2 profiles from the OSIRIS measurements were compared with data from the GOMOS and HALOE instruments. No more than 5% difference was found between OSIRIS daytime and GOMOS nighttime ozone profiles between 21 and 45 km. The difference between OSIRIS and HALOE sunset NO2 mixing ratio profiles was at most 25% between 20 and 40 km. The neutral air density was compared with the ECMWF analyzed data and around 5% difference was found at altitudes from 20 to 55 km. However, OSIRIS observations yield as much as 80% greater aerosols number density than GOMOS observations between 15 and 35 km. These validation results indicate that the quality of MOP ozone, NO2, and neutral air is good. The new version of the method introduced here is also easily expanded to retrieve additional species of interest.

  2. European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.

    PubMed

    Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario

    2014-08-01

    The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories. PMID:24513055

  3. Enhancement of light propagation depth in skin: cross-validation of mathematical modeling methods.

    PubMed

    Kwon, Kiwoon; Son, Taeyoon; Lee, Kyoung-Joung; Jung, Byungjo

    2009-07-01

    Various techniques to enhance light propagation in skin have been studied in low-level laser therapy. In this study, three mathematical modeling methods for five selected techniques were implemented so that we could understand the mechanisms that enhance light propagation in skin. The five techniques included the increasing of the power and diameter of a laser beam, the application of a hyperosmotic chemical agent (HCA), and the whole and partial compression of the skin surface. The photon density profile of the five techniques was solved with three mathematical modeling methods: the finite element method (FEM), the Monte Carlo method (MCM), and the analytic solution method (ASM). We cross-validated the three mathematical modeling results by comparing photon density profiles and analyzing modeling error. The mathematical modeling results verified that the penetration depth of light can be enhanced if incident beam power and diameter, amount of HCA, or whole and partial skin compression is increased. In this study, light with wavelengths of 377 nm, 577 nm, and 633 nm was used.

  4. Validation of an official control method for enumeration of authorised probiotic yeast in animal feed.

    PubMed

    Leuschner, Renata G K; Bew, Jan; Bertin, Gérard

    2003-03-01

    An official control method in the framework of Council Directive 70/524/EEC for probiotic yeast used as feed additives was validated in a collaborative study by twenty laboratories in 12 European Countries. A pour plate method following ISO 7954 using chloramphenicol glucose yeast extract (CGYE) and a plate count method using CHROMagar Candida were used. Precision data in terms of repeatability (r) and reproducibility (R) of the method using different feeding stuffs and three inoculation levels were determined. Yeast was present in the samples in mixtures with other probiotic feed additives at a lower, a higher concentration or not present. The enumeration of yeast on CGYE agar showed for the lower and higher concentration a RSD(r) of 2.4-4.9% and a RSD(R) of 7.7-8%, respectively and was preferred by the majority of labs. CHROMagar Candida had a RSD(r) of 1.9-2.8% and a RSD(R) of 1.9-5.9%. For routine analysis the use of the pour plate technique is recommended. CHROMagar Candida can be used for confirmation of the species Saccharomyces cerevisiae. The methods are not recommended for mineral feeds. The results from this study are intended for consideration for adoption as CEN and ISO standards. PMID:12747423

  5. Validation of a novel derivatization method for GC-ECD determination of acrylamide in food.

    PubMed

    Notardonato, Ivan; Avino, Pasquale; Centola, Angela; Cinelli, Giuseppe; Russo, Mario Vincenzo

    2013-07-01

    This paper proposes a new method for quantitative analysis of acrylamide in cereal-based foods and potato chips. The method uses reaction with trifluoroacetic anhydride, and analyses the resulting derivative by use of gas chromatography with electron-capture detection (GC-ECD). The effects of derivatization conditions, including temperature, reaction time, and catalyst, on the acylation reaction were evaluated. Chromatographic analysis was performed on an SE-54 capillary column. Under the optimum conditions, good retention and peak response were achieved for the acrylamide derivative. The analytical method was fully validated by assessment of LODs and LOQs (1 ng g(-1) and 25 ng g(-1), with relative standard deviations (RSD) 2.1 and 3.6, respectively), linearity (R = 0.9935 over the range 0.03-10 μg g(-1)), and extraction recovery (>96%, with RSD below 2.0, for acrylamide spiked at 1, 20, 50, and 100 ng g(-1); 99.8% for acrylamide content >1000 ng g(-1)). The method requires no clean-up of the acrylamide derivative before injection. The method has been successfully used to determine acrylamide levels in different commercial cereal-based foods, French fries, and potato chips. PMID:23660693

  6. European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.

    PubMed

    Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario

    2014-08-01

    The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories.

  7. Validation of prescribing appropriateness criteria for older Australians using the RAND/UCLA appropriateness method

    PubMed Central

    Basger, Benjamin Joseph; Chen, Timothy Frank; Moles, Rebekah Jane

    2012-01-01

    Objective To further develop and validate previously published national prescribing appropriateness criteria to assist in identifying drug-related problems (DRPs) for commonly occurring medications and medical conditions in older (≥65 years old) Australians. Design RAND/UCLA appropriateness method. Participants A panel of medication management experts were identified consisting of geriatricians/pharmacologists, clinical pharmacists and disease management advisors to organisations that produce Australian evidence-based therapeutic publications. This resulted in a round-one panel of 15 members, and a round-two panel of 12 members. Main outcome measure Agreement on all criteria. Results Forty-eight prescribing criteria were rated. In the first rating round via email, there was disagreement regarding 17 of the criteria according to median panel ratings. During a face-to-face second round meeting, discussion resulted in retention of 25 criteria after amendments, agreement for 14 criteria with no changes required and deletion of 9 criteria. Two new criteria were added, resulting in a final validated list of 41 prescribing appropriateness criteria. Agreement after round two was reached for all 41 criteria, measured by median panel ratings and the amount of dispersion of panel ratings, based on the interpercentile range. Conclusions A set of 41 Australian prescribing appropriateness criteria were validated by an expert panel. Use of these criteria, together with clinical judgement and other medication review processes such as patient interview, is intended to assist in improving patient care by efficiently detecting potential DRPs related to commonly occurring medicines and medical conditions in older Australians. These criteria may also contribute to the medication management education of healthcare professionals. PMID:22983875

  8. CE-C(4)D method development and validation for the assay of ciprofloxacin.

    PubMed

    Paul, Prasanta; Van Laeken, Christophe; Sänger-van de Griend, Cari; Adams, Erwin; Van Schepdael, Ann

    2016-09-10

    A capillary electrophoresis method with capacitively coupled contactless conductivity detection (CE-C(4)D) has been developed, optimized and validated for the determination of ciprofloxacin. Ciprofloxacin is a member of the fluoroquinolone antibiotics with a broad spectrum bactericidal activity and recommended for complicated respiratory infections, sexually transmitted diseases, tuberculosis, bacterial diarrhea etc. Method development was conducted with major focus on the quality by design (QbD) approach. During method development, multiple buffers were tried at different ionic strength. However, the optimized method finally involved a very simple background electrolyte, monosodium citrate at a concentration of 10mM without pH adjustment. The optimized CE-C(4)D method involved an uncoated fused silica capillary (59/39cm, 50μm i.d.) and hydrodynamic sample injection at a pressure of 0.5 p.s.i. for 5s. The actual separation was conducted for 10min at normal polarity with a voltage of 20kV corresponding to 5.9μA current. LiCl (1mg/mL) was used as an internal standard. The optimized method is robust and accurate (recovery >98%) which rendered the ciprofloxacin peak within five minutes with good linearity (R(2)>0.999) in the concentration range of 0.0126-0.8mg/mL. The repeatability is expressed by percentage relative standard deviation (%RSD) of the relative peak areas (RPA) and it showed good repeatability both intra-day (<3%) and inter-day (3.1%). This method, proven to be free of matrix interference, showed that the estimated percent content of ciprofloxacin (102%) was within the official requirements. Moreover, due to its ease of use and robustness, the method should also be applicable in less well controlled laboratory environments. PMID:27386824

  9. A validated spectrofluorimetric method for the determination of nifuroxazide through coumarin formation using experimental design

    PubMed Central

    2013-01-01

    Background Nifuroxazide (NF) is an oral nitrofuran antibiotic, having a wide range of bactericidal activity against gram positive and gram negative enteropathogenic organisms. It is formulated either in single form, as intestinal antiseptic or in combination with drotaverine (DV) for the treatment of gastroenteritis accompanied with gastrointestinal spasm. Spectrofluorimetry is a convenient and sensitive technique for pharmaceutical quality control. The new proposed spectrofluorimetric method allows its determination either in single form or in binary mixture with DV. Furthermore, experimental conditions were optimized using the new approach: Experimental design, which has many advantages over the old one, one variable at a time (OVAT approach). Results A novel and sensitive spectrofluorimetric method was designed and validated for the determination of NF in pharmaceutical formulation. The method was based upon the formation of a highly fluorescent coumarin compound by the reaction between NF and ethylacetoacetate (EAA) using sulfuric acid as catalyst. The fluorescence was measured at 390 nm upon excitation at 340 nm. Experimental design was used to optimize experimental conditions. Volumes of EAA and sulfuric acid, temperature and heating time were considered the critical factors to be studied in order to establish an optimum fluorescence. Each two factors were co-tried at three levels. Regression analysis revealed good correlation between fluorescence intensity and concentration over the range 20–400 ng ml-1. The suggested method was successfully applied for the determination of NF in pure and capsule forms. The procedure was validated in terms of linearity, accuracy, precision, limit of detection and limit of quantification. The selectivity of the method was investigated by analysis of NF in presence of the co-mixed drug DV where no interference was observed. The reaction pathway was suggested and the structure of the fluorescent product was proposed

  10. Fatal Intoxication Involving 3-MeO-PCP: A Case Report and Validated Method.

    PubMed

    Bakota, Erica; Arndt, Crystal; Romoser, Amelia A; Wilson, Stephen K

    2016-09-01

    We present in this case report a validated method for accurate quantitative analysis of 3-methoxy phencyclidine (3-MeO-PCP) to determine postmortem blood concentrations of this PCP analog. A 29-year-old male with a history of illicit drug use was found unresponsive in his bed with a bag of white powder next to him. Resuscitation efforts were unsuccessful and the individual was pronounced dead 9 minutes after arrival to the hospital. Initial ELISA screening suggested the presence of PCP in the decedent's blood. However, confirmatory testing revealed no detectable PCP. Instead, a large peak corresponding to a m/z 274.218 species with retention time similar to PCP was present on a LC-TOF-MS drug screen, suggesting a possible PCP analog. This mass corresponds specifically to a methoxy-PCP analog, several of which are available for purchase online. Standards for 3-MeO-PCP and 4-MeO-PCP were obtained and injected on the same instrument. Although the 3- and 4-MeO-PCP analogs have identical masses and retention times, they are still distinguishable through their mass spectra. The peak from the decedent's sample matched both the mass spectrum and the retention time of 3-MeO-PCP. A quantitative LC-MS-MS method was subsequently developed and validated for casework. Analysis using this method revealed a concentration of 139 ± 41 µg/L 3-MeO-PCP in the decedent's blood. Diphenhydramine (4.1 ± 0.7 mg/L), marijuana metabolite (presumptive positive, confirmation not performed) and a small amount of amphetamine (<0.10 mg/L) were also found in the decedent's blood. The cause of death was determined to be combined 3-MeO-PCP, diphenhydramine and amphetamine toxicity. The manner of death was certified as an accident. PMID:27339479

  11. Development and validation of a LC-MS method for quantitation of ergot alkaloids in lateral saphenous vein tissue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A liquid chromatography-mass spectrometry (LC/MS) method for simultaneous quantitation of seven ergot alkaloids (lysergic acid, ergonovine, ergovaline, ergocornine, ergotamine, ergocryptine and ergocrystine) in vascular tissue was developed and validated. Reverse-phase chromatography, coupled to an...

  12. In-house validation and quality control of real-time PCR methods for GMO detection: a practical approach.

    PubMed

    Ciabatti, I; Froiio, A; Gatto, F; Amaddeo, D; Marchesi, U

    2006-01-01

    GMO detection and quantification methods in the EU are mainly based on real-time PCR. The analytical methods in use must be validated, first on an intra-laboratory scale and through a collaborative trial thereafter. Since a consensual protocol for intra-laboratory validation of real-time PCR methods is lacking, we provide a practical approach for the in-house validation of quantitative real-time PCR methods, establishing acceptability criteria and quality controls for PCR runs. Parameters such as limit of detection, limit of quantification, precision, trueness, linear dynamic range, PCR efficiency, robustness and specificity are considered. The protocol is sufficiently detailed to be directly applicable, increases the reliability of results and their harmonization among different laboratories, and represents a necessary preliminary step before proceeding to a time-consuming and costly full validation study.

  13. A new validated analytical method for the quality control of red ginseng products

    PubMed Central

    Kim, Il-Woung; Cha, Kyu-Min; Wee, Jae Joon; Ye, Michael B.; Kim, Si-Kwan

    2013-01-01

    The main active components of Panax ginseng are ginsenosides. Ginsenoside Rb1 and Rg1 are accepted as marker substances for quality control worldwide. The analytical methods currently used to detect these two compounds unfairly penalize steamed and dried (red) P. ginseng preparations, because it has a lower content of those ginsenosides than white ginseng. To manufacture red ginseng products from fresh ginseng, the ginseng roots are exposed to high temperatures for many hours. This heating process converts the naturally occurring ginsenoside Rb1 and Rg1 into artifact ginsenosides such as ginsenoside Rg3, Rg5, Rh1, and Rh2, among others. This study highlights the absurdity of the current analytical practice by investigating the time-dependent changes in the crude saponin and the major natural and artifact ginsenosides contents during simmering. The results lead us to recommend (20S)- and (20R)-ginsenoside Rg3 as new reference materials to complement the current P. ginseng preparation reference materials ginsenoside Rb1 and Rg1. An attempt has also been made to establish validated qualitative and quantitative analytical procedures for these four compounds that meet International Conference of Harmonization (ICH) guidelines for specificity, linearity, range, accuracy, precision, detection limit, quantitation limit, robustness and system suitability. Based on these results, we suggest a validated analytical procedure which conforms to ICH guidelines and equally values the contents of ginsenosides in white and red ginseng preparations. PMID:24235862

  14. Method validation and analysis of nine dithiocarbamates in fruits and vegetables by LC-MS/MS.

    PubMed

    Schmidt, B; Christensen, H B; Petersen, A; Sloth, J J; Poulsen, M E

    2013-01-01

    An analytical method for separation and quantitative determination of nine dithiocarbamates (DTCs) in fruits and vegetables by using LC-MS/MS was developed, validated and applied to samples purchased in local supermarkets. The nine DTCs were ziram, ferbam, thiram, maneb, zineb, nabam, metiram, mancozeb and propineb. Validation parameters of mean recovery for two matrices at two concentration levels, relative repeatability (RSDr), relative within-laboratory reproducibility (RSDR) and LOD were obtained for the nine DTCs. The results from the analysis of fruits and vegetables served as the basis for an exposure assessment within the given commodities and a risk assessment by comparing the calculated exposure to the acceptable daily intake and acute reference dose for various exposure groups. The analysis indicated positive findings of DTCs in apples, pears, plums, table grapes, papaya and broccoli at concentrations ranging from 0.03 mg/kg to 2.69 mg/kg expressed as the equivalent amount of CS2. None of the values exceeded the Maximum residue level (MRL) set by the European Union, and furthermore, it was not possible to state whether illegal use had taken place or not, because a clear differentiation between the various DTCs in the LC-MS/MS analysis was lacking. The exposure and risk assessment showed that only for maneb in the case of apples and apple juice, the acute reference dose was exceeded for infants in the United Kingdom and for children in Germany, respectively. PMID:23799268

  15. Verification and Validation of a Chemical Reaction Solver Coupled to the Piecewise Parabolic Method

    NASA Astrophysics Data System (ADS)

    Attal, Nitesh; Ramaprabhu, Praveen; Hossain, Jahed; Karkhanis, Varad; Roy, Sukesh; Gord, James; Uddin, Mesbah

    2012-11-01

    We present a detailed chemical kinetics reaction solver coupled to the Piecewise Parabolic Method (PPM) embedded in the widely used astrophysical FLASH code. The FLASH code solves the compressible Euler equations with a directionally split, PPM with Adaptive Mesh Refinement (AMR). The reaction network is solved using a library of coupled ODE solvers, specialized for handling stiff systems of equations. Finally, the diffusion of heat, mass, and momentum is handled either through an update of the fluxes of each quantity, or by directly solving a diffusion equation for each. The resulting product is capable of handling a variety of physics such as gas-phase chemical kinetics, diffusive transport of mass, momentum, and heat, shocks, sharp interfaces, multi-species mixtures, and thermal radiation. We will present results from verification and validation of the above capabilities through comparison with analytical solutions, and published numerical and experimental data. Our validation cases include advection of reacting fronts in 1-D and 2D, laminar premixed flames in a Bunsen burner configuration, and shock-driven combustion. We acknowledge funding from Spectral Energies LLC.

  16. A new validated analytical method for the quality control of red ginseng products.

    PubMed

    Kim, Il-Woung; Cha, Kyu-Min; Wee, Jae Joon; Ye, Michael B; Kim, Si-Kwan

    2013-10-01

    The main active components of Panax ginseng are ginsenosides. Ginsenoside Rb1 and Rg1 are accepted as marker substances for quality control worldwide. The analytical methods currently used to detect these two compounds unfairly penalize steamed and dried (red) P. ginseng preparations, because it has a lower content of those ginsenosides than white ginseng. To manufacture red ginseng products from fresh ginseng, the ginseng roots are exposed to high temperatures for many hours. This heating process converts the naturally occurring ginsenoside Rb1 and Rg1 into artifact ginsenosides such as ginsenoside Rg3, Rg5, Rh1, and Rh2, among others. This study highlights the absurdity of the current analytical practice by investigating the time-dependent changes in the crude saponin and the major natural and artifact ginsenosides contents during simmering. The results lead us to recommend (20S)- and (20R)-ginsenoside Rg3 as new reference materials to complement the current P. ginseng preparation reference materials ginsenoside Rb1 and Rg1. An attempt has also been made to establish validated qualitative and quantitative analytical procedures for these four compounds that meet International Conference of Harmonization (ICH) guidelines for specificity, linearity, range, accuracy, precision, detection limit, quantitation limit, robustness and system suitability. Based on these results, we suggest a validated analytical procedure which conforms to ICH guidelines and equally values the contents of ginsenosides in white and red ginseng preparations.

  17. Advanced validation of CFD-FDTD combined method using highly applicable solver for reentry blackout prediction

    NASA Astrophysics Data System (ADS)

    Takahashi, Yusuke

    2016-01-01

    An analysis model of plasma flow and electromagnetic waves around a reentry vehicle for radio frequency blackout prediction during aerodynamic heating was developed in this study. The model was validated based on experimental results from the radio attenuation measurement program. The plasma flow properties, such as electron number density, in the shock layer and wake region were obtained using a newly developed unstructured grid solver that incorporated real gas effect models and could treat thermochemically non-equilibrium flow. To predict the electromagnetic waves in plasma, a frequency-dependent finite-difference time-domain method was used. Moreover, the complicated behaviour of electromagnetic waves in the plasma layer during atmospheric reentry was clarified at several altitudes. The prediction performance of the combined model was evaluated with profiles and peak values of the electron number density in the plasma layer. In addition, to validate the models, the signal losses measured during communication with the reentry vehicle were directly compared with the predicted results. Based on the study, it was suggested that the present analysis model accurately predicts the radio frequency blackout and plasma attenuation of electromagnetic waves in plasma in communication.

  18. Validation of the Delvotest SP NT DA. Performance Tested Method 011101.

    PubMed

    Hennart, Stephen L A; Faragher, John

    2012-01-01

    Delvotest SP NT DA is designed to test milk for the presence of antibacterial substances, such as antibiotics. The test is made of an agar gel containing bacterial spores and a color indicator. The milk sample is added onto the agar gel, and the test is incubated at 64 degrees C. The principle of the test is based on the diffusion of possible inhibitory substances that may be present in the milk sample into agar. This reduces growth and acid production by the test organism, and delays or prevents the agar from changing color from purple to yellow. The Delvotest Accelerator is an automated system in which the plates containing the milk to be analyzed are placed for incubation. The Accelerator automatically detects the end of the incubation and reads the results. A sample containing antibiotic will be noted as "positive." A sample without antibiotics or with antibiotics at concentrations below detection level will be noted as "negative." The present report includes all technical details about the Delvotest SP NT DA, and the results of the validation study. The validation study demonstrates that the Delvotest SP NT DA conforms to the product performance claims and confirms the robustness of the test. The Delvotest SP NT DA is, therefore, granted Performance Tested Method certification.

  19. Validation of the Delvotest SP NT DA. Performance Tested Method 011101.

    PubMed

    Hennart, Stephen L A; Faragher, John

    2012-01-01

    Delvotest SP NT DA is designed to test milk for the presence of antibacterial substances, such as antibiotics. The test is made of an agar gel containing bacterial spores and a color indicator. The milk sample is added onto the agar gel, and the test is incubated at 64 degrees C. The principle of the test is based on the diffusion of possible inhibitory substances that may be present in the milk sample into agar. This reduces growth and acid production by the test organism, and delays or prevents the agar from changing color from purple to yellow. The Delvotest Accelerator is an automated system in which the plates containing the milk to be analyzed are placed for incubation. The Accelerator automatically detects the end of the incubation and reads the results. A sample containing antibiotic will be noted as "positive." A sample without antibiotics or with antibiotics at concentrations below detection level will be noted as "negative." The present report includes all technical details about the Delvotest SP NT DA, and the results of the validation study. The validation study demonstrates that the Delvotest SP NT DA conforms to the product performance claims and confirms the robustness of the test. The Delvotest SP NT DA is, therefore, granted Performance Tested Method certification. PMID:22468368

  20. Development and Content Validation of the Information Assessment Method for Patients and Consumers

    PubMed Central

    Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan LM; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-01-01

    Background Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. Objective We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Methods Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. Results The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded

  1. Experimental validation of a method characterizing bow tie filters in CT scanners using a real-time dose probe

    SciTech Connect

    McKenney, Sarah E.; Nosratieh, Anita; Gelskey, Dale; Yang Kai; Huang Shinying; Chen Lin; Boone, John M.

    2011-03-15

    Purpose: Beam-shaping or ''bow tie'' (BT) filters are used to spatially modulate the x-ray beam in a CT scanner, but the conventional method of step-and-shoot measurement to characterize a beam's profile is tedious and time-consuming. The theory for characterization of bow tie relative attenuation (COBRA) method, which relies on a real-time dosimeter to address the issues of conventional measurement techniques, was previously demonstrated using computer simulations. In this study, the feasibility of the COBRA theory is further validated experimentally through the employment of a prototype real-time radiation meter and a known BT filter. Methods: The COBRA method consisted of four basic steps: (1) The probe was placed at the edge of a scanner's field of view; (2) a real-time signal train was collected as the scanner's gantry rotated with the x-ray beam on; (3) the signal train, without a BT filter, was modeled using peak values measured in the signal train of step 2; and (4) the relative attenuation of the BT filter was estimated from filtered and unfiltered data sets. The prototype probe was first verified to have an isotropic and linear response to incident x-rays. The COBRA method was then tested on a dedicated breast CT scanner with a custom-designed BT filter and compared to the conventional step-and-shoot characterization of the BT filter. Using basis decomposition of dual energy signal data, the thickness of the filter was estimated and compared to the BT filter's manufacturing specifications. The COBRA method was also demonstrated with a clinical whole body CT scanner using the body BT filter. The relative attenuation was calculated at four discrete x-ray tube potentials and used to estimate the thickness of the BT filter. Results: The prototype probe was found to have a linear and isotropic response to x-rays. The relative attenuation produced from the COBRA method fell within the error of the relative attenuation measured with the step-and-shoot method

  2. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation

    PubMed Central

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2–90.8% and 83.3–86.9% and a specificity of 97.7–98.8% and 95.1–97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  3. Columbia River Stock Identification Study; Validation of Genetic Method, 1980-1981 Final Report.

    SciTech Connect

    Milner, George B.; Teel, David J.; Utter, Fred M.

    1981-06-01

    The reliability of a method for obtaining maximum likelihood estimate of component stocks in mixed populations of salmonids through the frequency of genetic variants in a mixed population and in potentially contributing stocks was tested in 1980. A data base of 10 polymorphic loci from 14 hatchery stocks of spring chinook salmon of the Columbia River was used to estimate proportions of these stocks in four different blind'' mixtures whose true composition was only revealed subsequent to obtaining estimates. The accuracy and precision of these blind tests have validated the genetic method as a valuable means for identifying components of stock mixtures. Properties of the genetic method were further examined by simulation studies using the pooled data of the four blind tests as a mixed fishery. Replicated tests with samples sizes between 100 and 1,000 indicated that actual standard deviations on estimated contributions were consistently lower than calculated standard deviations; this difference diminished as sample size increased. It is recommended that future applications of the method be preceded by simulation studies that will identify appropriate levels of sampling required for acceptable levels of accuracy and precision. Variables in such studies include the stocks involved, the loci used, and the genetic differentiation among stocks. 8 refs., 6 figs., 4 tabs.

  4. Validation of a GC-MS screening method for anabolizing agents in aqueous nutritional supplements.

    PubMed

    Thuyne, W Van; Delbeke, F T

    2005-01-01

    A sensitive and selective method for the screening of anabolizing agents in aqueous nutritional supplements is described and validated. A total of 28 different anabolizing agents are screened for, including testosterone and prohormones, nandrolone and prohormones, stanozolol, and metandienone. The different analytes are extracted from the aqueous nutritional supplements by liquid-liquid extraction with a mixture of pentane and freshly distilled diethylether (1:1) after the supplements have been made alkaline with a NaHCO3-K2CO3 (2:1) buffer. The anabolizing agents are derivatized with a mixture of MSTFA-NH4I-ethanethiol (320:1:2) as routinely used for the screening of anabolic steroids extracted from urine. The derivatives are analyzed by gas chromatography (GC)-mass spectrometry (MS) in the selective ion monitoring mode. The limits of detection range from 1 to 10 ng/mL. One aqueous nutritional supplement (creatine serum) was analyzed with this screening method and was found to contain dehydroepiandrosterone (DHEA) at very low concentrations. The presence of DHEA could be confirmed with GC-MS-MS. Results of the application of this method and a similar method for solid nutritional supplements previously described are given. PMID:15808000

  5. The validity of two methods of mandibular superimposition: a comparison with tantalum implants.

    PubMed

    Springate, S D; Jones, A G

    1998-03-01

    The aim of this investigation was to examine and compare the validity of Björk's and Ricketts' methods for the superimposition of serial cephalometric radiographs of the mandible for the analysis of changes over the duration of routine orthodontic treatment in growing subjects (approximately 2 years). Pre- and posttreatment lateral cephalometric radiographs of 23 children, with tantalum markers implanted in the mandible, were studied. The differences in position of six dental and skeletal landmarks between superimposition on Björk's structures and on Ricketts' corpus axis were compared with those on the basis of the implants. A rotational effect was found for corpus axis resulting from differential movement of Xi point with growth, whereas Björk's method yielded results essentially similar to those of the implant-based superimposition. This resulted in statistically significant median differences between the two methods for all landmarks except pogonion and menton. The magnitude of the differences increased with distance from the central core of the mandible and were generally greater horizontally than vertically. Although most differences were less than 2 mm, approximately 10% of the subjects showed differences greater than 4 mm for molar and incisor landmarks. These findings suggest that, for growing subjects, Björk's method should be preferred Ricketts', which cannot be relied on to indicate the true (intramandibular) changes during orthodontic treatment in growing subjects.

  6. Collaborative validation of the quantification method for suspected allergens and test of an automated data treatment.

    PubMed

    Chaintreau, Alain; Cicchetti, Esmeralda; David, Nathalie; Earls, Andy; Gimeno, Pascal; Grimaud, Béatrice; Joulain, Daniel; Kupfermann, Nikolai; Kuropka, Gryta; Saltron, Frédéric; Schippa, Christine

    2011-10-28

    Previous publications investigated different data treatment strategies for quantification of volatile suspected allergens by GC/MS. This publication presents the validation results obtained on "ready to inject" samples under reproducibility conditions following inter-laboratory ring-testing. The approach is based on the monitoring of three selected ions per analyte using two different GC capillary columns. To aid the analysts a decisional tree is used for guidance during the interpretation of the analytical results. The method is evaluated using a fragrance oil concentrate spiked with all suspected allergens to mimic the difficulty of a real sample extract or perfume oil. At the concentrations of 10 and 100mg/kg, imposed by Directive 76/768/EEC for labeling of leave-on and rinse-off cosmetics, the mean bias is +14% and -4%, respectively. The method is linear for all analytes, and the prediction intervals for each analyte have been determined. To speed up the analyst's task, an automated data treatment is also proposed. The method mean bias is slightly shifted towards negative values, but the method prediction intervals are close to that resulting from the decisional tree.

  7. Validated HPLC-Diode Array Detector Method for Simultaneous Evaluation of Six Quality Markers in Coffee.

    PubMed

    Gant, Anastasia; Leyva, Vanessa E; Gonzalez, Ana E; Maruenda, Helena

    2015-01-01

    Nicotinic acid, N-methylpyridinium ion, and trigonelline are well studied nutritional biomarkers present in coffee, and they are indicators of thermal decomposition during roasting. However, no method is yet available for their simultaneous determination. This paper describes a rapid and validated HPLC-diode array detector method for the simultaneous quantitation of caffeine, trigonelline, nicotinic acid, N-methylpyridinium ion, 5-caffeoylquinic acid, and 5-hydroxymethyl furfural that is applicable to three coffee matrixes: green, roasted, and instant. Baseline separation among all compounds was achieved in 30 min using a phenyl-hexyl RP column (250×4.6 mm, 5 μm particle size), 0.3% aqueous formic buffer (pH 2.4)-methanol mobile phase at a flow rate of 1 mL/min, and a column temperature at 30°C. The method showed good linear correlation (r2>0.9985), precision (less than 3.9%), sensitivity (LOD=0.023-0.237 μg/mL; LOQ=0.069-0.711 μg/mL), and recovery (84-102%) for all compounds. This simplified method is amenable for a more complete routine evaluation of coffee in industry. PMID:25857885

  8. Validation of thermodesorption method for analysis of semi-volatile organic compounds adsorbed on wafer surface.

    PubMed

    Hayeck, Nathalie; Gligorovski, Sasho; Poulet, Irène; Wortham, Henri

    2014-05-01

    To prevent the degradation of the device characteristics it is important to detect the organic contaminants adsorbed on the wafers. In this respect, a reliable qualitative and quantitative analytical method for analysis of semi-volatile organic compounds which can adsorb on wafer surfaces is of paramount importance. Here, we present a new analytical method based on Wafer Outgassing System (WOS) coupled to Automated Thermal Desorber-Gas chromatography-Mass spectrometry (ATD-GC-MS) to identify and quantify volatile and semi-volatile organic compounds from 6", 8" and 12" wafers. WOS technique allows the desorption of organic compounds from one side of the wafers. This method was tested on three important airborne contaminants in cleanroom i.e. tris-(2-chloroethyl) phosphate (TCEP), tris-(2-chloroisopropyl) phosphate (TCPP) and diethyl phthalate (DEP). In addition, we validated this method for the analysis and quantification of DEP, TCEP and TCPP and we estimated the backside organic contamination which may contribute to the front side of the contaminated wafers. We are demonstrating that WOS/ATD-GC-MS is a suitable and highly efficient technique for desorption and quantitative analysis of organophosphorous compounds and phthalate ester which could be found on the wafer surface.

  9. Validation and application of an improved method for the rapid determination of proline in grape berries.

    PubMed

    Rienth, Markus; Romieu, Charles; Gregan, Rebecca; Walsh, Caroline; Torregrosa, Laurent; Kelly, Mary T

    2014-04-16

    A rapid and sensitive method is presented for the determination of proline in grape berries. Following acidification with formic acid, proline is derivatized by heating at 100 °C for 15 min with 3% ninhydrin in dimethyl sulfoxide, and the absorbance, which is stable for at least 60 min, is read at 520 nm. The method was statistically validated in the concentration range from 2.5 to 15 mg/L, giving a repeatability and intermediate precision of generally <3%; linearity was determined using the lack of fit test. Results obtained with this method concurred (r = 0.99) with those obtained for the same samples on an amino acid analyzer. In terms of sample preparation, a simple dilution (5-20-fold) is required, and sugars, primary amino acids, and anthocyanins were demonstrated not to interfere, as the latter are bleached by ninhydrin under the experimental conditions. The method was applied to the study of proline accumulation in the fruits of microvines grown in phytotrons, and it was established that proline accumulation and concentrations closely resemble those of field-grown macrovines. PMID:24617570

  10. Quantitative analysis of eugenol in clove extract by a validated HPLC method.

    PubMed

    Yun, So-Mi; Lee, Myoung-Heon; Lee, Kwang-Jick; Ku, Hyun-Ok; Son, Seong-Wan; Joo, Yi-Seok

    2010-01-01

    Clove (Eugenia caryophyllata) is a well-known medicinal plant used for diarrhea, digestive disorders, or in antiseptics in Korea. Eugenol is the main active ingredient of clove and has been chosen as a marker compound for the chemical evaluation or QC of clove. This paper reports the development and validation of an HPLC-diode array detection (DAD) method for the determination of eugenol in clove. HPLC separation was accomplished on an XTerra RP18 column (250 x 4.6 mm id, 5 microm) with an isocratic mobile phase of 60% methanol and DAD at 280 nm. Calibration graphs were linear with very good correlation coefficients (r2 > 0.9999) from 12.5 to 1000 ng/mL. The LOD was 0.81 and the LOQ was 2.47 ng/mL. The method showed good intraday precision (%RSD 0.08-0.27%) and interday precision (%RSD 0.32-1.19%). The method was applied to the analysis of eugenol from clove cultivated in various countries (Indonesia, Singapore, and China). Quantitative analysis of the 15 clove samples showed that the content of eugenol varied significantly, ranging from 163 to 1049 ppb. The method of determination of eugenol by HPLC is accurate to evaluate the quality and safety assurance of clove, based on the results of this study.

  11. Development and validation of a new method for measuring friction between skin and nonwoven materials.

    PubMed

    Cottenden, A M; Wong, W K; Cottenden, D J; Farbrot, A

    2008-07-01

    A new method for measuring the coefficient of friction between nonwoven materials and the curved surface of the volar forearm has been developed and validated. The method was used to measure the coefficient of static friction for three different nonwoven materials on the normal (dry) and over-hydrated volar forearms of five female volunteers (ages 18-44). The method proved simple to run and had good repeatability: the coefficient of variation (standard deviation expressed as a percentage of the mean) for triplets of repeat measurements was usually (80 per cent of the time) less than 10 per cent. Measurements involving the geometrically simpler configuration of pulling a weighted fabric sample horizontally across a quasi-planar area of volar forearm skin proved experimentally more difficult and had poorer repeatability. However, correlations between values of coefficient of static friction derived using the two methods were good (R = 0.81 for normal (dry) skin, and 0.91 for over-hydrated skin). Measurements of the coefficient of static friction for the three nonwovens for normal (dry) and for over-hydrated skin varied in the ranges of about 0.3-0.5 and 0.9-1.3, respectively. In agreement with Amontons' law, coefficients of friction were invariant with normal pressure over the entire experimental range (0.1-8.2 kPa).

  12. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    PubMed Central

    2012-01-01

    Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO). MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA) with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO) correctly identified (p < 0.05) microarray data in which genes annotated to differentially expressed GO terms are upregulated. We found that GSEA + MIMGO was slightly less effective than, or comparable to, GSEA (Pearson), a method that uses Pearson’s correlation as a metric, at detecting true differentially expressed GO terms. However, unlike other methods including GSEA (Pearson), GSEA + MIMGO can comprehensively identify the microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively. PMID:23232071

  13. Method validation of a survey of thevetia cardiac glycosides in serum samples.

    PubMed

    Kohls, Sarah; Scholz-Böttcher, Barbara; Rullkötter, Jürgen; Teske, Jörg

    2012-02-10

    A sensitive and specific liquid chromatography tandem mass spectrometry (HPLC-ESI(+)-MS/MS) procedure was developed and validated for the identification and quantification of thevetin B and further cardiac glycosides in human serum. The seeds of Yellow Oleander (Thevetia peruviana) contain cardiac glycosides that can cause serious intoxication. A mixture of six thevetia glycosides was extracted from these seeds and characterized. Thevetin B, isolated and efficiently purified from that mixture, is the main component and can be used as evidence. Solid phase extraction (SPE) proved to be an effective sample preparation method. Digoxin-d3 was used as the internal standard. Although ion suppression occurs, the limit of detection (LOD) is 0.27 ng/ml serum for thevetin B. Recovery is higher than 94%, and accuracy and precision were proficient. Method refinement was carried out with regard to developing a general screening method for cardiac glycosides. The assay is linear over the range of 0.5-8 ng/ml serum. Finally, the method was applied to a case of thevetia seed ingestion. PMID:21376490

  14. Development and validation of personal monitoring methods for low levels of acrylonitrile in workplace atmosphere. I. Test atmosphere generation and solvent desorption methods

    SciTech Connect

    Melcher, R.G.; Borders, R.A.; Coyne, L.B.

    1986-03-01

    The purpose of this study was to optimize monitoring methods and to investigate new technology for the determination of low levels of acrylonitrile (0.05 to 5 ppm) in workplace atmospheres. In the first phase of the study, a dynamic atmosphere generation system was developed to produce low levels of acrylonitrile in simulated workplace atmospheres. Various potential sorbents were investigated in the second phase, and the candidate methods were compared in a laboratory validation study over a concentration range from 0.05 to 5 ppm acrylonitrile in the presence of potential interferences and under relative humidity conditions from 30% to 95% RH. A collection tube containing 600 mg Pittsburgh coconut base charcoal was found to be the optimum tube for sampling for a full 8 -hr shift. No breakthrough was observed over the concentrations and humidities tested. The recovery was 91.3% with a total relative precision of +/-21% over the test range, and the recovery was not affected by storage for up to five weeks.

  15. Development and validation spectroscopic methods for the determination of lomefloxacin in bulk and pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    El-Didamony, A. M.; Hafeez, S. M.

    2016-01-01

    Four simple, sensitive spectrophotometric and spectrofluorimetric methods (A-D) for the determination of antibacterial drug lomefloxacin (LMFX) in pharmaceutical formulations have been developed. Method A is based on formation of ternary complex between Pd(II), eosin and LMFX in the presence of methyl cellulose as surfactant and acetate-HCl buffer pH 4.0. Spectrophotometrically, under the optimum conditions, the ternary complex showed absorption maximum at 530 nm. Methods B and C are based on redox reaction between LMFX and KMnO4 in acid and alkaline media. In indirect spectrophotometry method B the drug solution is treated with a known excess of KMnO4 in H2SO4 medium and subsequent determination of unreacted oxidant by reacting it with safronine O in the same medium at λmax = 520 nm. Direct spectrophotometry method C involves treating the alkaline solution of LMFX with KMnO4 and measuring the bluish green product at 604 nm. Method D is based on the chelation of LMFX with Zr(IV) to produce fluorescent chelate. At the optimum reaction conditions, the drug-metal chelate showed excitation maxima at 280 nm and emission maxima at 443 nm. The optimum experimental parameters for the reactions have been studied. The validity of the described procedures was assessed. Statistical analysis of the results has been carried out revealing high accuracy and good precision. The proposed methods were successfully applied for the determination of the selected drug in pharmaceutical preparations with good recoveries.

  16. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  17. Development and validation of LC methods for the separation of misoprostol related substances and diastereoisomers.

    PubMed

    Kahsay, Getu; Song, Huiying; Eerdekens, Fran; Tie, Yaxin; Hendriks, Danny; Van Schepdael, Ann; Cabooter, Deirdre; Adams, Erwin

    2015-01-01

    Misoprostol is a synthetic prostaglandin E1 analogue which is mainly used for prevention and treatment of gastric ulcers, but also for abortion due to its labour inducing effect. Misoprostol exists as a mixture of diastereoisomers (1:1) and has several related impurities owing to its instability at higher temperatures and moisture. A simple and robust reversed phase liquid chromatographic (RPLC) method is described for the separation of the related substances and a normal phase (NP) LC method for the separation of misoprostol diastereoisomers. The RPLC method was performed using an Ascentis Express C18 (150 mm × 4.6 mm, 5 μm) column kept at 35 °C. The mobile phase was a gradient mixture of mobile phase A (ACN-H2O-MeOH, 28:69:3 v/v/v) and mobile phase B (ACN-H2O-MeOH, 47:50:3 v/v/v) eluted at a flow rate of 1.5 mL/min. UV detection was performed at 200 nm. The NPLC method was undertaken by using an XBridge bare silica (150 mm × 2.1 mm, 3.5 μm) column at 35 °C. The mobile phase contained 1-propanol-heptane-TFA (4:96:0.1%, v/v/v), pumped at a flow rate of 0.5 mL/min. UV detection was performed at 205 nm. This LC method can properly separate the two diastereoisomers (Rs > 2) within an analysis time of less than 20 min. Both methods were validated according to the ICH guidelines. Furthermore, these new LC methods have been successfully applied for purity control and diastereoisomers ratio determination of misoprostol bulk drug, tablets and dispersion.

  18. Fit-for-purpose chromatographic method for the determination of amikacin in human plasma for the dosage control of patients.

    PubMed

    Ezquer-Garin, C; Escuder-Gilabert, L; Martín-Biosca, Y; Lisart, R Ferriols; Sagrado, S; Medina-Hernández, M J

    2016-04-01

    In this paper, a simple, rapid and sensitive method based on liquid chromatography with fluorimetric detection (HPLC-FLD) for the determination of amikacin (AMK) in human plasma is developed. Determination is performed by pre-column derivatization of AMK with ortho-phtalaldehyde (OPA) in presence of N-acetyl-L-cysteine (NAC) at pH 9.5 for 5 min at 80 °C. In our knowledge, this is the first time that NAC has been used in AMK derivatization. Derivatization conditions (pH, AMK/OPA/NAC molar ratios, temperature and reaction time) are optimized to obtain a single and stable, at room temperature, derivative. Separation of the derivative is achieved on a reversed phase LC column (Kromasil C18, 5 μm, 150 × 4.6 i.d. mm) with a mobile phase of 0.05 M phosphate buffer:acetonitrile (80:20, v/v) pumped at flow rate of 1.0 mL/min. Detection is performed using 337 and 439 nm for excitation and emission wavelengths, respectively. The method is fitted for the purpose of being a competitive alternative to the currently used method in many hospitals for AMK dosage control: fluorescence polarization immunoassay (FPIA). The method exhibits linearity in the 0.17-10 µg mL(-1) concentration range with a squared correlation coefficient higher than 0.995. Trueness and intermediate precision are estimated using spiked drug free plasma samples, which fulfill current UNE-EN ISO15189:2007 accreditation schemes. Finally, for the first time, statistical comparison against the FPIA method is demonstrated using plasma samples from 31 patients under treatment with AMK.

  19. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    PubMed

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella. PMID:26268975

  20. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    PubMed

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  1. Validating Test Score Meaning and Defending Test Score Use: Different Aims, Different Methods

    ERIC Educational Resources Information Center

    Cizek, Gregory J.

    2016-01-01

    Advances in validity theory and alacrity in validation practice have suffered because the term "validity" has been used to refer to two incompatible concerns: (1) the degree of support for specified interpretations of test scores (i.e. intended score meaning) and (2) the degree of support for specified applications (i.e. intended test…

  2. Validation of qPCR Methods for the Detection of Mycobacterium in New World Animal Reservoirs

    PubMed Central

    Housman, Genevieve; Malukiewicz, Joanna; Boere, Vanner; Grativol, Adriana D.; Pereira, Luiz Cezar M.; Silva, Ita de Oliveira e; Ruiz-Miranda, Carlos R.; Truman, Richard; Stone, Anne C.

    2015-01-01

    Zoonotic pathogens that cause leprosy (Mycobacterium leprae) and tuberculosis (Mycobacterium tuberculosis complex, MTBC) continue to impact modern human populations. Therefore, methods able to survey mycobacterial infection in potential animal hosts are necessary for proper evaluation of human exposure threats. Here we tested for mycobacterial-specific single- and multi-copy loci using qPCR. In a trial study in which armadillos were artificially infected with M. leprae, these techniques were specific and sensitive to pathogen detection, while more traditional ELISAs were only specific. These assays were then employed in a case study to detect M. leprae as well as MTBC in wild marmosets. All marmosets were negative for M. leprae DNA, but 14 were positive for the mycobacterial rpoB gene assay. Targeted capture and sequencing of rpoB and other MTBC genes validated the presence of mycobacterial DNA in these samples and revealed that qPCR is useful for identifying mycobacterial-infected animal hosts. PMID:26571269

  3. Flight test validation of a frequency-based system identification method on an F-15 aircraft

    NASA Technical Reports Server (NTRS)

    Schkolnik, Gerard S.; Orme, John S.; Hreha, Mark A.

    1995-01-01

    A frequency-based performance identification approach was evaluated using flight data from the NASA F-15 Highly Integrated Digital Electronic Control aircraft. The approach used frequency separation to identify the effectiveness of multiple controls simultaneously as an alternative to independent control identification methods. Fourier transformations converted measured control and response data into frequency domain representations. Performance gradients were formed using multiterm frequency matching of control and response frequency domain models. An objective function was generated using these performance gradients. This function was formally optimized to produce a coordinated control trim set. This algorithm was applied to longitudinal acceleration and evaluated using two control effectors: nozzle throat area and inlet first ramp. Three criteria were investigated to validate the approach: simultaneous gradient identification, gradient frequency dependency, and repeatability. This report describes the flight test results. These data demonstrate that the approach can accurately identify performance gradients during simultaneous control excitation independent of excitation frequency.

  4. Validation of qPCR Methods for the Detection of Mycobacterium in New World Animal Reservoirs.

    PubMed

    Housman, Genevieve; Malukiewicz, Joanna; Boere, Vanner; Grativol, Adriana D; Pereira, Luiz Cezar M; Silva, Ita de Oliveira; Ruiz-Miranda, Carlos R; Truman, Richard; Stone, Anne C

    2015-11-01

    Zoonotic pathogens that cause leprosy (Mycobacterium leprae) and tuberculosis (Mycobacterium tuberculosis complex, MTBC) continue to impact modern human populations. Therefore, methods able to survey mycobacterial infection in potential animal hosts are necessary for proper evaluation of human exposure threats. Here we tested for mycobacterial-specific single- and multi-copy loci using qPCR. In a trial study in which armadillos were artificially infected with M. leprae, these techniques were specific and sensitive to pathogen detection, while more traditional ELISAs were only specific. These assays were then employed in a case study to detect M. leprae as well as MTBC in wild marmosets. All marmosets were negative for M. leprae DNA, but 14 were positive for the mycobacterial rpoB gene assay. Targeted capture and sequencing of rpoB and other MTBC genes validated the presence of mycobacterial DNA in these samples and revealed that qPCR is useful for identifying mycobacterial-infected animal hosts. PMID:26571269

  5. Advances in the Development and Validation of Test Methods in the United States

    PubMed Central

    Casey, Warren M.

    2016-01-01

    The National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) provides validation support for US Federal agencies and the US Tox21 interagency consortium, an interagency collaboration that is using high throughput screening (HTS) and other advanced approaches to better understand and predict chemical hazards to humans and the environment. The use of HTS data from assays relevant to the estrogen receptor signaling data pathway is used as an example of how HTS data can be combined with computational modeling to meet the needs of US agencies. As brief summary of US efforts in the areas of biologics testing, acute toxicity, and skin sensitization will also be provided. PMID:26977254

  6. Validation of qPCR Methods for the Detection of Mycobacterium in New World Animal Reservoirs.

    PubMed

    Housman, Genevieve; Malukiewicz, Joanna; Boere, Vanner; Grativol, Adriana D; Pereira, Luiz Cezar M; Silva, Ita de Oliveira; Ruiz-Miranda, Carlos R; Truman, Richard; Stone, Anne C

    2015-11-01

    Zoonotic pathogens that cause leprosy (Mycobacterium leprae) and tuberculosis (Mycobacterium tuberculosis complex, MTBC) continue to impact modern human populations. Therefore, methods able to survey mycobacterial infection in potential animal hosts are necessary for proper evaluation of human exposure threats. Here we tested for mycobacterial-specific single- and multi-copy loci using qPCR. In a trial study in which armadillos were artificially infected with M. leprae, these techniques were specific and sensitive to pathogen detection, while more traditional ELISAs were only specific. These assays were then employed in a case study to detect M. leprae as well as MTBC in wild marmosets. All marmosets were negative for M. leprae DNA, but 14 were positive for the mycobacterial rpoB gene assay. Targeted capture and sequencing of rpoB and other MTBC genes validated the presence of mycobacterial DNA in these samples and revealed that qPCR is useful for identifying mycobacterial-infected animal hosts.

  7. Validation of plasma shape reconstruction by Cauchy condition surface method in KSTAR

    SciTech Connect

    Miyata, Y.; Suzuki, T.; Ide, S.; Hahn, S. H.; Chung, J.; Bak, J. G.; Ko, W. H.

    2014-03-15

    Cauchy Condition Surface (CCS) method is a numerical approach to reconstruct the plasma boundary and calculate the quantities related to plasma shape using the magnetic diagnostics in real time. It has been applied to the KSTAR plasma in order to establish the plasma shape reconstruction with the high elongation of plasma shape and the large effect of eddy currents flowing in the tokamak structures for the first time. For applying the CCS calculation to the KSTAR plasma, the effects by the eddy currents and the ferromagnetic materials on the plasma shape reconstruction are studied. The CCS calculation includes the effect of eddy currents and excludes the magnetic diagnostics, which is expected to be influenced largely by ferromagnetic materials. Calculations have been performed to validate the plasma shape reconstruction in 2012 KSTAR experimental campaign. Comparison between the CCS calculation and non-magnetic measurements revealed that the CCS calculation can reconstruct the accurate plasma shape even with a small I{sub P}.

  8. A Method of Retrospective Computerized System Validation for Drug Manufacturing Software Considering Modifications

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Fukue, Yoshinori

    This paper proposes a Retrospective Computerized System Validation (RCSV) method for Drug Manufacturing Software (DMSW) that relates to drug production considering software modification. Because DMSW that is used for quality management and facility control affects big impact to quality of drugs, regulatory agency required proofs of adequacy for DMSW's functions and performance based on developed documents and test results. Especially, the work that explains adequacy for previously developed DMSW based on existing documents and operational records is called RCSV. When modifying RCSV conducted DMSW, it was difficult to secure consistency between developed documents and test results for modified DMSW parts and existing documents and operational records for non-modified DMSW parts. This made conducting RCSV difficult. In this paper, we proposed (a) definition of documents architecture, (b) definition of descriptive items and levels in the documents, (c) management of design information using database, (d) exhaustive testing, and (e) integrated RCSV procedure. As a result, we could conduct adequate RCSV securing consistency.

  9. A validated HPTLC method for determination of terbutaline sulfate in biological samples: Application to pharmacokinetic study

    PubMed Central

    Faiyazuddin, Md.; Rauf, Abdul; Ahmad, Niyaz; Ahmad, Sayeed; Iqbal, Zeenat; Talegaonkar, Sushma; Bhatnagar, Aseem; Khar, Roop K.; Ahmad, Farhan J.

    2011-01-01

    Terbutaline sulfate (TBS) was assayed in biological samples by validated HPTLC method. Densitometric analysis of TBS was carried out at 366 nm on precoated TLC aluminum plates with silica gel 60F254 as a stationary phase and chloroform–methanol (9.0:1.0, v/v) as a mobile phase. TBS was well resolved at RF 0.34 ± 0.02. In all matrices, the calibration curve appeared linear (r2 ⩾ 0.9943) in the tested range of 100–1000 ng spot−1 with a limit of quantification of 18.35 ng spot−1. Drug recovery from biological fluids averaged ⩾95.92%. In both matrices, rapid degradation of drug favored and the T0.5 of drug ranged from 9.92 to 12.41 h at 4 °C and from 6.31 to 9.13 h at 20 °C. Frozen at −20 °C, this drug was stable for at least 2 months (without losses >10%). The maximum plasma concentration (Cpmax) was found to be 5875.03 ± 114 ng mL−1, which is significantly higher than the maximum saliva concentration (Csmax, 1501.69 ± 96 ng mL−1). Therefore, the validated method could be used to carry out pharmacokinetic studies of the TBS from novel drug delivery systems. PMID:23960758

  10. Development, Quantification, Method Validation, and Stability Study of a Novel Fucoxanthin-Fortified Milk.

    PubMed

    Mok, Il-Kyoon; Yoon, Jung-Ro; Pan, Cheol-Ho; Kim, Sang Min

    2016-08-10

    To extend the scope of application of fucoxanthin, a marine carotenoid, whole milk (WM) and skimmed milk (SM) were fortified with fucoxanthin isolated from the microalga Phaeodactylum tricornutum to a final 8 μg/mL milk solution concentration. Using these liquid systems, a fucoxanthin analysis method implementing extraction and HPLC-DAD was developed and validated by accuracy, precision, system suitability, and robustness tests. The current method demonstrated good linearity over the range of 0.125-100 μg/mL fucoxanthin with R(2) = 1.0000, and all validation data supported its adequacy for use in fucoxanthin analysis from milk solution. To investigate fucoxanthin stability during milk production and distribution, fucoxanthin content was examined during storage, pasteurization, and drying processes under various conditions. Fucoxanthin in milk solutions showed better stabilizing effect in 1 month of storage period. Degradation rate constant (k) on fucoxanthin during this storage period suggested that fucoxanthin stability might be negatively correlated with decrease of temperature and increase of protein content such as casein and whey protein in milk matrix. In a comparison between SM and WM, fucoxantin in SM always showed better stability than that in WM during storage and three kinds of drying processes. This effect was also deduced to relate with protein content. In the pasteurization step, >91% of fucoxanthin was retained after three pasteurization processes even though the above trend was not found. This study demonstrated for the first time that milk products can be used as a basic food matrix for fucoxanthin application and that protein content in milk is an important factor for fucoxanthin stability. PMID:27455130

  11. Investigating the impact of scaling methods in soil moisture validation on error metrics

    NASA Astrophysics Data System (ADS)

    Xaver, Angelika; Steiner, Caroline; Paulik, Christoph; Wagner, Wolfgang

    2014-05-01

    Soil moisture is an important parameter for the climate system and environment. It is currently measured via remote sensing techniques as well as through in situ observations. The validation of remote sensing products plays a key role in quality control, improvement of retrieval algorithms and in setting benchmarks for future means of observation or new satellite missions. Retrieval of absolute soil moisture values is difficult and depends on ancillary information from soil porosity maps. Because of this, remotely sensed products often have a systematic bias to in situ soil moisture observations. To alleviate this issue, most published validation studies use one or several scaling techniques to remove these systematic biases. The aim of this study is to determine the impact of different scaling methods on often used error metrics like Pearson's (R) and Spearman's (rho) correlation coefficient, root mean square difference (RMSD) and bias. For this study, remotely sensed soil moisture from the ASCAT sensor onboard Metop - A and B will be compared to in situ data from the International Soil Moisture Network (ISMN; http://ismn.geo.tuwien.ac.at/). The four most popular scaling methods found in the literature will be used to remove systematic biases. These are: minimum-maximum scaling, mean-standard deviation scaling, linear regression scaling and CDF matching. Preliminary results, calculated with two years of data, show that CDF matching, being a non-linear transformation, can have a significant impact on R while leaving rho unchanged. The mean R of the 248 examined time series changed by 0.096. The maximum difference occurred when CDF matching changed R from -0.12 to 0.79, resulting in a difference of 0.91. This indicates that it can be problematic to use Pearson's R in combination with non-linear scaling.

  12. A simple method for validation and verification of pipettes mounted on automated liquid handlers.

    PubMed

    Stangegaard, Michael; Hansen, Anders J; Frøslev, Tobias G; Morling, Niels

    2011-10-01

    We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared in quadruplicates in a flat-bottom 96-well microtiter plate, manually using calibrated pipettes. Each pipette of the liquid handler (1-8) dispensed a selected volume (1-200 μL) of OrangeG eight times into the wells of the microtiter plate. All wells contained a total of 200 μL liquid. The absorbance was read, and the dispensed volume of each pipette was calculated based on a plot of volume and absorbance of a known set of OrangeG dilutions. Finally, the percent inaccuracy (%d) and the imprecision (%CV) of each pipette were calculated. Using predefined acceptance criteria, each pipette was then either approved or failed. Failed pipettes were either repaired or the volume deviation was compensated for by applying a calibration curve in the liquid-handler software. We have implemented the procedure on a Sias Xantus, an MWGt TheONYX, four Tecan Freedom EVO, a Biomek NX Span-8, and four Biomek 3000 robots, and the methods are freely available. In conclusion, we have set up a simple, inexpensive, and fast solution for the continuous validation of ALHs used for accredited work according to the ISO 17025 standard. The method is easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates. PMID:21906565

  13. Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.

    PubMed

    Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer

    2016-09-01

    Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types. PMID:27343591

  14. Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.

    PubMed

    Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer

    2016-09-01

    Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types.

  15. A validated ultra high pressure liquid chromatographic method for qualification and quantification of folic acid in pharmaceutical preparations.

    PubMed

    Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J

    2011-04-01

    A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption.

  16. An assessment to evaluate the validity of different methods for the description of some corrosion inhibitors.

    PubMed

    Aboelnga, M M; Awad, M K; Gauld, J W; Mustafa, M R

    2014-09-01

    New research and development efforts using computational chemistry in studying an assessment of the validity of different quantum chemical methods to describe the molecular and electronic structures of some corrosion inhibitors were introduced. The standard and the highly accurate CCSD method with 6-311++G(d,p), ab initio calculations using the HF/6-31G++(d,p) and MP2 with 6-311G(d,p), 6-31++G(d,p), and 6-311++G(2df,p) methods as well as DFT method at the B3LYP, BP86, B3LYP*, M06L, and M062x/6-31G++(d,p) basis set level were performed on some triazole derivatives and sulfur containing compounds used as corrosion inhibitors. Quantum chemical parameters, such as the energy of the highest occupied molecular orbital energy (E(HOMO)), the energy of the lowest unoccupied molecular orbital energy (E(LUMO)), energy gap (ΔE), dipole moment (μ), sum of total negative charges (TNC), chemical potential (Pi), electronegativity (χ), hardness (η), softness (σ), local softness (s), Fukui functions (f (+),f (-)), electrophilicity (ω), the total energy change (∆E(T)) and the solvation energy (S.E), were calculated. Furthermore, the accuracy and the applicability of these methods were estimated relative to the highest accuracy and standard CCSD with 6-311++G(d,p) method. Good correlations between the quantum chemical parameters and the corresponding inhibition efficiency (IE%) were found.

  17. Validation Methods Research for Fault-Tolerant Avionics and Control Systems: Working Group Meeting, 2

    NASA Technical Reports Server (NTRS)

    Gault, J. W. (Editor); Trivedi, K. S. (Editor); Clary, J. B. (Editor)

    1980-01-01

    The validation process comprises the activities required to insure the agreement of system realization with system specification. A preliminary validation methodology for fault tolerant systems documented. A general framework for a validation methodology is presented along with a set of specific tasks intended for the validation of two specimen system, SIFT and FTMP. Two major areas of research are identified. First, are those activities required to support the ongoing development of the validation process itself, and second, are those activities required to support the design, development, and understanding of fault tolerant systems.

  18. Validation and Clinical Evaluation of a Novel Method To Measure Miltefosine in Leishmaniasis Patients Using Dried Blood Spot Sample Collection

    PubMed Central

    Rosing, H.; Hillebrand, M. J. X.; Blesson, S.; Mengesha, B.; Diro, E.; Hailu, A.; Schellens, J. H. M.; Beijnen, J. H.

    2016-01-01

    To facilitate future pharmacokinetic studies of combination treatments against leishmaniasis in remote regions in which the disease is endemic, a simple cheap sampling method is required for miltefosine quantification. The aims of this study were to validate a liquid chromatography-tandem mass spectrometry method to quantify miltefosine in dried blood spot (DBS) samples and to validate its use with Ethiopian patients with visceral leishmaniasis (VL). Since hematocrit (Ht) levels are typically severely decreased in VL patients, returning to normal during treatment, the method was evaluated over a range of clinically relevant Ht values. Miltefosine was extracted from DBS samples using a simple method of pretreatment with methanol, resulting in >97% recovery. The method was validated over a calibration range of 10 to 2,000 ng/ml, and accuracy and precision were within ±11.2% and ≤7.0% (≤19.1% at the lower limit of quantification), respectively. The method was accurate and precise for blood spot volumes between 10 and 30 μl and for Ht levels of 20 to 35%, although a linear effect of Ht levels on miltefosine quantification was observed in the bioanalytical validation. DBS samples were stable for at least 162 days at 37°C. Clinical validation of the method using paired DBS and plasma samples from 16 VL patients showed a median observed DBS/plasma miltefosine concentration ratio of 0.99, with good correlation (Pearson's r = 0.946). Correcting for patient-specific Ht levels did not further improve the concordance between the sampling methods. This successfully validated method to quantify miltefosine in DBS samples was demonstrated to be a valid and practical alternative to venous blood sampling that can be applied in future miltefosine pharmacokinetic studies with leishmaniasis patients, without Ht correction. PMID:26787691

  19. Clinical validation of a type-specific real-time quantitative human papillomavirus PCR against the performance of hybrid capture 2 for the purpose of cervical cancer screening.

    PubMed

    Depuydt, C E; Benoy, I H; Beert, J F A; Criel, A M; Bogers, J J; Arbyn, M

    2012-12-01

    To be acceptable for use in cervical cancer screening, a new assay that detects DNA of high-risk human papillomavirus (hrHPV) types must demonstrate high reproducibility and performance not inferior to that of a clinically validated HPV test. In the present study, a real-time quantitative PCR (qPCR) assay targeting the E6 and E7 genes of hrHPV was compared with Hybrid Capture 2 (hc2) in a Belgian cervical cancer screening setting. In women >30 years old, the sensitivity and specificity for intraepithelial neoplasias of grade 2 or worse (93 cases of cervical intraepithelial neoplasias of grade 2 or worse (CIN2+) and 1,207 cases of no CIN or CIN1) were 93.6% and 95.6%, respectively, and those of hc2 were 83.9% and 94.5%, respectively {relative sensitivity of qPCR/hc2 = 1.12 [95% confidence interval (CI), 1.01 to 1.23]; relative specificity = 1.01 [95% CI, 0.99 to 1.03]}. A score test showed that the sensitivity (P < 0.0001) and specificity (P < 0.0001) of the qPCR assay were not inferior to those of hc2 at the required thresholds of 90% and 98%, respectively. The overall agreement of hrHPV positivity between the two runs of the qPCR tests was 98.7% (95% CI, 97.5 to 99.4%), with a kappa value of 0.96 (95% CI, 0.83 to 1.00). The qPCR assay used in this study can be considered a reliable HPV assay that fulfills the clinical validation criteria defined for use in cervical cancer screening.

  20. Validated HPLC method and temperature stabilities for oil-soluble organosulfur compounds in garlic macerated oil.

    PubMed

    Yoo, Miyoung; Kim, Sunyoung; Lee, Sanghee; Shin, Dongbin

    2014-01-01

    To enhance the utilization of garlic macerated oil as functional foods, oil-soluble organosulfur compounds were investigated using normal-phase high-performance liquid chromatography method. For analysis of compounds, it was simply extracted with 98% n-hexane in 2-propanol followed by sensitive and selective determination of all compounds. These method exhibited excellent linearity for oil-soluble organosulfur compounds with good coefficient (r > 0.999). Average recoveries were in the range of 80.23-106.18%. The limits of quantitation of oil-soluble organosulfur compounds ranged from 0.32 to 9.56 μg mL(-1) and the limits of detection were from 0.11 to 3.16 μg mL(-1). Overall, the precision of the results, expressed as relative standard deviation, ranged from 0.55 to 11.67%. The proposed method was applied to determining the contents of oil-soluble organosulfur compounds in commercial garlic macerated oils. Also, the stability of oil-soluble organosulfur compounds in garlic macerated oil were evaluated during 3 months of storage at four difference temperatures (4, 10, 25 and 35°C). The results showed the studied oil-soluble compounds in garlic macerated oil were stable at 4°C and relatively unstable at 35°C with varied extents degradation. Therefore, these validation data and temperature stability may be useful for quality evaluation of garlic macerated oils.

  1. A validated method for quantifying hypoglycin A in whole blood by UHPLC-HRMS/MS.

    PubMed

    Carlier, Jérémy; Guitton, Jérôme; Moreau, Cécile; Boyer, Baptiste; Bévalot, Fabien; Fanton, Laurent; Habyarimana, Jean; Gault, Gilbert; Gaillard, Yvan

    2015-01-26

    Hypoglycin A (HGA) is the toxic principle in ackee (Blighia sapida Koenig), a nutritious and readily available fruit which is a staple of the Jamaican working-class and rural population. The aril of the unripe fruit has high concentrations of HGA, the cause of Jamaican vomiting sickness, which is very often fatal. HGA is also present in the samara of several species of maple (Acer spp.) which are suspected to cause seasonal pasture myopathy in North America and equine atypical myopathy in Europe, often fatal for horses. The aim of this study was to develop a method for quantifying HGA in blood that would be sensitive enough to provide toxicological evidence of ackee or maple poisoning. Analysis was carried out using solid-phase extraction (HILIC cartridges), dansyl derivatization and UHPLC-HRMS/MS detection. The method was validated in whole blood with a detection limit of 0.35 μg/L (range: 0.8-500 μg/L). This is the first method applicable in forensic toxicology for quantifying HGA in whole blood. HGA was quantified in two serum samples from horses suffering from atypical myopathy. The concentrations were 446.9 and 87.8 μg/L. HGA was also quantified in dried arils of unripe ackee fruit (Suriname) and seeds of sycamore maple (Acer pseudoplatanus L.) (France). The concentrations were 7.2 and 0.74 mg/g respectively.

  2. Development and validation of a LC-MS/MS method to determine sulforaphane in honey.

    PubMed

    Ares, Ana M; Valverde, Silvia; Bernal, José L; Nozal, María J; Bernal, José

    2015-08-15

    A new method was developed to determine sulforaphane (SFN) in honey using liquid chromatography tandem mass spectrometry (LC-MS/MS) with electrospray ionization (ESI). An efficient extraction procedure was proposed (average analyte recoveries were between 92% and 99%); this involved a solid phase extraction (SPE) with a polymeric sorbent. Chromatography was performed on a Synergi™ Hydro analytical column with a mobile phase of 0.02 M ammonium formate in water and acetonitrile, at a flow rate of 0.5 mL/min. The method was fully validated in terms of selectivity, limits of detection (LOD) and quantification (LOQ), linearity, carry-over effect, reinjection reproducibility, precision and accuracy. The LOD and LOQ values were below 0.8 μg/kg and 2.6 μg/kg, respectively. The proposed method was applied to analyze SFN in honey from different botanical origins (rosemary, multifloral, orange blossom and heather), and SFN was detected at trace levels in some of the honey samples examined.

  3. Conventional and micellar liquid chromatography method development for danazol and validation in capsules.

    PubMed

    Gonzalo-Lumbreras, R; Izquierdo-Hornillos, R

    2003-07-14

    Two isocratic liquid chromatographic methods (conventional and micellar) for the determination of danazol (DZ) in capsules using canrenone (CAN) as internal standard have been developed and validated. In conventional liquid chromatography a mobile phase 35% water:acetonitrile 65%, v:v, a flow-rate 1 ml min(-1) and a C18 Hypersil ODS (250 x 4.6 mm, 5 microm) column (25 degrees C) were used. In micellar liquid chromatography (MLC) the conditions were: mobile phase 40 mM sodium dodecyl sulfate:2% pentanol, flow-rate 0.5 ml min(-1) and C18 Hypersil ODS (150 x 3.0 mm, 5 microm) column (60 degrees C). For both methods. UV absorbance detection at 280 nm was used and a separation up to base line was achieved. Prior to HPLC analysis a simple sample preparation was required. The recoveries found in the accuracy test were 99 +/- 10 and 101 +/- 8%, in conventional liquid chromatography (CLC) and MLC, respectively. Repeatability and intermediate precision expressed as R.S.D. were lower than 5% for both methods. Detection limits obtained were 2.4 and 3.0 ng g(-1) in CLC and CLM, respectively. PMID:14565547

  4. Validation of a mass spectrometry-based method for milk traces detection in baked food.

    PubMed

    Lamberti, Cristina; Cristina, Lamberti; Acquadro, Elena; Elena, Acquadro; Corpillo, Davide; Davide, Corpillo; Giribaldi, Marzia; Marzia, Giribaldi; Decastelli, Lucia; Lucia, Decastelli; Garino, Cristiano; Cristiano, Garino; Arlorio, Marco; Marco, Arlorio; Ricciardi, Carlo; Carlo, Ricciardi; Cavallarin, Laura; Laura, Cavallarin; Giuffrida, Maria Gabriella; Gabriella, Giuffrida Maria

    2016-05-15

    A simple validated LC-MS/MS-based method was set up to detect milk contamination in bakery products, taking the effects of food processing into account for the evaluation of allergen recovery and quantification. Incurred cookies were prepared at eight levels of milk contamination and were cooked to expose all milk components, including allergenic proteins, to food processing conditions. Remarkable results were obtained in term of sufficiently low LOD and LOQ (1.3 and 4 mg/kg cookies, respectively). Precision was calculated as intra-day repeatability (RSD in the 5-20% range) and inter-day repeatability (4 days; RSD never exceeded 12%). The extraction recovery values ranged from 20% to 26%. Method applicability was evaluated by analysing commercial cookies labelled either as "milk-free" or "may contain milk". Although the ELISA methodology is considered the gold standard for detecting allergens in foods, this robust LC-MS/MS approach should be a useful confirmatory method for assessing and certifying "milk-free" food products.

  5. Validated HPTLC methods for quantification of mexiletine hydrochloride in a pharmaceutical formulation.

    PubMed

    Pietraś, Rafal; Skibiński, Robert; Komsta, Łukasz; Kowalczuk, Dorota; Panecka, Ewa

    2010-01-01

    Two simple, accurate, and precise HPTLC methods have been established for the determination of mexiletine hydrochloride, an antiarrhythmic agent, in Mexicord capsules. Analyses were performed in horizontal chambers on RP C18F254s and normal-phase amino (NH2) HPTLC precoated plates with the mobile phases tetrahydrofuran-citrate buffer, pH 4.45 (3 + 7, v/v) and chloroform-tetrahydrofuran-hexane-ethylamine (3 + 2 + 5 + 0.1, v/v/v/v), respectively. The plates were developed for a distance of 40 mm in both cases. Densitometric measurements were achieved in the UV mode at 217 nm based on peak areas with semilinear calibration curves (R2 > or = 0.97) in the concentration range 0.5-8.0 microg/spot for the NH2 and C18 HPTLC methods. The elaborated chromatographic methods were validated in accordance with International Conference on Harmonization guidelines in terms of linearity, accuracy (99.64% for NH2 and 99.53% for C18), precision (intraday RSD 1.16 and 2.71%, respectively), sensitivity (LOD 0.1 microg/spot for both systems), and specificity. PMID:20629382

  6. Numerical validation of a suprasystolic brachial cuff-based method for estimating aortic pressure.

    PubMed

    Liang, Fuyou

    2014-01-01

    Central aortic pressures are better predictors of cardiovascular events than peripheral pressures. However, central aortic blood pressures cannot be measured noninvasively; for this reason, estimating aortic pressures from noninvasive measurements of peripheral pressures has been the subject of numerous studies. In the present study, a novel method was proposed to noninvasively estimate aortic pressures from the oscillometric wave of a suprasystolic brachial cuff. The errors of estimation were evaluated in relation to various cardiovascular properties using an integrated cardiovascular-cuff model. Obtained results demonstrated that the estimation errors are affected mainly by aortic stiffness. The estimation errors for aortic systolic pressure, diastolic pressure, pulse pressure and wave shape under the assumed cardiovascular conditions were 5.84 ± 1.58 mmHg, -0.28 ± 0.41 mmHg, 6.12 ± 1.42 mmHg and 1.72 ± 0.57 mmHg, respectively, all of which fell within the error ranges established by existing devices. Since the method is easy to be automated and bases the estimation fully on patient-specific information, its clinical application is promising, although further clinical studies are awaited to validate the method in vivo.

  7. Validation of a gas chromatographic method to quantify sesquiterpenes in copaiba oils.

    PubMed

    Sousa, João Paulo B; Brancalion, Ana P S; Souza, Ariana B; Turatti, Izabel C C; Ambrósio, Sérgio R; Furtado, Niege A J C; Lopes, Norberto P; Bastos, Jairo K

    2011-03-25

    Copaifera species (Leguminoseae) are popularly known as "copaiba" or "copaíva". The oleoresins obtained from the trunk of these species have been extensively used in folk medicine and are commercialized in Brazil as crude oil and in several pharmaceutical and cosmetic products. This work reports a complete validated method for the quantification of β-caryophyllene, α-copaene, and α-humulene in distinct copaiba oleoresins available commercially. Thus, essential oil samples (100μL) were dissolved in 20mL of hexanes containing internal standard (1,2,4,5-tetramethylbenzene, 3.0mM) in a 25mL glass flask. A 1μL aliquot was injected into the GC-FID system. A fused-silica capillary column HP-5, coated with 5% phenyl-methylsiloxane was used for this study. The developed method gave a good detection response with linearity in the range of 0.10-18.74mM. Limits of detection and quantitation variety ranged between 0.003 and 0.091mM. β-Caryophyllene, α-copaene, and α-humulene were recovered in a range from 74.71% to 88.31%, displaying RSD lower than 10% and relative errors between -11.69% and -25.30%. Therefore, this method could be considered as an analytical tool for the quality control of different Copaifera oil samples and its products in both cosmetic and pharmaceutical companies.

  8. Development and validation of RP-HPLC method for quantification of glipizide in biological macromolecules.

    PubMed

    Pani, Nihar Ranjan; Acharya, Sujata; Patra, Sradhanjali

    2014-04-01

    Glipizide (GPZ) has been widely used in the treatment of type-2 diabetics as insulin secretogague. Multiunit chitosan based GPZ floating microspheres was prepared by ionotropic gelation method for gastroretentive delivery using sodiumtripolyphosphate as cross-linking agent. Pharmacokinetic study of microspheres was done in rabbit and plasma samples were analyzed by a newly developed and validated high-performance liquid chromatographic method. Method was developed on Hypersil ODS-18 column using a mobile phase of 10mM phosphate buffer (pH, 3.5) and methanol (25:75, v/v). Elute was monitored at 230 nm with a flow rate of 1 mL/min. Calibration curve was linear over the concentration range of 25.38-2046.45 ng/mL. Retention times of GPZ and internal standard (gliclazide) were 7.32 and 9.02 min respectively. Maximum plasma drug concentration, area under the plasma drug concentration-time curve and elimination half life for GPZ floating microspheres were 2.88±0.29 μg mL(-1), 38.46±2.26 μg h mL(-1) and 13.55±1.36 h respectively. When the fraction of drug dissolved from microspheres in pH 7.4 was plotted against the fraction of drug absorbed, a linear correlation (R(2)=0.991) was obtained in in vitro and in vivo correlation study.

  9. Development and validation of simultaneous estimation method for curcumin and piperine by RP-UFLC.

    PubMed

    Ramaswamy, Shanmugam; Kuppuswamy, Gowthamarajan; Dwarampudi, Priyanka; Kadiyala, Madhuri; Menta, Lokesh; Kannan, Elango

    2014-07-01

    Curcumin and piperine are proven for their potent medicinal benefits to treat various diseases and they are most commonly used combination in various Indian systems of medicine such as Ayurveda, Siddha and Unani. The objective of the present work is to develop a simultaneous estimation of curcumin and piperine by reverse phase Ultra-fast liquid chromatographic (RP-UFLC) method. The chromatographic separation was performed on a C8 column (250 x 4.6 mm, 5µ i.d.) stationary phase using a mobile phase of 25mM potassium dihydrogen ortho phosphate buffer (pH 3.5) and acetonitrile (30: 70 v/v) at a flow rate of lml/min at detection wave length of 280nm. The calibration curve was plotted in the concentration range of 0-2200ng/ml and found to be linear for both curcumin (r(2)=0.996) and piperine (r(2)=0.999). The method was validated for parameters such as accuracy, sensitivity, precision, linearity, specificity, ruggedness and robustness as per ICH guidelines. The developed simple, precise and specific method can be used as a quality control tool for qualitative and quantitative estimation of curcumin and piperine in various food products, herbal medicines and nutraceuticals. PMID:25015458

  10. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    PubMed

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy.

  11. Development and validation of a method for allantoin determination in liposomes and pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Braga, Raquel Rennó; Sales, Juliana; Marins, Rita de Cassia Elias Estrela; Ortiz, Gisela Maria Dellamora; Garcia, Sheila

    2012-06-01

    The aim of this work was to develop and validate an ultraviolet derivative spectrophotometric (UVDS) method for the quantitative determination of allantoin (ALL) in liposomes, gels and creams. Liposomes were prepared by methods of thin film hydration and mechanical agitation. Solutions of ALL in 0.1 mol/L NaOH with ethanol:water (70:30, v/v) were prepared in order to destroy liposome vesicles. Spectral interference from components of liposomes, cream, gel and ALL degradation products was eliminated using the second-order derivative of the zero-order spectrum. Characterization of ALL in 0.1 mol/L NaOH was carried out by direct infusion mass spectrometry. Absorbances of ALL solutions were measured at 266.6 nm of the second-derivative spectrum and linearity was observed in the ALL concentration range of 50-300 μg mL-1 (correlation coefficient (r) = 0.9961). The mean recovery percentage was 100.68 ± 1.61, repeatability expressed as relative standard deviation (RSD) was 1.07 and 2.12%, and intermediate precision (RSD) was 2.16%. The proposed UVDS method was found to be linear, precise, accurate, robust and selective, providing rapid and specific determination of ALL in raw materials and in topical formulations.

  12. [Forensic risk calculation: basic methodological aspects for the evaluation of the applicability and validity of diverse methods].

    PubMed

    Urbaniok, F; Rinne, T; Held, L; Rossegger, A; Endrass, J

    2008-08-01

    Risk assessment instruments have been the subject of a number of validation studies which have mainly examined the psychometric properties known primarily from psychological test development (objectivity, reliability and validity). Hardly any attention was paid to the fact that validation of forensic risk assessment instruments is confronted with a whole row of methodical challenges. Risk assessments include a quantitative and a qualitative component in that they state the probability (quantitative) of a particular offense (qualitative) to occur. To disregard the probabilistic nature of risk calculations leads to methodically faulty assumptions on the predictive validity of an instrument and what represents a suitable statistical method to test it. For example, ROC analyses are considered to be state of the art in the validation of risk assessment instruments. This method does however not take into account the probabilistic nature of prognoses and its results can be interpreted only to a limited degree. ROC analyses for example disregard certain aspects of an instrument's calibration which might lead in an instrument's validation to high ROC values while demonstrating only low validity. Further shortcomings of validation studies are that they ignore changes of risk dispositions or that they don't differentiate between offense specific risks (e. g. any recidivism vs. violent or sexual recidivism). The paper discusses and reviews different quality criteria of risk assessment instruments in view of methodological as well as practical issues. Many of these criteria have been ignored so far in the scientific discourse even though they are essential to the evaluation of the validity and the scope of indication of an instrument.

  13. Small-scale and household methods to remove arsenic from water for drinking purposes in Latin America.

    PubMed

    Litter, Marta I; Alarcón-Herrera, María Teresa; Arenas, María J; Armienta, María A; Avilés, Marta; Cáceres, Roberto E; Cipriani, Henrique Nery; Cornejo, Lorena; Dias, Luiz E; Cirelli, Alicia Fernández; Farfán, Elsa M; Garrido, Sofía; Lorenzo, Liliana; Morgada, María E; Olmos-Márquez, Mario A; Pérez-Carrera, Alejo

    2012-07-01

    Small-scale and household low-cost technologies to provide water free of arsenic for drinking purposes, suitable for isolated rural and periurban areas not connected to water networks in Latin America are described. Some of them are merely adaptation of conventional technologies already used at large and medium scale, but others are environmentally friendly emerging procedures that use local materials and resources of the affected zone. The technologies require simple and low-cost equipment that can be easily handled and maintained by the local population. The methods are based on the following processes: combination of coagulation/flocculation with adsorption, adsorption with geological and other low-cost natural materials, electrochemical technologies, biological methods including phytoremediation, use of zerovalent iron and photochemical processes. Examples of relevant research studies and developments in the region are given. In some cases, processes have been tested only at the laboratory level and there is not enough information about the costs. However, it is considered that the presented technologies constitute potential alternatives for arsenic removal in isolated rural and periurban localities of Latin America. Generation, handling and adequate disposal of residues should be taken into account in all cases.

  14. Hypothesis-driven and field-validated method to prioritize fragmentation mitigation efforts in road projects.

    PubMed

    Vanthomme, Hadrien; Kolowski, Joseph; Nzamba, Brave S; Alonso, Alfonso

    2015-10-01

    The active field of connectivity conservation has provided numerous methods to identify wildlife corridors with the aim of reducing the ecological effect of fragmentation. Nevertheless, these methods often rely on untested hypotheses of animal movements, usually fail to generate fine-scale predictions of road crossing sites, and do not allow managers to prioritize crossing sites for implementing road fragmentation mitigation measures. We propose a new method that addresses these limitations. We illustrate this method with data from southwestern Gabon (central Africa). We used stratified random transect surveys conducted in two seasons to model the distribution of African forest elephant (Loxodonta cyclotis), forest buffalo (Syncerus caffer nanus), and sitatunga (Tragelaphus spekii) in a mosaic landscape along a 38.5 km unpaved road scheduled for paving. Using a validation data set of recorded crossing locations, we evaluated the performance of three types of models (local suitability, local least-cost movement, and regional least-cost movement) in predicting actual road crossings for each species, and developed a unique and flexible scoring method for prioritizing road sections for the implementation of road fragmentation mitigation measures. With a data set collected in <10 weeks of fieldwork, the method was able to identify seasonal changes in animal movements for buffalo and sitatunga that shift from a local exploitation of the site in the wet season to movements through the study site in the dry season, whereas elephants use the entire study area in both seasons. These three species highlighted the need to use species- and season-specific modeling of movement. From these movement models, the method ranked road sections for their suitability for implementing fragmentation mitigation efforts, allowing managers to adjust priority thresholds based on budgets and management goals. The method relies on data that can be obtained in a period compatible with

  15. Hypothesis-driven and field-validated method to prioritize fragmentation mitigation efforts in road projects.

    PubMed

    Vanthomme, Hadrien; Kolowski, Joseph; Nzamba, Brave S; Alonso, Alfonso

    2015-10-01

    The active field of connectivity conservation has provided numerous methods to identify wildlife corridors with the aim of reducing the ecological effect of fragmentation. Nevertheless, these methods often rely on untested hypotheses of animal movements, usually fail to generate fine-scale predictions of road crossing sites, and do not allow managers to prioritize crossing sites for implementing road fragmentation mitigation measures. We propose a new method that addresses these limitations. We illustrate this method with data from southwestern Gabon (central Africa). We used stratified random transect surveys conducted in two seasons to model the distribution of African forest elephant (Loxodonta cyclotis), forest buffalo (Syncerus caffer nanus), and sitatunga (Tragelaphus spekii) in a mosaic landscape along a 38.5 km unpaved road scheduled for paving. Using a validation data set of recorded crossing locations, we evaluated the performance of three types of models (local suitability, local least-cost movement, and regional least-cost movement) in predicting actual road crossings for each species, and developed a unique and flexible scoring method for prioritizing road sections for the implementation of road fragmentation mitigation measures. With a data set collected in <10 weeks of fieldwork, the method was able to identify seasonal changes in animal movements for buffalo and sitatunga that shift from a local exploitation of the site in the wet season to movements through the study site in the dry season, whereas elephants use the entire study area in both seasons. These three species highlighted the need to use species- and season-specific modeling of movement. From these movement models, the method ranked road sections for their suitability for implementing fragmentation mitigation efforts, allowing managers to adjust priority thresholds based on budgets and management goals. The method relies on data that can be obtained in a period compatible with

  16. Validation of the ANSR Salmonella method for detection of Salmonella spp. in a variety of foods. Performance Tested Method 061203.

    PubMed

    Caballero, Oscar; Alles, Susan; Gray, R Lucas; Tolan, Jerry; Mozola, Mark; Rice, Jennifer

    2014-01-01

    This study represents a proposal to extend the matrix claims for the ANSR Salmonella test, Performance Tested Method 061203. The test is based on the nicking enzyme amplification reaction (NEAR) isothermal nucleic acid amplification technology. The assay platform features simple instrumentation, minimal labor, and following a single-step 16-24 h enrichment (depending on sample type), an extremely short assay time of 30 min including sample preparation. Detection is real-time using fluorescent molecular beacon probes. ANSR Salmonella was originally validated for detection of Salmonella spp. in chicken carcass rinse, raw ground turkey, raw ground beef, hot dogs, and oat cereal, and on stainless steel, plastic, sealed concrete, ceramic tile, and rubber surfaces. The matrixes tested in this study include pet food, ice cream, soy flour, raw almonds, peanut butter, spinach, black pepper, raw frozen shrimp, cocoa powder, and pasteurized dried egg. In unpaired comparative testing there were no statistically significant differences in the number of positive results obtained with the ANSR and the reference culture methods. Enrichment for 16 h was effective for all commodities tested except ice cream, black pepper, dried pasteurized egg, and 375 g samples of dry pet food, for which enrichment for 24 h is indicated. PMID:24830155

  17. Method for monitoring the fertility of workers. 2. Validation of the method among workers exposed to dibromochloropropane

    SciTech Connect

    Levine, R.J.; Symons, M.J.; Balogh, S.A.; Milby, T.H.; Whorton, M.D.

    1981-03-01

    A method has been developed for monitoring industrial workers and others exposed to environmental agents which may impair fertility. National birth probabilities specific for maternal birth cohort, age, parity, and race are used to derive expected fertility. Observed fertility is obtained by questionnaire. Standardized fertility ratios are computed for exposure and non-exposure periods and compared. The analytic techniques have been validated by applying the method to a group of 36 male factory employees working in an agricultural chemical division (ACD) where pesticides including the nematocide dibromochloropropane were formulated. Twelve of these employees in mid-1977 had been discovered to have severely depressed sperm counts related to occupational exposure. The standardized fertility ratio (SFR) computed from data available in mid-1977 for the period at risk from employment in the ACD (SFR = 0.75) was significantly lower than those derived for the entire not-at-risk period (SFR = 1.88) and the portion related to employment in other areas of the factory (SFR = 2.16). Similar differences also were evident from data available several years earlier, demonstrating that the surveillance technique would have been capable of detecting occupationally induced infertility among these workers in advance of the actual discovery date.

  18. Validation of the ANSR Salmonella method for detection of Salmonella spp. in a variety of foods. Performance Tested Method 061203.

    PubMed

    Caballero, Oscar; Alles, Susan; Gray, R Lucas; Tolan, Jerry; Mozola, Mark; Rice, Jennifer

    2014-01-01

    This study represents a proposal to extend the matrix claims for the ANSR Salmonella test, Performance Tested Method 061203. The test is based on the nicking enzyme amplification reaction (NEAR) isothermal nucleic acid amplification technology. The assay platform features simple instrumentation, minimal labor, and following a single-step 16-24 h enrichment (depending on sample type), an extremely short assay time of 30 min including sample preparation. Detection is real-time using fluorescent molecular beacon probes. ANSR Salmonella was originally validated for detection of Salmonella spp. in chicken carcass rinse, raw ground turkey, raw ground beef, hot dogs, and oat cereal, and on stainless steel, plastic, sealed concrete, ceramic tile, and rubber surfaces. The matrixes tested in this study include pet food, ice cream, soy flour, raw almonds, peanut butter, spinach, black pepper, raw frozen shrimp, cocoa powder, and pasteurized dried egg. In unpaired comparative testing there were no statistically significant differences in the number of positive results obtained with the ANSR and the reference culture methods. Enrichment for 16 h was effective for all commodities tested except ice cream, black pepper, dried pasteurized egg, and 375 g samples of dry pet food, for which enrichment for 24 h is indicated.

  19. Nanoliter microfluidic hybrid method for simultaneous screening and optimization validated with crystallization of membrane proteins

    PubMed Central

    Li, Liang; Mustafi, Debarshi; Fu, Qiang; Tereshko, Valentina; Chen, Delai L.; Tice, Joshua D.; Ismagilov, Rustem F.

    2006-01-01

    High-throughput screening and optimization experiments are critical to a number of fields, including chemistry and structural and molecular biology. The separation of these two steps may introduce false negatives and a time delay between initial screening and subsequent optimization. Although a hybrid method combining both steps may address these problems, miniaturization is required to minimize sample consumption. This article reports a “hybrid” droplet-based microfluidic approach that combines the steps of screening and optimization into one simple experiment and uses nanoliter-sized plugs to minimize sample consumption. Many distinct reagents were sequentially introduced as ≈140-nl plugs into a microfluidic device and combined with a substrate and a di