Sample records for target validation laboratory

  1. Development and Validation of Targeted Next-Generation Sequencing Panels for Detection of Germline Variants in Inherited Diseases.

    PubMed

    Santani, Avni; Murrell, Jill; Funke, Birgit; Yu, Zhenming; Hegde, Madhuri; Mao, Rong; Ferreira-Gonzalez, Andrea; Voelkerding, Karl V; Weck, Karen E

    2017-06-01

    - The number of targeted next-generation sequencing (NGS) panels for genetic diseases offered by clinical laboratories is rapidly increasing. Before an NGS-based test is implemented in a clinical laboratory, appropriate validation studies are needed to determine the performance characteristics of the test. - To provide examples of assay design and validation of targeted NGS gene panels for the detection of germline variants associated with inherited disorders. - The approaches used by 2 clinical laboratories for the development and validation of targeted NGS gene panels are described. Important design and validation considerations are examined. - Clinical laboratories must validate performance specifications of each test prior to implementation. Test design specifications and validation data are provided, outlining important steps in validation of targeted NGS panels by clinical diagnostic laboratories.

  2. 42 CFR 493.565 - Selection for validation inspection-laboratory responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Selection for validation inspection-laboratory... Program § 493.565 Selection for validation inspection—laboratory responsibilities. A laboratory selected for a validation inspection must do the following: (a) Authorize its accreditation organization or...

  3. 42 CFR 493.565 - Selection for validation inspection-laboratory responsibilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Selection for validation inspection-laboratory... Program § 493.565 Selection for validation inspection—laboratory responsibilities. A laboratory selected for a validation inspection must do the following: (a) Authorize its accreditation organization or...

  4. 42 CFR 493.565 - Selection for validation inspection-laboratory responsibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Selection for validation inspection-laboratory... Program § 493.565 Selection for validation inspection—laboratory responsibilities. A laboratory selected for a validation inspection must do the following: (a) Authorize its accreditation organization or...

  5. 42 CFR 493.565 - Selection for validation inspection-laboratory responsibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Selection for validation inspection-laboratory... Program § 493.565 Selection for validation inspection—laboratory responsibilities. A laboratory selected for a validation inspection must do the following: (a) Authorize its accreditation organization or...

  6. 42 CFR 493.565 - Selection for validation inspection-laboratory responsibilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Selection for validation inspection-laboratory... Program § 493.565 Selection for validation inspection—laboratory responsibilities. A laboratory selected for a validation inspection must do the following: (a) Authorize its accreditation organization or...

  7. Literature evidence in open targets - a target validation platform.

    PubMed

    Kafkas, Şenay; Dunham, Ian; McEntyre, Johanna

    2017-06-06

    We present the Europe PMC literature component of Open Targets - a target validation platform that integrates various evidence to aid drug target identification and validation. The component identifies target-disease associations in documents and ranks the documents based on their confidence from the Europe PMC literature database, by using rules utilising expert-provided heuristic information. The confidence score of a given document represents how valuable the document is in the scope of target validation for a given target-disease association by taking into account the credibility of the association based on the properties of the text. The component serves the platform regularly with the up-to-date data since December, 2015. Currently, there are a total number of 1168365 distinct target-disease associations text mined from >26 million PubMed abstracts and >1.2 million Open Access full text articles. Our comparative analyses on the current available evidence data in the platform revealed that 850179 of these associations are exclusively identified by literature mining. This component helps the platform's users by providing the most relevant literature hits for a given target and disease. The text mining evidence along with the other types of evidence can be explored visually through https://www.targetvalidation.org and all the evidence data is available for download in json format from https://www.targetvalidation.org/downloads/data .

  8. Teaching method validation in the clinical laboratory science curriculum.

    PubMed

    Moon, Tara C; Legrys, Vicky A

    2008-01-01

    With the Clinical Laboratory Improvement Amendment's (CLIA) final rule, the ability of the Clinical Laboratory Scientist (CLS) to perform method validation has become increasingly important. Knowledge of the statistical methods and procedures used in method validation is imperative for clinical laboratory scientists. However, incorporating these concepts in a CLS curriculum can be challenging, especially at a time of limited resources. This paper provides an outline of one approach to addressing these topics in lecture courses and integrating them in the student laboratory and the clinical practicum for direct application.

  9. Creation and Validation of Sintered PTFE BRDF Targets & Standards

    PubMed Central

    Durell, Christopher; Scharpf, Dan; McKee, Greg; L’Heureux, Michelle; Georgiev, Georgi; Obein, Gael; Cooksey, Catherine

    2016-01-01

    Sintered polytetrafluoroethylene (PTFE) is an extremely stable, near-perfect Lambertian reflecting diffuser and calibration standard material that has been used by national labs, space, aerospace and commercial sectors for over two decades. New uncertainty targets of 2 % on-orbit absolute validation in the Earth Observing Systems community have challenged the industry to improve is characterization and knowledge of almost every aspect of radiometric performance (space and ground). Assuming “near perfect” reflectance for angular dependent measurements is no longer going to suffice for many program needs. The total hemispherical spectral reflectance provides a good mark of general performance; but, without the angular characterization of bidirectional reflectance distribution function (BRDF) measurements, critical data is missing from many applications and uncertainty budgets. Therefore, traceable BRDF measurement capability is needed to characterize sintered PTFE’s angular response and provide a full uncertainty profile to users. This paper presents preliminary comparison measurements of the BRDF of sintered PTFE from several laboratories to better quantify the BRDF of sintered PTFE, assess the BRDF measurement comparability between laboratories, and improve estimates of measurement uncertainties under laboratory conditions. PMID:26900206

  10. Creation and Validation of Sintered PTFE BRDF Targets & Standards.

    PubMed

    Durell, Christopher; Scharpf, Dan; McKee, Greg; L'Heureux, Michelle; Georgiev, Georgi; Obein, Gael; Cooksey, Catherine

    2015-09-21

    Sintered polytetrafluoroethylene (PTFE) is an extremely stable, near-perfect Lambertian reflecting diffuser and calibration standard material that has been used by national labs, space, aerospace and commercial sectors for over two decades. New uncertainty targets of 2 % on-orbit absolute validation in the Earth Observing Systems community have challenged the industry to improve is characterization and knowledge of almost every aspect of radiometric performance (space and ground). Assuming "near perfect" reflectance for angular dependent measurements is no longer going to suffice for many program needs. The total hemispherical spectral reflectance provides a good mark of general performance; but, without the angular characterization of bidirectional reflectance distribution function (BRDF) measurements, critical data is missing from many applications and uncertainty budgets. Therefore, traceable BRDF measurement capability is needed to characterize sintered PTFE's angular response and provide a full uncertainty profile to users. This paper presents preliminary comparison measurements of the BRDF of sintered PTFE from several laboratories to better quantify the BRDF of sintered PTFE, assess the BRDF measurement comparability between laboratories, and improve estimates of measurement uncertainties under laboratory conditions.

  11. Review of validation and reporting of non-targeted fingerprinting approaches for food authentication.

    PubMed

    Riedl, Janet; Esslinger, Susanne; Fauhl-Hassek, Carsten

    2015-07-23

    Food fingerprinting approaches are expected to become a very potent tool in authentication processes aiming at a comprehensive characterization of complex food matrices. By non-targeted spectrometric or spectroscopic chemical analysis with a subsequent (multivariate) statistical evaluation of acquired data, food matrices can be investigated in terms of their geographical origin, species variety or possible adulterations. Although many successful research projects have already demonstrated the feasibility of non-targeted fingerprinting approaches, their uptake and implementation into routine analysis and food surveillance is still limited. In many proof-of-principle studies, the prediction ability of only one data set was explored, measured within a limited period of time using one instrument within one laboratory. Thorough validation strategies that guarantee reliability of the respective data basis and that allow conclusion on the applicability of the respective approaches for its fit-for-purpose have not yet been proposed. Within this review, critical steps of the fingerprinting workflow were explored to develop a generic scheme for multivariate model validation. As a result, a proposed scheme for "good practice" shall guide users through validation and reporting of non-targeted fingerprinting results. Furthermore, food fingerprinting studies were selected by a systematic search approach and reviewed with regard to (a) transparency of data processing and (b) validity of study results. Subsequently, the studies were inspected for measures of statistical model validation, analytical method validation and quality assurance measures. In this context, issues and recommendations were found that might be considered as an actual starting point for developing validation standards of non-targeted metabolomics approaches for food authentication in the future. Hence, this review intends to contribute to the harmonization and standardization of food fingerprinting, both

  12. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    PubMed

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  13. Validating models of target acquisition performance in the dismounted soldier context

    NASA Astrophysics Data System (ADS)

    Glaholt, Mackenzie G.; Wong, Rachel K.; Hollands, Justin G.

    2018-04-01

    The problem of predicting real-world operator performance with digital imaging devices is of great interest within the military and commercial domains. There are several approaches to this problem, including: field trials with imaging devices, laboratory experiments using imagery captured from these devices, and models that predict human performance based on imaging device parameters. The modeling approach is desirable, as both field trials and laboratory experiments are costly and time-consuming. However, the data from these experiments is required for model validation. Here we considered this problem in the context of dismounted soldiering, for which detection and identification of human targets are essential tasks. Human performance data were obtained for two-alternative detection and identification decisions in a laboratory experiment in which photographs of human targets were presented on a computer monitor and the images were digitally magnified to simulate range-to-target. We then compared the predictions of different performance models within the NV-IPM software package: Targeting Task Performance (TTP) metric model and the Johnson model. We also introduced a modification to the TTP metric computation that incorporates an additional correction for target angular size. We examined model predictions using NV-IPM default values for a critical model constant, V50, and we also considered predictions when this value was optimized to fit the behavioral data. When using default values, certain model versions produced a reasonably close fit to the human performance data in the detection task, while for the identification task all models substantially overestimated performance. When using fitted V50 values the models produced improved predictions, though the slopes of the performance functions were still shallow compared to the behavioral data. These findings are discussed in relation to the models' designs and parameters, and the characteristics of the behavioral

  14. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  15. Validation of a laboratory and hospital information system in a medical laboratory accredited according to ISO 15189.

    PubMed

    Biljak, Vanja Radisic; Ozvald, Ivan; Radeljak, Andrea; Majdenic, Kresimir; Lasic, Branka; Siftar, Zoran; Lovrencic, Marijana Vucic; Flegar-Mestric, Zlata

    2012-01-01

    The aim of the study was to present a protocol for laboratory information system (LIS) and hospital information system (HIS) validation at the Institute of Clinical Chemistry and Laboratory Medicine of the Merkur University Hospital, Zagreb, Croatia. Validity of data traceability was checked by entering all test requests for virtual patient into HIS/LIS and printing corresponding barcoded labels that provided laboratory analyzers with the information on requested tests. The original printouts of the test results from laboratory analyzer(s) were compared with the data obtained from LIS and entered into the provided template. Transfer of data from LIS to HIS was examined by requesting all tests in HIS and creating real data in a finding generated in LIS. Data obtained from LIS and HIS were entered into a corresponding template. The main outcome measure was the accuracy of transfer obtained from laboratory analyzers and results transferred from LIS and HIS expressed as percentage (%). The accuracy of data transfer from laboratory analyzers to LIS was 99.5% and of that from LIS to HIS 100%. We presented our established validation protocol for laboratory information system and demonstrated that a system meets its intended purpose.

  16. Open Targets: a platform for therapeutic target identification and validation

    PubMed Central

    Koscielny, Gautier; An, Peter; Carvalho-Silva, Denise; Cham, Jennifer A.; Fumis, Luca; Gasparyan, Rippa; Hasan, Samiul; Karamanis, Nikiforos; Maguire, Michael; Papa, Eliseo; Pierleoni, Andrea; Pignatelli, Miguel; Platt, Theo; Rowland, Francis; Wankar, Priyanka; Bento, A. Patrícia; Burdett, Tony; Fabregat, Antonio; Forbes, Simon; Gaulton, Anna; Gonzalez, Cristina Yenyxe; Hermjakob, Henning; Hersey, Anne; Jupe, Steven; Kafkas, Şenay; Keays, Maria; Leroy, Catherine; Lopez, Francisco-Javier; Magarinos, Maria Paula; Malone, James; McEntyre, Johanna; Munoz-Pomer Fuentes, Alfonso; O'Donovan, Claire; Papatheodorou, Irene; Parkinson, Helen; Palka, Barbara; Paschall, Justin; Petryszak, Robert; Pratanwanich, Naruemon; Sarntivijal, Sirarat; Saunders, Gary; Sidiropoulos, Konstantinos; Smith, Thomas; Sondka, Zbyslaw; Stegle, Oliver; Tang, Y. Amy; Turner, Edward; Vaughan, Brendan; Vrousgou, Olga; Watkins, Xavier; Martin, Maria-Jesus; Sanseau, Philippe; Vamathevan, Jessica; Birney, Ewan; Barrett, Jeffrey; Dunham, Ian

    2017-01-01

    We have designed and developed a data integration and visualization platform that provides evidence about the association of known and potential drug targets with diseases. The platform is designed to support identification and prioritization of biological targets for follow-up. Each drug target is linked to a disease using integrated genome-wide data from a broad range of data sources. The platform provides either a target-centric workflow to identify diseases that may be associated with a specific target, or a disease-centric workflow to identify targets that may be associated with a specific disease. Users can easily transition between these target- and disease-centric workflows. The Open Targets Validation Platform is accessible at https://www.targetvalidation.org. PMID:27899665

  17. ANITA (Advanced Network for Isotope and TArget laboratories) - The urgent need for a European target preparation network

    NASA Astrophysics Data System (ADS)

    Schumann, Dorothea; Sibbens, Goedele; Stolarz, Anna; Eberhardt, Klaus; Lommel, Bettina; Stodel, Christelle

    2018-05-01

    A wide number of research fields in the nuclear sector requires high-quality and well-characterized samples and targets. Currently, only a few laboratories own or have access to the equipment allowing fulfilling such demands. Coordination of activities and sharing resources is therefore mandatory to meet the increasing needs. This very urgent issue has now been addressed by six European target laboratories with an initiative called ANITA (Advanced Network for Isotope and TArget laboratories). The global aim of ANITA is to establish an overarching research infrastructure service for isotope and target production and develop a tight cooperation between the target laboratories in Europe in order to transfer the knowledge and improve the production techniques of well-characterized samples and targets. Moreover, the interaction of the target producers with the users shall be encouraged and intensified to deliver tailor-made targets best-suited to the envisaged experiments. For the realization of this ambitious goal, efforts within the European Commission and strong support by the target-using communities will be necessary. In particular, an appropriate funding instrument has to be found and applied, enabling ANITA to develop from an initiative employed by the interested parties to a real coordination platform.

  18. Validation of a laboratory and hospital information system in a medical laboratory accredited according to ISO 15189

    PubMed Central

    Biljak, Vanja Radisic; Ozvald, Ivan; Radeljak, Andrea; Majdenic, Kresimir; Lasic, Branka; Siftar, Zoran; Lovrencic, Marijana Vucic; Flegar-Mestric, Zlata

    2012-01-01

    Introduction The aim of the study was to present a protocol for laboratory information system (LIS) and hospital information system (HIS) validation at the Institute of Clinical Chemistry and Laboratory Medicine of the Merkur University Hospital, Zagreb, Croatia. Materials and methods: Validity of data traceability was checked by entering all test requests for virtual patient into HIS/LIS and printing corresponding barcoded labels that provided laboratory analyzers with the information on requested tests. The original printouts of the test results from laboratory analyzer(s) were compared with the data obtained from LIS and entered into the provided template. Transfer of data from LIS to HIS was examined by requesting all tests in HIS and creating real data in a finding generated in LIS. Data obtained from LIS and HIS were entered into a corresponding template. The main outcome measure was the accuracy of transfer obtained from laboratory analyzers and results transferred from LIS and HIS expressed as percentage (%). Results: The accuracy of data transfer from laboratory analyzers to LIS was 99.5% and of that from LIS to HIS 100%. Conclusion: We presented our established validation protocol for laboratory information system and demonstrated that a system meets its intended purpose. PMID:22384522

  19. Extrapolating non-target risk of Bt crops from laboratory to field.

    PubMed

    Duan, Jian J; Lundgren, Jonathan G; Naranjo, Steve; Marvier, Michelle

    2010-02-23

    The tiered approach to assessing ecological risk of insect-resistant transgenic crops assumes that lower tier laboratory studies, which expose surrogate non-target organisms to high doses of insecticidal proteins, can detect harmful effects that might be manifested in the field. To test this assumption, we performed meta-analyses comparing results for non-target invertebrates exposed to Bacillus thuringiensis (Bt) Cry proteins in laboratory studies with results derived from independent field studies examining effects on the abundance of non-target invertebrates. For Lepidopteran-active Cry proteins, laboratory studies correctly predicted the reduced field abundance of non-target Lepidoptera. However, laboratory studies incorporating tri-trophic interactions of Bt plants, herbivores and parasitoids were better correlated with the decreased field abundance of parasitoids than were direct-exposure assays. For predators, laboratory tri-trophic studies predicted reduced abundances that were not realized in field studies and thus overestimated ecological risk. Exposure to Coleopteran-active Cry proteins did not significantly reduce the laboratory survival or field abundance of any functional group examined. Our findings support the assumption that laboratory studies of transgenic insecticidal crops show effects that are either consistent with, or more conservative than, those found in field studies, with the important caveat that laboratory studies should explore all ecologically relevant routes of exposure.

  20. Validated MicroRNA Target Databases: An Evaluation.

    PubMed

    Lee, Yun Ji Diana; Kim, Veronica; Muth, Dillon C; Witwer, Kenneth W

    2015-11-01

    Preclinical Research Positive findings from preclinical and clinical studies involving depletion or supplementation of microRNA (miRNA) engender optimism about miRNA-based therapeutics. However, off-target effects must be considered. Predicting these effects is complicated. Each miRNA may target many gene transcripts, and the rules governing imperfectly complementary miRNA: target interactions are incompletely understood. Several databases provide lists of the relatively small number of experimentally confirmed miRNA: target pairs. Although incomplete, this information might allow assessment of at least some of the off-target effects. We evaluated the performance of four databases of experimentally validated miRNA: target interactions (miRWalk 2.0, miRTarBase, miRecords, and TarBase 7.0) using a list of 50 alphabetically consecutive genes. We examined the provided citations to determine the degree to which each interaction was experimentally supported. To assess stability, we tested at the beginning and end of a five-month period. Results varied widely by database. Two of the databases changed significantly over the course of 5 months. Most reported evidence for miRNA: target interactions were indirect or otherwise weak, and relatively few interactions were supported by more than one publication. Some returned results appear to arise from simplistic text searches that offer no insight into the relationship of the search terms, may not even include the reported gene or miRNA, and may thus, be invalid. We conclude that validation databases provide important information, but not all information in all extant databases is up-to-date or accurate. Nevertheless, the more comprehensive validation databases may provide useful starting points for investigation of off-target effects of proposed small RNA therapies. © 2015 Wiley Periodicals, Inc.

  1. Analytic Validation of Immunohistochemistry Assays: New Benchmark Data From a Survey of 1085 Laboratories.

    PubMed

    Stuart, Lauren N; Volmar, Keith E; Nowak, Jan A; Fatheree, Lisa A; Souers, Rhona J; Fitzgibbons, Patrick L; Goldsmith, Jeffrey D; Astles, J Rex; Nakhleh, Raouf E

    2017-09-01

    - A cooperative agreement between the College of American Pathologists (CAP) and the United States Centers for Disease Control and Prevention was undertaken to measure laboratories' awareness and implementation of an evidence-based laboratory practice guideline (LPG) on immunohistochemical (IHC) validation practices published in 2014. - To establish new benchmark data on IHC laboratory practices. - A 2015 survey on IHC assay validation practices was sent to laboratories subscribed to specific CAP proficiency testing programs and to additional nonsubscribing laboratories that perform IHC testing. Specific questions were designed to capture laboratory practices not addressed in a 2010 survey. - The analysis was based on responses from 1085 laboratories that perform IHC staining. Ninety-six percent (809 of 844) always documented validation of IHC assays. Sixty percent (648 of 1078) had separate procedures for predictive and nonpredictive markers, 42.7% (220 of 515) had procedures for laboratory-developed tests, 50% (349 of 697) had procedures for testing cytologic specimens, and 46.2% (363 of 785) had procedures for testing decalcified specimens. Minimum case numbers were specified by 85.9% (720 of 838) of laboratories for nonpredictive markers and 76% (584 of 768) for predictive markers. Median concordance requirements were 95% for both types. For initial validation, 75.4% (538 of 714) of laboratories adopted the 20-case minimum for nonpredictive markers and 45.9% (266 of 579) adopted the 40-case minimum for predictive markers as outlined in the 2014 LPG. The most common method for validation was correlation with morphology and expected results. Laboratories also reported which assay changes necessitated revalidation and their minimum case requirements. - Benchmark data on current IHC validation practices and procedures may help laboratories understand the issues and influence further refinement of LPG recommendations.

  2. Certification & validation of biosafety level-2 & biosafety level-3 laboratories in Indian settings & common issues.

    PubMed

    Mourya, Devendra T; Yadav, Pragya D; Khare, Ajay; Khan, Anwar H

    2017-10-01

    With increasing awareness regarding biorisk management worldwide, many biosafety laboratories are being setup in India. It is important for the facility users, project managers and the executing agencies to understand the process of validation and certification of such biosafety laboratories. There are some international guidelines available, but there are no national guidelines or reference standards available in India on certification and validation of biosafety laboratories. There is no accredited government/private agency available in India to undertake validation and certification of biosafety laboratories. Therefore, the reliance is mostly on indigenous experience, talent and expertise available, which is in short supply. This article elucidates the process of certification and validation of biosafety laboratories in a concise manner for the understanding of the concerned users and suggests the important parameters and criteria that should be considered and addressed during the laboratory certification and validation process.

  3. 49 CFR 40.89 - What is validity testing, and are laboratories required to conduct it?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.89 What is validity testing, and are laboratories required to conduct it? (a) Specimen validity testing is... 49 Transportation 1 2013-10-01 2013-10-01 false What is validity testing, and are laboratories...

  4. 49 CFR 40.89 - What is validity testing, and are laboratories required to conduct it?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.89 What is validity testing, and are laboratories required to conduct it? (a) Specimen validity testing is... 49 Transportation 1 2011-10-01 2011-10-01 false What is validity testing, and are laboratories...

  5. 49 CFR 40.89 - What is validity testing, and are laboratories required to conduct it?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.89 What is validity testing, and are laboratories required to conduct it? (a) Specimen validity testing is... 49 Transportation 1 2010-10-01 2010-10-01 false What is validity testing, and are laboratories...

  6. 49 CFR 40.89 - What is validity testing, and are laboratories required to conduct it?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.89 What is validity testing, and are laboratories required to conduct it? (a) Specimen validity testing is... 49 Transportation 1 2012-10-01 2012-10-01 false What is validity testing, and are laboratories...

  7. 49 CFR 40.89 - What is validity testing, and are laboratories required to conduct it?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.89 What is validity testing, and are laboratories required to conduct it? (a) Specimen validity testing is... 49 Transportation 1 2014-10-01 2014-10-01 false What is validity testing, and are laboratories...

  8. Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025

    NASA Astrophysics Data System (ADS)

    Banegas, J. M.; Orué, M. W.

    2016-07-01

    Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.

  9. Certification & validation of biosafety level-2 & biosafety level-3 laboratories in Indian settings & common issues

    PubMed Central

    Mourya, Devendra T.; Yadav, Pragya D.; Khare, Ajay; Khan, Anwar H.

    2017-01-01

    With increasing awareness regarding biorisk management worldwide, many biosafety laboratories are being setup in India. It is important for the facility users, project managers and the executing agencies to understand the process of validation and certification of such biosafety laboratories. There are some international guidelines available, but there are no national guidelines or reference standards available in India on certification and validation of biosafety laboratories. There is no accredited government/private agency available in India to undertake validation and certification of biosafety laboratories. Therefore, the reliance is mostly on indigenous experience, talent and expertise available, which is in short supply. This article elucidates the process of certification and validation of biosafety laboratories in a concise manner for the understanding of the concerned users and suggests the important parameters and criteria that should be considered and addressed during the laboratory certification and validation process. PMID:29434059

  10. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    PubMed

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  11. Drug Target Validation Methods in Malaria - Protein Interference Assay (PIA) as a Tool for Highly Specific Drug Target Validation.

    PubMed

    Meissner, Kamila A; Lunev, Sergey; Wang, Yuan-Ze; Linzke, Marleen; de Assis Batista, Fernando; Wrenger, Carsten; Groves, Matthew R

    2017-01-01

    The validation of drug targets in malaria and other human diseases remains a highly difficult and laborious process. In the vast majority of cases, highly specific small molecule tools to inhibit a proteins function in vivo are simply not available. Additionally, the use of genetic tools in the analysis of malarial pathways is challenging. These issues result in difficulties in specifically modulating a hypothetical drug target's function in vivo. The current "toolbox" of various methods and techniques to identify a protein's function in vivo remains very limited and there is a pressing need for expansion. New approaches are urgently required to support target validation in the drug discovery process. Oligomerisation is the natural assembly of multiple copies of a single protein into one object and this self-assembly is present in more than half of all protein structures. Thus, oligomerisation plays a central role in the generation of functional biomolecules. A key feature of oligomerisation is that the oligomeric interfaces between the individual parts of the final assembly are highly specific. However, these interfaces have not yet been systematically explored or exploited to dissect biochemical pathways in vivo. This mini review will describe the current state of the antimalarial toolset as well as the potentially druggable malarial pathways. A specific focus is drawn to the initial efforts to exploit oligomerisation surfaces in drug target validation. As alternative to the conventional methods, Protein Interference Assay (PIA) can be used for specific distortion of the target protein function and pathway assessment in vivo. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. A Novel Target Synthesis Laboratory for Students

    NASA Astrophysics Data System (ADS)

    Smales, C. Mark; Harding, David R. K.

    1999-11-01

    A third-year specialist course in drug design and delivery focused on a single laboratory goal for all students. A tetrapeptide, destined as the signal component of a drug delivery system, was chosen for this target synthesis. The practical, real-life aspect of the course, and the target synthesis in particular, was a major component of the appeal to the students. Students were given a synthetic scheme based on standard peptide synthesis protocols, and several lectures provided background for the general approach. They were then encouraged to design each step of the synthesis themselves, with reference to the literature and course work. As long as due diligence was shown in attempts to achieve success at each step, no student was penalized for losses, low yields, or other lack of progress. Reports on all procedures used were prepared in a journal format chosen by the student and were collected at the end of the course. The target-synthesis approach was appreciated by the students and enjoyed by the staff. We believe the students left the course with a greater appreciation for laboratory research. It takes more work to set up and run this type of course than the traditional follow-the-recipe course, but in our experience it was worth the extra effort.

  13. Computer validation in toxicology: historical review for FDA and EPA good laboratory practice.

    PubMed

    Brodish, D L

    1998-01-01

    The application of computer validation principles to Good Laboratory Practice is a fairly recent phenomenon. As automated data collection systems have become more common in toxicology facilities, the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency have begun to focus inspections in this area. This historical review documents the development of regulatory guidance on computer validation in toxicology over the past several decades. An overview of the components of a computer life cycle is presented, including the development of systems descriptions, validation plans, validation testing, system maintenance, SOPs, change control, security considerations, and system retirement. Examples are provided for implementation of computer validation principles on laboratory computer systems in a toxicology facility.

  14. Validation of Radiometric Standards for the Laboratory Calibration of Reflected-Solar Earth Observing Satellite Instruments

    NASA Technical Reports Server (NTRS)

    Butler, James J.; Johnson, B. Carol; Rice, Joseph P.; Brown, Steven W.; Barnes, Robert A.

    2007-01-01

    Historically, the traceability of the laboratory calibration of Earth-observing satellite instruments to a primary radiometric reference scale (SI units) is the responsibility of each instrument builder. For the NASA Earth Observing System (EOS), a program has been developed using laboratory transfer radiometers, each with its own traceability to the primary radiance scale of a national metrology laboratory, to independently validate the radiances assigned to the laboratory sources of the instrument builders. The EOS Project Science Office also developed a validation program for the measurement of onboard diffuse reflecting plaques, which are also used as radiometric standards for Earth-observing satellite instruments. Summarized results of these validation campaigns, with an emphasis on the current state-of-the-art uncertainties in laboratory radiometric standards, will be presented. Future mission uncertainty requirements, and possible enhancements to the EOS validation program to ensure that those uncertainties can be met, will be presented.

  15. Challenges in validating candidate therapeutic targets in cancer

    PubMed Central

    Sawyers, Charles L; Hunter, Tony

    2018-01-01

    More than 30 published articles have suggested that a protein kinase called MELK is an attractive therapeutic target in human cancer, but three recent reports describe compelling evidence that it is not. These reports highlight the caveats associated with some of the research tools that are commonly used to validate candidate therapeutic targets in cancer research. PMID:29417929

  16. Comprehensive GMO detection using real-time PCR array: single-laboratory validation.

    PubMed

    Mano, Junichi; Harada, Mioko; Takabatake, Reona; Furui, Satoshi; Kitta, Kazumi; Nakamura, Kosuke; Akiyama, Hiroshi; Teshima, Reiko; Noritake, Hiromichi; Hatano, Shuko; Futo, Satoshi; Minegishi, Yasutaka; Iizuka, Tayoshi

    2012-01-01

    We have developed a real-time PCR array method to comprehensively detect genetically modified (GM) organisms. In the method, genomic DNA extracted from an agricultural product is analyzed using various qualitative real-time PCR assays on a 96-well PCR plate, targeting for individual GM events, recombinant DNA (r-DNA) segments, taxon-specific DNAs, and donor organisms of the respective r-DNAs. In this article, we report the single-laboratory validation of both DNA extraction methods and component PCR assays constituting the real-time PCR array. We selected some DNA extraction methods for specified plant matrixes, i.e., maize flour, soybean flour, and ground canola seeds, then evaluated the DNA quantity, DNA fragmentation, and PCR inhibition of the resultant DNA extracts. For the component PCR assays, we evaluated the specificity and LOD. All DNA extraction methods and component PCR assays satisfied the criteria set on the basis of previous reports.

  17. [Validation of a questionnaire to evaluate patient safety in clinical laboratories].

    PubMed

    Giménez Marín, Ángeles; Rivas-Ruiz, Francisco

    2012-01-01

    The aim of this study was to prepare, pilot and validate a questionnaire to evaluate patient safety in the specific context of clinical laboratories. A specific questionnaire on patient safety in the laboratory, with 62 items grouped into six areas, was developed, taking into consideration the diverse human and laboratory contextual factors which may contribute to producing errors. A pilot study of 30 interviews was carried out, including validity and reliability analyses using principal components factor analysis and Cronbach's alpha. Subsequently, 240 questionnaires were sent to 21 hospitals, followed by a test-retest of 41 questionnaires with the definitive version. The sample analyzed was composed of 225 questionnaires (an overall response rate of 80%). Of the 62 items initially assessed, 17 were eliminated due to non-compliance with the criteria established before the principal components factor analysis was performed. For the 45 remaining items, 12 components were identified, with an cumulative variance of 69.5%. In seven of the 10 components with two or more items, Cronbach's alpha was higher than 0.7. The questionnaire items assessed in the test-retest were found to be stable. We present the first questionnaire with sufficiently proven validity and reliability for evaluating patient safety in the specific context of clinical laboratories. This questionnaire provides a useful instrument to perform a subsequent macrostudy of hospital clinical laboratories in Spain. The questionnaire can also be used to monitor and promote commitment to patient safety within the search for continuous quality improvement. Copyright © 2011 SESPAS. Published by Elsevier Espana. All rights reserved.

  18. Validation and Implementation of Clinical Laboratory Improvements Act-Compliant Whole-Genome Sequencing in the Public Health Microbiology Laboratory

    PubMed Central

    Kozyreva, Varvara K.; Truong, Chau-Linda; Greninger, Alexander L.; Crandall, John; Mukhopadhyay, Rituparna

    2017-01-01

    ABSTRACT Public health microbiology laboratories (PHLs) are on the cusp of unprecedented improvements in pathogen identification, antibiotic resistance detection, and outbreak investigation by using whole-genome sequencing (WGS). However, considerable challenges remain due to the lack of common standards. Here, we describe the validation of WGS on the Illumina platform for routine use in PHLs according to Clinical Laboratory Improvements Act (CLIA) guidelines for laboratory-developed tests (LDTs). We developed a validation panel comprising 10 Enterobacteriaceae isolates, 5 Gram-positive cocci, 5 Gram-negative nonfermenting species, 9 Mycobacterium tuberculosis isolates, and 5 miscellaneous bacteria. The genome coverage range was 15.71× to 216.4× (average, 79.72×; median, 71.55×); the limit of detection (LOD) for single nucleotide polymorphisms (SNPs) was 60×. The accuracy, reproducibility, and repeatability of base calling were >99.9%. The accuracy of phylogenetic analysis was 100%. The specificity and sensitivity inferred from multilocus sequence typing (MLST) and genome-wide SNP-based phylogenetic assays were 100%. The following objectives were accomplished: (i) the establishment of the performance specifications for WGS applications in PHLs according to CLIA guidelines, (ii) the development of quality assurance and quality control measures, (iii) the development of a reporting format for end users with or without WGS expertise, (iv) the availability of a validation set of microorganisms, and (v) the creation of a modular template for the validation of WGS processes in PHLs. The validation panel, sequencing analytics, and raw sequences could facilitate multilaboratory comparisons of WGS data. Additionally, the WGS performance specifications and modular template are adaptable for the validation of other platforms and reagent kits. PMID:28592550

  19. Validation and Implementation of Clinical Laboratory Improvements Act-Compliant Whole-Genome Sequencing in the Public Health Microbiology Laboratory.

    PubMed

    Kozyreva, Varvara K; Truong, Chau-Linda; Greninger, Alexander L; Crandall, John; Mukhopadhyay, Rituparna; Chaturvedi, Vishnu

    2017-08-01

    Public health microbiology laboratories (PHLs) are on the cusp of unprecedented improvements in pathogen identification, antibiotic resistance detection, and outbreak investigation by using whole-genome sequencing (WGS). However, considerable challenges remain due to the lack of common standards. Here, we describe the validation of WGS on the Illumina platform for routine use in PHLs according to Clinical Laboratory Improvements Act (CLIA) guidelines for laboratory-developed tests (LDTs). We developed a validation panel comprising 10 Enterobacteriaceae isolates, 5 Gram-positive cocci, 5 Gram-negative nonfermenting species, 9 Mycobacterium tuberculosis isolates, and 5 miscellaneous bacteria. The genome coverage range was 15.71× to 216.4× (average, 79.72×; median, 71.55×); the limit of detection (LOD) for single nucleotide polymorphisms (SNPs) was 60×. The accuracy, reproducibility, and repeatability of base calling were >99.9%. The accuracy of phylogenetic analysis was 100%. The specificity and sensitivity inferred from multilocus sequence typing (MLST) and genome-wide SNP-based phylogenetic assays were 100%. The following objectives were accomplished: (i) the establishment of the performance specifications for WGS applications in PHLs according to CLIA guidelines, (ii) the development of quality assurance and quality control measures, (iii) the development of a reporting format for end users with or without WGS expertise, (iv) the availability of a validation set of microorganisms, and (v) the creation of a modular template for the validation of WGS processes in PHLs. The validation panel, sequencing analytics, and raw sequences could facilitate multilaboratory comparisons of WGS data. Additionally, the WGS performance specifications and modular template are adaptable for the validation of other platforms and reagent kits. Copyright © 2017 Kozyreva et al.

  20. Performance Tested Method multiple laboratory validation study of ELISA-based assays for the detection of peanuts in food.

    PubMed

    Park, Douglas L; Coates, Scott; Brewer, Vickery A; Garber, Eric A E; Abouzied, Mohamed; Johnson, Kurt; Ritter, Bruce; McKenzie, Deborah

    2005-01-01

    Performance Tested Method multiple laboratory validations for the detection of peanut protein in 4 different food matrixes were conducted under the auspices of the AOAC Research Institute. In this blind study, 3 commercially available ELISA test kits were validated: Neogen Veratox for Peanut, R-Biopharm RIDASCREEN FAST Peanut, and Tepnel BioKits for Peanut Assay. The food matrixes used were breakfast cereal, cookies, ice cream, and milk chocolate spiked at 0 and 5 ppm peanut. Analyses of the samples were conducted by laboratories representing industry and international and U.S governmental agencies. All 3 commercial test kits successfully identified spiked and peanut-free samples. The validation study required 60 analyses on test samples at the target level 5 microg peanut/g food and 60 analyses at a peanut-free level, which was designed to ensure that the lower 95% confidence limit for the sensitivity and specificity would not be <90%. The probability that a test sample contains an allergen given a prevalence rate of 5% and a positive test result using a single test kit analysis with 95% sensitivity and 95% specificity, which was demonstrated for these test kits, would be 50%. When 2 test kits are run simultaneously on all samples, the probability becomes 95%. It is therefore recommended that all field samples be analyzed with at least 2 of the validated kits.

  1. Pre-trial inter-laboratory analytical validation of the FOCUS4 personalised therapy trial.

    PubMed

    Richman, Susan D; Adams, Richard; Quirke, Phil; Butler, Rachel; Hemmings, Gemma; Chambers, Phil; Roberts, Helen; James, Michelle D; Wozniak, Sue; Bathia, Riya; Pugh, Cheryl; Maughan, Timothy; Jasani, Bharat

    2016-01-01

    Molecular characterisation of tumours is increasing personalisation of cancer therapy, tailored to an individual and their cancer. FOCUS4 is a molecularly stratified clinical trial for patients with advanced colorectal cancer. During an initial 16-week period of standard first-line chemotherapy, tumour tissue will undergo several molecular assays, with the results used for cohort allocation, then randomisation. Laboratories in Leeds and Cardiff will perform the molecular testing. The results of a rigorous pre-trial inter-laboratory analytical validation are presented and discussed. Wales Cancer Bank supplied FFPE tumour blocks from 97 mCRC patients with consent for use in further research. Both laboratories processed each sample according to an agreed definitive FOCUS4 laboratory protocol, reporting results directly to the MRC Trial Management Group for independent cross-referencing. Pyrosequencing analysis of mutation status at KRAS codons12/13/61/146, NRAS codons12/13/61, BRAF codon600 and PIK3CA codons542/545/546/1047, generated highly concordant results. Two samples gave discrepant results; in one a PIK3CA mutation was detected only in Leeds, and in the other, a PIK3CA mutation was only detected in Cardiff. pTEN and mismatch repair (MMR) protein expression was assessed by immunohistochemistry (IHC) resulting in 6/97 discordant results for pTEN and 5/388 for MMR, resolved upon joint review. Tumour heterogeneity was likely responsible for pyrosequencing discrepancies. The presence of signet-ring cells, necrosis, mucin, edge-effects and over-counterstaining influenced IHC discrepancies. Pre-trial assay analytical validation is essential to ensure appropriate selection of patients for targeted therapies. This is feasible for both mutation testing and immunohistochemical assays and must be built into the workup of such trials. ISRCTN90061564. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to

  2. Modeling and validation of spectral BRDF on material surface of space target

    NASA Astrophysics Data System (ADS)

    Hou, Qingyu; Zhi, Xiyang; Zhang, Huili; Zhang, Wei

    2014-11-01

    The modeling and the validation methods of the spectral BRDF on the material surface of space target were presented. First, the microscopic characteristics of the space targets' material surface were analyzed based on fiber-optic spectrometer using to measure the direction reflectivity of the typical materials surface. To determine the material surface of space target is isotropic, atomic force microscopy was used to measure the material surface structure of space target and obtain Gaussian distribution model of microscopic surface element height. Then, the spectral BRDF model based on that the characteristics of the material surface were isotropic and the surface micro-facet with the Gaussian distribution which we obtained was constructed. The model characterizes smooth and rough surface well for describing the material surface of the space target appropriately. Finally, a spectral BRDF measurement platform in a laboratory was set up, which contains tungsten halogen lamp lighting system, fiber optic spectrometer detection system and measuring mechanical systems with controlling the entire experimental measurement and collecting measurement data by computers automatically. Yellow thermal control material and solar cell were measured with the spectral BRDF, which showed the relationship between the reflection angle and BRDF values at three wavelengths in 380nm, 550nm, 780nm, and the difference between theoretical model values and the measured data was evaluated by relative RMS error. Data analysis shows that the relative RMS error is less than 6%, which verified the correctness of the spectral BRDF model.

  3. Common pitfalls in preclinical cancer target validation.

    PubMed

    Kaelin, William G

    2017-07-01

    An alarming number of papers from laboratories nominating new cancer drug targets contain findings that cannot be reproduced by others or are simply not robust enough to justify drug discovery efforts. This problem probably has many causes, including an underappreciation of the danger of being misled by off-target effects when using pharmacological or genetic perturbants in complex biological assays. This danger is particularly acute when, as is often the case in cancer pharmacology, the biological phenotype being measured is a 'down' readout (such as decreased proliferation, decreased viability or decreased tumour growth) that could simply reflect a nonspecific loss of cellular fitness. These problems are compounded by multiple hypothesis testing, such as when candidate targets emerge from high-throughput screens that interrogate multiple targets in parallel, and by a publication and promotion system that preferentially rewards positive findings. In this Perspective, I outline some of the common pitfalls in preclinical cancer target identification and some potential approaches to mitigate them.

  4. Multi-laboratory validation study of multilocus variable-number tandem repeat analysis (MLVA) for Salmonella enterica serovar Enteritidis, 2015

    PubMed Central

    Peters, Tansy; Bertrand, Sophie; Björkman, Jonas T; Brandal, Lin T; Brown, Derek J; Erdõsi, Tímea; Heck, Max; Ibrahem, Salha; Johansson, Karin; Kornschober, Christian; Kotila, Saara M; Le Hello, Simon; Lienemann, Taru; Mattheus, Wesley; Nielsen, Eva Møller; Ragimbeau, Catherine; Rumore, Jillian; Sabol, Ashley; Torpdahl, Mia; Trees, Eija; Tuohy, Alma; de Pinna, Elizabeth

    2017-01-01

    Multilocus variable-number tandem repeat analysis (MLVA) is a rapid and reproducible typing method that is an important tool for investigation, as well as detection, of national and multinational outbreaks of a range of food-borne pathogens. Salmonella enterica serovar Enteritidis is the most common Salmonella serovar associated with human salmonellosis in the European Union/European Economic Area and North America. Fourteen laboratories from 13 countries in Europe and North America participated in a validation study for MLVA of S. Enteritidis targeting five loci. Following normalisation of fragment sizes using a set of reference strains, a blinded set of 24 strains with known allele sizes was analysed by each participant. The S. Enteritidis 5-loci MLVA protocol was shown to produce internationally comparable results as more than 90% of the participants reported less than 5% discrepant MLVA profiles. All 14 participating laboratories performed well, even those where experience with this typing method was limited. The raw fragment length data were consistent throughout, and the inter-laboratory validation helped to standardise the conversion of raw data to repeat numbers with at least two countries updating their internal procedures. However, differences in assigned MLVA profiles remain between well-established protocols and should be taken into account when exchanging data. PMID:28277220

  5. Multi-laboratory validation study of multilocus variable-number tandem repeat analysis (MLVA) for Salmonella enterica serovar Enteritidis, 2015.

    PubMed

    Peters, Tansy; Bertrand, Sophie; Björkman, Jonas T; Brandal, Lin T; Brown, Derek J; Erdõsi, Tímea; Heck, Max; Ibrahem, Salha; Johansson, Karin; Kornschober, Christian; Kotila, Saara M; Le Hello, Simon; Lienemann, Taru; Mattheus, Wesley; Nielsen, Eva Møller; Ragimbeau, Catherine; Rumore, Jillian; Sabol, Ashley; Torpdahl, Mia; Trees, Eija; Tuohy, Alma; de Pinna, Elizabeth

    2017-03-02

    Multilocus variable-number tandem repeat analysis (MLVA) is a rapid and reproducible typing method that is an important tool for investigation, as well as detection, of national and multinational outbreaks of a range of food-borne pathogens. Salmonella enterica serovar Enteritidis is the most common Salmonella serovar associated with human salmonellosis in the European Union/European Economic Area and North America. Fourteen laboratories from 13 countries in Europe and North America participated in a validation study for MLVA of S. Enteritidis targeting five loci. Following normalisation of fragment sizes using a set of reference strains, a blinded set of 24 strains with known allele sizes was analysed by each participant. The S. Enteritidis 5-loci MLVA protocol was shown to produce internationally comparable results as more than 90% of the participants reported less than 5% discrepant MLVA profiles. All 14 participating laboratories performed well, even those where experience with this typing method was limited. The raw fragment length data were consistent throughout, and the inter-laboratory validation helped to standardise the conversion of raw data to repeat numbers with at least two countries updating their internal procedures. However, differences in assigned MLVA profiles remain between well-established protocols and should be taken into account when exchanging data. This article is copyright of The Authors, 2017.

  6. Heart rate variability indicates emotional value during pro-social economic laboratory decisions with large external validity.

    PubMed

    Fooken, Jonas

    2017-03-10

    The present study investigates the external validity of emotional value measured in economic laboratory experiments by using a physiological indicator of stress, heart rate variability (HRV). While there is ample evidence supporting the external validity of economic experiments, there is little evidence comparing the magnitude of internal levels of emotional stress during decision making with external stress. The current study addresses this gap by comparing the magnitudes of decision stress experienced in the laboratory with the stress from outside the laboratory. To quantify a large change in HRV, measures observed in the laboratory during decision-making are compared to the difference between HRV during a university exam and other mental activity for the same individuals in and outside of the laboratory. The results outside the laboratory inform about the relevance of laboratory findings in terms of their relative magnitude. Results show that psychologically induced HRV changes observed in the laboratory, particularly in connection with social preferences, correspond to large effects outside. This underscores the external validity of laboratory findings and shows the magnitude of emotional value connected to pro-social economic decisions in the laboratory.

  7. Validation of the Electromagnetic Code FACETS for Numerical Simulation of Radar Target Images

    DTIC Science & Technology

    2009-12-01

    Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong...Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong DRDC Ottawa...for simulating radar images of a target is obtained, through direct simulation-to-measurement comparisons. A 3-dimensional computer-aided design

  8. Developmental validation of the MiSeq FGx Forensic Genomics System for Targeted Next Generation Sequencing in Forensic DNA Casework and Database Laboratories.

    PubMed

    Jäger, Anne C; Alvarez, Michelle L; Davis, Carey P; Guzmán, Ernesto; Han, Yonmee; Way, Lisa; Walichiewicz, Paulina; Silva, David; Pham, Nguyen; Caves, Glorianna; Bruand, Jocelyne; Schlesinger, Felix; Pond, Stephanie J K; Varlaro, Joe; Stephens, Kathryn M; Holt, Cydne L

    2017-05-01

    Human DNA profiling using PCR at polymorphic short tandem repeat (STR) loci followed by capillary electrophoresis (CE) size separation and length-based allele typing has been the standard in the forensic community for over 20 years. Over the last decade, Next-Generation Sequencing (NGS) matured rapidly, bringing modern advantages to forensic DNA analysis. The MiSeq FGx™ Forensic Genomics System, comprised of the ForenSeq™ DNA Signature Prep Kit, MiSeq FGx™ Reagent Kit, MiSeq FGx™ instrument and ForenSeq™ Universal Analysis Software, uses PCR to simultaneously amplify up to 231 forensic loci in a single multiplex reaction. Targeted loci include Amelogenin, 27 common, forensic autosomal STRs, 24 Y-STRs, 7 X-STRs and three classes of single nucleotide polymorphisms (SNPs). The ForenSeq™ kit includes two primer sets: Amelogenin, 58 STRs and 94 identity informative SNPs (iiSNPs) are amplified using DNA Primer Set A (DPMA; 153 loci); if a laboratory chooses to generate investigative leads using DNA Primer Set B, amplification is targeted to the 153 loci in DPMA plus 22 phenotypic informative (piSNPs) and 56 biogeographical ancestry SNPs (aiSNPs). High-resolution genotypes, including detection of intra-STR sequence variants, are semi-automatically generated with the ForenSeq™ software. This system was subjected to developmental validation studies according to the 2012 Revised SWGDAM Validation Guidelines. A two-step PCR first amplifies the target forensic STR and SNP loci (PCR1); unique, sample-specific indexed adapters or "barcodes" are attached in PCR2. Approximately 1736 ForenSeq™ reactions were analyzed. Studies include DNA substrate testing (cotton swabs, FTA cards, filter paper), species studies from a range of nonhuman organisms, DNA input sensitivity studies from 1ng down to 7.8pg, two-person human DNA mixture testing with three genotype combinations, stability analysis of partially degraded DNA, and effects of five commonly encountered PCR

  9. Clinical Validation of Copy Number Variant Detection from Targeted Next-Generation Sequencing Panels.

    PubMed

    Kerkhof, Jennifer; Schenkel, Laila C; Reilly, Jack; McRobbie, Sheri; Aref-Eshghi, Erfan; Stuart, Alan; Rupar, C Anthony; Adams, Paul; Hegele, Robert A; Lin, Hanxin; Rodenhiser, David; Knoll, Joan; Ainsworth, Peter J; Sadikovic, Bekim

    2017-11-01

    Next-generation sequencing (NGS) technology has rapidly replaced Sanger sequencing in the assessment of sequence variations in clinical genetics laboratories. One major limitation of current NGS approaches is the ability to detect copy number variations (CNVs) approximately >50 bp. Because these represent a major mutational burden in many genetic disorders, parallel CNV assessment using alternate supplemental methods, along with the NGS analysis, is normally required, resulting in increased labor, costs, and turnaround times. The objective of this study was to clinically validate a novel CNV detection algorithm using targeted clinical NGS gene panel data. We have applied this approach in a retrospective cohort of 391 samples and a prospective cohort of 2375 samples and found a 100% sensitivity (95% CI, 89%-100%) for 37 unique events and a high degree of specificity to detect CNVs across nine distinct targeted NGS gene panels. This NGS CNV pipeline enables stand-alone first-tier assessment for CNV and sequence variants in a clinical laboratory setting, dispensing with the need for parallel CNV analysis using classic techniques, such as microarray, long-range PCR, or multiplex ligation-dependent probe amplification. This NGS CNV pipeline can also be applied to the assessment of complex genomic regions, including pseudogenic DNA sequences, such as the PMS2CL gene, and to mitochondrial genome heteroplasmy detection. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  10. Laboratory Validation and Demonstrations of Non-Hexavalent Chromium Conversion Coatings for Steel Substrates (Briefing Charts)

    DTIC Science & Technology

    2011-02-01

    UNCLASSIFIED: Approved for public release; distribution unlimited. Laboratory Validation and Demonstrations of Non-Hexavalent Chromium Conversion...00-00-2011 4. TITLE AND SUBTITLE Laboratory Validation and Demonstrations of Non-Hexavalent Chromium Conversion Coatings for Steel Substrates 5a...Coatings for HHA • SurTec 650 - ChromitAL TCP - Trivalent Chrome Pretreatment Developed by NAVAIR for Aluminum. • Chemetall Oxsilan 9810/2 - Non-chrome

  11. Studying Sexual Aggression: A Review of the Evolution and Validity of Laboratory Paradigms

    PubMed Central

    Davis, Kelly Cue; George, William H.; Nagayama Hall, Gordon C.; Parrott, Dominic J.; Tharp, Andra Teten; Stappenbeck, Cynthia A.

    2018-01-01

    Objective Researchers have endeavored for decades to develop and implement experimental assessments of sexual aggression and its precursors to capitalize on the many scientific advantages offered by laboratory experiments, such as rigorous control of key variables and identification of causal relationships. The purpose of this review is to provide an overview of and commentary on the evolution of these laboratory-based methods. Conclusions To date, two primary types of sexual aggression laboratory studies have been developed: those that involve behavioral analogues of sexual aggression and those that assess postulated precursors to sexually aggressive behavior. Although the study of sexual aggression in the laboratory is fraught with methodological challenges, validity concerns, and ethical considerations, advances in the field have resulted in greater methodological rigor, more precise dependent measures, and improved experimental validity, reliability, and realism. Because highly effective sexual aggression prevention strategies remain elusive, continued laboratory-based investigation of sexual aggression coupled with translation of critical findings to the development and modification of sexual aggression prevention programs remains an important task for the field. PMID:29675289

  12. PRN 96-1: Tolerance Enforcement Methods - Independent Laboratory Validation by Petitioner

    EPA Pesticide Factsheets

    This notice is intended to clarify the requirements for submission of an Independent Laboratory Validation to accompany new pesticide analytical methods and does not contain additional data requirements.This notice supersedes PR Notice 88-5.

  13. ETHOWATCHER: validation of a tool for behavioral and video-tracking analysis in laboratory animals.

    PubMed

    Crispim Junior, Carlos Fernando; Pederiva, Cesar Nonato; Bose, Ricardo Chessini; Garcia, Vitor Augusto; Lino-de-Oliveira, Cilene; Marino-Neto, José

    2012-02-01

    We present a software (ETHOWATCHER(®)) developed to support ethography, object tracking and extraction of kinematic variables from digital video files of laboratory animals. The tracking module allows controlled segmentation of the target from the background, extracting image attributes used to calculate the distance traveled, orientation, length, area and a path graph of the experimental animal. The ethography module allows recording of catalog-based behaviors from environment or from video files continuously or frame-by-frame. The output reports duration, frequency and latency of each behavior and the sequence of events in a time-segmented format, set by the user. Validation tests were conducted on kinematic measurements and on the detection of known behavioral effects of drugs. This software is freely available at www.ethowatcher.ufsc.br. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Analytic Validation of Immunohistochemical Assays: A Comparison of Laboratory Practices Before and After Introduction of an Evidence-Based Guideline.

    PubMed

    Fitzgibbons, Patrick L; Goldsmith, Jeffrey D; Souers, Rhona J; Fatheree, Lisa A; Volmar, Keith E; Stuart, Lauren N; Nowak, Jan A; Astles, J Rex; Nakhleh, Raouf E

    2017-09-01

    - Laboratories must demonstrate analytic validity before any test can be used clinically, but studies have shown inconsistent practices in immunohistochemical assay validation. - To assess changes in immunohistochemistry analytic validation practices after publication of an evidence-based laboratory practice guideline. - A survey on current immunohistochemistry assay validation practices and on the awareness and adoption of a recently published guideline was sent to subscribers enrolled in one of 3 relevant College of American Pathologists proficiency testing programs and to additional nonsubscribing laboratories that perform immunohistochemical testing. The results were compared with an earlier survey of validation practices. - Analysis was based on responses from 1085 laboratories that perform immunohistochemical staining. Of 1057 responses, 65.4% (691) were aware of the guideline recommendations before this survey was sent and 79.9% (550 of 688) of those have already adopted some or all of the recommendations. Compared with the 2010 survey, a significant number of laboratories now have written validation procedures for both predictive and nonpredictive marker assays and specifications for the minimum numbers of cases needed for validation. There was also significant improvement in compliance with validation requirements, with 99% (100 of 102) having validated their most recently introduced predictive marker assay, compared with 74.9% (326 of 435) in 2010. The difficulty in finding validation cases for rare antigens and resource limitations were cited as the biggest challenges in implementing the guideline. - Dissemination of the 2014 evidence-based guideline validation practices had a positive impact on laboratory performance; some or all of the recommendations have been adopted by nearly 80% of respondents.

  15. Validity of diagnoses, procedures, and laboratory data in Japanese administrative data.

    PubMed

    Yamana, Hayato; Moriwaki, Mutsuko; Horiguchi, Hiromasa; Kodan, Mariko; Fushimi, Kiyohide; Yasunaga, Hideo

    2017-10-01

    Validation of recorded data is a prerequisite for studies that utilize administrative databases. The present study evaluated the validity of diagnoses and procedure records in the Japanese Diagnosis Procedure Combination (DPC) data, along with laboratory test results in the newly-introduced Standardized Structured Medical Record Information Exchange (SS-MIX) data. Between November 2015 and February 2016, we conducted chart reviews of 315 patients hospitalized between April 2014 and March 2015 in four middle-sized acute-care hospitals in Shizuoka, Kochi, Fukuoka, and Saga Prefectures and used them as reference standards. The sensitivity and specificity of DPC data in identifying 16 diseases and 10 common procedures were identified. The accuracy of SS-MIX data for 13 laboratory test results was also examined. The specificity of diagnoses in the DPC data exceeded 96%, while the sensitivity was below 50% for seven diseases and variable across diseases. When limited to primary diagnoses, the sensitivity and specificity were 78.9% and 93.2%, respectively. The sensitivity of procedure records exceeded 90% for six procedures, and the specificity exceeded 90% for nine procedures. Agreement between the SS-MIX data and the chart reviews was above 95% for all 13 items. The validity of diagnoses and procedure records in the DPC data and laboratory results in the SS-MIX data was high in general, supporting their use in future studies. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  16. Reliability and validity of job content questionnaire for university research laboratory staff in Malaysia.

    PubMed

    Nehzat, F; Huda, B Z; Tajuddin, S H Syed

    2014-03-01

    Job Content Questionnaire (JCQ) has been proven a reliable and valid instrument to assess job stress in many countries and among various occupations. In Malaysia, both English and Malay versions of the JCQ have been administered to automotive workers, schoolteachers, and office workers. This study assessed the reliability and validity of the instrument with research laboratory staff in a university. A cross sectional study was conducted among 258 research laboratory staff in Universiti Putra Malaysia (UPM). Malaysian laboratory staff who have worked for at least one year were randomly selected from nine faculties and institutes in the university that have research laboratory. A self-administered English and Malay version of Job Content Questionnaire (JCQ) was used. Three major scales of JCQ: decision latitude, psychological job demands, and social support were assessed. Cronbach's alpha coefficients of two scales were acceptable, decision latitude and psychological job demands (0.70 and 0.72, respectively), while Cronbach's alpha coefficient for social support (0.86) was good. Exploratory factor analysis showed five factors that correspond closely to the theoretical construct of the questionnaire. The results of this research suggest that the JCQ is reliable and valid for examining psychosocial work situations and job strain among research laboratory staff. Further studies should be done for confirmative results, and further evaluation is needed on the decision authority subscale for this occupation.

  17. The druggable genome and support for target identification and validation in drug development.

    PubMed

    Finan, Chris; Gaulton, Anna; Kruger, Felix A; Lumbers, R Thomas; Shah, Tina; Engmann, Jorgen; Galver, Luana; Kelley, Ryan; Karlsson, Anneli; Santos, Rita; Overington, John P; Hingorani, Aroon D; Casas, Juan P

    2017-03-29

    Target identification (determining the correct drug targets for a disease) and target validation (demonstrating an effect of target perturbation on disease biomarkers and disease end points) are important steps in drug development. Clinically relevant associations of variants in genes encoding drug targets model the effect of modifying the same targets pharmacologically. To delineate drug development (including repurposing) opportunities arising from this paradigm, we connected complex disease- and biomarker-associated loci from genome-wide association studies to an updated set of genes encoding druggable human proteins, to agents with bioactivity against these targets, and, where there were licensed drugs, to clinical indications. We used this set of genes to inform the design of a new genotyping array, which will enable association studies of druggable genes for drug target selection and validation in human disease. Copyright © 2017, American Association for the Advancement of Science.

  18. Overview of Target Fabrication in Support of Sandia National Laboratories

    NASA Astrophysics Data System (ADS)

    Schroen, Diana; Breden, Eric; Florio, Joseph; Grine-Jones, Suzi; Holt, Randy; Krych, Wojtek; Metzler, James; Russell, Chris; Stolp, Justin; Streit, Jonathan; Youngblood, Kelly

    2004-11-01

    Sandia National Laboratories has succeeded in making its pulsed power driver, the Z machine, a valuable testbed for a great variety of experiments. These experiments include ICF, weapon physics, Equation of State and astrophysics. There are four main target types: Dynamic Hohlraum, Double Pinch, Fast Igniter and EOS. The target sizes are comparable to projected NIF sizes. For example, capsules up to 5 mm have been fielded. This talk will focus on the assembly challenges and the use of foams to create these targets. For many targets, diagnostics and capsules are embedded in the foams, and foam dopants have been added. It is the 14 mg/cc foam target with an embedded capsule (containing deuterium) that has reproducibly produced thermonuclear neutrons. For all target types, the characterization and documentation has had to develop to ensure understanding of target performance. To achieve the required resolution we are using a Nikon automated microscope and a custom OMEGA/NIF target assembly system. Our drive for quality has lead us develop a management system that been registered to ISO 9001.

  19. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  20. Evolution and validation of a personal form of an instrument for assessing science laboratory classroom environments

    NASA Astrophysics Data System (ADS)

    Fraser, Barry J.; Giddings, Geoffrey J.; McRobbie, Campbell J.

    The research reported in this article makes two distinctive contributions to the field of classroom environment research. First, because existing instruments are unsuitable for science laboratory classes, the Science Laboratory Environment Inventory (SLEI) was developed and validated. Second, a new Personal form of the SLEI (involving a student's perceptions of his or her own role within the class) was developed and validated in conjunction with the conventional Class form (involving a student's perceptions of the class as a whole), and its usefulness was investigated. The instrument was cross-nationally fieldtested with 5,447 students in 269 senior high school and university classes in six countries, and cross-validated with 1,594 senior high school students in 92 classes in Australia. Each SLEI scale exhibited satisfactory internal consistency reliability, discriminant validity, and factorial validity, and differentiated between the perceptions of students in different classes. A variety of applications with the new instrument furnished evidence about its usefulness and revealed that science laboratory classes are dominated by closed-ended activities; mean scores obtained on the Class form were consistently somewhat more favorable than on the corresponding Personal form; females generally held more favorable perceptions than males, but these differences were somewhat larger for the Personal form than the Class form; associations existed between attitudinal outcomes and laboratory environment dimensions; and the Class and Personal forms of the SLEI each accounted for unique variance in student outcomes which was independent of that accounted for by the other form.

  1. Automatic target validation based on neuroscientific literature mining for tractography

    PubMed Central

    Vasques, Xavier; Richardet, Renaud; Hill, Sean L.; Slater, David; Chappelier, Jean-Cedric; Pralong, Etienne; Bloch, Jocelyne; Draganski, Bogdan; Cif, Laura

    2015-01-01

    Target identification for tractography studies requires solid anatomical knowledge validated by an extensive literature review across species for each seed structure to be studied. Manual literature review to identify targets for a given seed region is tedious and potentially subjective. Therefore, complementary approaches would be useful. We propose to use text-mining models to automatically suggest potential targets from the neuroscientific literature, full-text articles and abstracts, so that they can be used for anatomical connection studies and more specifically for tractography. We applied text-mining models to three structures: two well-studied structures, since validated deep brain stimulation targets, the internal globus pallidus and the subthalamic nucleus and, the nucleus accumbens, an exploratory target for treating psychiatric disorders. We performed a systematic review of the literature to document the projections of the three selected structures and compared it with the targets proposed by text-mining models, both in rat and primate (including human). We ran probabilistic tractography on the nucleus accumbens and compared the output with the results of the text-mining models and literature review. Overall, text-mining the literature could find three times as many targets as two man-weeks of curation could. The overall efficiency of the text-mining against literature review in our study was 98% recall (at 36% precision), meaning that over all the targets for the three selected seeds, only one target has been missed by text-mining. We demonstrate that connectivity for a structure of interest can be extracted from a very large amount of publications and abstracts. We believe this tool will be useful in helping the neuroscience community to facilitate connectivity studies of particular brain regions. The text mining tools used for the study are part of the HBP Neuroinformatics Platform, publicly available at http://connectivity-brainer.rhcloud.com/. PMID

  2. Extrapolating non-target risk of Bt crops from laboratory to field

    USDA-ARS?s Scientific Manuscript database

    The tiered approach to assessing the ecological risk of insect-resistant transgenic crops rests on the assumption that lower-tier laboratory studies, which expose surrogate non-target organisms to insecticidal proteins, accurately predict the ecological effects of these crops under field conditions....

  3. Development and Score Validation of a Chemistry Laboratory Anxiety Instrument (CLAI) for College Chemistry Students.

    ERIC Educational Resources Information Center

    Bowen, Craig W.

    1999-01-01

    Reports the development and score validation of an instrument for measuring anxieties students experience in college chemistry laboratories. Factor analysis of scores from 361 college students shows that the developed Chemistry Laboratory Anxiety Instrument measures five constructs. Results from a second sample of 598 students show that scores on…

  4. Applications of CRISPR genome editing technology in drug target identification and validation.

    PubMed

    Lu, Quinn; Livi, George P; Modha, Sundip; Yusa, Kosuke; Macarrón, Ricardo; Dow, David J

    2017-06-01

    The analysis of pharmaceutical industry data indicates that the major reason for drug candidates failing in late stage clinical development is lack of efficacy, with a high proportion of these due to erroneous hypotheses about target to disease linkage. More than ever, there is a requirement to better understand potential new drug targets and their role in disease biology in order to reduce attrition in drug development. Genome editing technology enables precise modification of individual protein coding genes, as well as noncoding regulatory sequences, enabling the elucidation of functional effects in human disease relevant cellular systems. Areas covered: This article outlines applications of CRISPR genome editing technology in target identification and target validation studies. Expert opinion: Applications of CRISPR technology in target validation studies are in evidence and gaining momentum. Whilst technical challenges remain, we are on the cusp of CRISPR being applied in complex cell systems such as iPS derived differentiated cells and stem cell derived organoids. In the meantime, our experience to date suggests that precise genome editing of putative targets in primary cell systems is possible, offering more human disease relevant systems than conventional cell lines.

  5. Target studies for the neutrino factory at the Rutherford Appleton laboratory

    NASA Astrophysics Data System (ADS)

    Drumm, Paul; Densham, Chris; Bennett, Roger

    2001-10-01

    Target studies at the Rutherford Appleton Laboratory have concentrated on studies of a solid heavy metal target. The suggestion to use a radiatively cooled target which rotates in beam was made shortly after the first NuFact workshop as a means of dissipating large amounts of power at a high temperature, and as an alternative to the proposed water-cooled rotating band and liquid metal jet targets. This paper examines the proposed drive scheme for the target ring, which uses induced currents and magnetic forces to both levitate and drive the target. Estimates of the power required to levitate and drive the target ring and the forces exerted on the moving ring as it enters the target capture solenoid are given. One of the principle concerns in the operation of a solid target is the severe shock stress experienced due to the impact of an intense energetic proton beam in a short time compared to the transit time of sound in the material. Calculations of the stresses induced in the target ring and their evolution with time as well as an initial estimation of the expected power densities and stresses in an existing high power density target are presented.

  6. Approaches to Validate and Manipulate RNA Targets with Small Molecules in Cells.

    PubMed

    Childs-Disney, Jessica L; Disney, Matthew D

    2016-01-01

    RNA has become an increasingly important target for therapeutic interventions and for chemical probes that dissect and manipulate its cellular function. Emerging targets include human RNAs that have been shown to directly cause cancer, metabolic disorders, and genetic disease. In this review, we describe various routes to obtain bioactive compounds that target RNA, with a particular emphasis on the development of small molecules. We use these cases to describe approaches that are being developed for target validation, which include target-directed cleavage, classic pull-down experiments, and covalent cross-linking. Thus, tools are available to design small molecules to target RNA and to identify the cellular RNAs that are their targets.

  7. Validation of Laser-Induced Fluorescent Photogrammetric Targets on Membrane Structures

    NASA Technical Reports Server (NTRS)

    Jones, Thomas W.; Dorrington, Adrian A.; Shortis, Mark R.; Hendricks, Aron R.

    2004-01-01

    The need for static and dynamic characterization of a new generation of inflatable space structures requires the advancement of classical metrology techniques. A new photogrammetric-based method for non-contact ranging and surface profiling has been developed at NASA Langley Research Center (LaRC) to support modal analyses and structural validation of this class of space structures. This full field measurement method, known as Laser-Induced Fluorescence (LIF) photogrammetry, has previously yielded promising experimental results. However, data indicating the achievable measurement precision had not been published. This paper provides experimental results that indicate the LIF-photogrammetry measurement precision for three different target types used on a reflective membrane structure. The target types were: (1) non-contact targets generated using LIF, (2) surface attached retro-reflective targets, and (3) surface attached diffuse targets. Results from both static and dynamic investigations are included.

  8. miRTarBase update 2018: a resource for experimentally validated microRNA-target interactions.

    PubMed

    Chou, Chih-Hung; Shrestha, Sirjana; Yang, Chi-Dung; Chang, Nai-Wen; Lin, Yu-Ling; Liao, Kuang-Wen; Huang, Wei-Chi; Sun, Ting-Hsuan; Tu, Siang-Jyun; Lee, Wei-Hsiang; Chiew, Men-Yee; Tai, Chun-San; Wei, Ting-Yen; Tsai, Tzi-Ren; Huang, Hsin-Tzu; Wang, Chung-Yu; Wu, Hsin-Yi; Ho, Shu-Yi; Chen, Pin-Rong; Chuang, Cheng-Hsun; Hsieh, Pei-Jung; Wu, Yi-Shin; Chen, Wen-Liang; Li, Meng-Ju; Wu, Yu-Chun; Huang, Xin-Yi; Ng, Fung Ling; Buddhakosai, Waradee; Huang, Pei-Chun; Lan, Kuan-Chun; Huang, Chia-Yen; Weng, Shun-Long; Cheng, Yeong-Nan; Liang, Chao; Hsu, Wen-Lian; Huang, Hsien-Da

    2018-01-04

    MicroRNAs (miRNAs) are small non-coding RNAs of ∼ 22 nucleotides that are involved in negative regulation of mRNA at the post-transcriptional level. Previously, we developed miRTarBase which provides information about experimentally validated miRNA-target interactions (MTIs). Here, we describe an updated database containing 422 517 curated MTIs from 4076 miRNAs and 23 054 target genes collected from over 8500 articles. The number of MTIs curated by strong evidence has increased ∼1.4-fold since the last update in 2016. In this updated version, target sites validated by reporter assay that are available in the literature can be downloaded. The target site sequence can extract new features for analysis via a machine learning approach which can help to evaluate the performance of miRNA-target prediction tools. Furthermore, different ways of browsing enhance user browsing specific MTIs. With these improvements, miRTarBase serves as more comprehensively annotated, experimentally validated miRNA-target interactions databases in the field of miRNA related research. miRTarBase is available at http://miRTarBase.mbc.nctu.edu.tw/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Target analyte quantification by isotope dilution LC-MS/MS directly referring to internal standard concentrations--validation for serum cortisol measurement.

    PubMed

    Maier, Barbara; Vogeser, Michael

    2013-04-01

    Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.

  10. Implementing the Science Assessment Standards: Developing and validating a set of laboratory assessment tasks in high school biology

    NASA Astrophysics Data System (ADS)

    Saha, Gouranga Chandra

    Very often a number of factors, especially time, space and money, deter many science educators from using inquiry-based, hands-on, laboratory practical tasks as alternative assessment instruments in science. A shortage of valid inquiry-based laboratory tasks for high school biology has been cited. Driven by this need, this study addressed the following three research questions: (1) How can laboratory-based performance tasks be designed and developed that are doable by students for whom they are designed/written? (2) Do student responses to the laboratory-based performance tasks validly represent at least some of the intended process skills that new biology learning goals want students to acquire? (3) Are the laboratory-based performance tasks psychometrically consistent as individual tasks and as a set? To answer these questions, three tasks were used from the six biology tasks initially designed and developed by an iterative process of trial testing. Analyses of data from 224 students showed that performance-based laboratory tasks that are doable by all students require careful and iterative process of development. Although the students demonstrated more skill in performing than planning and reasoning, their performances at the item level were very poor for some items. Possible reasons for the poor performances have been discussed and suggestions on how to remediate the deficiencies have been made. Empirical evidences for validity and reliability of the instrument have been presented both from the classical and the modern validity criteria point of view. Limitations of the study have been identified. Finally implications of the study and directions for further research have been discussed.

  11. Development and Cross-National Validation of a Laboratory Classroom Environment Instrument for Senior High School Science.

    ERIC Educational Resources Information Center

    Fraser, Barry J.; And Others

    1993-01-01

    Describes the development of the Science Laboratory Environment Inventory (SLEI) instrument for assessing perceptions of the psychosocial environment in science laboratory classrooms, and reports validation information for samples of senior high school students from six different countries. The SLEI assesses five dimensions of the actual and…

  12. Size dependence of the disruption threshold: laboratory examination of millimeter-centimeter porous targets

    NASA Astrophysics Data System (ADS)

    Nakamura, Akiko M.; Yamane, Fumiya; Okamoto, Takaya; Takasawa, Susumu

    2015-03-01

    The outcome of collision between small solid bodies is characterized by the threshold energy density Q*s, the specific energy to shatter, that is defined as the ratio of projectile kinetic energy to the target mass (or the sum of target and projectile) needed to produce the largest intact fragment that contains one half the target mass. It is indicated theoretically and by numerical simulations that the disruption threshold Q*s decreases with target size in strength-dominated regime. The tendency was confirmed by laboratory impact experiments using non-porous rock targets (Housen and Holsapple, 1999; Nagaoka et al., 2014). In this study, we performed low-velocity impact disruption experiments on porous gypsum targets with porosity of 65-69% and of three different sizes to examine the size dependence of the disruption threshold for porous material. The gypsum specimens were shown to have a weaker volume dependence on static tensile strength than do the non-porous rocks. The disruption threshold had also a weaker dependence on size scale as Q*s ∝D-γ , γ ≤ 0.25 - 0.26, while the previous laboratory studies showed γ=0.40 for the non-porous rocks. The measurements at low-velocity lead to a value of about 100 J kg-1 for Q*s which is roughly one order of magnitude lower than the value of Q*s for the gypsum targets of 65% porosity but impacted by projectiles with higher velocities. Such a clear dependence on the impact velocity was also shown by previous studies of gypsum targets with porosity of 50%.

  13. Validation of laboratory-scale recycling test method of paper PSA label products

    Treesearch

    Carl Houtman; Karen Scallon; Richard Oldack

    2008-01-01

    Starting with test methods and a specification developed by the U.S. Postal Service (USPS) Environmentally Benign Pressure Sensitive Adhesive Postage Stamp Program, a laboratory-scale test method and a specification were developed and validated for pressure-sensitive adhesive labels, By comparing results from this new test method and pilot-scale tests, which have been...

  14. Benchmark radar targets for the validation of computational electromagnetics programs

    NASA Technical Reports Server (NTRS)

    Woo, Alex C.; Wang, Helen T. G.; Schuh, Michael J.; Sanders, Michael L.

    1993-01-01

    Results are presented of a set of computational electromagnetics validation measurements referring to three-dimensional perfectly conducting smooth targets, performed for the Electromagnetic Code Consortium. Plots are presented for both the low- and high-frequency measurements of the NASA almond, an ogive, a double ogive, a cone-sphere, and a cone-sphere with a gap.

  15. Modeling and validation of photometric characteristics of space targets oriented to space-based observation.

    PubMed

    Wang, Hongyuan; Zhang, Wei; Dong, Aotuo

    2012-11-10

    A modeling and validation method of photometric characteristics of the space target was presented in order to track and identify different satellites effectively. The background radiation characteristics models of the target were built based on blackbody radiation theory. The geometry characteristics of the target were illustrated by the surface equations based on its body coordinate system. The material characteristics of the target surface were described by a bidirectional reflectance distribution function model, which considers the character of surface Gauss statistics and microscale self-shadow and is obtained by measurement and modeling in advance. The contributing surfaces of the target to observation system were determined by coordinate transformation according to the relative position of the space-based target, the background radiation sources, and the observation platform. Then a mathematical model on photometric characteristics of the space target was built by summing reflection components of all the surfaces. Photometric characteristics simulation of the space-based target was achieved according to its given geometrical dimensions, physical parameters, and orbital parameters. Experimental validation was made based on the scale model of the satellite. The calculated results fit well with the measured results, which indicates the modeling method of photometric characteristics of the space target is correct.

  16. TARGET's role in knowledge acquisition, engineering, validation, and documentation

    NASA Technical Reports Server (NTRS)

    Levi, Keith R.

    1994-01-01

    We investigate the use of the TARGET task analysis tool for use in the development of rule-based expert systems. We found TARGET to be very helpful in the knowledge acquisition process. It enabled us to perform knowledge acquisition with one knowledge engineer rather than two. In addition, it improved communication between the domain expert and knowledge engineer. We also found it to be useful for both the rule development and refinement phases of the knowledge engineering process. Using the network in these phases required us to develop guidelines that enabled us to easily translate the network into production rules. A significant requirement for TARGET remaining useful throughout the knowledge engineering process was the need to carefully maintain consistency between the network and the rule representations. Maintaining consistency not only benefited the knowledge engineering process, but also has significant payoffs in the areas of validation of the expert system and documentation of the knowledge in the system.

  17. Systems biology-embedded target validation: improving efficacy in drug discovery.

    PubMed

    Vandamme, Drieke; Minke, Benedikt A; Fitzmaurice, William; Kholodenko, Boris N; Kolch, Walter

    2014-01-01

    The pharmaceutical industry is faced with a range of challenges with the ever-escalating costs of drug development and a drying out of drug pipelines. By harnessing advances in -omics technologies and moving away from the standard, reductionist model of drug discovery, there is significant potential to reduce costs and improve efficacy. Embedding systems biology approaches in drug discovery, which seek to investigate underlying molecular mechanisms of potential drug targets in a network context, will reduce attrition rates by earlier target validation and the introduction of novel targets into the currently stagnant market. Systems biology approaches also have the potential to assist in the design of multidrug treatments and repositioning of existing drugs, while stratifying patients to give a greater personalization of medical treatment. © 2013 Wiley Periodicals, Inc.

  18. Validation of "laboratory-supported" criteria for functional (psychogenic) tremor.

    PubMed

    Schwingenschuh, Petra; Saifee, Tabish A; Katschnig-Winter, Petra; Macerollo, Antonella; Koegl-Wallner, Mariella; Culea, Valeriu; Ghadery, Christine; Hofer, Edith; Pendl, Tamara; Seiler, Stephan; Werner, Ulrike; Franthal, Sebastian; Maurits, Natasha M; Tijssen, Marina A; Schmidt, Reinhold; Rothwell, John C; Bhatia, Kailash P; Edwards, Mark J

    2016-04-01

    In a small group of patients, we have previously shown that a combination of electrophysiological tests was able to distinguish functional (psychogenic) tremor and organic tremor with excellent sensitivity and specificity. This study aims to validate an electrophysiological test battery as a tool to diagnose patients with functional tremor with a "laboratory-supported" level of certainty. For this prospective data collection study, we recruited 38 new patients with functional tremor (mean age 37.9 ± 24.5 years; mean disease duration 5.9 ± 9.0 years) and 73 new patients with organic tremor (mean age 55.4 ± 25.4 years; mean disease duration 15.8 ± 17.7 years). Tremor was recorded at rest, posture (with and without loading), action, while performing tapping tasks (1, 3, and 5 Hz), and while performing ballistic movements with the less-affected hand. Electrophysiological tests were performed by raters blinded to the clinical diagnosis. We calculated a sum score for all performed tests (maximum of 10 points) and used a previously suggested cut-off score of 3 points for a diagnosis of laboratory-supported functional tremor. We demonstrated good interrater reliability and test-retest reliability. Patients with functional tremor had a higher average score on the test battery when compared with patients with organic tremor (3.6 ± 1.4 points vs 1.0 ± 0.8 points; P < .001), and the predefined cut-off score for laboratory-supported functional tremor yielded a test sensitivity of 89.5% and a specificity of 95.9%. We now propose this test battery as the basis of laboratory-supported criteria for the diagnosis of functional tremor, and we encourage its use in clinical and research practice. © 2016 International Parkinson and Movement Disorder Society.

  19. Assessment of physical activity with the Computer Science and Applications, Inc., accelerometer: laboratory versus field validation.

    PubMed

    Nichols, J F; Morgan, C G; Chabot, L E; Sallis, J F; Calfas, K J

    2000-03-01

    Our purpose was to compare the validity of the Computer Science and Applications, (CSA) Inc., accelerometer in laboratory and field settings and establish CSA count ranges for light, moderate, and vigorous physical activity. Validity was determined in 60 adults during treadmill exercise, using oxygen consumption (VO2) as the criterion measure, while 30 adults walked and jogged outdoors on a 400-m track. The relationship between CSA counts and VO2 was linear (R2 = .89 SEE = 3.72 ml.kg-1.min-1), as was the relationship between velocity and counts in the field (R2 = .89, SEE = 0.89 mi.hr-1). However, significant differences were found (p < .05) between laboratory and field measures of CSA counts for light and vigorous intensity. We conclude that the CSA can be used to quantify walking and jogging outdoors on level ground; however, laboratory equations may not be appropriate for use in field settings, particularly for light and vigorous activity.

  20. Validation of a basic neurosonology laboratory for detecting cervical carotid artery stenosis.

    PubMed

    de la Cruz Cosme, C; Dawid Milner, M S; Ojeda Burgos, G; Gallardo Tur, A; Márquez Martínez, M; Segura, T

    2017-03-24

    Most of the cases of ischaemic stroke in our setting are of atherothrombotic origin. Detecting intracranial and cervical carotid artery stenosis in patients with ischaemic stroke is therefore essential. Ultrasonography has become the tool of choice for diagnosing carotid artery stenosis because it is both readily accessibility and reliable. However, use of this technique must be validated in each laboratory. The purpose of this study is to validate Doppler ultrasound in our laboratory as a means of detecting severe carotid artery stenosis. We conducted an observational descriptive study to evaluate diagnostic tests. The results from transcranial and cervical carotid Doppler ultrasound scans conducted by neurologists were compared to those from carotid duplex scans performed by radiologists in patients diagnosed with stroke. Arteriography was considered the gold standard (MR angiography, CT angiography, or conventional arteriography). Our sample included 228 patients. Transcranial and cervical carotid Doppler ultrasound showed a sensitivity of 95% and specificity of 100% for detection of carotid artery stenosis > 70%, whereas carotid duplex displayed a sensitivity of 87% and a specificity of 94%. Transcranial carotid Doppler ultrasound achieved a sensitivity of 78% and a specificity of 98% for detection of intracranial stenosis. Doppler ultrasound in our neurosonology laboratory was found to be a useful diagnostic tool for detecting cervical carotid artery stenosis and demonstrated superiority to carotid duplex despite the lack of B-mode. Furthermore, this technique was found to be useful for detecting intracranial stenosis. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. Internal validation of STRmix™ - A multi laboratory response to PCAST.

    PubMed

    Bright, Jo-Anne; Richards, Rebecca; Kruijver, Maarten; Kelly, Hannah; McGovern, Catherine; Magee, Alan; McWhorter, Andrew; Ciecko, Anne; Peck, Brian; Baumgartner, Chase; Buettner, Christina; McWilliams, Scott; McKenna, Claire; Gallacher, Colin; Mallinder, Ben; Wright, Darren; Johnson, Deven; Catella, Dorothy; Lien, Eugene; O'Connor, Craig; Duncan, George; Bundy, Jason; Echard, Jillian; Lowe, John; Stewart, Joshua; Corrado, Kathleen; Gentile, Sheila; Kaplan, Marla; Hassler, Michelle; McDonald, Naomi; Hulme, Paul; Oefelein, Rachel H; Montpetit, Shawn; Strong, Melissa; Noël, Sarah; Malsom, Simon; Myers, Steven; Welti, Susan; Moretti, Tamyra; McMahon, Teresa; Grill, Thomas; Kalafut, Tim; Greer-Ritzheimer, MaryMargaret; Beamer, Vickie; Taylor, Duncan A; Buckleton, John S

    2018-05-01

    We report a large compilation of the internal validations of the probabilistic genotyping software STRmix™. Thirty one laboratories contributed data resulting in 2825 mixtures comprising three to six donors and a wide range of multiplex, equipment, mixture proportions and templates. Previously reported trends in the LR were confirmed including less discriminatory LRs occurring both for donors and non-donors at low template (for the donor in question) and at high contributor number. We were unable to isolate an effect of allelic sharing. Any apparent effect appears to be largely confounded with increased contributor number. Copyright © 2018. Published by Elsevier B.V.

  2. Improving validity of informed consent for biomedical research in Zambia using a laboratory exposure intervention.

    PubMed

    Zulu, Joseph Mumba; Lisulo, Mpala Mwanza; Besa, Ellen; Kaonga, Patrick; Chisenga, Caroline C; Chomba, Mumba; Simuyandi, Michelo; Banda, Rosemary; Kelly, Paul

    2014-01-01

    Complex biomedical research can lead to disquiet in communities with limited exposure to scientific discussions, leading to rumours or to high drop-out rates. We set out to test an intervention designed to address apprehensions commonly encountered in a community where literacy is uncommon, and where complex biomedical research has been conducted for over a decade. We aimed to determine if it could improve the validity of consent. Data were collected using focus group discussions, key informant interviews and observations. We designed an intervention that exposed participants to a detailed demonstration of laboratory processes. Each group was interviewed twice in a day, before and after exposure to the intervention in order to assess changes in their views. Factors that motivated people to participate in invasive biomedical research included a desire to stay healthy because of the screening during the recruitment process, regular advice from doctors, free medical services, and trust in the researchers. Inhibiting factors were limited knowledge about samples taken from their bodies during endoscopic procedures, the impact of endoscopy on the function of internal organs, and concerns about the use of biomedical samples. The belief that blood can be used for Satanic practices also created insecurities about drawing of blood samples. Further inhibiting factors included a fear of being labelled as HIV positive if known to consult heath workers repeatedly, and gender inequality. Concerns about the use and storage of blood and tissue samples were overcome by a laboratory exposure intervention. Selecting a group of members from target community and engaging them in a laboratory exposure intervention could be a useful tool for enhancing specific aspects of consent for biomedical research. Further work is needed to determine the extent to which improved understanding permeates beyond the immediate group participating in the intervention.

  3. Improving Validity of Informed Consent for Biomedical Research in Zambia Using a Laboratory Exposure Intervention

    PubMed Central

    Zulu, Joseph Mumba; Lisulo, Mpala Mwanza; Besa, Ellen; Kaonga, Patrick; Chisenga, Caroline C.; Chomba, Mumba; Simuyandi, Michelo; Banda, Rosemary; Kelly, Paul

    2014-01-01

    Background Complex biomedical research can lead to disquiet in communities with limited exposure to scientific discussions, leading to rumours or to high drop-out rates. We set out to test an intervention designed to address apprehensions commonly encountered in a community where literacy is uncommon, and where complex biomedical research has been conducted for over a decade. We aimed to determine if it could improve the validity of consent. Methods Data were collected using focus group discussions, key informant interviews and observations. We designed an intervention that exposed participants to a detailed demonstration of laboratory processes. Each group was interviewed twice in a day, before and after exposure to the intervention in order to assess changes in their views. Results Factors that motivated people to participate in invasive biomedical research included a desire to stay healthy because of the screening during the recruitment process, regular advice from doctors, free medical services, and trust in the researchers. Inhibiting factors were limited knowledge about samples taken from their bodies during endoscopic procedures, the impact of endoscopy on the function of internal organs, and concerns about the use of biomedical samples. The belief that blood can be used for Satanic practices also created insecurities about drawing of blood samples. Further inhibiting factors included a fear of being labelled as HIV positive if known to consult heath workers repeatedly, and gender inequality. Concerns about the use and storage of blood and tissue samples were overcome by a laboratory exposure intervention. Conclusion Selecting a group of members from target community and engaging them in a laboratory exposure intervention could be a useful tool for enhancing specific aspects of consent for biomedical research. Further work is needed to determine the extent to which improved understanding permeates beyond the immediate group participating in the intervention

  4. CRISPR-Cas9-based target validation for p53-reactivating model compounds

    PubMed Central

    Wanzel, Michael; Vischedyk, Jonas B; Gittler, Miriam P; Gremke, Niklas; Seiz, Julia R; Hefter, Mirjam; Noack, Magdalena; Savai, Rajkumar; Mernberger, Marco; Charles, Joël P; Schneikert, Jean; Bretz, Anne Catherine; Nist, Andrea; Stiewe, Thorsten

    2015-01-01

    Inactivation of the p53 tumor suppressor by Mdm2 is one of the most frequent events in cancer, so compounds targeting the p53-Mdm2 interaction are promising for cancer therapy. Mechanisms conferring resistance to p53-reactivating compounds are largely unknown. Here we show using CRISPR-Cas9–based target validation in lung and colorectal cancer that the activity of nutlin, which blocks the p53-binding pocket of Mdm2, strictly depends on functional p53. In contrast, sensitivity to the drug RITA, which binds the Mdm2-interacting N terminus of p53, correlates with induction of DNA damage. Cells with primary or acquired RITA resistance display cross-resistance to DNA crosslinking compounds such as cisplatin and show increased DNA cross-link repair. Inhibition of FancD2 by RNA interference or pharmacological mTOR inhibitors restores RITA sensitivity. The therapeutic response to p53-reactivating compounds is therefore limited by compound-specific resistance mechanisms that can be resolved by CRISPR-Cas9-based target validation and should be considered when allocating patients to p53-reactivating treatments. PMID:26595461

  5. Do targeted written comments and the rubric method of delivery affect performance on future human physiology laboratory reports?

    PubMed

    Clayton, Zachary S; Wilds, Gabriel P; Mangum, Joshua E; Hocker, Austin D; Dawson, Sierra M

    2016-09-01

    We investigated how students performed on weekly two-page laboratory reports based on whether the grading rubric was provided to the student electronically or in paper form and the inclusion of one- to two-sentence targeted comments. Subjects were registered for a 289-student, third-year human physiology class with laboratory and were randomized into four groups related to rubric delivery and targeted comments. All students received feedback via the same detailed grading rubric. At the end of the term, subjects provided consent and a self-assessment of their rubric viewing rate and preferences. There were no differences in laboratory report scores between groups (P = 0.86), although scores did improve over time (P < 0.01). Students receiving targeted comments self-reported viewing their rubric more often than students that received no comments (P = 0.02), but the viewing rate was independent of the rubric delivery method (P = 0.15). Subjects with high rubric viewing rates did not have higher laboratory report grades than subjects with low viewing rates (P = 0.64). When asked about their preference for the future, 43% of respondents preferred the same method again (electronic or paper rubric) and 25% had no preference. We conclude that although student laboratory report grades improved over time, the rate and degree of improvement were not related to rubric delivery method or to the inclusion of targeted comments. Copyright © 2016 The American Physiological Society.

  6. Determination of Ethanol in Kombucha Products: Single-Laboratory Validation, First Action 2016.12.

    PubMed

    Ebersole, Blake; Liu, Ying; Schmidt, Rich; Eckert, Matt; Brown, Paula N

    2017-05-01

    Kombucha is a fermented nonalcoholic beverage that has drawn government attention due to the possible presence of excess ethanol (≥0.5% alcohol by volume; ABV). A validated method that provides better precision and accuracy for measuring ethanol levels in kombucha is urgently needed by the kombucha industry. The current study validated a method for determining ethanol content in commercial kombucha products. The ethanol content in kombucha was measured using headspace GC with flame ionization detection. An ethanol standard curve ranging from 0.05 to 5.09% ABV was used, with correlation coefficients greater than 99.9%. The method detection limit was 0.003% ABV and the LOQ was 0.01% ABV. The RSDr ranged from 1.62 to 2.21% and the Horwitz ratio ranged from 0.4 to 0.6. The average accuracy of the method was 98.2%. This method was validated following the guidelines for single-laboratory validation by AOAC INTERNATIONAL and meets the requirements set by AOAC SMPR 2016.001, "Standard Method Performance Requirements for Determination of Ethanol in Kombucha."

  7. Genetically Validated Drug Targets in Leishmania: Current Knowledge and Future Prospects.

    PubMed

    Jones, Nathaniel G; Catta-Preta, Carolina M C; Lima, Ana Paula C A; Mottram, Jeremy C

    2018-04-13

    There has been a very limited number of high-throughput screening campaigns carried out with Leishmania drug targets. In part, this is due to the small number of suitable target genes that have been shown by genetic or chemical methods to be essential for the parasite. In this perspective, we discuss the state of genetic target validation in the field of Leishmania research and review the 200 Leishmania genes and 36 Trypanosoma cruzi genes for which gene deletion attempts have been made since the first published case in 1990. We define a quality score for the different genetic deletion techniques that can be used to identify potential drug targets. We also discuss how the advances in genome-scale gene disruption techniques have been used to assist target-based and phenotypic-based drug development in other parasitic protozoa and why Leishmania has lacked a similar approach so far. The prospects for this scale of work are considered in the context of the application of CRISPR/Cas9 gene editing as a useful tool in Leishmania.

  8. Comparative Study in Laboratory Rats to Validate Sperm Quality Methods and Endpoints

    NASA Technical Reports Server (NTRS)

    Price, W. A.; Briggs, G. B.; Alexander, W. K.; Still, K. R.; Grasman, K. A.

    2000-01-01

    Abstract The Naval Health Research Center, Detachment (Toxicology) performs toxicity studies in laboratory animals to characterize the risk of exposure to chemicals of Navy interest. Research was conducted at the Toxicology Detachment at WPAFB, OH in collaboration with Wright State University, Department of Biological Sciences for the validation of new bioassay methods for evaluating reproductive toxicity. The Hamilton Thorne sperm analyzer was used to evaluate sperm damage produced by exposure to a known testicular toxic agent, methoxyacetic acid and by inhalation exposure to JP-8 and JP-5 in laboratory rats. Sperm quality parameters were evaluated (sperm concentration, motility, and morphology) to provide evidence of sperm damage. The Hamilton Thorne sperm analyzer utilizes a DNA specific fluorescent stain (similar to flow cytometry) and digitized optical computer analysis to detect sperm cell damage. The computer assisted sperm analysis (CASA) is a more rapid, robust, predictive and sensitive method for characterizing reproductive toxicity. The results presented in this poster report validation information showing exposure to methoxyacetic acid causes reproductive toxicity and inhalation exposure to JP-8 and JP-5 had no significant effects. The CASA method detects early changes that result in reproductive deficits and these data will be used in a continuing program to characterize the toxicity of chemicals, and combinations of chemicals, of military interest to formulate permissible exposure limits.

  9. Validity for an integrated laboratory analogue of sexual aggression and bystander intervention.

    PubMed

    Parrott, Dominic J; Tharp, Andra Teten; Swartout, Kevin M; Miller, Cameron A; Hall, Gordon C Nagayama; George, William H

    2012-01-01

    This study sought to develop and validate an integrated laboratory paradigm of sexual aggression and bystander intervention. Participants were a diverse community sample (54% African American) of heterosexual males (N = 156) between 21 and 35 years of age who were recruited to complete the study with a male friend and an ostensibly single, heterosexual female who reported a strong dislike of sexual content in the media. Participants viewed a sexually explicit or nonsexually explicit film clip as part of contrived media rating task and made individual choices of which film clip to show the female confederate. Immediately thereafter, participants were required to reach consensus on a group decision of which film clip to show the female confederate. Subjecting a target to an unwanted experience with a sexual connotation was operationalized as selection of the sexually explicit video, whereas successful bystander intervention was operationalized as the event of one partner individually selecting the sexually explicit video but then selecting the nonsexually explicit video for the group choice. Results demonstrated that a 1-year history of sexual aggression and endorsement of pertinent misogynistic attitudes significantly predicted selection of the sexually-explicit video. In addition, bystander efficacy significantly predicted men's successful prevention of their male peer's intent to show the female confederate a sexually explicit video. Discussion focused on how these data inform future research and bystander intervention programming for sexual aggression. © 2012 Wiley Periodicals, Inc.

  10. Multi-laboratory evaluations of the performance of Catellicoccus marimammalium PCR assays developed to target gull fecal sources

    USGS Publications Warehouse

    Sinigalliano, Christopher D.; Ervin, Jared S.; Van De Werfhorst, Laurie C.; Badgley, Brian D.; Ballestée, Elisenda; Bartkowiaka, Jakob; Boehm, Alexandria B.; Byappanahalli, Muruleedhara N.; Goodwin, Kelly D.; Gourmelon, Michèle; Griffith, John; Holden, Patricia A.; Jay, Jenny; Layton, Blythe; Lee, Cheonghoon; Lee, Jiyoung; Meijer, Wim G.; Noble, Rachel; Raith, Meredith; Ryu, Hodon; Sadowsky, Michael J.; Schriewer, Alexander; Wang, Dan; Wanless, David; Whitman, Richard; Wuertz, Stefan; Santo Domingo, Jorge W.

    2013-01-01

    Here we report results from a multi-laboratory (n = 11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium originally developed to detect gull fecal contamination in coastal environments. The methods included a conventional end-point PCR method, a SYBR® Green qPCR method, and two TaqMan® qPCR methods. Different techniques for data normalization and analysis were tested. Data analysis methods had a pronounced impact on assay sensitivity and specificity calculations. Across-laboratory standardization of metrics including the lower limit of quantification (LLOQ), target detected but not quantifiable (DNQ), and target not detected (ND) significantly improved results compared to results submitted by individual laboratories prior to definition standardization. The unit of measure used for data normalization also had a pronounced effect on measured assay performance. Data normalization to DNA mass improved quantitative method performance as compared to enterococcus normalization. The MST methods tested here were originally designed for gulls but were found in this study to also detect feces from other birds, particularly feces composited from pigeons. Sequencing efforts showed that some pigeon feces from California contained sequences similar to C. marimammalium found in gull feces. These data suggest that the prevalence, geographic scope, and ecology of C. marimammalium in host birds other than gulls require further investigation. This study represents an important first step in the multi-laboratory assessment of these methods and highlights the need to broaden and standardize additional evaluations, including environmentally relevant target concentrations in ambient waters from diverse geographic regions.

  11. EGFR, HER2 and VEGF pathways: validated targets for cancer treatment.

    PubMed

    Press, Michael F; Lenz, Heinz-Josef

    2007-01-01

    Targeted therapies are rationally designed to interfere with specific molecular events that are important in tumour growth, progression or survival. Several targeted therapies with anti-tumour activity in human cancer cell lines and xenograft models have now been shown to produce objective responses, delay disease progression and, in some cases, improve survival of patients with advanced malignancies. These targeted therapies include cetuximab, an anti-epidermal growth factor receptor (EGFR) monoclonal antibody; gefitinib and erlotinib, EGFR-specific tyrosine kinase inhibitors; trastuzumab, an anti-human EGFR type 2 (HER2)-related monoclonal antibody; lapatinib, a dual inhibitor of both EGFR- and HER2-associated tyrosine kinases; and bevacizumab, an anti-vascular endothelial growth factor (VEGF) monoclonal antibody. On the basis of preclinical and clinical evidence, EGFR, HER2 and VEGF represent validated targets for cancer therapy and remain the subject of intensive investigation. Both EGFR and HER2 are targets found on cancer cells, whereas VEGF is a target that acts in the tumour microenvironment. Clinical studies are focusing on how to best incorporate targeted therapy into current treatment regimens and other studies are exploring whether different strategies for inhibiting these targets will offer greater benefit. It is clear that optimal use of targeted therapy will depend on understanding how these drugs work mechanistically, and recognising that their activities may differ across patient populations, tumour types and disease stages, as well as when and how they are used in cancer treatment. The results achieved with targeted therapies to date are promising, although they illustrate the need for additional preclinical and clinical study.

  12. Intra- and inter-laboratory validation of a dipstick immunoassay for the detection of tropane alkaloids hyoscyamine and scopolamine in animal feed.

    PubMed

    Mulder, Patrick P J; von Holst, Christoph; Nivarlet, Noan; van Egmond, Hans P

    2014-01-01

    Tropane alkaloids (TAs) are toxic secondary metabolites produced by plants of, inter alia, the genera Datura (thorn apple) and Atropa (deadly nightshade). The most relevant TAs are (-)-L-hyoscyamine and (-)-L-scopolamine, which act as antagonists of acetylcholine muscarinic receptors and can induce a variety of distinct toxic syndromes in mammals (anti-cholinergic poisoning). The European Union has regulated the presence of seeds of Datura sp. in animal feeds, specifying that the content should not exceed 1000 mg kg(-1) (Directive 2002/32/EC). For materials that have not been ground, visual screening methods are often used to comply with these regulations, but these cannot be used for ground materials and compound feeds. Immunological assays, preferably in dipstick format, can be a simple and cost-effective approach to monitor feedstuffs in an HACCP setting in control laboratories. So far no reports have been published on immunoassays that are capable of detecting both hyoscyamine and scopolamine with equal sensitivity and that can be used, preferably in dipstick format, for application as a fast screening tool in feed analysis. This study presents the results obtained for the in-house and inter-laboratory validation of a dipstick immunoassay for the detection of hyoscyamine and scopolamine in animal feed. The target level was set at 800 µg kg(-1) for the sum of both alkaloids. By using a representative set of compound feeds during validation and a robust study design, a reliable impression of the relevant characteristics of the assay could be obtained. The dipstick test displayed similar sensitivity towards the two alkaloids and it could be concluded that the test has a very low probability of producing a false-positive result at blank level or a false-negative result at target level. The assay can be used for monitoring of TAs in feedstuffs, but has also potential as a quick screening tool in food- or feed-related poisonings.

  13. Remote Sensing Product Verification and Validation at the NASA Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas M.

    2005-01-01

    Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.

  14. Naval Research Laboratory Multiscale Targeting Guidance for T-PARC and TCS-08

    DTIC Science & Technology

    2010-01-01

    Naval Research Laboratory Multiscale Targeting Guidance for T- PARC and TCS-08 CAROLYN A. REYNOLDS AND JAMES D. DOYLE Marine Meteorology Division...of The Observing System Research and Predictability Experiment (THORPEX) Pacific Asian Regional Campaign (T- PARC ) and the Office of Naval Research’s...These products were produced with 24-, 36-, and 48-h lead times. The nonhydrostatic adjoint system used during T- PARC /TCS-08 contains an exact adjoint to

  15. VALIDATION OF THE CORONAL THICK TARGET SOURCE MODEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleishman, Gregory D.; Xu, Yan; Nita, Gelu N.

    2016-01-10

    We present detailed 3D modeling of a dense, coronal thick-target X-ray flare using the GX Simulator tool, photospheric magnetic measurements, and microwave imaging and spectroscopy data. The developed model offers a remarkable agreement between the synthesized and observed spectra and images in both X-ray and microwave domains, which validates the entire model. The flaring loop parameters are chosen to reproduce the emission measure, temperature, and the nonthermal electron distribution at low energies derived from the X-ray spectral fit, while the remaining parameters, unconstrained by the X-ray data, are selected such as to match the microwave images and total power spectra.more » The modeling suggests that the accelerated electrons are trapped in the coronal part of the flaring loop, but away from where the magnetic field is minimal, and, thus, demonstrates that the data are clearly inconsistent with electron magnetic trapping in the weak diffusion regime mediated by the Coulomb collisions. Thus, the modeling supports the interpretation of the coronal thick-target sources as sites of electron acceleration in flares and supplies us with a realistic 3D model with physical parameters of the acceleration region and flaring loop.« less

  16. Setup, Validation and Quality Control of a Centralized WGS laboratory - Lessons Learned.

    PubMed

    Arnold, Cath; Edwards, Kirstin; Desai, Meeta; Platt, Steve; Green, Jonathan; Conway, David

    2018-04-25

    Routine use of Whole Genome analysis for infectious diseases can be used to enlighten various scenarios pertaining to Public Health, including identification of microbial pathogens; relating individual cases to an outbreak of infectious disease; establishing an association between an outbreak of food poisoning and a specific food vehicle; inferring drug susceptibility; source tracing of contaminants and study of variations in the genome affect pathogenicity/virulence. We describe the setup, validation and ongoing verification of a centralised WGS laboratory to carry out the sequencing for these public health functions for the National Infection Services, Public Health England in the UK. The performance characteristics and Quality Control metrics measured during validation and verification of the entire end to end process (accuracy, precision, reproducibility and repeatability) are described and include information regarding the automated pass and release of data to service users without intervention. © Crown copyright 2018.

  17. Laboratory-based validation of the baseline sensors of the ITER diagnostic residual gas analyzer

    NASA Astrophysics Data System (ADS)

    Klepper, C. C.; Biewer, T. M.; Marcus, C.; Andrew, P.; Gardner, W. L.; Graves, V. B.; Hughes, S.

    2017-10-01

    The divertor-specific ITER Diagnostic Residual Gas Analyzer (DRGA) will provide essential information relating to DT fusion plasma performance. This includes pulse-resolving measurements of the fuel isotopic mix reaching the pumping ducts, as well as the concentration of the helium generated as the ash of the fusion reaction. In the present baseline design, the cluster of sensors attached to this diagnostic's differentially pumped analysis chamber assembly includes a radiation compatible version of a commercial quadrupole mass spectrometer, as well as an optical gas analyzer using a plasma-based light excitation source. This paper reports on a laboratory study intended to validate the performance of this sensor cluster, with emphasis on the detection limit of the isotopic measurement. This validation study was carried out in a laboratory set-up that closely prototyped the analysis chamber assembly configuration of the baseline design. This includes an ITER-specific placement of the optical gas measurement downstream from the first turbine of the chamber's turbo-molecular pump to provide sufficient light emission while preserving the gas dynamics conditions that allow for \\textasciitilde 1 s response time from the sensor cluster [1].

  18. Comparison of Continuous-Wave CO2 Lidar Calibration by use of Earth-Surface Targets in Laboratory and Airborne Measurements

    NASA Technical Reports Server (NTRS)

    Jarzembski, Maurice A.; Srivastava, Vandana

    1998-01-01

    Backscatter of several Earth surfaces was characterized in the laboratory as a function of incidence angle with a focused continuous-wave 9.1 micro meter CO2 Doppler lidar for use as possible calibration targets. Some targets showed negligible angular dependence, while others showed a slight increase with decreasing angle. The Earth-surface signal measured over the complex Californian terrain during a 1995 NASA airborne mission compared well with laboratory data. Distributions of the Earth's surface signal shows that the lidar efficiency can be estimated with a fair degree of accuracy, preferably with uniform Earth-surface targets during flight for airborne or space-based lidar.

  19. Ground Target Modeling and Validation Conference (10th) Held in Houghton, Michigan, on 17-19 August 1999

    DTIC Science & Technology

    1999-08-01

    electrically small or only have a greater size in one dimension will not have a significant impact on the total RCS. At 1000 MHz, the components on the model ...7^/43- L"^y 16 % 6 ^Ly Cc>v y to-*^ r*r+r g,^\\oS^ Proceedings ? Tenth Annual Ground Target Modeling and Validation Conference August 1999...of the Tenth Annual Ground Target Modeling and Validation Conference (Unclassified) \\2. PERSONAL AUTHOR(S) William R Reynolds and Tracy T. Maki 13a

  20. 42 CFR 493.1780 - Standard: Inspection of CLIA-exempt laboratories or laboratories requesting or issued a...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...

  1. 42 CFR 493.1780 - Standard: Inspection of CLIA-exempt laboratories or laboratories requesting or issued a...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...

  2. 42 CFR 493.1780 - Standard: Inspection of CLIA-exempt laboratories or laboratories requesting or issued a...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...

  3. 42 CFR 493.1780 - Standard: Inspection of CLIA-exempt laboratories or laboratories requesting or issued a...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...

  4. 42 CFR 493.1780 - Standard: Inspection of CLIA-exempt laboratories or laboratories requesting or issued a...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...

  5. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  6. Validating Laboratory Results in Electronic Health Records

    PubMed Central

    Perrotta, Peter L.; Karcher, Donald S.

    2017-01-01

    Context Laboratories must ensure that the test results and pathology reports they transmit to a patient’s electronic health record (EHR) are accurate, complete, and presented in a useable format. Objective To determine the accuracy, completeness, and formatting of laboratory test results and pathology reports transmitted from the laboratory to the EHR. Design Participants from 45 institutions retrospectively reviewed results from 16 different laboratory tests, including clinical and anatomic pathology results, within the EHR used by their providers to view laboratory results. Results were evaluated for accuracy, presence of required elements, and usability. Both normal and abnormal results were reviewed for tests, some of which were performed in-house and others at a reference laboratory. Results Overall accuracy for test results transmitted to the EHR was greater than 99.3% (1052 of 1059). There was lower compliance for completeness of test results, with 69.6% (732 of 1051) of the test results containing all essential reporting elements. Institutions that had fewer than half of their orders entered electronically had lower test result completeness rates. The rate of appropriate formatting of results was 90.9% (98 of 1010). Conclusions The great majority of test results are accurately transmitted from the laboratory to the EHR; however, lower percentages are transmitted completely and in a useable format. Laboratories should verify the accuracy, completeness, and format of test results at the time of test implementation, after test changes, and periodically. PMID:27575266

  7. Universal immunogenicity validation and assessment during early biotherapeutic development to support a green laboratory.

    PubMed

    Bautista, Ami C; Zhou, Lei; Jawa, Vibha

    2013-10-01

    Immunogenicity support during nonclinical biotherapeutic development can be resource intensive if supported by conventional methodologies. A universal indirect species-specific immunoassay can eliminate the need for biotherapeutic-specific anti-drug antibody immunoassays without compromising quality. By implementing the R's of sustainability (reduce, reuse, rethink), conservation of resources and greener laboratory practices were achieved in this study. Statistical analysis across four biotherapeutics supported identification of consistent product performance standards (cut points, sensitivity and reference limits) and a streamlined universal anti-drug antibody immunoassay method implementation strategy. We propose an efficient, fit-for-purpose, scientifically and statistically supported nonclinical immunogenicity assessment strategy. Utilization of a universal method and streamlined validation, while retaining comparability to conventional immunoassays and meeting the industry recommended standards, provides environmental credits in the scientific laboratory. Collectively, individual reductions in critical material consumption, energy usage, waste and non-environment friendly consumables, such as plastic and paper, support a greener laboratory environment.

  8. Implementation of a digital preparation validation tool in dental skills laboratory training.

    PubMed

    Kozarovska, A; Larsson, C

    2018-05-01

    To describe the implementation of a digital tool for preparation validation and evaluate it as an aid in students' self-assessment. Students at the final semester of skills laboratory training were asked to use a digital preparation validation tool (PVT) when performing two different tasks; preparation of crowns for teeth 11 and 21. The students were divided into two groups. Group A self-assessed and scanned all three attempts at 21 ("prep-and-scan"). Group B self-assessed all attempts chose the best one and scanned it ("best-of-three"). The situation was reversed for 11. The students assessed five parameters of the preparation and marked them as approved (A) or failed (F). These marks were compared with the information from the PVT. The students also completed a questionnaire. Each question was rated from 1 to 5. Teachers' opinions were collected at staff meetings throughout the project. Most students in the "prep-and-scan" groups showed an increase in agreement between their self-assessment and the information from the PVT, whereas students in the "best-of-three" groups showed lower levels of agreement. All students rated the PVT positively. Most strongly agreed that the tool was helpful in developing skills (mean 4.15), easy to use (mean 4.23) and that it added benefits in comparison to existing assessment tools (mean 4.05). They did not however, fully agree that the tool is time efficient (mean 2.55), and they did not consider it a substitute for verbal teacher feedback. Teachers' feedback suggested advantages of the tool in the form of ease of use, visual aid and increasing interest and motivation during skills laboratory training however, they did not notice a reduction in need of verbal feedback. Within the limitations of the study, our conclusion is that a digital PVT may be a valuable adjunct to other assessment tools in skills laboratory training. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Human genetics as a model for target validation: finding new therapies for diabetes.

    PubMed

    Thomsen, Soren K; Gloyn, Anna L

    2017-06-01

    Type 2 diabetes is a global epidemic with major effects on healthcare expenditure and quality of life. Currently available treatments are inadequate for the prevention of comorbidities, yet progress towards new therapies remains slow. A major barrier is the insufficiency of traditional preclinical models for predicting drug efficacy and safety. Human genetics offers a complementary model to assess causal mechanisms for target validation. Genetic perturbations are 'experiments of nature' that provide a uniquely relevant window into the long-term effects of modulating specific targets. Here, we show that genetic discoveries over the past decades have accurately predicted (now known) therapeutic mechanisms for type 2 diabetes. These findings highlight the potential for use of human genetic variation for prospective target validation, and establish a framework for future applications. Studies into rare, monogenic forms of diabetes have also provided proof-of-principle for precision medicine, and the applicability of this paradigm to complex disease is discussed. Finally, we highlight some of the limitations that are relevant to the use of genome-wide association studies (GWAS) in the search for new therapies for diabetes. A key outstanding challenge is the translation of GWAS signals into disease biology and we outline possible solutions for tackling this experimental bottleneck.

  10. Validity of Teacher Ratings in Selecting Influential Aggressive Adolescents for a Targeted Preventive Intervention

    PubMed Central

    Henry, David B.; Miller-Johnson, Shari; Simon, Thomas R.; Schoeny, Michael E.

    2009-01-01

    This study describes a method for using teacher nominations and ratings to identify socially influential, aggressive middle school students for participation in a targeted violence prevention intervention. The teacher nomination method is compared with peer nominations of aggression and influence to obtain validity evidence. Participants were urban, predominantly African American and Latino sixth-grade students who were involved in a pilot study for a large multi-site violence prevention project. Convergent validity was suggested by the high correlation of teacher ratings of peer influence and peer nominations of social influence. The teacher ratings of influence demonstrated acceptable sensitivity and specificity when predicting peer nominations of influence among the most aggressive children. Results are discussed m terms of the application of teacher nominations and ratings in large trials and full implementation of targeted prevention programs. PMID:16378226

  11. Do Targeted Written Comments and the Rubric Method of Delivery Affect Performance on Future Human Physiology Laboratory Reports?

    ERIC Educational Resources Information Center

    Clayton, Zachary S.; Wilds, Gabriel P.; Mangum, Joshua E.; Hocker, Austin D.; Dawson, Sierra M.

    2016-01-01

    We investigated how students performed on weekly two-page laboratory reports based on whether the grading rubric was provided to the student electronically or in paper form and the inclusion of one- to two-sentence targeted comments. Subjects were registered for a 289-student, third-year human physiology class with laboratory and were randomized…

  12. Comparison of on-site field measured inorganic arsenic in rice with laboratory measurements using a field deployable method: Method validation.

    PubMed

    Mlangeni, Angstone Thembachako; Vecchi, Valeria; Norton, Gareth J; Raab, Andrea; Krupp, Eva M; Feldmann, Joerg

    2018-10-15

    A commercial arsenic field kit designed to measure inorganic arsenic (iAs) in water was modified into a field deployable method (FDM) to measure iAs in rice. While the method has been validated to give precise and accurate results in the laboratory, its on-site field performance has not been evaluated. This study was designed to test the method on-site in Malawi in order to evaluate its accuracy and precision in determination of iAs on-site by comparing with a validated reference method and giving original data on inorganic arsenic in Malawian rice and rice-based products. The method was validated by using the established laboratory-based HPLC-ICPMS. Statistical tests indicated there were no significant differences between on-site and laboratory iAs measurements determined using the FDM (p = 0.263, ά = 0.05) and between on-site measurements and measurements determined using HPLC-ICP-MS (p = 0.299, ά = 0.05). This method allows quick (within 1 h) and efficient screening of rice containing iAs concentrations on-site. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Design, development, testing and validation of a Photonics Virtual Laboratory for the study of LEDs

    NASA Astrophysics Data System (ADS)

    Naranjo, Francisco L.; Martínez, Guadalupe; Pérez, Ángel L.; Pardo, Pedro J.

    2014-07-01

    This work presents the design, development, testing and validation of a Photonic Virtual Laboratory, highlighting the study of LEDs. The study was conducted from a conceptual, experimental and didactic standpoint, using e-learning and m-learning platforms. Specifically, teaching tools that help ensure that our students perform significant learning have been developed. It has been brought together the scientific aspect, such as the study of LEDs, with techniques of generation and transfer of knowledge through the selection, hierarchization and structuring of information using concept maps. For the validation of the didactic materials developed, it has been used procedures with various assessment tools for the collection and processing of data, applied in the context of an experimental design. Additionally, it was performed a statistical analysis to determine the validity of the materials developed. The assessment has been designed to validate the contributions of the new materials developed over the traditional method of teaching, and to quantify the learning achieved by students, in order to draw conclusions that serve as a reference for its application in the teaching and learning processes, and comprehensively validate the work carried out.

  14. Inter-laboratory validation of an inexpensive streamlined method to measure inorganic arsenic in rice grain.

    PubMed

    Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J

    2018-05-04

    With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.

  15. Validation of a Laboratory Method for Evaluating Dynamic Properties of Reconstructed Equine Racetrack Surfaces

    PubMed Central

    Setterbo, Jacob J.; Chau, Anh; Fyhrie, Patricia B.; Hubbard, Mont; Upadhyaya, Shrini K.; Symons, Jennifer E.; Stover, Susan M.

    2012-01-01

    Background Racetrack surface is a risk factor for racehorse injuries and fatalities. Current research indicates that race surface mechanical properties may be influenced by material composition, moisture content, temperature, and maintenance. Race surface mechanical testing in a controlled laboratory setting would allow for objective evaluation of dynamic properties of surface and factors that affect surface behavior. Objective To develop a method for reconstruction of race surfaces in the laboratory and validate the method by comparison with racetrack measurements of dynamic surface properties. Methods Track-testing device (TTD) impact tests were conducted to simulate equine hoof impact on dirt and synthetic race surfaces; tests were performed both in situ (racetrack) and using laboratory reconstructions of harvested surface materials. Clegg Hammer in situ measurements were used to guide surface reconstruction in the laboratory. Dynamic surface properties were compared between in situ and laboratory settings. Relationships between racetrack TTD and Clegg Hammer measurements were analyzed using stepwise multiple linear regression. Results Most dynamic surface property setting differences (racetrack-laboratory) were small relative to surface material type differences (dirt-synthetic). Clegg Hammer measurements were more strongly correlated with TTD measurements on the synthetic surface than the dirt surface. On the dirt surface, Clegg Hammer decelerations were negatively correlated with TTD forces. Conclusions Laboratory reconstruction of racetrack surfaces guided by Clegg Hammer measurements yielded TTD impact measurements similar to in situ values. The negative correlation between TTD and Clegg Hammer measurements confirms the importance of instrument mass when drawing conclusions from testing results. Lighter impact devices may be less appropriate for assessing dynamic surface properties compared to testing equipment designed to simulate hoof impact (TTD

  16. In silico target prediction for elucidating the mode of action of herbicides including prospective validation.

    PubMed

    Chiddarwar, Rucha K; Rohrer, Sebastian G; Wolf, Antje; Tresch, Stefan; Wollenhaupt, Sabrina; Bender, Andreas

    2017-01-01

    The rapid emergence of pesticide resistance has given rise to a demand for herbicides with new mode of action (MoA). In the agrochemical sector, with the availability of experimental high throughput screening (HTS) data, it is now possible to utilize in silico target prediction methods in the early discovery phase to suggest the MoA of a compound via data mining of bioactivity data. While having been established in the pharmaceutical context, in the agrochemical area this approach poses rather different challenges, as we have found in this work, partially due to different chemistry, but even more so due to different (usually smaller) amounts of data, and different ways of conducting HTS. With the aim to apply computational methods for facilitating herbicide target identification, 48,000 bioactivity data against 16 herbicide targets were processed to train Laplacian modified Naïve Bayesian (NB) classification models. The herbicide target prediction model ("HerbiMod") is an ensemble of 16 binary classification models which are evaluated by internal, external and prospective validation sets. In addition to the experimental inactives, 10,000 random agrochemical inactives were included in the training process, which showed to improve the overall balanced accuracy of our models up to 40%. For all the models, performance in terms of balanced accuracy of≥80% was achieved in five-fold cross validation. Ranking target predictions was addressed by means of z-scores which improved predictivity over using raw scores alone. An external testset of 247 compounds from ChEMBL and a prospective testset of 394 compounds from BASF SE tested against five well studied herbicide targets (ACC, ALS, HPPD, PDS and PROTOX) were used for further validation. Only 4% of the compounds in the external testset lied in the applicability domain and extrapolation (and correct prediction) was hence impossible, which on one hand was surprising, and on the other hand illustrated the utilization of

  17. Faster experimental validation of microRNA targets using cold fusion cloning and a dual firefly-Renilla luciferase reporter assay.

    PubMed

    Alvarez, M Lucrecia

    2014-01-01

    Different target prediction algorithms have been developed to provide a list of candidate target genes for a given animal microRNAs (miRNAs). However, these computational approaches provide both false-positive and false-negative predictions. Therefore, the target genes of a specific miRNA identified in silico should be experimentally validated. In this chapter, we describe a step-by-step protocol for the experimental validation of a direct miRNA target using a faster Dual Firefly-Renilla Luciferase Reporter Assay. We describe how to construct reporter plasmids using the simple, fast, and highly efficient cold fusion cloning technology, which does not require ligase, phosphatase, or restriction enzymes. In addition, we provide a protocol for co-transfection of reporter plasmids with either miRNA mimics or miRNA inhibitors in human embryonic kidney 293 (HEK293) cells, as well as a description on how to measure Firefly and Renilla luciferase activity using the Dual-Glo Luciferase Assay kit. As an example of the use of this technology, we will validate glucose-6-phosphate dehydrogenase (G6PD) as a direct target of miR-1207-5p.

  18. Validating An Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments

    NASA Astrophysics Data System (ADS)

    Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we

  19. A combined pre-clinical meta-analysis and randomized confirmatory trial approach to improve data validity for therapeutic target validation.

    PubMed

    Kleikers, Pamela W M; Hooijmans, Carlijn; Göb, Eva; Langhauser, Friederike; Rewell, Sarah S J; Radermacher, Kim; Ritskes-Hoitinga, Merel; Howells, David W; Kleinschnitz, Christoph; Schmidt, Harald H H W

    2015-08-27

    Biomedical research suffers from a dramatically poor translational success. For example, in ischemic stroke, a condition with a high medical need, over a thousand experimental drug targets were unsuccessful. Here, we adopt methods from clinical research for a late-stage pre-clinical meta-analysis (MA) and randomized confirmatory trial (pRCT) approach. A profound body of literature suggests NOX2 to be a major therapeutic target in stroke. Systematic review and MA of all available NOX2(-/y) studies revealed a positive publication bias and lack of statistical power to detect a relevant reduction in infarct size. A fully powered multi-center pRCT rejects NOX2 as a target to improve neurofunctional outcomes or achieve a translationally relevant infarct size reduction. Thus stringent statistical thresholds, reporting negative data and a MA-pRCT approach can ensure biomedical data validity and overcome risks of bias.

  20. Simultaneous Determination of 10 Ultratrace Elements in Infant Formula, Adult Nutritionals, and Milk Products by ICP/MS After Pressure Digestion: Single-Laboratory Validation.

    PubMed

    Dubascoux, Stephane; Nicolas, Marine; Rime, Celine Fragniere; Payot, Janique Richoz; Poitevin, Eric

    2015-01-01

    A single-laboratory validation (SLV) is presented for the simultaneous determination of 10 ultratrace elements (UTEs) including aluminum (Al), arsenic (As), cadmium (Cd), cobalt (Co), chromium (Cr), mercury (Hg), molybdenum (Mo), lead (Pb), selenium (Se), and tin (Sn) in infant formulas, adult nutritionals, and milk based products by inductively coupled plasma (ICP)/MS after acidic pressure digestion. This robust and routine multielemental method is based on several official methods with modifications of sample preparation using either microwave digestion or high pressure ashing and of analytical conditions using ICP/MS with collision cell technology. This SLV fulfills AOAC method performance criteria in terms of linearity, specificity, sensitivity, precision, and accuracy and fully answers most international regulation limits for trace contaminants and/or recommended nutrient levels established for 10 UTEs in targeted matrixes.

  1. The accomplishments of lithium target and test facility validation activities in the IFMIF/EVEDA phase

    NASA Astrophysics Data System (ADS)

    Arbeiter, Frederik; Baluc, Nadine; Favuzza, Paolo; Gröschel, Friedrich; Heidinger, Roland; Ibarra, Angel; Knaster, Juan; Kanemura, Takuji; Kondo, Hiroo; Massaut, Vincent; Saverio Nitti, Francesco; Miccichè, Gioacchino; O'hira, Shigeru; Rapisarda, David; Sugimoto, Masayoshi; Wakai, Eiichi; Yokomine, Takehiko

    2018-01-01

    As part of the engineering validation and engineering design activities (EVEDA) phase for the international fusion materials irradiation facility IFMIF, major elements of a lithium target facility and the test facility were designed, prototyped and validated. For the lithium target facility, the EVEDA lithium test loop was built at JAEA and used to test the stability (waves and long term) of the lithium flow in the target, work out the startup procedures, and test lithium purification and analysis. It was confirmed by experiments in the Lifus 6 plant at ENEA that lithium corrosion on ferritic martensitic steels is acceptably low. Furthermore, complex remote handling procedures for the remote maintenance of the target in the test cell environment were successfully practiced. For the test facility, two variants of a high flux test module were prototyped and tested in helium loops, demonstrating their good capabilities of maintaining the material specimens at the desired temperature with a low temperature spread. Irradiation tests were performed for heated specimen capsules and irradiation instrumentation in the BR2 reactor at SCK-CEN. The small specimen test technique, essential for obtaining material test results with limited irradiation volume, was advanced by evaluating specimen shape and test technique influences.

  2. PIG's Speed Estimated with Pressure Transducers and Hall Effect Sensor: An Industrial Application of Sensors to Validate a Testing Laboratory.

    PubMed

    Lima, Gustavo F; Freitas, Victor C G; Araújo, Renan P; Maitelli, André L; Salazar, Andrés O

    2017-09-15

    The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG's movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG's passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory.

  3. In silico Analysis of Toxins of Staphylococcus aureus for Validating Putative Drug Targets.

    PubMed

    Mohana, Ramadevi; Venugopal, Subhashree

    2017-01-01

    Toxins are one among the numerous virulence factors produced by the bacteria. These are powerful poisonous substances enabling the bacteria to encounter the defense mechanism of human body. The pathogenic system of Staphylococcus aureus is evolved with various exotoxins that cause detrimental effects on human immune system. Four toxins namely enterotoxin A, exfoliative toxin A, TSST-1 and γ-hemolysin were downloaded from Uniprot database and were analyzed to understand the nature of the toxins and for drug target validation. The results inferred that the toxins were found to interact with many protein partners and no homologous sequences for human proteome were found, and based on similarity search in Drugbank, the targets were identified as novel drug targets. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Inter-laboratory validation of bioaccessibility testing for metals.

    PubMed

    Henderson, Rayetta G; Verougstraete, Violaine; Anderson, Kim; Arbildua, José J; Brock, Thomas O; Brouwers, Tony; Cappellini, Danielle; Delbeke, Katrien; Herting, Gunilla; Hixon, Greg; Odnevall Wallinder, Inger; Rodriguez, Patricio H; Van Assche, Frank; Wilrich, Peter; Oller, Adriana R

    2014-10-01

    Bioelution assays are fast, simple alternatives to in vivo testing. In this study, the intra- and inter-laboratory variability in bioaccessibility data generated by bioelution tests were evaluated in synthetic fluids relevant to oral, inhalation, and dermal exposure. Using one defined protocol, five laboratories measured metal release from cobalt oxide, cobalt powder, copper concentrate, Inconel alloy, leaded brass alloy, and nickel sulfate hexahydrate. Standard deviations of repeatability (sr) and reproducibility (sR) were used to evaluate the intra- and inter-laboratory variability, respectively. Examination of the sR:sr ratios demonstrated that, while gastric and lysosomal fluids had reasonably good reproducibility, other fluids did not show as good concordance between laboratories. Relative standard deviation (RSD) analysis showed more favorable reproducibility outcomes for some data sets; overall results varied more between- than within-laboratories. RSD analysis of sr showed good within-laboratory variability for all conditions except some metals in interstitial fluid. In general, these findings indicate that absolute bioaccessibility results in some biological fluids may vary between different laboratories. However, for most applications, measures of relative bioaccessibility are needed, diminishing the requirement for high inter-laboratory reproducibility in absolute metal releases. The inter-laboratory exercise suggests that the degrees of freedom within the protocol need to be addressed. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Request of laboratory liver tests in primary care in Spain: potential savings if appropriateness indicator targets were achieved.

    PubMed

    Salinas, Maria; López-Garrigós, Maite; Flores, Emilio; Uris, Joaquín; Leiva-Salinas, Carlos

    2015-10-01

    Liver laboratory tests are used to screen for liver disease, suggest the underlying cause, estimate the severity, assess prognosis, and monitor the efficacy of therapy. The aim of this study was to compare the liver laboratory tests requesting patterns by GPs in Spain, according to geographic and hospital characteristics, to investigate the degree of requesting appropriateness. One hundred and forty-one clinical laboratories were invited to participate from diverse regions across Spain. They filed out the number of laboratory liver tests requested by GPs for the year 2012. Two types of appropriateness indicators were calculated: every test request per 1000 inhabitants or ratios of related tests requests. The indicator results obtained were compared between the different hospitals, according to their setting, location, and management. The savings generated, if each area would have achieved indicator targets, were calculated. We recruited 76 laboratories covering a population of 17,679,195 inhabitants. GPs requested 20,916,780 laboratory liver tests in the year 2012. No differences were obtained according to their setting. Lactate dehydrogenase and direct bilirubin per 1000 inhabitants were significantly higher in institutions with private management. Largest differences were observed between communities. Nine, 31, 0, and 13 laboratories, respectively, achieved the aspartate aminotransferase, lactate dehydrogenase, γ-glutamyl transpeptidase, and total bilirubin-related alanine aminotransferase indicator targets. Reaching ratios would have resulted in savings of €1,028,468. There was a high variability in the request of liver tests. This emphasizes the need to implement interventions to improve appropriate use of liver tests.

  6. The laboratory demonstration and signal processing of the inverse synthetic aperture imaging ladar

    NASA Astrophysics Data System (ADS)

    Gao, Si; Zhang, ZengHui; Xu, XianWen; Yu, WenXian

    2017-10-01

    This paper presents a coherent inverse synthetic-aperture imaging ladar(ISAL)system to obtain high resolution images. A balanced coherent optics system in laboratory is built with binary phase coded modulation transmit waveform which is different from conventional chirp. A whole digital signal processing solution is proposed including both quality phase gradient autofocus(QPGA) algorithm and cubic phase function(CPF) algorithm. Some high-resolution well-focused ISAL images of retro-reflecting targets are shown to validate the concepts. It is shown that high resolution images can be achieved and the influences from vibrations of platform involving targets and radar can be automatically compensated by the distinctive laboratory system and digital signal process.

  7. Panorama: A Targeted Proteomics Knowledge Base

    PubMed Central

    2015-01-01

    Panorama is a web application for storing, sharing, analyzing, and reusing targeted assays created and refined with Skyline,1 an increasingly popular Windows client software tool for targeted proteomics experiments. Panorama allows laboratories to store and organize curated results contained in Skyline documents with fine-grained permissions, which facilitates distributed collaboration and secure sharing of published and unpublished data via a web-browser interface. It is fully integrated with the Skyline workflow and supports publishing a document directly to a Panorama server from the Skyline user interface. Panorama captures the complete Skyline document information content in a relational database schema. Curated results published to Panorama can be aggregated and exported as chromatogram libraries. These libraries can be used in Skyline to pick optimal targets in new experiments and to validate peak identification of target peptides. Panorama is open-source and freely available. It is distributed as part of LabKey Server,2 an open source biomedical research data management system. Laboratories and organizations can set up Panorama locally by downloading and installing the software on their own servers. They can also request freely hosted projects on https://panoramaweb.org, a Panorama server maintained by the Department of Genome Sciences at the University of Washington. PMID:25102069

  8. Validation of the RAGE Hydrocode for Impacts into Volatile-Rich Targets

    NASA Astrophysics Data System (ADS)

    Plesko, C. S.; Asphaug, E.; Coker, R. F.; Wohletz, K. H.; Korycansky, D. G.; Gisler, G. R.

    2007-12-01

    In preparation for a detailed study of large-scale impacts into the Martian surface, we have validated the RAGE hydrocode (Gittings et al., in press, CSD) against a suite of experiments and statistical models. We present comparisons of hydrocode models to centimeter-scale gas gun impacts (Nakazawa et al. 2002), an underground nuclear test (Perret, 1971), and crater scaling laws (Holsapple 1993, O'Keefe and Ahrens 1993). We have also conducted model convergence and uncertainty analyses which will be presented. Results to date are encouraging for our current model goals, and indicate areas where the hydrocode may be extended in the future. This validation work is focused on questions related to the specific problem of large impacts into volatile-rich targets. The overall goal of this effort is to be able to realistically model large-scale Noachian, and possibly post- Noachian, impacts on Mars not so much to model the crater morphology as to understand the evolution of target volatiles in the post-impact regime, to explore how large craters might set the stage for post-impact hydro- geologic evolution both locally (in the crater subsurface) and globally, due to the redistribution of volatiles from the surface and subsurface into the atmosphere. This work is performed under the auspices of IGPP and the DOE at LANL under contracts W-7405-ENG-36 and DE-AC52-06NA25396. Effort by DK and EA is sponsored by NASA's Mars Fundamental Research Program.

  9. OCO-2 Observation and Validation Overview: Observations Data Modes and Target Observations, Taken During the First 15 Months of Operations

    NASA Astrophysics Data System (ADS)

    Osterman, G. B.; Fisher, B.; Wunch, D.; Eldering, A.; Wennberg, P. O.; Roehl, C. M.; Naylor, B. J.; Lee, R.; Pollock, R.; Gunson, M. R.

    2015-12-01

    The OCO-2 instrument was successfully launched on July 2, 2014 from Vandenberg Air Force Base in California. The instrument reached its observational orbit about three weeks later. The spacecraft is at the head of the A-train satellites and began collecting operational data on Sept 5, 2014. OCO-2 makes measurements in three modes: nadir, glint and target. Target observations are designed to provide large amounts of data in a small area near a ground validation site. The instruments of the Total Carbon Column Observing Network (TCCON) provide the ground validation data for the OCO-2 XCO2 observations and comparisons to TCCON form the basis of the OCO-2 validation plan. There are now 27 locations at which OCO-2 can perform target observations and CCON sites make up 23 of those possible target locations. For its first year in orbit, OCO-2 operated in nadir mode for 16 days and then in glint mode for 16 days. Each 16-day cycle spans 233 orbits. On July 1, 2015, OCO-2 changed to an observational mode of alternating nadir and glint measurements on an orbit-by-orbit basis. By December 2015, this operational mode may be modified such that orbits that measure only over ocean will always observed in glint mode. In this presentation we will provide information on the observations made by OCO-2 during its first 15 month in operations. We will show maps of the OCO-2 ground tracks and XCO2 data, calendars illustrating the observational schedule and statistics on the target observations taken. We will provide more information on what is involved in making target observations and how it affects the standard operational data acquisition patterns. Changes to the standard observational patterns of OCO-2 and to the list of locations for target observations will be discussed as well. We will provide an overview of some of the validation related analysis being done using nadir and glint mode OCO-2 data in addition to an overview on validation analyses that do not directly utilize TCCON

  10. Toddler physical activity study: laboratory and community studies to evaluate accelerometer validity and correlates.

    PubMed

    Hager, Erin R; Gormley, Candice E; Latta, Laura W; Treuth, Margarita S; Caulfield, Laura E; Black, Maureen M

    2016-09-06

    Toddlerhood is an important age for physical activity (PA) promotion to prevent obesity and support a physically active lifestyle throughout childhood. Accurate assessment of PA is needed to determine trends/correlates of PA, time spent in sedentary, light, or moderate-vigorous PA (MVPA), and the effectiveness of PA promotion programs. Due to the limited availability of objective measures that have been validated and evaluated for feasibility in community studies, it is unclear which subgroups of toddlers are at the highest risk for inactivity. Using Actical ankle accelerometry, the objectives of this study are to develop valid thresholds, examine feasibility, and examine demographic/ anthropometric PA correlates of MVPA among toddlers from low-income families. Two studies were conducted with toddlers (12-36 months). Laboratory Study (n = 24)- Two Actical accelerometers were placed on the ankle. PA was observed using the Child Activity Rating Scale (CARS, prescribed activities). Analyses included device equivalence reliability (correlation: activity counts of two Acticals), criterion-related validity (correlation: activity counts and CARS ratings), and sensitivity/specificity for thresholds. Community Study (n = 277, low-income mother-toddler dyads recruited)- An Actical was worn on the ankle for > 7 days (goal >5, 24-h days). Height/weight was measured. Mothers reported demographics. Analyses included frequencies (feasibility) and stepwise multiple linear regression (sMLR). Laboratory Study- Acticals demonstrated reliability (r = 0.980) and validity (r = 0.75). Thresholds demonstrated sensitivity (86 %) and specificity (88 %). Community Study- 86 % wore accelerometer, 69 % had valid data (mean = 5.2 days). Primary reasons for missing/invalid data: refusal (14 %) and wear-time ≤2 days (11 %). The MVPA threshold (>2200 cpm) yielded 54 min/day. In sMLR, MVPA was associated with age (older > younger, β = 32.8, p < 0

  11. Validation of Autonomic and Endocrine Reactivity to a Laboratory Stressor in Young Children

    PubMed Central

    Roos, Leslie E.; Giuliano, Ryan J.; Beauchamp, Kathryn G.; Gunnar, Megan; Amidon, Brigette; Fisher, Philip A.

    2017-01-01

    The validation of laboratory paradigms that reliably induce a stress response [including hypothalamic-pituitary-adrenal (HPA) axis and autonomic nervous system (ANS) activation], is critical for understanding how children’s stress-response systems support emotional and cognitive function. Early childhood research to date is markedly limited, given the difficulty in establishing paradigms that reliably induce a cortisol response. Furthermore, research to date has not included a control condition or examined concurrent ANS reactivity. We addressed these limitations by characterizing the extent to which a modified matching task stressor paradigm induces HPA and ANS activation, beyond a closely matched control condition. Modifications include an unfamiliar and unfriendly assessor to increase the stressful nature of the task. Results validate the matching task as a laboratory stressor, with significant differences in HPA and ANS responsivity between conditions. The Stressor group exhibited a cortisol increase post-stressor, while the Control group was stable over time. Children in both conditions exhibited reduced parasympathetic activity to the first-half of the task, but in the second-half, only children in the Stressor condition, who were experiencing exaggerated signals of failure, exhibited further parasympathetic decline. The Stressor condition induced higher sympathetic activity (versus Control) throughout the task, with exaggerated second-half differences. Within the Stressor condition, responsivity was convergent across systems, with greater cortisol reactivity correlated with the magnitude of parasympathetic withdrawal and sympathetic engagement. Future research employing the matching task will facilitate understanding the role of HPA and ANS function in development. PMID:28024268

  12. Neuroinflammatory targets and treatments for epilepsy validated in experimental models.

    PubMed

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M

    2017-07-01

    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  13. In Silico Prediction and Validation of Gfap as an miR-3099 Target in Mouse Brain.

    PubMed

    Abidin, Shahidee Zainal; Leong, Jia-Wen; Mahmoudi, Marzieh; Nordin, Norshariza; Abdullah, Syahril; Cheah, Pike-See; Ling, King-Hwa

    2017-08-01

    MicroRNAs are small non-coding RNAs that play crucial roles in the regulation of gene expression and protein synthesis during brain development. MiR-3099 is highly expressed throughout embryogenesis, especially in the developing central nervous system. Moreover, miR-3099 is also expressed at a higher level in differentiating neurons in vitro, suggesting that it is a potential regulator during neuronal cell development. This study aimed to predict the target genes of miR-3099 via in-silico analysis using four independent prediction algorithms (miRDB, miRanda, TargetScan, and DIANA-micro-T-CDS) with emphasis on target genes related to brain development and function. Based on the analysis, a total of 3,174 miR-3099 target genes were predicted. Those predicted by at least three algorithms (324 genes) were subjected to DAVID bioinformatics analysis to understand their overall functional themes and representation. The analysis revealed that nearly 70% of the target genes were expressed in the nervous system and a significant proportion were associated with transcriptional regulation and protein ubiquitination mechanisms. Comparison of in situ hybridization (ISH) expression patterns of miR-3099 in both published and in-house-generated ISH sections with the ISH sections of target genes from the Allen Brain Atlas identified 7 target genes (Dnmt3a, Gabpa, Gfap, Itga4, Lxn, Smad7, and Tbx18) having expression patterns complementary to miR-3099 in the developing and adult mouse brain samples. Of these, we validated Gfap as a direct downstream target of miR-3099 using the luciferase reporter gene system. In conclusion, we report the successful prediction and validation of Gfap as an miR-3099 target gene using a combination of bioinformatics resources with enrichment of annotations based on functional ontologies and a spatio-temporal expression dataset.

  14. PIG’s Speed Estimated with Pressure Transducers and Hall Effect Sensor: An Industrial Application of Sensors to Validate a Testing Laboratory

    PubMed Central

    Freitas, Victor C. G.; Araújo, Renan P.; Maitelli, André L.; Salazar, Andrés O.

    2017-01-01

    The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG’s movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG’s passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory. PMID:28914757

  15. C-GRApH: A Validated Scoring System for Early Stratification of Neurologic Outcome After Out-of-Hospital Cardiac Arrest Treated With Targeted Temperature Management.

    PubMed

    Kiehl, Erich L; Parker, Alex M; Matar, Ralph M; Gottbrecht, Matthew F; Johansen, Michelle C; Adams, Mark P; Griffiths, Lori A; Dunn, Steven P; Bidwell, Katherine L; Menon, Venu; Enfield, Kyle B; Gimple, Lawrence W

    2017-05-20

    Out-of-hospital cardiac arrest (OHCA) results in significant morbidity and mortality, primarily from neurologic injury. Predicting neurologic outcome early post-OHCA remains difficult in patients receiving targeted temperature management. Retrospective analysis was performed on consecutive OHCA patients receiving targeted temperature management (32-34°C) for 24 hours at a tertiary-care center from 2008 to 2012 (development cohort, n=122). The primary outcome was favorable neurologic outcome at hospital discharge, defined as cerebral performance category 1 to 2 (poor 3-5). Patient demographics, pre-OHCA diagnoses, and initial laboratory studies post-resuscitation were compared between favorable and poor neurologic outcomes with multivariable logistic regression used to develop a simple scoring system ( C-GRApH ). The C-GRApH score ranges 0 to 5 using equally weighted variables: ( C ): coronary artery disease, known pre-OHCA; ( G ): glucose ≥200 mg/dL; ( R ): rhythm of arrest not ventricular tachycardia/fibrillation; ( A ): age >45; ( pH ): arterial pH ≤7.0. A validation cohort (n=344) included subsequent patients from the initial site (n=72) and an external quaternary-care health system (n=272) from 2012 to 2014. The c-statistic for predicting neurologic outcome was 0.82 (0.74-0.90, P <0.001) in the development cohort and 0.81 (0.76-0.87, P <0.001) in the validation cohort. When subdivided by C-GRApH score, similar rates of favorable neurologic outcome were seen in both cohorts, 70% each for low (0-1, n=60), 22% versus 19% for medium (2-3, n=307), and 0% versus 2% for high (4-5, n=99) C-GRApH scores in the development and validation cohorts, respectively. C-GRApH stratifies neurologic outcomes following OHCA in patients receiving targeted temperature management (32-34°C) using objective data available at hospital presentation, identifying patient subsets with disproportionally favorable ( C-GRApH ≤1) and poor ( C-GRApH ≥4) prognoses. © 2017 The Authors

  16. An Investigation into the Transportation of Irradiated Uranium/Aluminum Targets from a Foreign Nuclear Reactor to the Chalk River Laboratories Site in Ontario, Canada - 12249

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clough, Malcolm; Jackson, Austin

    2012-07-01

    This investigation required the selection of a suitable cask and development of a device to hold and transport irradiated targets from a foreign nuclear reactor to the Chalk River Laboratories in Ontario, Canada. The main challenge was to design and validate a target holder to protect the irradiated HEU-Al target pencils during transit. Each of the targets was estimated to have an initial decay heat of 118 W prior to transit. As the targets have little thermal mass the potential for high temperature damage and possibly melting was high. Thus, the primary design objective was to conceive a target holdermore » to dissipate heat from the targets. Other design requirements included securing the targets during transportation and providing a simple means to load and unload the targets while submerged five metres under water. A unique target holder (patent pending) was designed and manufactured together with special purpose experimental apparatus including a representative cask. Aluminum dummy targets were fabricated to accept cartridge heaters, to simulate decay heat. Thermocouples were used to measure the temperature of the test targets and selected areas within the target holder and test cask. After obtaining test results, calculations were performed to compensate for differences between experimental and real life conditions. Taking compensation into consideration the maximum target temperature reached was 231 deg. C which was below the designated maximum of 250 deg. C. The design of the aluminum target holder also allowed generous clearance to insert and unload the targets. This clearance was designed to close up as the target holder is placed into the cavity of the transport cask. Springs served to retain and restrain the targets from movement during transportation as well as to facilitate conductive heat transfer. The target holder met the design requirements and as such provided data supporting the feasibility of transporting targets over a relatively long period

  17. Validating the Goldstein-Wehner Law for the Stratified Positive Column of DC Discharge in an Undergraduate Laboratory

    ERIC Educational Resources Information Center

    Lisovskiy, V. A.; Koval, V. A.; Artushenko, E. P.; Yegorenkov, V. D.

    2012-01-01

    In this paper we suggest a simple technique for validating the Goldstein-Wehner law for a stratified positive column of dc glow discharge while studying the properties of gas discharges in an undergraduate laboratory. To accomplish this a simple device with a pre-vacuum mechanical pump, dc source and gas pressure gauge is required. Experiments may…

  18. Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.

    2005-01-01

    In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.

  19. Validation of the proteasome as a therapeutic target in Plasmodium using an epoxyketone inhibitor with parasite-specific toxicity

    PubMed Central

    Li, Hao; Ponder, Elizabeth L.; Verdoes, Martijn; Asbjornsdottir, Kristijana H.; Deu, Edgar; Edgington, Laura E.; Lee, Jeong Tae; Kirk, Christopher J.; Demo, Susan D.; Williamson, Kim C.; Bogyo, Matthew

    2012-01-01

    Summary The Plasmodium proteasome has been suggested to be a potential anti-malarial drug target, however toxicity of inhibitors has prevented validation of this enzyme in vivo. We report here a screen of a library of 670 analogs of the recently FDA approved inhibitor, carfilzomib, to identify compounds that selectively kill parasites. We identified one compound, PR3, that has significant parasite killing activity in vitro but dramatically reduced toxicity in host cells. We found that this parasite-specific toxicity is not due to selective targeting of the Plasmodium proteasome over the host proteasome, but instead is due to a lack of activity against one of the human proteasome subunits. Subsequently, we used PR3 to significantly reduce parasite load in P. berghei infected mice without host toxicity, thus validating the proteasome as a viable anti-malarial drug target. PMID:23142757

  20. Validity of a heart rate monitor during work in the laboratory and on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Moore, A. D. Jr; Lee, S. M.; Greenisen, M. C.; Bishop, P.

    1997-01-01

    Accurate heart rate measurement during work is required for many industrial hygiene and ergonomics situations. The purpose of this investigation was to determine the validity of heart rate measurements obtained by a simple, lightweight, commercially available wrist-worn heart rate monitor (HRM) during work (cycle exercise) sessions conducted in the laboratory and also during the particularly challenging work environment of space flight. Three different comparisons were made. The first compared HRM data to simultaneous electrocardiogram (ECG) recordings of varying heart rates that were generated by an ECG simulator. The second compared HRM data to ECG recordings collected during work sessions of 14 subjects in the laboratory. Finally, ECG downlink and HRM data were compared in four astronauts who performed cycle exercise during space flight. The data were analyzed using regression techniques. The results were that the HRM recorded virtually identical heart rates compared with ECG recordings for the data set generated by an ECG simulator. The regression equation for the relationship between ECG versus HRM heart rate data during work in the laboratory was: ECG HR = 0.99 x (HRM) + 0.82 (r2 = 0.99). Finally, the agreement between ECG downlink data and HRM data during space flight was also very high, with the regression equation being: Downlink ECG HR = 1.05 x (HRM) -5.71 (r2 = 0.99). The results of this study indicate that the HRM provides accurate data and may be used to reliably obtain valid data regarding heart rate responses during work.

  1. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  2. Single laboratory validation of the determination of yohimbine in yohimbe bark and related dietary supplements using UHPLC/UV/MS

    USDA-ARS?s Scientific Manuscript database

    A single laboratory validation has been performed on a practical ultra high-performance liquid chromatography (UHPLC), diode array detection (DAD), and tandem mass spectrometry (MS) method for determination of yohimbine in yohimbe barks and related dietary supplements. Good separation was achieved u...

  3. Determination of total sulfur in fertilizers by high temperature combustion: single-laboratory validation.

    PubMed

    Bernius, Jean; Kraus, Sabine; Hughes, Sandra; Margraf, Dominik; Bartos, James; Newlon, Natalie; Sieper, Hans-Peter

    2014-01-01

    Asingle-laboratory validation study was conducted for the determination of total sulfur (S) in a variety of common, inorganic fertilizers by combustion. The procedure involves conversion of S species into SO2 through combustion at 1150 degrees C, absorption then desorption from a purge and trap column, followed by measurement by a thermal conductivity detector. Eleven different validation materials were selected for study, which included four commercial fertilizer products, five fertilizers from the Magruder Check Sample Program, one reagent grade product, and one certified organic reference material. S content ranged between 1.47 and 91% as sulfate, thiosulfate, and elemental and organically bound S. Determinations of check samples were performed on 3 different days with four replicates/day. Determinations for non-Magruder samples were performed on 2 different days. Recoveries ranged from 94.3 to 125.9%. ABS SL absolute SD among runs ranged from 0.038 to 0.487%. Based on the accuracy and precision demonstrated here, it is recommended that this method be collaboratively studied for the determination of total S in fertilizers.

  4. Multi-laboratory evaluations of the performance of Catellicoccus marimammalium PCR assays developed to target gull fecal sources

    EPA Science Inventory

    Here we report results from a multi-laboratory (n=11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium used to detect fecal contamination from birds in coastal environments. The methods included conventional end-point PCR, a SYBR...

  5. Blood collection tubes as medical devices: The potential to affect assays and proposed verification and validation processes for the clinical laboratory.

    PubMed

    Bowen, Raffick A R; Adcock, Dorothy M

    2016-12-01

    Blood collection tubes (BCTs) are an often under-recognized variable in the preanalytical phase of clinical laboratory testing. Unfortunately, even the best-designed and manufactured BCTs may not work well in all clinical settings. Clinical laboratories, in collaboration with healthcare providers, should carefully evaluate BCTs prior to putting them into clinical use to determine their limitations and ensure that patients are not placed at risk because of inaccuracies due to poor tube performance. Selection of the best BCTs can be achieved through comparing advertising materials, reviewing the literature, observing the device at a scientific meeting, receiving a demonstration, evaluating the device under simulated conditions, or testing the device with patient samples. Although many publications have discussed method validations, few detail how to perform experiments for tube verification and validation. This article highlights the most common and impactful variables related to BCTs and discusses the validation studies that a typical clinical laboratory should perform when selecting BCTs. We also present a brief review of how in vitro diagnostic devices, particularly BCTs, are regulated in the United States, the European Union, and Canada. The verification and validation of BCTs will help to avoid the economic and human costs associated with incorrect test results, including poor patient care, unnecessary testing, and delays in test results. We urge laboratorians, tube manufacturers, diagnostic companies, and other researchers to take all the necessary steps to protect against the adverse effects of BCT components and their additives on clinical assays. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. The role of diagnostic laboratories in support of animal disease surveillance systems.

    PubMed

    Zepeda, C

    2007-01-01

    Diagnostic laboratories are an essential component of animal disease surveillance systems. To understand the occurrence of disease in populations, surveillance systems rely on random or targeted surveys using three approaches: clinical, serological and virological surveillance. Clinical surveillance is the basis for early detection of disease and is usually centered on the detection of syndromes and clinical findings requiring confirmation by diagnostic laboratories. Although most of the tests applied usually perform to an acceptable standard, several have not been properly validated in terms of their diagnostic sensitivity and specificity. Sensitivity and specificity estimates can vary according to local conditions and, ideally, should be determined by national laboratories where the tests are to be applied. The importance of sensitivity and specificity estimates in the design and interpretation of statistically based surveys and risk analysis is fundamental to establish appropriate disease control and prevention strategies. The World Organisation for Animal Health's (OIE) network of reference laboratories acts as centers of expertise for the diagnosis of OIE listed diseases and have a role in promoting the validation of OIE prescribed tests for international trade. This paper discusses the importance of the epidemiological evaluation of diagnostic tests and the role of the OIE Reference Laboratories and Collaborating Centres in this process.

  7. Validating the Technology Acceptance Model in the Context of the Laboratory Information System-Electronic Health Record Interface System

    ERIC Educational Resources Information Center

    Aquino, Cesar A.

    2014-01-01

    This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…

  8. The future of targeted peptidomics.

    PubMed

    Findeisen, Peter

    2013-12-01

    Targeted MS is becoming increasingly important for sensitive and specific quantitative detection of proteins and respective PTMs. In this article, Ceglarek et al. [Proteomics Clin. Appl. 2013, 7, 794-801] present an LC-MS-based method for simultaneous quantitation of seven apolipoproteins in serum specimens. The assay fulfills many necessities of routine diagnostic applications, namely, low cost, high throughput, and good reproducibility. We anticipate that validation of new biomarkers will speed up with this technology and the palette of laboratory-based diagnostic tools will hopefully be augmented significantly in the near future. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Laboratory compliance with the American Society of Clinical Oncology/College of American Pathologists human epidermal growth factor receptor 2 testing guidelines: a 3-year comparison of validation procedures.

    PubMed

    Dyhdalo, Kathryn S; Fitzgibbons, Patrick L; Goldsmith, Jeffery D; Souers, Rhona J; Nakhleh, Raouf E

    2014-07-01

    The American Society of Clinical Oncology/College of American Pathologists (ASCO/CAP) published guidelines in 2007 regarding testing accuracy, interpretation, and reporting of results for HER2 studies. A 2008 survey identified areas needing improved compliance. To reassess laboratory response to those guidelines following a full accreditation cycle for an updated snapshot of laboratory practices regarding ASCO/CAP guidelines. In 2011, a survey was distributed with the HER2 immunohistochemistry (IHC) proficiency testing program identical to the 2008 survey. Of the 1150 surveys sent, 977 (85.0%) were returned, comparable to the original survey response in 2008 (757 of 907; 83.5%). New participants submitted 124 of 977 (12.7%) surveys. The median laboratory accession rate was 14,788 cases with 211 HER2 tests performed annually. Testing was validated with fluorescence in situ hybridization in 49.1% (443 of 902) of the laboratories; 26.3% (224 of 853) of the laboratories used another IHC assay. The median number of cases to validate fluorescence in situ hybridization (n = 40) and IHC (n = 27) was similar to those in 2008. Ninety-five percent concordance with fluorescence in situ hybridization was achieved by 76.5% (254 of 332) of laboratories for IHC(-) findings and 70.4% (233 of 331) for IHC(+) cases. Ninety-five percent concordance with another IHC assay was achieved by 71.1% (118 of 168) of the laboratories for negative findings and 69.6% (112 of 161) of the laboratories for positive cases. The proportion of laboratories interpreting HER2 IHC using ASCO/CAP guidelines (86.6% [798 of 921] in 2011; 83.8% [605 of 722] in 2008) remains similar. Although fixation time improvements have been made, assay validation deficiencies still exist. The results of this survey were shared within the CAP, including the Laboratory Accreditation Program and the ASCO/CAP panel revising the HER2 guidelines published in October 2013. The Laboratory Accreditation Program checklist was

  10. miRTar2GO: a novel rule-based model learning method for cell line specific microRNA target prediction that integrates Ago2 CLIP-Seq and validated microRNA-target interaction data.

    PubMed

    Ahadi, Alireza; Sablok, Gaurav; Hutvagner, Gyorgy

    2017-04-07

    MicroRNAs (miRNAs) are ∼19-22 nucleotides (nt) long regulatory RNAs that regulate gene expression by recognizing and binding to complementary sequences on mRNAs. The key step in revealing the function of a miRNA, is the identification of miRNA target genes. Recent biochemical advances including PAR-CLIP and HITS-CLIP allow for improved miRNA target predictions and are widely used to validate miRNA targets. Here, we present miRTar2GO, which is a model, trained on the common rules of miRNA-target interactions, Argonaute (Ago) CLIP-Seq data and experimentally validated miRNA target interactions. miRTar2GO is designed to predict miRNA target sites using more relaxed miRNA-target binding characteristics. More importantly, miRTar2GO allows for the prediction of cell-type specific miRNA targets. We have evaluated miRTar2GO against other widely used miRNA target prediction algorithms and demonstrated that miRTar2GO produced significantly higher F1 and G scores. Target predictions, binding specifications, results of the pathway analysis and gene ontology enrichment of miRNA targets are freely available at http://www.mirtar2go.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Test and Validation of the Mars Science Laboratory Robotic Arm

    NASA Technical Reports Server (NTRS)

    Robinson, M.; Collins, C.; Leger, P.; Kim, W.; Carsten, J.; Tompkins, V.; Trebi-Ollennu, A.; Florow, B.

    2013-01-01

    The Mars Science Laboratory Robotic Arm (RA) is a key component for achieving the primary scientific goals of the mission. The RA supports sample acquisition by precisely positioning a scoop above loose regolith or accurately preloading a percussive drill on Martian rocks or rover-mounted organic check materials. It assists sample processing by orienting a sample processing unit called CHIMRA through a series of gravity-relative orientations and sample delivery by positioning the sample portion door above an instrument inlet or the observation tray. In addition the RA facilitates contact science by accurately positioning the dust removal tool, Alpha Particle X-Ray Spectrometer (APXS) and the Mars Hand Lens Imager (MAHLI) relative to surface targets. In order to fulfill these seemingly disparate science objectives the RA must satisfy a variety of accuracy and performance requirements. This paper describes the necessary arm requirement specification and the test campaign to demonstrate these requirements were satisfied.

  12. Improving efficiency of a small forensic DNA laboratory: validation of robotic assays and evaluation of microcapillary array device.

    PubMed

    Crouse, Cecelia A; Yeung, Stephanie; Greenspoon, Susan; McGuckian, Amy; Sikorsky, Julie; Ban, Jeff; Mathies, Richard

    2005-08-01

    To present validation studies performed for the implementation of existing and new technologies to increase the efficiency in the forensic DNA Section of the Palm Beach County Sheriff's Office (PBSO) Crime Laboratory. Using federally funded grants, internal support, and an external Process Mapping Team, the PBSO collaborated with forensic vendors, universities, and other forensic laboratories to enhance DNA testing procedures, including validation of the DNA IQ magnetic bead extraction system, robotic DNA extraction using the BioMek2000, the ABI7000 Sequence Detection System, and is currently evaluating a micro Capillary Array Electrophoresis device. The PBSO successfully validated and implemented both manual and automated Promega DNA IQ magnetic bead extractions system, which have increased DNA profile results from samples with low DNA template concentrations. The Beckman BioMek2000 DNA robotic workstation has been validated for blood, tissue, bone, hair, epithelial cells (touch evidence), and mixed stains such as semen. There has been a dramatic increase in the number of samples tested per case since implementation of the robotic extraction protocols. The validation of the ABI7000 real-time quantitative polymerase chain reaction (qPCR) technology and the single multiplex short tandem repeat (STR) PowerPlex16 BIO amplification system has provided both a time and a financial benefit. In addition, the qPCR system allows more accurate DNA concentration data and the PowerPlex 16 BIO multiplex generates DNA profiles data in half the time when compared to PowerPlex1.1 and PowerPlex2.1 STR systems. The PBSO's future efficiency requirements are being addressed through collaboration with the University of California at Berkeley and the Virginia Division of Forensic Science to validate microcapillary array electrophoresis instrumentation. Initial data demonstrated the electrophoresis of 96 samples in less than twenty minutes. The PBSO demonstrated, through the validation of

  13. Intra-/inter-laboratory validation study on reactive oxygen species assay for chemical photosafety evaluation using two different solar simulators.

    PubMed

    Onoue, Satomi; Hosoi, Kazuhiro; Toda, Tsuguto; Takagi, Hironori; Osaki, Naoto; Matsumoto, Yasuhiro; Kawakami, Satoru; Wakuri, Shinobu; Iwase, Yumiko; Yamamoto, Toshinobu; Nakamura, Kazuichi; Ohno, Yasuo; Kojima, Hajime

    2014-06-01

    A previous multi-center validation study demonstrated high transferability and reliability of reactive oxygen species (ROS) assay for photosafety evaluation. The present validation study was undertaken to verify further the applicability of different solar simulators and assay performance. In 7 participating laboratories, 2 standards and 42 coded chemicals, including 23 phototoxins and 19 non-phototoxic drugs/chemicals, were assessed by the ROS assay using two different solar simulators (Atlas Suntest CPS series, 3 labs; and Seric SXL-2500V2, 4 labs). Irradiation conditions could be optimized using quinine and sulisobenzone as positive and negative standards to offer consistent assay outcomes. In both solar simulators, the intra- and inter-day precisions (coefficient of variation; CV) for quinine were found to be below 10%. The inter-laboratory CV for quinine averaged 15.4% (Atlas Suntest CPS) and 13.2% (Seric SXL-2500V2) for singlet oxygen and 17.0% (Atlas Suntest CPS) and 7.1% (Seric SXL-2500V2) for superoxide, suggesting high inter-laboratory reproducibility even though different solar simulators were employed for the ROS assay. In the ROS assay on 42 coded chemicals, some chemicals (ca. 19-29%) were unevaluable because of limited solubility and spectral interference. Although several false positives appeared with positive predictivity of ca. 76-92% (Atlas Suntest CPS) and ca. 75-84% (Seric SXL-2500V2), there were no false negative predictions in both solar simulators. A multi-center validation study on the ROS assay demonstrated satisfactory transferability, accuracy, precision, and predictivity, as well as the availability of other solar simulators. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. The bogus taste test: Validity as a measure of laboratory food intake.

    PubMed

    Robinson, Eric; Haynes, Ashleigh; Hardman, Charlotte A; Kemps, Eva; Higgs, Suzanne; Jones, Andrew

    2017-09-01

    Because overconsumption of food contributes to ill health, understanding what affects how much people eat is of importance. The 'bogus' taste test is a measure widely used in eating behaviour research to identify factors that may have a causal effect on food intake. However, there has been no examination of the validity of the bogus taste test as a measure of food intake. We conducted a participant level analysis of 31 published laboratory studies that used the taste test to measure food intake. We assessed whether the taste test was sensitive to experimental manipulations hypothesized to increase or decrease food intake. We examined construct validity by testing whether participant sex, hunger and liking of taste test food were associated with the amount of food consumed in the taste test. In addition, we also examined whether BMI (body mass index), trait measures of dietary restraint and over-eating in response to palatable food cues were associated with food consumption. Results indicated that the taste test was sensitive to experimental manipulations hypothesized to increase or decrease food intake. Factors that were reliably associated with increased consumption during the taste test were being male, have a higher baseline hunger, liking of the taste test food and a greater tendency to overeat in response to palatable food cues, whereas trait dietary restraint and BMI were not. These results indicate that the bogus taste test is likely to be a valid measure of food intake and can be used to identify factors that have a causal effect on food intake. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Recommendations for the design of laboratory studies on non-target arthropods for risk assessment of genetically engineered plants

    PubMed Central

    Hellmich, Richard L.; Candolfi, Marco P.; Carstens, Keri; De Schrijver, Adinda; Gatehouse, Angharad M. R.; Herman, Rod A.; Huesing, Joseph E.; McLean, Morven A.; Raybould, Alan; Shelton, Anthony M.; Waggoner, Annabel

    2010-01-01

    This paper provides recommendations on experimental design for early-tier laboratory studies used in risk assessments to evaluate potential adverse impacts of arthropod-resistant genetically engineered (GE) plants on non-target arthropods (NTAs). While we rely heavily on the currently used proteins from Bacillus thuringiensis (Bt) in this discussion, the concepts apply to other arthropod-active proteins. A risk may exist if the newly acquired trait of the GE plant has adverse effects on NTAs when they are exposed to the arthropod-active protein. Typically, the risk assessment follows a tiered approach that starts with laboratory studies under worst-case exposure conditions; such studies have a high ability to detect adverse effects on non-target species. Clear guidance on how such data are produced in laboratory studies assists the product developers and risk assessors. The studies should be reproducible and test clearly defined risk hypotheses. These properties contribute to the robustness of, and confidence in, environmental risk assessments for GE plants. Data from NTA studies, collected during the analysis phase of an environmental risk assessment, are critical to the outcome of the assessment and ultimately the decision taken by regulatory authorities on the release of a GE plant. Confidence in the results of early-tier laboratory studies is a precondition for the acceptance of data across regulatory jurisdictions and should encourage agencies to share useful information and thus avoid redundant testing. PMID:20938806

  16. Clinical Validation and Implementation of a Targeted Next-Generation Sequencing Assay to Detect Somatic Variants in Non-Small Cell Lung, Melanoma, and Gastrointestinal Malignancies

    PubMed Central

    Fisher, Kevin E.; Zhang, Linsheng; Wang, Jason; Smith, Geoffrey H.; Newman, Scott; Schneider, Thomas M.; Pillai, Rathi N.; Kudchadkar, Ragini R.; Owonikoko, Taofeek K.; Ramalingam, Suresh S.; Lawson, David H.; Delman, Keith A.; El-Rayes, Bassel F.; Wilson, Malania M.; Sullivan, H. Clifford; Morrison, Annie S.; Balci, Serdar; Adsay, N. Volkan; Gal, Anthony A.; Sica, Gabriel L.; Saxe, Debra F.; Mann, Karen P.; Hill, Charles E.; Khuri, Fadlo R.; Rossi, Michael R.

    2017-01-01

    We tested and clinically validated a targeted next-generation sequencing (NGS) mutation panel using 80 formalin-fixed, paraffin-embedded (FFPE) tumor samples. Forty non-small cell lung carcinoma (NSCLC), 30 melanoma, and 30 gastrointestinal (12 colonic, 10 gastric, and 8 pancreatic adenocarcinoma) FFPE samples were selected from laboratory archives. After appropriate specimen and nucleic acid quality control, 80 NGS libraries were prepared using the Illumina TruSight tumor (TST) kit and sequenced on the Illumina MiSeq. Sequence alignment, variant calling, and sequencing quality control were performed using vendor software and laboratory-developed analysis workflows. TST generated ≥500× coverage for 98.4% of the 13,952 targeted bases. Reproducible and accurate variant calling was achieved at ≥5% variant allele frequency with 8 to 12 multiplexed samples per MiSeq flow cell. TST detected 112 variants overall, and confirmed all known single-nucleotide variants (n = 27), deletions (n = 5), insertions (n = 3), and multinucleotide variants (n = 3). TST detected at least one variant in 85.0% (68/80), and two or more variants in 36.2% (29/80), of samples. TP53 was the most frequently mutated gene in NSCLC (13 variants; 13/32 samples), gastrointestinal malignancies (15 variants; 13/25 samples), and overall (30 variants; 28/80 samples). BRAF mutations were most common in melanoma (nine variants; 9/23 samples). Clinically relevant NGS data can be obtained from routine clinical FFPE solid tumor specimens using TST, benchtop instruments, and vendor-supplied bioinformatics pipelines. PMID:26801070

  17. Validation of N-myristoyltransferase as an antimalarial drug target using an integrated chemical biology approach

    NASA Astrophysics Data System (ADS)

    Wright, Megan H.; Clough, Barbara; Rackham, Mark D.; Rangachari, Kaveri; Brannigan, James A.; Grainger, Munira; Moss, David K.; Bottrill, Andrew R.; Heal, William P.; Broncel, Malgorzata; Serwa, Remigiusz A.; Brady, Declan; Mann, David J.; Leatherbarrow, Robin J.; Tewari, Rita; Wilkinson, Anthony J.; Holder, Anthony A.; Tate, Edward W.

    2014-02-01

    Malaria is an infectious disease caused by parasites of the genus Plasmodium, which leads to approximately one million deaths per annum worldwide. Chemical validation of new antimalarial targets is urgently required in view of rising resistance to current drugs. One such putative target is the enzyme N-myristoyltransferase, which catalyses the attachment of the fatty acid myristate to protein substrates (N-myristoylation). Here, we report an integrated chemical biology approach to explore protein myristoylation in the major human parasite P. falciparum, combining chemical proteomic tools for identification of the myristoylated and glycosylphosphatidylinositol-anchored proteome with selective small-molecule N-myristoyltransferase inhibitors. We demonstrate that N-myristoyltransferase is an essential and chemically tractable target in malaria parasites both in vitro and in vivo, and show that selective inhibition of N-myristoylation leads to catastrophic and irreversible failure to assemble the inner membrane complex, a critical subcellular organelle in the parasite life cycle. Our studies provide the basis for the development of new antimalarials targeting N-myristoyltransferase.

  18. Experimental and statistical post-validation of positive example EST sequences carrying peroxisome targeting signals type 1 (PTS1).

    PubMed

    Lingner, Thomas; Kataya, Amr R A; Reumann, Sigrun

    2012-02-01

    We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences. As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity." Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals.

  19. Experimental and statistical post-validation of positive example EST sequences carrying peroxisome targeting signals type 1 (PTS1)

    PubMed Central

    Lingner, Thomas; Kataya, Amr R. A.; Reumann, Sigrun

    2012-01-01

    We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences.1 As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity.” Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals. PMID:22415050

  20. Full-scale laboratory validation of a wireless MEMS-based technology for damage assessment of concrete structures

    NASA Astrophysics Data System (ADS)

    Trapani, Davide; Zonta, Daniele; Molinari, Marco; Amditis, Angelos; Bimpas, Matthaios; Bertsch, Nicolas; Spiering, Vincent; Santana, Juan; Sterken, Tom; Torfs, Tom; Bairaktaris, Dimitris; Bairaktaris, Manos; Camarinopulos, Stefanos; Frondistou-Yannas, Mata; Ulieru, Dumitru

    2012-04-01

    This paper illustrates an experimental campaign conducted under laboratory conditions on a full-scale reinforced concrete three-dimensional frame instrumented with wireless sensors developed within the Memscon project. In particular it describes the assumptions which the experimental campaign was based on, the design of the structure, the laboratory setup and the results of the tests. The aim of the campaign was to validate the performance of Memscon sensing systems, consisting of wireless accelerometers and strain sensors, on a real concrete structure during construction and under an actual earthquake. Another aspect of interest was to assess the effectiveness of the full damage recognition procedure based on the data recorded by the sensors and the reliability of the Decision Support System (DSS) developed in order to provide the stakeholders recommendations for building rehabilitation and the costs of this. With these ends, a Eurocode 8 spectrum-compatible accelerogram with increasing amplitude was applied at the top of an instrumented concrete frame built in the laboratory. MEMSCON sensors were directly compared with wired instruments, based on devices available on the market and taken as references, during both construction and seismic simulation.

  1. Validation of the Six Sigma Z-score for the quality assessment of clinical laboratory timeliness.

    PubMed

    Ialongo, Cristiano; Bernardini, Sergio

    2018-03-28

    The International Federation of Clinical Chemistry and Laboratory Medicine has introduced in recent times the turnaround time (TAT) as mandatory quality indicator for the postanalytical phase. Classic TAT indicators, namely, average, median, 90th percentile and proportion of acceptable test (PAT), are in use since almost 40 years and to date represent the mainstay for gauging the laboratory timeliness. In this study, we investigated the performance of the Six Sigma Z-score, which was previously introduced as a device for the quantitative assessment of timeliness. A numerical simulation was obtained modeling the actual TAT data set using the log-logistic probability density function. Five thousand replicates for each size of the artificial TAT random sample (n=20, 50, 250 and 1000) were generated, and different laboratory conditions were simulated manipulating the PDF in order to generate more or less variable data. The Z-score and the classic TAT indicators were assessed for precision (%CV), robustness toward right-tailing (precision at different sample variability), sensitivity and specificity. Z-score showed sensitivity and specificity comparable to PAT (≈80% with n≥250), but superior precision that ranged within 20% by moderately small sized samples (n≥50); furthermore, Z-score was less affected by the value of the cutoff used for setting the acceptable TAT, as well as by the sample variability that reflected into the magnitude of right-tailing. The Z-score was a valid indicator of laboratory timeliness and a suitable device to improve as well as to maintain the achieved quality level.

  2. Single Laboratory Validated Method for Determination of Cylindrospermopsin and Anatoxin-a in Ambient Freshwaters by Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS)

    EPA Pesticide Factsheets

    This document is a standardized single laboratory validated liquid chromatography/tandem mass spectrometry (LC/MS/MS) method for the detection and quantification of cyanotoxins (combined intracellular and extracellular) in ambient freshwaters.

  3. Development and Validation of a Novel Vancomycin Dosing Nomogram for Achieving High-Target Trough Levels at 2 Canadian Teaching Hospitals

    PubMed Central

    Thalakada, Rosanne; Legal, Michael; Lau, Tim T Y; Luey, Tiffany; Batterink, Josh; Ensom, Mary H H

    2012-01-01

    Background: Recent guidelines recommend a vancomycin trough (predose) level between 15 and 20 mg/L in the treatment of invasive gram-positive infections, but most initial dosing nomograms are designed to achieve lower targets (5–15 mg/L). Clinicians need guidance about appropriate initial dosing to achieve the higher target. Objective: To develop and validate a high-target vancomycin dosing nomogram to achieve trough levels of 15–20 mg/L. Methods: A retrospective study was conducted at 2 teaching hospitals, St Paul’s Hospital and Vancouver General Hospital in Vancouver, British Columbia. Patients who were treated with vancomycin between January 2008 and June 2010 and who had achieved a trough level of 14.5–20.5 mg/L were identified. Demographic and clinical data were collected. Multiple linear regression was used to develop a vancomycin dosing nomogram for each hospital site. An integrated nomogram was constructed by merging the data from the 2 hospitals. A unique set of patients at each institution was used for validating their respective nomograms and a pooled group of patients for validating the integrated nomogram. Predictive success was evaluated, and a nomogram was deemed significantly different from another nomogram if p < 0.05 via “χ2 testing. Results: Data from 78 patients at one hospital and 91 patients at the other were used in developing the respective institutional nomograms. For each hospital’s data set, both age and initial serum creatinine were significantly associated with the predicted dosing interval (p < 0.001). Validation in a total of 105 test patients showed that the integrated nomogram had a predictive success rate of 56%. Conclusions: A novel vancomycin dosing nomogram was developed and validated at 2 Canadian teaching hospitals. This integrated nomogram is a tool that clinicians can use in selecting appropriate initial vancomycin regimens on the basis of age and serum creatinine, to achieve high-target levels of 15–20 mg

  4. An Assessment of Database-Validated microRNA Target Genes in Normal Colonic Mucosa: Implications for Pathway Analysis.

    PubMed

    Slattery, Martha L; Herrick, Jennifer S; Stevens, John R; Wolff, Roger K; Mullany, Lila E

    2017-01-01

    Determination of functional pathways regulated by microRNAs (miRNAs), while an essential step in developing therapeutics, is challenging. Some miRNAs have been studied extensively; others have limited information. In this study, we focus on 254 miRNAs previously identified as being associated with colorectal cancer and their database-identified validated target genes. We use RNA-Seq data to evaluate messenger RNA (mRNA) expression for 157 subjects who also had miRNA expression data. In the replication phase of the study, we replicated associations between 254 miRNAs associated with colorectal cancer and mRNA expression of database-identified target genes in normal colonic mucosa. In the discovery phase of the study, we evaluated expression of 18 miR-NAs (those with 20 or fewer database-identified target genes along with miR-21-5p, miR-215-5p, and miR-124-3p which have more than 500 database-identified target genes) with expression of 17 434 mRNAs to identify new targets in colon tissue. Seed region matches between miRNA and newly identified targeted mRNA were used to help determine direct miRNA-mRNA associations. From the replication of the 121 miRNAs that had at least 1 database-identified target gene using mRNA expression methods, 97.9% were expressed in normal colonic mucosa. Of the 8622 target miRNA-mRNA associations identified in the database, 2658 (30.2%) were associated with gene expression in normal colonic mucosa after adjusting for multiple comparisons. Of the 133 miRNAs with database-identified target genes by non-mRNA expression methods, 97.2% were expressed in normal colonic mucosa. After adjustment for multiple comparisons, 2416 miRNA-mRNA associations remained significant (19.8%). Results from the discovery phase based on detailed examination of 18 miRNAs identified more than 80 000 miRNA-mRNA associations that had not previously linked to the miRNA. Of these miRNA-mRNA associations, 15.6% and 14.8% had seed matches for CRCh38 and CRCh37

  5. Evaluation of Calibration Laboratories Performance

    NASA Astrophysics Data System (ADS)

    Filipe, Eduarda

    2011-12-01

    One of the main goals of interlaboratory comparisons (ILCs) is the evaluation of the laboratories performance for the routine calibrations they perform for the clients. In the frame of Accreditation of Laboratories, the national accreditation boards (NABs) in collaboration with the national metrology institutes (NMIs) organize the ILCs needed to comply with the requirements of the international accreditation organizations. In order that an ILC is a reliable tool for a laboratory to validate its best measurement capability (BMC), it is needed that the NMI (reference laboratory) provides a better traveling standard—in terms of accuracy class or uncertainty—than the laboratories BMCs. Although this is the general situation, there are cases where the NABs ask the NMIs to evaluate the performance of the accredited laboratories when calibrating industrial measuring instruments. The aim of this article is to discuss the existing approaches for the evaluation of ILCs and propose a basis for the validation of the laboratories measurement capabilities. An example is drafted with the evaluation of the results of mercury-in-glass thermometers ILC with 12 participant laboratories.

  6. Development and inter-laboratory validation study of an improved new real-time PCR assay with internal control for detection and laboratory diagnosis of African swine fever virus.

    PubMed

    Tignon, Marylène; Gallardo, Carmina; Iscaro, Carmen; Hutet, Evelyne; Van der Stede, Yves; Kolbasov, Denis; De Mia, Gian Mario; Le Potier, Marie-Frédérique; Bishop, Richard P; Arias, Marisa; Koenen, Frank

    2011-12-01

    A real-time polymerase chain reaction (PCR) assay for the rapid detection of African swine fever virus (ASFV), multiplexed for simultaneous detection of swine beta-actin as an endogenous control, has been developed and validated by four National Reference Laboratories of the European Union for African swine fever (ASF) including the European Union Reference Laboratory. Primers and a TaqMan(®) probe specific for ASFV were selected from conserved regions of the p72 gene. The limit of detection of the new real-time PCR assay is 5.7-57 copies of the ASFV genome. High accuracy, reproducibility and robustness of the PCR assay (CV ranging from 0.7 to 5.4%) were demonstrated both within and between laboratories using different real-time PCR equipments. The specificity of virus detection was validated using a panel of 44 isolates collected over many years in various geographical locations in Europe, Africa and America, including recent isolates from the Caucasus region, Sardinia, East and West Africa. Compared to the OIE-prescribed conventional and real-time PCR assays, the sensitivity of the new assay with internal control was improved, as demonstrated by testing 281 field samples collected in recent outbreaks and surveillance areas in Europe and Africa (170 samples) together with samples obtained through experimental infections (111 samples). This is particularly evident in the early days following experimental infection and during the course of the disease in pigs sub-clinically infected with strains of low virulence (from 35 up to 70dpi). The specificity of the assay was also confirmed on 150 samples from uninfected pigs and wild boar from ASF-free areas. Measured on the total of 431 tested samples, the positive deviation of the new assay reaches 21% or 26% compared to PCR and real-time PCR methods recommended by OIE. This improved and rigorously validated real-time PCR assay with internal control will provide a rapid, sensitive and reliable molecular tool for ASFV

  7. Procedures For Microbial-Ecology Laboratory

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1993-01-01

    Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.

  8. Identification and validation nucleolin as a target of curcumol in nasopharyngeal carcinoma cells.

    PubMed

    Wang, Juan; Wu, Jiacai; Li, Xumei; Liu, Haowei; Qin, Jianli; Bai, Zhun; Chi, Bixia; Chen, Xu

    2018-06-30

    Identification of the specific protein target(s) of a drug is a critical step in unraveling its mechanisms of action (MOA) in many natural products. Curcumol, isolated from well known Chinese medicinal plant Curcuma zedoary, has been shown to possess multiple biological activities. It can inhibit nasopharyngeal carcinoma (NPC) proliferation and induce apoptosis, but its target protein(s) in NPC cells remains unclear. In this study, we employed a mass spectrometry-based chemical proteomics approach reveal the possible protein targets of curcumol in NPC cells. Cellular thermal shift assay (CETSA), molecular docking and cell-based assay was used to validate the binding interactions. Chemical proteomics capturing uncovered that NCL is a target of curcumol in NPC cells, Molecular docking showed that curcumol bound to NCL with an -7.8 kcal/mol binding free energy. Cell function analysis found that curcumol's treatment leads to a degradation of NCL in NPC cells, and it showed slight effects on NP69 cells. In conclusion, our results providing evidences that NCL is a target protein of curcumol. We revealed that the anti-cancer effects of curcumol in NPC cells are mediated, at least in part, by NCL inhibition. Many natural products showed high bioactivity, while their mechanisms of action (MOA) are very poor or completely missed. Understanding the MOA of natural drugs can thoroughly exploit their therapeutic potential and minimize their adverse side effects. Identification of the specific protein target(s) of a drug is a critical step in unraveling its MOA. Compound-centric chemical proteomics is a classic chemical proteomics approach which integrates chemical synthesis with cell biology and mass spectrometry (MS) to identify protein targets of natural products determine the drug mechanism of action, describe its toxicity, and figure out the possible cause of off-target. It is an affinity-based chemical proteomics method to identify small molecule-protein interactions

  9. Linking rare and common disease: mapping clinical disease-phenotypes to ontologies in therapeutic target validation.

    PubMed

    Sarntivijai, Sirarat; Vasant, Drashtti; Jupp, Simon; Saunders, Gary; Bento, A Patrícia; Gonzalez, Daniel; Betts, Joanna; Hasan, Samiul; Koscielny, Gautier; Dunham, Ian; Parkinson, Helen; Malone, James

    2016-01-01

    The Centre for Therapeutic Target Validation (CTTV - https://www.targetvalidation.org/) was established to generate therapeutic target evidence from genome-scale experiments and analyses. CTTV aims to support the validity of therapeutic targets by integrating existing and newly-generated data. Data integration has been achieved in some resources by mapping metadata such as disease and phenotypes to the Experimental Factor Ontology (EFO). Additionally, the relationship between ontology descriptions of rare and common diseases and their phenotypes can offer insights into shared biological mechanisms and potential drug targets. Ontologies are not ideal for representing the sometimes associated type relationship required. This work addresses two challenges; annotation of diverse big data, and representation of complex, sometimes associated relationships between concepts. Semantic mapping uses a combination of custom scripting, our annotation tool 'Zooma', and expert curation. Disease-phenotype associations were generated using literature mining on Europe PubMed Central abstracts, which were manually verified by experts for validity. Representation of the disease-phenotype association was achieved by the Ontology of Biomedical AssociatioN (OBAN), a generic association representation model. OBAN represents associations between a subject and object i.e., disease and its associated phenotypes and the source of evidence for that association. The indirect disease-to-disease associations are exposed through shared phenotypes. This was applied to the use case of linking rare to common diseases at the CTTV. EFO yields an average of over 80% of mapping coverage in all data sources. A 42% precision is obtained from the manual verification of the text-mined disease-phenotype associations. This results in 1452 and 2810 disease-phenotype pairs for IBD and autoimmune disease and contributes towards 11,338 rare diseases associations (merged with existing published work [Am J Hum Genet

  10. Validation and verification of the laser range safety tool (LRST)

    NASA Astrophysics Data System (ADS)

    Kennedy, Paul K.; Keppler, Kenneth S.; Thomas, Robert J.; Polhamus, Garrett D.; Smith, Peter A.; Trevino, Javier O.; Seaman, Daniel V.; Gallaway, Robert A.; Crockett, Gregg A.

    2003-06-01

    The U.S. Dept. of Defense (DOD) is currently developing and testing a number of High Energy Laser (HEL) weapons systems. DOD range safety officers now face the challenge of designing safe methods of testing HEL's on DOD ranges. In particular, safety officers need to ensure that diffuse and specular reflections from HEL system targets, as well as direct beam paths, are contained within DOD boundaries. If both the laser source and the target are moving, as they are for the Airborne Laser (ABL), a complex series of calculations is required and manual calculations are impractical. Over the past 5 years, the Optical Radiation Branch of the Air Force Research Laboratory (AFRL/HEDO), the ABL System Program Office, Logicon-RDA, and Northrup-Grumman, have worked together to develop a computer model called teh Laser Range Safety Tool (LRST), specifically designed for HEL reflection hazard analyses. The code, which is still under development, is currently tailored to support the ABL program. AFRL/HEDO has led an LRST Validation and Verification (V&V) effort since 1998, in order to determine if code predictions are accurate. This paper summarizes LRST V&V efforts to date including: i) comparison of code results with laboratory measurements of reflected laser energy and with reflection measurements made during actual HEL field tests, and ii) validation of LRST's hazard zone computations.

  11. MRM validation of targeted nonglycosylated peptides from N-glycoprotein biomarkers using direct trypsin digestion of undepleted human plasma.

    PubMed

    Lee, Ju Yeon; Kim, Jin Young; Cheon, Mi Hee; Park, Gun Wook; Ahn, Yeong Hee; Moon, Myeong Hee; Yoo, Jong Shin

    2014-02-26

    A rapid, simple, and reproducible MRM-based validation method for serological glycoprotein biomarkers in clinical use was developed by targeting the nonglycosylated tryptic peptides adjacent to N-glycosylation sites. Since changes in protein glycosylation are known to be associated with a variety of diseases, glycoproteins have been major targets in biomarker discovery. We previously found that nonglycosylated tryptic peptides adjacent to N-glycosylation sites differed in concentration between normal and hepatocellular carcinoma (HCC) plasma due to differences in steric hindrance of the glycan moiety in N-glycoproteins to tryptic digestion (Lee et al., 2011). To increase the feasibility and applicability of clinical validation of biomarker candidates (nonglycosylated tryptic peptides), we developed a method to effectively monitor nonglycosylated tryptic peptides from a large number of plasma samples and to reduce the total analysis time with maximizing the effect of steric hindrance by the glycans during digestion of glycoproteins. The AUC values of targeted nonglycosylated tryptic peptides were excellent (0.955 for GQYCYELDEK, 0.880 for FEDGVLDPDYPR and 0.907 for TEDTIFLR), indicating that these could be effective biomarkers for hepatocellular carcinoma. This method provides the necessary throughput required to validate glycoprotein biomarkers, as well as quantitative accuracy for human plasma analysis, and should be amenable to clinical use. Difficulties in verifying and validating putative protein biomarkers are often caused by complex sample preparation procedures required to determine their concentrations in a large number of plasma samples. To solve the difficulties, we developed MRM-based protein biomarker assays that greatly reduce complex, time-consuming, and less reproducible sample pretreatment steps in plasma for clinical implementation. First, we used undepleted human plasma samples without any enrichment procedures. Using nanoLC/MS/MS, we targeted

  12. An European inter-laboratory validation of alternative endpoints of the murine local lymph node assay: first round.

    PubMed

    Ehling, G; Hecht, M; Heusener, A; Huesler, J; Gamer, A O; van Loveren, H; Maurer, Th; Riecke, K; Ullmann, L; Ulrich, P; Vandebriel, R; Vohr, H-W

    2005-08-15

    The new OECD guideline 429 (skin sensitization: local lymph node assay) is based upon a protocol, which utilises the incorporation of radioactivity into DNA as a measure for cell proliferation in vivo. The guideline also enables the use of alternative endpoints in order to assess draining lymph node (LN) cell proliferation. Here we describe the first round of an inter-laboratory validation of alternative endpoints in the LLNA conducted in seven laboratories. The validation study was managed and supervised by the Swiss Agency for Therapeutic Products, Swissmedic. Statistical analyses of all data were performed by an independent centre at the University of Bern, Department of Statistics. Ear-draining, LN weight and cell count were used to assess proliferation instead of radioactive labeling of lymph node cells. In addition, the acute inflammatory skin reaction was measured by ear swelling and weight of circular biopsies of the ears to identify skin irritating properties of the test items. Hexylcinnamaldehyde (HCA) and three blinded test items were applied to female, 8--10 weeks old NMRI and BALB/c mice. Results were sent via the independent study coordinator to the statistician. The results of this first round showed that the alternative endpoints of the LLNA are sensitive and robust parameters. The use of ear weights added an important parameter assessing the skin irritation potential, which supports the differentiation of pure irritative from contact allergenic potential. There were absolute no discrepancies between the categorisation of the three test substances A--C determined by each single participating laboratories. The results highlighted also that many parameters do have an impact on the strength of the responses. Therefore, such parameters have to be taken into consideration for the categorisation of compounds due to their relative sensitizing potencies.

  13. The ideal laboratory information system.

    PubMed

    Sepulveda, Jorge L; Young, Donald S

    2013-08-01

    Laboratory information systems (LIS) are critical components of the operation of clinical laboratories. However, the functionalities of LIS have lagged significantly behind the capacities of current hardware and software technologies, while the complexity of the information produced by clinical laboratories has been increasing over time and will soon undergo rapid expansion with the use of new, high-throughput and high-dimensionality laboratory tests. In the broadest sense, LIS are essential to manage the flow of information between health care providers, patients, and laboratories and should be designed to optimize not only laboratory operations but also personalized clinical care. To list suggestions for designing LIS with the goal of optimizing the operation of clinical laboratories while improving clinical care by intelligent management of laboratory information. Literature review, interviews with laboratory users, and personal experience and opinion. Laboratory information systems can improve laboratory operations and improve patient care. Specific suggestions for improving the function of LIS are listed under the following sections: (1) Information Security, (2) Test Ordering, (3) Specimen Collection, Accessioning, and Processing, (4) Analytic Phase, (5) Result Entry and Validation, (6) Result Reporting, (7) Notification Management, (8) Data Mining and Cross-sectional Reports, (9) Method Validation, (10) Quality Management, (11) Administrative and Financial Issues, and (12) Other Operational Issues.

  14. Validation of the use of synthetic imagery for camouflage effectiveness assessment

    NASA Astrophysics Data System (ADS)

    Newman, Sarah; Gilmore, Marilyn A.; Moorhead, Ian R.; Filbee, David R.

    2002-08-01

    CAMEO-SIM was developed as a laboratory method to assess the effectiveness of aircraft camouflage schemes. It is a physically accurate synthetic image generator, rendering in any waveband between 0.4 and 14 microns. Camouflage schemes are assessed by displaying imagery to observers under controlled laboratory conditions or by analyzing the digital image and calculating the contrast statistics between the target and background. Code verification has taken place during development. However, validation of CAMEO-SIM is essential to ensure that the imagery produced is suitable to be used for camouflage effectiveness assessment. Real world characteristics are inherently variable, so exact pixel to pixel correlation is unnecessary. For camouflage effectiveness assessment it is more important to be confident that the comparative effects of different schemes are correct, but prediction of detection ranges is also desirable. Several different tests have been undertaken to validate CAMEO-SIM for the purpose of assessing camouflage effectiveness. Simple scenes have been modeled and measured. Thermal and visual properties of the synthetic and real scenes have been compared. This paper describes the validation tests and discusses the suitability of CAMEO-SIM for camouflage assessment.

  15. Accuracy Analysis and Validation of the Mars Science Laboratory (MSL) Robotic Arm

    NASA Technical Reports Server (NTRS)

    Collins, Curtis L.; Robinson, Matthew L.

    2013-01-01

    The Mars Science Laboratory (MSL) Curiosity Rover is currently exploring the surface of Mars with a suite of tools and instruments mounted to the end of a five degree-of-freedom robotic arm. To verify and meet a set of end-to-end system level accuracy requirements, a detailed positioning uncertainty model of the arm was developed and exercised over the arm operational workspace. Error sources at each link in the arm kinematic chain were estimated and their effects propagated to the tool frames.A rigorous test and measurement program was developed and implemented to collect data to characterize and calibrate the kinematic and stiffness parameters of the arm. Numerous absolute and relative accuracy and repeatability requirements were validated with a combination of analysis and test data extrapolated to the Mars gravity and thermal environment. Initial results of arm accuracy and repeatability on Mars demonstrate the effectiveness of the modeling and test program as the rover continues to explore the foothills of Mount Sharp.

  16. Clinical Validation of Targeted Next Generation Sequencing for Colon and Lung Cancers

    PubMed Central

    D’Haene, Nicky; Le Mercier, Marie; De Nève, Nancy; Blanchard, Oriane; Delaunoy, Mélanie; El Housni, Hakim; Dessars, Barbara; Heimann, Pierre; Remmelink, Myriam; Demetter, Pieter; Tejpar, Sabine; Salmon, Isabelle

    2015-01-01

    Objective Recently, Next Generation Sequencing (NGS) has begun to supplant other technologies for gene mutation testing that is now required for targeted therapies. However, transfer of NGS technology to clinical daily practice requires validation. Methods We validated the Ion Torrent AmpliSeq Colon and Lung cancer panel interrogating 1850 hotspots in 22 genes using the Ion Torrent Personal Genome Machine. First, we used commercial reference standards that carry mutations at defined allelic frequency (AF). Then, 51 colorectal adenocarcinomas (CRC) and 39 non small cell lung carcinomas (NSCLC) were retrospectively analyzed. Results Sensitivity and accuracy for detecting variants at an AF >4% was 100% for commercial reference standards. Among the 90 cases, 89 (98.9%) were successfully sequenced. Among the 86 samples for which NGS and the reference test were both informative, 83 showed concordant results between NGS and the reference test; i.e. KRAS and BRAF for CRC and EGFR for NSCLC, with the 3 discordant cases each characterized by an AF <10%. Conclusions Overall, the AmpliSeq colon/lung cancer panel was specific and sensitive for mutation analysis of gene panels and can be incorporated into clinical daily practice. PMID:26366557

  17. Post-launch validation of Multispectral Thermal Imager (MTI) data and algorithms

    NASA Astrophysics Data System (ADS)

    Garrett, Alfred J.; Kurzeja, Robert J.; O'Steen, B. L.; Parker, Matthew J.; Pendergast, Malcolm M.; Villa-Aleman, Eliel

    1999-10-01

    Sandia National Laboratories (SNL), Los Alamos National Laboratory (LANL) and the Savannah River Technology Center (SRTC) have developed a diverse group of algorithms for processing and analyzing the data that will be collected by the Multispectral Thermal Imager (MTI) after launch late in 1999. Each of these algorithms must be verified by comparison to independent surface and atmospheric measurements. SRTC has selected 13 sites in the continental U.S. for ground truth data collections. These sites include a high altitude cold water target (Crater Lake), cooling lakes and towers in the warm, humid southeastern U.S., Department of Energy (DOE) climate research sites, the NASA Stennis satellite Validation and Verification (V&V) target array, waste sites at the Savannah River Site, mining sites in the Four Corners area and dry lake beds in Nevada. SRTC has established mutually beneficial relationships with the organizations that manage these sites to make use of their operating and research data and to install additional instrumentation needed for MTI algorithm V&V.

  18. 42 CFR 493.563 - Validation inspections-Basis and focus.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Validation inspections-Basis and focus. 493.563... Validation inspections—Basis and focus. (a) Basis for validation inspection—(1) Laboratory with a certificate... of that State's licensed or approved laboratories from CLIA program requirements. (b) Validation...

  19. 42 CFR 493.563 - Validation inspections-Basis and focus.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Validation inspections-Basis and focus. 493.563... Validation inspections—Basis and focus. (a) Basis for validation inspection—(1) Laboratory with a certificate... of that State's licensed or approved laboratories from CLIA program requirements. (b) Validation...

  20. 42 CFR 493.563 - Validation inspections-Basis and focus.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Validation inspections-Basis and focus. 493.563... Validation inspections—Basis and focus. (a) Basis for validation inspection—(1) Laboratory with a certificate... of that State's licensed or approved laboratories from CLIA program requirements. (b) Validation...

  1. 42 CFR 493.563 - Validation inspections-Basis and focus.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Validation inspections-Basis and focus. 493.563... Validation inspections—Basis and focus. (a) Basis for validation inspection—(1) Laboratory with a certificate... of that State's licensed or approved laboratories from CLIA program requirements. (b) Validation...

  2. 42 CFR 493.563 - Validation inspections-Basis and focus.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Validation inspections-Basis and focus. 493.563... Validation inspections—Basis and focus. (a) Basis for validation inspection—(1) Laboratory with a certificate... of that State's licensed or approved laboratories from CLIA program requirements. (b) Validation...

  3. Site selection and directional models of deserts used for ERBE validation targets

    NASA Technical Reports Server (NTRS)

    Staylor, W. F.

    1986-01-01

    Broadband shortwave and longwave radiance measurements obtained from the Nimbus 7 Earth Radiation Budget scanner were used to develop reflectance and emittance models for the Sahara, Gibson, and Saudi Deserts. These deserts will serve as in-flight validation targets for the Earth Radiation Budget Experiment being flown on the Earth Radiation Budget Satellite and two National Oceanic and Atmospheric Administration polar satellites. The directional reflectance model derived for the deserts was a function of the sum and product of the cosines of the solar and viewing zenith angles, and thus reciprocity existed between these zenith angles. The emittance model was related by a power law of the cosine of the viewing zenith angle.

  4. Creation and Validation of a Novel Mobile Simulation Laboratory for High Fidelity, Prehospital, Difficult Airway Simulation.

    PubMed

    Bischof, Jason J; Panchal, Ashish R; Finnegan, Geoffrey I; Terndrup, Thomas E

    2016-10-01

    Introduction Endotracheal intubation (ETI) is a complex clinical skill complicated by the inherent challenge of providing care in the prehospital setting. Literature reports a low success rate of prehospital ETI attempts, partly due to the care environment and partly to the lack of consistent standardized training opportunities of prehospital providers in ETI. Hypothesis/Problem The availability of a mobile simulation laboratory (MSL) to study clinically critical interventions is needed in the prehospital setting to enhance instruction and maintain proficiency. This report is on the development and validation of a prehospital airway simulator and MSL that mimics in situ care provided in an ambulance. The MSL was a Type 3 ambulance with four cameras allowing audio-video recordings of observable behaviors. The prehospital airway simulator is a modified airway mannequin with increased static tongue pressure and a rigid cervical collar. Airway experts validated the model in a static setting through ETI at varying tongue pressures with a goal of a Grade 3 Cormack-Lehane (CL) laryngeal view. Following completion of this development, the MSL was launched with the prehospital airway simulator to distant communities utilizing a single facilitator/driver. Paramedics were recruited to perform ETI in the MSL, and the detailed airway management observations were stored for further analysis. Nineteen airway experts performed 57 ETI attempts at varying tongue pressures demonstrating increased CL views at higher tongue pressures. Tongue pressure of 60 mm Hg generated 31% Grade 3/4 CL view and was chosen for the prehospital trials. The MSL was launched and tested by 18 paramedics. First pass success was 33% with another 33% failing to intubate within three attempts. The MSL created was configured to deliver, record, and assess intubator behaviors with a difficult airway simulation. The MSL created a reproducible, high fidelity, mobile learning environment for assessment of

  5. An European inter-laboratory validation of alternative endpoints of the murine local lymph node assay: 2nd round.

    PubMed

    Ehling, G; Hecht, M; Heusener, A; Huesler, J; Gamer, A O; van Loveren, H; Maurer, Th; Riecke, K; Ullmann, L; Ulrich, P; Vandebriel, R; Vohr, H-W

    2005-08-15

    The original local lymph node assay (LLNA) is based on the use of radioactive labelling to measure cell proliferation. Other endpoints for the assessment of proliferation are also authorized by the OECD Guideline 429 provided there is appropriate scientific support, including full citations and description of the methodology (OECD, 2002. OECD Guideline for the Testing of Chemicals; Skin Sensitization: Local Lymph Node Assay, Guideline 429. Paris, adopted 24th April 2002.). Here, we describe the outcome of the second round of an inter-laboratory validation of alternative endpoints in the LLNA conducted in nine laboratories in Europe. The validation study was managed and supervised by the Swiss Agency for Therapeutic Products (Swissmedic) in Bern. Ear-draining lymph node (LN) weight and cell counts were used to assess LN cell proliferation instead of [3H]TdR incorporation. In addition, the acute inflammatory skin reaction was measured by ear weight determination of circular biopsies of the ears to identify skin irritation properties of the test items. The statistical analysis was performed in the department of statistics at the university of Bern. Similar to the EC(3) values defined for the radioactive method, threshold values were calculated for the endpoints measured in this modification of the LLNA. It was concluded that all parameters measured have to be taken into consideration for the categorisation of compounds due to their sensitising potencies. Therefore, an assessment scheme has been developed which turned out to be of great importance to consistently assess sensitisation versus irritancy based on the data of the different parameters. In contrast to the radioactive method, irritants have been picked up by all the laboratories applying this assessment scheme.

  6. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors

    PubMed Central

    Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister

    2017-01-01

    Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach

  7. Determination of Antimycin-A in water by liquid chromatographic/mass spectrometry: single-laboratory validation

    USGS Publications Warehouse

    Bernardy, Jeffry A.; Hubert, Terrance D.; Ogorek, Jacob M.; Schmidt, Larry J.

    2013-01-01

    An LC/MS method was developed and validated for the quantitative determination and confirmation of antimycin-A (ANT-A) in water from lakes or streams. Three different water sample volumes (25, 50, and 250 mL) were evaluated. ANT-A was stabilized in the field by immediately extracting it from water into anhydrous acetone using SPE. The stabilized concentrated samples were then transported to a laboratory and analyzed by LC/MS using negative electrospray ionization. The method was determined to have adequate accuracy (78 to 113% recovery), precision (0.77 to 7.5% RSD with samples ≥500 ng/L and 4.8 to 17% RSD with samples ≤100 ng/L), linearity, and robustness over an LOQ range from 8 to 51 600 ng/L.

  8. Determination of antimycin-A in water by liquid chromatographic/mass spectrometry: single-laboratory validation.

    PubMed

    Bernardy, Jeffry A; Hubert, Terrance D; Ogorek, Jacob M; Schmidt, Larry J

    2013-01-01

    An LC/MS method was developed and validated for the quantitative determination and confirmation of antimycin-A (ANT-A) in water from lakes or streams. Three different water sample volumes (25, 50, and 250 mL) were evaluated. ANT-A was stabilized in the field by immediately extracting it from water into anhydrous acetone using SPE. The stabilized concentrated samples were then transported to a laboratory and analyzed by LC/MS using negative electrospray ionization. The method was determined to have adequate accuracy (78 to 113% recovery), precision (0.77 to 7.5% RSD with samples > or = 500 ng/L and 4.8 to 17% RSD with samples < or = 100 ng/L), linearity, and robustness over an LOQ range from 8 to 51 600 ng/L.

  9. VIRmiRNA: a comprehensive resource for experimentally validated viral miRNAs and their targets.

    PubMed

    Qureshi, Abid; Thakur, Nishant; Monga, Isha; Thakur, Anamika; Kumar, Manoj

    2014-01-01

    Viral microRNAs (miRNAs) regulate gene expression of viral and/or host genes to benefit the virus. Hence, miRNAs play a key role in host-virus interactions and pathogenesis of viral diseases. Lately, miRNAs have also shown potential as important targets for the development of novel antiviral therapeutics. Although several miRNA and their target repositories are available for human and other organisms in literature, but a dedicated resource on viral miRNAs and their targets are lacking. Therefore, we have developed a comprehensive viral miRNA resource harboring information of 9133 entries in three subdatabases. This includes 1308 experimentally validated miRNA sequences with their isomiRs encoded by 44 viruses in viral miRNA ' VIRMIRNA: ' and 7283 of their target genes in ' VIRMIRTAR': . Additionally, there is information of 542 antiviral miRNAs encoded by the host against 24 viruses in antiviral miRNA ' AVIRMIR': . The web interface was developed using Linux-Apache-MySQL-PHP (LAMP) software bundle. User-friendly browse, search, advanced search and useful analysis tools are also provided on the web interface. VIRmiRNA is the first specialized resource of experimentally proven virus-encoded miRNAs and their associated targets. This database would enhance the understanding of viral/host gene regulation and may also prove beneficial in the development of antiviral therapeutics. Database URL: http://crdd.osdd.net/servers/virmirna. © The Author(s) 2014. Published by Oxford University Press.

  10. Validation of numerical codes for impact and explosion cratering: Impacts on strengthless and metal targets

    NASA Astrophysics Data System (ADS)

    Pierazzo, E.; Artemieva, N.; Asphaug, E.; Baldwin, E. C.; Cazamias, J.; Coker, R.; Collins, G. S.; Crawford, D. A.; Davison, T.; Elbeshausen, D.; Holsapple, K. A.; Housen, K. R.; Korycansky, D. G.; Wünnemann, K.

    2008-12-01

    Over the last few decades, rapid improvement of computer capabilities has allowed impact cratering to be modeled with increasing complexity and realism, and has paved the way for a new era of numerical modeling of the impact process, including full, three-dimensional (3D) simulations. When properly benchmarked and validated against observation, computer models offer a powerful tool for understanding the mechanics of impact crater formation. This work presents results from the first phase of a project to benchmark and validate shock codes. A variety of 2D and 3D codes were used in this study, from commercial products like AUTODYN, to codes developed within the scientific community like SOVA, SPH, ZEUS-MP, iSALE, and codes developed at U.S. National Laboratories like CTH, SAGE/RAGE, and ALE3D. Benchmark calculations of shock wave propagation in aluminum-on-aluminum impacts were performed to examine the agreement between codes for simple idealized problems. The benchmark simulations show that variability in code results is to be expected due to differences in the underlying solution algorithm of each code, artificial stability parameters, spatial and temporal resolution, and material models. Overall, the inter-code variability in peak shock pressure as a function of distance is around 10 to 20%. In general, if the impactor is resolved by at least 20 cells across its radius, the underestimation of peak shock pressure due to spatial resolution is less than 10%. In addition to the benchmark tests, three validation tests were performed to examine the ability of the codes to reproduce the time evolution of crater radius and depth observed in vertical laboratory impacts in water and two well-characterized aluminum alloys. Results from these calculations are in good agreement with experiments. There appears to be a general tendency of shock physics codes to underestimate the radius of the forming crater. Overall, the discrepancy between the model and experiment results is

  11. Laboratory evolution of artificially expanded DNA gives redesignable aptamers that target the toxic form of anthrax protective antigen

    PubMed Central

    Biondi, Elisa; Lane, Joshua D.; Das, Debasis; Dasgupta, Saurja; Piccirilli, Joseph A.; Hoshika, Shuichi; Bradley, Kevin M.; Krantz, Bryan A.; Benner, Steven A.

    2016-01-01

    Reported here is a laboratory in vitro evolution (LIVE) experiment based on an artificially expanded genetic information system (AEGIS). This experiment delivers the first example of an AEGIS aptamer that binds to an isolated protein target, the first whose structural contact with its target has been outlined and the first to inhibit biologically important activities of its target, the protective antigen from Bacillus anthracis. We show how rational design based on secondary structure predictions can also direct the use of AEGIS to improve the stability and binding of the aptamer to its target. The final aptamer has a dissociation constant of ∼35 nM. These results illustrate the value of AEGIS-LIVE for those seeking to obtain receptors and ligands without the complexities of medicinal chemistry, and also challenge the biophysical community to develop new tools to analyze the spectroscopic signatures of new DNA folds that will emerge in synthetic genetic systems replacing standard DNA and RNA as platforms for LIVE. PMID:27701076

  12. Utility of NIST Whole-Genome Reference Materials for the Technical Validation of a Multigene Next-Generation Sequencing Test.

    PubMed

    Shum, Bennett O V; Henner, Ilya; Belluoccio, Daniele; Hinchcliffe, Marcus J

    2017-07-01

    The sensitivity and specificity of next-generation sequencing laboratory developed tests (LDTs) are typically determined by an analyte-specific approach. Analyte-specific validations use disease-specific controls to assess an LDT's ability to detect known pathogenic variants. Alternatively, a methods-based approach can be used for LDT technical validations. Methods-focused validations do not use disease-specific controls but use benchmark reference DNA that contains known variants (benign, variants of unknown significance, and pathogenic) to assess variant calling accuracy of a next-generation sequencing workflow. Recently, four whole-genome reference materials (RMs) from the National Institute of Standards and Technology (NIST) were released to standardize methods-based validations of next-generation sequencing panels across laboratories. We provide a practical method for using NIST RMs to validate multigene panels. We analyzed the utility of RMs in validating a novel newborn screening test that targets 70 genes, called NEO1. Despite the NIST RM variant truth set originating from multiple sequencing platforms, replicates, and library types, we discovered a 5.2% false-negative variant detection rate in the RM truth set genes that were assessed in our validation. We developed a strategy using complementary non-RM controls to demonstrate 99.6% sensitivity of the NEO1 test in detecting variants. Our findings have implications for laboratories or proficiency testing organizations using whole-genome NIST RMs for testing. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  13. Validation of the Sysmex sp-1000i automated slide preparer-stainer in a clinical laboratory

    PubMed Central

    de Bitencourt, Eberson Damião dos Santos; Voegeli, Carlos Franco; Onzi, Gabriela dos Santos; Boscato, Sara Cardoso; Ghem, Carine; Munhoz, Terezinha

    2013-01-01

    Background The speed and quality of information have become essential items in the release of laboratory reports. The Sysmex®SP1000-I device has been developed to prepare and stain smear slides. However, for a device to be cleared for use in the laboratory routine it must pass through a validation process. Objective To evaluate the performance and reliability of the Sysmex® SP-1000i slide preparer-stainer incorporated into the routine of a hospital laboratory in Porto Alegre. Methods Peripheral blood samples of patients attending the laboratory for ambulatory exams with leukocyte counts between 7000/°L and 12,000/°L were evaluated, independent of gender and age. Two slides were prepared for each sample using the Sysmex® SP-1000i equipment; one of the slides was used to perform quality control tests using the CellaVision® DM96 device, and the other slide was used to compare pre-classification by the same device and the classification performed by a pharmacist-biochemist. Results The results of all the slides used as controls were acceptable according to the quality control test as established by the manufacturer of the device. In the comparison between the automated pre-classification and the classification made by the professional, there was an acceptable variation in the differential counts of leukocytes for 90% of the analyzed slides. Pearson correlation coefficient showed a strong correlation for band neutrophils (r = 0.802; p-value < 0.001), segmented neutrophils (r = 0.963; p-value < 0.001), eosinophils (r = 0.958; p-value < 0.001), lymphocytes (r = 0.985; p-value < 0.001) and atypical lymphocytes (r = 0.866; p-value < 0.001) using both methods. The red blood cell analysis was adequate for all slides analyzed by the equipment and by the professional. Conclusion The new Sysmex®SP1000-i methodology was found to be reliable, fast and safe for the routines of medium and large laboratories, improving the quality of microscopic analysis in complete blood

  14. Validation of Physical Activity Tracking via Android Smartphones Compared to ActiGraph Accelerometer: Laboratory-Based and Free-Living Validation Studies.

    PubMed

    Hekler, Eric B; Buman, Matthew P; Grieco, Lauren; Rosenberger, Mary; Winter, Sandra J; Haskell, William; King, Abby C

    2015-04-15

    There is increasing interest in using smartphones as stand-alone physical activity monitors via their built-in accelerometers, but there is presently limited data on the validity of this approach. The purpose of this work was to determine the validity and reliability of 3 Android smartphones for measuring physical activity among midlife and older adults. A laboratory (study 1) and a free-living (study 2) protocol were conducted. In study 1, individuals engaged in prescribed activities including sedentary (eg, sitting), light (sweeping), moderate (eg, walking 3 mph on a treadmill), and vigorous (eg, jogging 5 mph on a treadmill) activity over a 2-hour period wearing both an ActiGraph and 3 Android smartphones (ie, HTC MyTouch, Google Nexus One, and Motorola Cliq). In the free-living study, individuals engaged in usual daily activities over 7 days while wearing an Android smartphone (Google Nexus One) and an ActiGraph. Study 1 included 15 participants (age: mean 55.5, SD 6.6 years; women: 56%, 8/15). Correlations between the ActiGraph and the 3 phones were strong to very strong (ρ=.77-.82). Further, after excluding bicycling and standing, cut-point derived classifications of activities yielded a high percentage of activities classified correctly according to intensity level (eg, 78%-91% by phone) that were similar to the ActiGraph's percent correctly classified (ie, 91%). Study 2 included 23 participants (age: mean 57.0, SD 6.4 years; women: 74%, 17/23). Within the free-living context, results suggested a moderate correlation (ie, ρ=.59, P<.001) between the raw ActiGraph counts/minute and the phone's raw counts/minute and a strong correlation on minutes of moderate-to-vigorous physical activity (MVPA; ie, ρ=.67, P<.001). Results from Bland-Altman plots suggested close mean absolute estimates of sedentary (mean difference=-26 min/day of sedentary behavior) and MVPA (mean difference=-1.3 min/day of MVPA) although there was large variation. Overall, results suggest

  15. Cosmetics Europe multi-laboratory pre-validation of the SkinEthic™ reconstituted human corneal epithelium test method for the prediction of eye irritation.

    PubMed

    Alépée, N; Bessou-Touya, S; Cotovio, J; de Smedt, A; de Wever, B; Faller, C; Jones, P; Le Varlet, B; Marrec-Fairley, M; Pfannenbecker, U; Tailhardat, M; van Goethem, F; McNamee, P

    2013-08-01

    Cosmetics Europe, The Personal Care Association, known as Colipa before 2012, conducted a program of technology transfer and assessment of Within/Between Laboratory (WLV/BLV) reproducibility of the SkinEthic™ Reconstituted Human Corneal Epithelium (HCE) as one of two human reconstructed tissue eye irritation test methods. The SkinEthic™ HCE test method involves two exposure time treatment procedures - one for short time exposure (10 min - SE) and the other for long time exposure (60 min - LE) of tissues to test substance. This paper describes pre-validation studies of the SkinEthic™ HCE test method (SE and LE protocols) as well as the Eye Peptide Reactivity Assay (EPRA). In the SE WLV study, 30 substances were evaluated. A consistent outcome with respect to viability measurement across all runs was observed with all substances showing an SD of less than 18%. In the LE WLV study, 44 out of 45 substances were consistently classified. These data demonstrated a high level of reproducibility within laboratory for both the SE and LE treatment procedures. For the LE BLV, 19 out of 20 substances were consistently classified between the three laboratories, again demonstrating a high level of reproducibility between laboratories. The results for EPRA WLV and BLV studies demonstrated that all substances analysed were categorised similarly and that the method is reproducible. The SkinEthic™ HCE test method entered into the experimental phase of a formal ECVAM validation program in 2010. Copyright © 2013. Published by Elsevier Ltd.

  16. Manufacturing validation of biologically functional T cells targeted to CD19 antigen for autologous adoptive cell therapy.

    PubMed

    Hollyman, Daniel; Stefanski, Jolanta; Przybylowski, Mark; Bartido, Shirley; Borquez-Ojeda, Oriana; Taylor, Clare; Yeh, Raymond; Capacio, Vanessa; Olszewska, Malgorzata; Hosey, James; Sadelain, Michel; Brentjens, Renier J; Rivière, Isabelle

    2009-01-01

    On the basis of promising preclinical data demonstrating the eradication of systemic B-cell malignancies by CD19-targeted T lymphocytes in vivo in severe combined immunodeficient-beige mouse models, we are launching phase I clinical trials in patients with chronic lymphocytic leukemia (CLL) and acute lymphoblastic leukemia. We present here the validation of the bioprocess which we developed for the production and expansion of clinical grade autologous T cells derived from patients with CLL. We demonstrate that T cells genetically modified with a replication-defective gammaretroviral vector derived from the Moloney murine leukemia virus encoding a chimeric antigen receptor (CAR) targeted to CD19 (1928z) can be expanded with Dynabeads CD3/CD28. This bioprocess allows us to generate clinical doses of 1928z+ T cells in approximately 2 to 3 weeks in a large-scale semiclosed culture system using the Wave Bioreactor. These 1928z+ T cells remain biologically functional not only in vitro but also in severe combined immunodeficient-beige mice bearing disseminated tumors. The validation requirements in terms of T-cell expansion, T-cell transduction with the 1928z CAR, biologic activity, quality control testing, and release criteria were met for all 4 validation runs using apheresis products from patients with CLL. Additionally, after expansion of the T cells, the diversity of the skewed Vbeta T-cell receptor repertoire was significantly restored. This validated process will be used in phase I clinical trials in patients with chemorefractory CLL and in patients with relapsed acute lymphoblastic leukemia. It can also be adapted for other clinical trials involving the expansion and transduction of patient or donor T cells using any CAR or T-cell receptor.

  17. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Validation of administrative and clinical case definitions for gestational diabetes mellitus against laboratory results.

    PubMed

    Bowker, S L; Savu, A; Donovan, L E; Johnson, J A; Kaul, P

    2017-06-01

    To examine the validity of International Classification of Disease, version 10 (ICD-10) codes for gestational diabetes mellitus in administrative databases (outpatient and inpatient), and in a clinical perinatal database (Alberta Perinatal Health Program), using laboratory data as the 'gold standard'. Women aged 12-54 years with in-hospital, singleton deliveries between 1 October 2008 and 31 March 2010 in Alberta, Canada were included in the study. A gestational diabetes diagnosis was defined in the laboratory data as ≥2 abnormal values on a 75-g oral glucose tolerance test or a 50-g glucose screen ≥10.3 mmol/l. Of 58 338 pregnancies, 2085 (3.6%) met gestational diabetes criteria based on laboratory data. The gestational diabetes rates in outpatient only, inpatient only, outpatient or inpatient combined, and Alberta Perinatal Health Program databases were 5.2% (3051), 4.8% (2791), 5.8% (3367) and 4.8% (2825), respectively. Although the outpatient or inpatient combined data achieved the highest sensitivity (92%) and specificity (97%), it was associated with a positive predictive value of only 57%. The majority of the false-positives (78%), however, had one abnormal value on oral glucose tolerance test, corresponding to a diagnosis of impaired glucose tolerance in pregnancy. The ICD-10 codes for gestational diabetes in administrative databases, especially when outpatient and inpatient databases are combined, can be used to reliably estimate the burden of the disease at the population level. Because impaired glucose tolerance in pregnancy and gestational diabetes may be managed similarly in clinical practice, impaired glucose tolerance in pregnancy is often coded as gestational diabetes. © 2016 Diabetes UK.

  19. In silico analysis and experimental validation of azelastine hydrochloride (N4) targeting sodium taurocholate co-transporting polypeptide (NTCP) in HBV therapy.

    PubMed

    Fu, L-L; Liu, J; Chen, Y; Wang, F-T; Wen, X; Liu, H-Q; Wang, M-Y; Ouyang, L; Huang, J; Bao, J-K; Wei, Y-Q

    2014-08-01

    The aim of this study was to explore sodium taurocholate co-transporting polypeptide (NTCP) exerting its function with hepatitis B virus (HBV) and its targeted candidate compounds, in HBV therapy. Identification of NTCP as a novel HBV target for screening candidate small molecules, was used by phylogenetic analysis, network construction, molecular modelling, molecular docking and molecular dynamics (MD) simulation. In vitro virological examination, q-PCR, western blotting and cytotoxicity studies were used for validating efficacy of the candidate compound. We used the phylogenetic analysis of NTCP and constructed its protein-protein network. Also, we screened compounds from Drugbank and ZINC, among which five were validated for their authentication in HepG 2.2.15 cells. Then, we selected compound N4 (azelastine hydrochloride) as the most potent of them. This showed good inhibitory activity against HBsAg (IC50 = 7.5 μm) and HBeAg (IC50 = 3.7 μm), as well as high SI value (SI = 4.68). Further MD simulation results supported good interaction between compound N4 and NTCP. In silico analysis and experimental validation together demonstrated that compound N4 can target NTCP in HepG2.2.15 cells, which may shed light on exploring it as a potential anti-HBV drug. © 2014 John Wiley & Sons Ltd.

  20. Sandia National Laboratories: Fabrication, Testing and Validation

    Science.gov Websites

    ; Technology Defense Systems & Assessments About Defense Systems & Assessments Program Areas safe, secure, reliable, and can fully support the Nation's deterrence policy. Employing only the most support of this mission, Sandia National Laboratories has a significant role in advancing the "state

  1. Laboratory Diagnostics of Botulism

    PubMed Central

    Lindström, Miia; Korkeala, Hannu

    2006-01-01

    Botulism is a potentially lethal paralytic disease caused by botulinum neurotoxin. Human pathogenic neurotoxins of types A, B, E, and F are produced by a diverse group of anaerobic spore-forming bacteria, including Clostridium botulinum groups I and II, Clostridium butyricum, and Clostridium baratii. The routine laboratory diagnostics of botulism is based on the detection of botulinum neurotoxin in the patient. Detection of toxin-producing clostridia in the patient and/or the vehicle confirms the diagnosis. The neurotoxin detection is based on the mouse lethality assay. Sensitive and rapid in vitro assays have been developed, but they have not yet been appropriately validated on clinical and food matrices. Culture methods for C. botulinum are poorly developed, and efficient isolation and identification tools are lacking. Molecular techniques targeted to the neurotoxin genes are ideal for the detection and identification of C. botulinum, but they do not detect biologically active neurotoxin and should not be used alone. Apart from rapid diagnosis, the laboratory diagnostics of botulism should aim at increasing our understanding of the epidemiology and prevention of the disease. Therefore, the toxin-producing organisms should be routinely isolated from the patient and the vehicle. The physiological group and genetic traits of the isolates should be determined. PMID:16614251

  2. Validation of the MARS: a combined physiological and laboratory risk prediction tool for 5- to 7-day in-hospital mortality.

    PubMed

    Öhman, M C; Atkins, T E H; Cooksley, T; Brabrand, M

    2018-06-01

    The Medical Admission Risk System (MARS) uses 11 physiological and laboratory data and had promising results in its derivation study for predicting 5- and 7- day mortality. To perform an external independent validation of the MARS score. An unplanned secondary cohort study. Patients admitted to the medical admission unit at The Hospital of South West Jutland were included from 2 October 2008 until 19 February 2009 and 23 February 2010 until 26 May 2010 were analysed. Validation of the MARS scores using 5- and 7- day mortality was the primary endpoint. Patients of 5858 were included in the study. Patients of 2923 (49.9%) were women with a median age of 65 years (15-107). The MARS score had an area under the receiving operator characteristic curve of 0.858 (95% CI: 0.831-0.884) for 5-day mortality and 0.844 (0.818-0.870) for 7 day mortality with poor calibration for both outcomes. The MARS score had excellent discriminatory power but poor calibration in predicting both 5- and 7-day mortality. The development of accurate combination physiological/laboratory data risk scores has the potential to improve the recognition of at risk patients.

  3. Single Laboratory Validated Method for Determination of Cylindrospermopsin and Anatoxin-a in Ambient Water by Liquid Chromatography/ Tandem Mass Spectrometry (LC/MS/MS)

    EPA Science Inventory

    This product is an LC/MS/MS single laboratory validated method for the determination of cylindrospermopsin and anatoxin-a in ambient waters. The product contains step-by-step instructions for sample preparation, analyses, preservation, sample holding time and QC protocols to ensu...

  4. Target validation: linking target and chemical properties to desired product profile.

    PubMed

    Wyatt, Paul G; Gilbert, Ian H; Read, Kevin D; Fairlamb, Alan H

    2011-01-01

    The discovery of drugs is a lengthy, high-risk and expensive business taking at least 12 years and is estimated to cost upwards of US$800 million for each drug to be successfully approved for clinical use. Much of this cost is driven by the late phase clinical trials and therefore the ability to terminate early those projects destined to fail is paramount to prevent unwanted costs and wasted effort. Although neglected diseases drug discovery is driven more by unmet medical need rather than financial considerations, the need to minimise wasted money and resources is even more vital in this under-funded area. To ensure any drug discovery project is addressing the requirements of the patients and health care providers and delivering a benefit over existing therapies, the ideal attributes of a novel drug needs to be pre-defined by a set of criteria called a target product profile. Using a target product profile the drug discovery process, clinical study design, and compound characteristics can be defined all the way back through to the suitability or druggability of the intended biochemical target. Assessment and prioritisation of the most promising targets for entry into screening programmes is crucial for maximising chances of success.

  5. Emerging techniques for the discovery and validation of therapeutic targets for skeletal diseases.

    PubMed

    Cho, Christine H; Nuttall, Mark E

    2002-12-01

    Advances in genomics and proteomics have revolutionised the drug discovery process and target validation. Identification of novel therapeutic targets for chronic skeletal diseases is an extremely challenging process based on the difficulty of obtaining high-quality human diseased versus normal tissue samples. The quality of tissue and genomic information obtained from the sample is critical to identifying disease-related genes. Using a genomics-based approach, novel genes or genes with similar homology to existing genes can be identified from cDNA libraries generated from normal versus diseased tissue. High-quality cDNA libraries are prepared from uncontaminated homogeneous cell populations harvested from tissue sections of interest. Localised gene expression analysis and confirmation are obtained through in situ hybridisation or immunohistochemical studies. Cells overexpressing the recombinant protein are subsequently designed for primary cell-based high-throughput assays that are capable of screening large compound banks for potential hits. Afterwards, secondary functional assays are used to test promising compounds. The same overexpressing cells are used in the secondary assay to test protein activity and functionality as well as screen for small-molecule agonists or antagonists. Once a hit is generated, a structure-activity relationship of the compound is optimised for better oral bioavailability and pharmacokinetics allowing the compound to progress into development. Parallel efforts from proteomics, as well as genetics/transgenics, bioinformatics and combinatorial chemistry, and improvements in high-throughput automation technologies, allow the drug discovery process to meet the demands of the medicinal market. This review discusses and illustrates how different approaches are incorporated into the discovery and validation of novel targets and, consequently, the development of potentially therapeutic agents in the areas of osteoporosis and osteoarthritis

  6. Grid Modernization Laboratory Consortium - Testing and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob

    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  7. Manufacturing validation of biologically functional T cells targeted to CD19 antigen for autologous adoptive cell therapy

    PubMed Central

    Hollyman, Daniel; Stefanski, Jolanta; Przybylowski, Mark; Bartido, Shirley; Borquez-Ojeda, Oriana; Taylor, Clare; Yeh, Raymond; Capacio, Vanessa; Olszewska, Malgorzata; Hosey, James; Sadelain, Michel; Brentjens, Renier J.; Rivière, Isabelle

    2009-01-01

    Summary Based on promising pre-clinical data demonstrating the eradication of systemic B cell malignancies by CD19-targeted T lymphocytes in vivo in SCID beige mouse models, we are launching Phase 1 clinical trials in patients with chronic lymphocytic leukemia (CLL) and acute lymphoblastic leukemia (ALL). We present here the validation of the bioprocess we developed for the production and expansion of clinical grade autologous T cells derived from patients with CLL. We demonstrate that T cells genetically modified with a replication-defective gammaretroviral vector derived from the Moloney murine leukemia virus encoding a chimeric antigen receptor (CAR) targeted to CD19 (1928z) can be expanded with Dynabeads® CD3/CD28. This bioprocess allows us to generate clinical doses of 1928z+ T cells in approximately 2 to 3 weeks in a large-scale semi-closed culture system using the Wave bioreactor. These 1928z+ T cells remain biologically functional not only in vitro but also in SCID beige mice bearing disseminated tumors. The validation requirements in terms of T cell expansion, T cell transduction with the 1928z CAR, biological activity, quality control testing and release criteria were met for all four validation runs using apheresis products from patients with CLL. Additionally, following expansion of the T cells, the diversity of the skewed Vβ T cell receptor repertoire was significantly restored. This validated process will be used in phase I clinical trials in patients with chemo-refractory CLL and in patients with relapsed ALL. It can also be adapted for other clinical trials involving the expansion and transduction of patient or donor T cells using any chimeric antigen receptor or T cell receptor. PMID:19238016

  8. Developing Guided Inquiry-Based Student Lab Worksheet for Laboratory Knowledge Course

    NASA Astrophysics Data System (ADS)

    Rahmi, Y. L.; Novriyanti, E.; Ardi, A.; Rifandi, R.

    2018-04-01

    The course of laboratory knowledge is an introductory course for biology students to follow various lectures practicing in the biology laboratory. Learning activities of laboratory knowledge course at this time in the Biology Department, Universitas Negeri Padang has not been completed by supporting learning media such as student lab worksheet. Guided inquiry learning model is one of the learning models that can be integrated into laboratory activity. The study aimed to produce student lab worksheet based on guided inquiry for laboratory knowledge course and to determine the validity of lab worksheet. The research was conducted using research and developmet (R&D) model. The instruments used in data collection in this research were questionnaire for student needed analysis and questionnaire to measure the student lab worksheet validity. The data obtained was quantitative from several validators. The validators consist of three lecturers. The percentage of a student lab worksheet validity was 94.18 which can be categorized was very good.

  9. Genetic Validation of Aminoacyl-tRNA Synthetases as Drug Targets in Trypanosoma brucei

    PubMed Central

    Kalidas, Savitha; Cestari, Igor; Monnerat, Severine; Li, Qiong; Regmi, Sandesh; Hasle, Nicholas; Labaied, Mehdi; Parsons, Marilyn; Stuart, Kenneth

    2014-01-01

    Human African trypanosomiasis (HAT) is an important public health threat in sub-Saharan Africa. Current drugs are unsatisfactory, and new drugs are being sought. Few validated enzyme targets are available to support drug discovery efforts, so our goal was to obtain essentiality data on genes with proven utility as drug targets. Aminoacyl-tRNA synthetases (aaRSs) are known drug targets for bacterial and fungal pathogens and are required for protein synthesis. Here we survey the essentiality of eight Trypanosoma brucei aaRSs by RNA interference (RNAi) gene expression knockdown, covering an enzyme from each major aaRS class: valyl-tRNA synthetase (ValRS) (class Ia), tryptophanyl-tRNA synthetase (TrpRS-1) (class Ib), arginyl-tRNA synthetase (ArgRS) (class Ic), glutamyl-tRNA synthetase (GluRS) (class 1c), threonyl-tRNA synthetase (ThrRS) (class IIa), asparaginyl-tRNA synthetase (AsnRS) (class IIb), and phenylalanyl-tRNA synthetase (α and β) (PheRS) (class IIc). Knockdown of mRNA encoding these enzymes in T. brucei mammalian stage parasites showed that all were essential for parasite growth and survival in vitro. The reduced expression resulted in growth, morphological, cell cycle, and DNA content abnormalities. ThrRS was characterized in greater detail, showing that the purified recombinant enzyme displayed ThrRS activity and that the protein localized to both the cytosol and mitochondrion. Borrelidin, a known inhibitor of ThrRS, was an inhibitor of T. brucei ThrRS and showed antitrypanosomal activity. The data show that aaRSs are essential for T. brucei survival and are likely to be excellent targets for drug discovery efforts. PMID:24562907

  10. Targeted Mass Spectrometric Approach for Biomarker Discovery and Validation with Nonglycosylated Tryptic Peptides from N-linked Glycoproteins in Human Plasma*

    PubMed Central

    Lee, Ju Yeon; Kim, Jin Young; Park, Gun Wook; Cheon, Mi Hee; Kwon, Kyung-Hoon; Ahn, Yeong Hee; Moon, Myeong Hee; Lee, Hyoung–Joo; Paik, Young Ki; Yoo, Jong Shin

    2011-01-01

    A simple mass spectrometric approach for the discovery and validation of biomarkers in human plasma was developed by targeting nonglycosylated tryptic peptides adjacent to glycosylation sites in an N-linked glycoprotein, one of the most important biomarkers for early detection, prognoses, and disease therapies. The discovery and validation of novel biomarkers requires complex sample pretreatment steps, such as depletion of highly abundant proteins, enrichment of desired proteins, or the development of new antibodies. The current study exploited the steric hindrance of glycan units in N-linked glycoproteins, which significantly affects the efficiency of proteolytic digestion if an enzymatically active amino acid is adjacent to the N-linked glycosylation site. Proteolytic digestion then results in quantitatively different peptide products in accordance with the degree of glycosylation. The effect of glycan steric hindrance on tryptic digestion was first demonstrated using alpha-1-acid glycoprotein (AGP) as a model compound versus deglycosylated alpha-1-acid glycoprotein. Second, nonglycosylated tryptic peptide biomarkers, which generally show much higher sensitivity in mass spectrometric analyses than their glycosylated counterparts, were quantified in human hepatocellular carcinoma plasma using a label-free method with no need for N-linked glycoprotein enrichment. Finally, the method was validated using a multiple reaction monitoring analysis, demonstrating that the newly discovered nonglycosylated tryptic peptide targets were present at different levels in normal and hepatocellular carcinoma plasmas. The area under the receiver operating characteristic curve generated through analyses of nonglycosylated tryptic peptide from vitronectin precursor protein was 0.978, the highest observed in a group of patients with hepatocellular carcinoma. This work provides a targeted means of discovering and validating nonglycosylated tryptic peptides as biomarkers in human plasma

  11. Chlorantraniliprole resistance and its biochemical and new molecular target mechanisms in laboratory and field strains of Chilo suppressalis (Walker).

    PubMed

    Sun, Yang; Xu, Lu; Chen, Qiong; Qin, Wenjing; Huang, Shuijin; Jiang, Ying; Qin, Houguo

    2018-06-01

    The rice striped stem borer (SSB), Chilo suppressalis (Walker), is one of the most economically important and destructive rice pests in China. To date, the efficiency of conventional insecticides has decreased greatly because of the development of high resistance. Since the introduction of chlorantraniliprole in 2008, SSB has presented resistance issues. In this study, laboratory resistant strains R1 and R2 [resistance ratio (RR) of 38.8 and 110.4, respectively] were established and a field population HR (RR of 249.6) was collected. Synergist assessment and enzyme activity data suggested the potential involvement of P450s and esterases in the resistance mechanism. No target (ryanodine receptor, RyR) mutation was found in R1, but a novel mutation Y4667D was found in R2. At the same position of RyR in HR strain, Y4667D and Y4667C were observed at low frequencies. In addition, the conserved mutation I4758M was found with a frequency of 94.4%. RyR mRNA expression was significantly lower in R1, R2 and HR than in S. When treated with chlorantraniliprole, RyR mRNA expression in all four strains was downregulated to ∼ 50%. A comprehensive analysis, including biochemical, target mutations and target mRNA expression, was conducted in an attempt to interpret the chlorantraniliprole resistance mechanism in both laboratory and field SSB strains. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  12. Evolution of Gas Cell Targets for Magnetized Liner Inertial Fusion Experiments at the Sandia National Laboratories PECOS Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paguio, R. R.; Smith, G. E.; Taylor, J. L.

    Z-Beamlet (ZBL) experiments conducted at the PECOS test facility at Sandia National Laboratories (SNL) investigated the nonlinear processes in laser plasma interaction (or laserplasma instabilities LPI) that complicate the deposition of laser energy by enhanced absorption, backscatter, filamentation and beam-spray that can occur in large-scale laser-heated gas cell targets. These targets and experiments were designed to provide better insight into the physics of the laser preheat stage of the Magnetized Liner Inertial Fusion (MagLIF) scheme being tested on the SNL Z-machine. The experiments aim to understand the tradeoffs between laser spot size, laser pulse shape, laser entrance hole (LEH) windowmore » thickness, and fuel density for laser preheat. Gas cell target design evolution and fabrication adaptations to accommodate the evolving experiment and scientific requirements are also described in this paper.« less

  13. Evolution of Gas Cell Targets for Magnetized Liner Inertial Fusion Experiments at the Sandia National Laboratories PECOS Test Facility

    DOE PAGES

    Paguio, R. R.; Smith, G. E.; Taylor, J. L.; ...

    2017-12-04

    Z-Beamlet (ZBL) experiments conducted at the PECOS test facility at Sandia National Laboratories (SNL) investigated the nonlinear processes in laser plasma interaction (or laserplasma instabilities LPI) that complicate the deposition of laser energy by enhanced absorption, backscatter, filamentation and beam-spray that can occur in large-scale laser-heated gas cell targets. These targets and experiments were designed to provide better insight into the physics of the laser preheat stage of the Magnetized Liner Inertial Fusion (MagLIF) scheme being tested on the SNL Z-machine. The experiments aim to understand the tradeoffs between laser spot size, laser pulse shape, laser entrance hole (LEH) windowmore » thickness, and fuel density for laser preheat. Gas cell target design evolution and fabrication adaptations to accommodate the evolving experiment and scientific requirements are also described in this paper.« less

  14. Target selection for a hypervelocity asteroid intercept vehicle flight validation mission

    NASA Astrophysics Data System (ADS)

    Wagner, Sam; Wie, Bong; Barbee, Brent W.

    2015-02-01

    Asteroids and comets have collided with the Earth in the past and will do so again in the future. Throughout Earth's history these collisions have played a significant role in shaping Earth's biological and geological histories. The planetary defense community has been examining a variety of options for mitigating the impact threat of asteroids and comets that approach or cross Earth's orbit, known as near-Earth objects (NEOs). This paper discusses the preliminary study results of selecting small (100-m class) NEO targets and mission analysis and design trade-offs for validating the effectiveness of a Hypervelocity Asteroid Intercept Vehicle (HAIV) concept, currently being investigated for a NIAC (NASA Advanced Innovative Concepts) Phase 2 study. In particular this paper will focus on the mission analysis and design for single spacecraft direct impact trajectories, as well as several mission types that enable a secondary rendezvous spacecraft to observe the HAIV impact and evaluate it's effectiveness.

  15. Directed energy deflection laboratory measurements of common space based targets

    NASA Astrophysics Data System (ADS)

    Brashears, Travis; Lubin, Philip; Hughes, Gary B.; Meinhold, Peter; Batliner, Payton; Motta, Caio; Madajian, Jonathan; Mercer, Whitaker; Knowles, Patrick

    2016-09-01

    We report on laboratory studies of the effectiveness of directed energy planetary defense as a part of the DE-STAR (Directed Energy System for Targeting of Asteroids and exploRation) program. DE-STAR and DE-STARLITE are directed energy "stand-off" and "stand-on" programs, respectively. These systems consist of a modular array of kilowatt-class lasers powered by photovoltaics, and are capable of heating a spot on the surface of an asteroid to the point of vaporization. Mass ejection, as a plume of evaporated material, creates a reactionary thrust capable of diverting the asteroid's orbit. In a series of papers, we have developed a theoretical basis and described numerical simulations for determining the thrust produced by material evaporating from the surface of an asteroid. In the DESTAR concept, the asteroid itself is used as the deflection "propellant". This study presents results of experiments designed to measure the thrust created by evaporation from a laser directed energy spot. We constructed a vacuum chamber to simulate space conditions, and installed a torsion balance that holds a common space target sample. The sample is illuminated with a fiber array laser with flux levels up to 60 MW/m2 , which allows us to simulate a mission level flux but on a small scale. We use a separate laser as well as a position sensitive centroid detector to readout the angular motion of the torsion balance and can thus determine the thrust. We compare the measured thrust to the models. Our theoretical models indicate a coupling coefficient well in excess of 100 μN/Woptical, though we assume a more conservative value of 80 μN/Woptical and then degrade this with an optical "encircled energy" efficiency of 0.75 to 60 μN/Woptical in our deflection modeling. Our measurements discussed here yield about 45 μN/Wabsorbed as a reasonable lower limit to the thrust per optical watt absorbed. Results vary depending on the material tested and are limited to measurements of 1 axis, so

  16. Comparative Validation of Five Quantitative Rapid Test Kits for the Analysis of Salt Iodine Content: Laboratory Performance, User- and Field-Friendliness

    PubMed Central

    Rohner, Fabian; Kangambèga, Marcelline O.; Khan, Noor; Kargougou, Robert; Garnier, Denis; Sanou, Ibrahima; Ouaro, Bertine D.; Petry, Nicolai; Wirth, James P.; Jooste, Pieter

    2015-01-01

    Background Iodine deficiency has important health and development consequences and the introduction of iodized salt as national programs has been a great public health success in the past decades. To render national salt iodization programs sustainable and ensure adequate iodization levels, simple methods to quantitatively assess whether salt is adequately iodized are required. Several methods claim to be simple and reliable, and are available on the market or are in development. Objective This work has validated the currently available quantitative rapid test kits (quantRTK) in a comparative manner for both their laboratory performance and ease of use in field settings. Methods Laboratory performance parameters (linearity, detection and quantification limit, intra- and inter-assay imprecision) were conducted on 5 quantRTK. We assessed inter-operator imprecision using salt of different quality along with the comparison of 59 salt samples from across the globe; measurements were made both in a laboratory and a field setting by technicians and non-technicians. Results from the quantRTK were compared against iodometric titration for validity. An ‘ease-of-use’ rating system was developed to identify the most suitable quantRTK for a given task. Results Most of the devices showed acceptable laboratory performance, but for some of the devices, use by non-technicians revealed poorer performance when working in a routine manner. Of the quantRTK tested, the iCheck® and I-Reader® showed most consistent performance and ease of use, and a newly developed paper-based method (saltPAD) holds promise if further developed. Conclusions User- and field-friendly devices are now available and the most appropriate quantRTK can be selected depending on the number of samples and the budget available. PMID:26401655

  17. Molecular Validation of PACE4 as a Target in Prostate Cancer12

    PubMed Central

    D'Anjou, François; Routhier, Sophie; Perreault, Jean-Pierre; Latil, Alain; Bonnel, David; Fournier, Isabelle; Salzet, Michel; Day, Robert

    2011-01-01

    Prostate cancer remains the single most prevalent cancer in men. Standard therapies are still limited and include androgen ablation that initially causes tumor regression. However, tumor cells eventually relapse and develop into a hormone-refractory prostate cancer. One of the current challenges in this disease is to define new therapeutic targets, which have been virtually unchanged in the past 30 years. Recent studies have suggested that the family of enzymes known as the proprotein convertases (PCs) is involved in various types of cancers and their progression. The present study examined PC expression in prostate cancer and validates one PC, namely PACE4, as a target. The evidence includes the observed high expression of PACE4 in all different clinical stages of human prostate tumor tissues. Gene silencing studies targeting PACE4 in the DU145 prostate cancer cell line produced cells (cell line 4-2) with slower proliferation rates, reduced clonogenic activity, and inability to grow as xenografts in nude mice. Gene expression and proteomic profiling of the 4-2 cell line reveals an increased expression of known cancer-related genes (e.g., GJA1, CD44, IGFBP6) that are downregulated in prostate cancer. Similarly, cancer genes whose expression is decreased in the 4-2 cell line were upregulated in prostate cancer (e.g., MUC1, IL6). The direct role of PACE4 in prostate cancer is most likely through the upregulated processing of growth factors or through the aberrant processing of growth factors leading to sustained cancer progression, suggesting that PACE4 holds a central role in prostate cancer. PMID:21633671

  18. Impact cratering in viscous targets - Laboratory experiments

    NASA Technical Reports Server (NTRS)

    Greeley, R.; Fink, J.; Snyder, D. B.; Gault, D. E.; Guest, J. E.; Schultz, P. H.

    1980-01-01

    To determine the effects of target yield strength and viscosity on the formation and morphology of Martian multilobed, slosh and rampart-type impact craters, 75 experiments in which target properties and impact energies were varied were carried out for high-speed motion picture observation in keeping with the following sequence: (1) projectile initial impact; (2) crater excavation and rise of ejecta plume; (3) formation of a transient central mound which generates a surge of material upon collapse that can partly override the plume deposit; and (4) oscillation of the central mound with progressively smaller surges of material leaving the crater. A dimensional analysis of the experimental results indicates that the dimensions of the central mound are proportional to (1) the energy of the impacting projectile and (2) to the inverse of both the yield strength and viscosity of the target material, and it is determined that extrapolation of these results to large Martian craters requires an effective surface layer viscosity of less than 10 to the 10th poise. These results may also be applicable to impacts on outer planet satellites composed of ice-silicate mixtures.

  19. Validation of a multi-layer Green's function code for ion beam transport

    NASA Astrophysics Data System (ADS)

    Walker, Steven; Tweed, John; Tripathi, Ram; Badavi, Francis F.; Miller, Jack; Zeitlin, Cary; Heilbronn, Lawrence

    To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiations is needed. In consequence, a new version of the HZETRN code capable of simulating high charge and energy (HZE) ions with either laboratory or space boundary conditions is currently under development. The new code, GRNTRN, is based on a Green's function approach to the solution of Boltzmann's transport equation and like its predecessor is deterministic in nature. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and down shift. Code validation in the laboratory environment is addressed by showing that GRNTRN accurately predicts energy loss spectra as measured by solid-state detectors in ion beam experiments with multi-layer targets. In order to validate the code with space boundary conditions, measured particle fluences are propagated through several thicknesses of shielding using both GRNTRN and the current version of HZETRN. The excellent agreement obtained indicates that GRNTRN accurately models the propagation of HZE ions in the space environment as well as in laboratory settings and also provides verification of the HZETRN propagator.

  20. Single Laboratory Validated Method for Determination of Microcystins and Nodularin in Ambient Freshwaters by Solid Phase Extraction and Liquid Chromatography/ Tandem Mass Spectrometry (LC/MS/MS)

    EPA Pesticide Factsheets

    This document is a standardized, single laboratory validated liquid chromatography/tandem mass spectrometry (LC/MS/MS) method for the detection of cyanotoxins—microsystins and nodularin (combined intracellular and extracellular)—in ambient freshwaters.

  1. 42 CFR 493.567 - Refusal to cooperate with validation inspection.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Refusal to cooperate with validation inspection... § 493.567 Refusal to cooperate with validation inspection. (a) Laboratory with a certificate of accreditation. (1) A laboratory with a certificate of accreditation that refuses to cooperate with a validation...

  2. 42 CFR 493.567 - Refusal to cooperate with validation inspection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Refusal to cooperate with validation inspection... § 493.567 Refusal to cooperate with validation inspection. (a) Laboratory with a certificate of accreditation. (1) A laboratory with a certificate of accreditation that refuses to cooperate with a validation...

  3. 42 CFR 493.567 - Refusal to cooperate with validation inspection.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Refusal to cooperate with validation inspection... § 493.567 Refusal to cooperate with validation inspection. (a) Laboratory with a certificate of accreditation. (1) A laboratory with a certificate of accreditation that refuses to cooperate with a validation...

  4. 42 CFR 493.567 - Refusal to cooperate with validation inspection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Refusal to cooperate with validation inspection... § 493.567 Refusal to cooperate with validation inspection. (a) Laboratory with a certificate of accreditation. (1) A laboratory with a certificate of accreditation that refuses to cooperate with a validation...

  5. 42 CFR 493.567 - Refusal to cooperate with validation inspection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Refusal to cooperate with validation inspection... § 493.567 Refusal to cooperate with validation inspection. (a) Laboratory with a certificate of accreditation. (1) A laboratory with a certificate of accreditation that refuses to cooperate with a validation...

  6. Analytic validation and real-time clinical application of an amplicon-based targeted gene panel for advanced cancer

    PubMed Central

    Wing, Michele R.; Reeser, Julie W.; Smith, Amy M.; Reeder, Matthew; Martin, Dorrelyn; Jewell, Benjamin M.; Datta, Jharna; Miya, Jharna; Monk, J. Paul; Mortazavi, Amir; Otterson, Gregory A.; Goldberg, Richard M.; VanDeusen, Jeffrey B.; Cole, Sharon; Dittmar, Kristin; Jaiswal, Sunny; Kinzie, Matthew; Waikhom, Suraj; Freud, Aharon G.; Zhou, Xiao-Ping; Chen, Wei; Bhatt, Darshna; Roychowdhury, Sameek

    2017-01-01

    Multiplex somatic testing has emerged as a strategy to test patients with advanced cancer. We demonstrate our analytic validation approach for a gene hotspot panel and real-time prospective clinical application for any cancer type. The TruSight Tumor 26 assay amplifies 85 somatic hotspot regions across 26 genes. Using cell line and tumor mixes, we observed that 100% of the 14,715 targeted bases had at least 1000x raw coverage. We determined the sensitivity (100%, 95% CI: 96-100%), positive predictive value (100%, 95% CI: 96-100%), reproducibility (100% concordance), and limit of detection (3% variant allele frequency at 1000x read depth) of this assay to detect single nucleotide variants and small insertions and deletions. Next, we applied the assay prospectively in a clinical tumor sequencing study to evaluate 174 patients with metastatic or advanced cancer, including frozen tumors, formalin-fixed tumors, and enriched peripheral blood mononuclear cells in hematologic cancers. We reported one or more somatic mutations in 89 (53%) of the sequenced tumors (167 passing quality filters). Forty-three of these patients (26%) had mutations that would enable eligibility for targeted therapies. This study demonstrates the validity and feasibility of applying TruSight Tumor 26 for pan-cancer testing using multiple specimen types. PMID:29100271

  7. Statistical analysis of target acquisition sensor modeling experiments

    NASA Astrophysics Data System (ADS)

    Deaver, Dawne M.; Moyer, Steve

    2015-05-01

    The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.

  8. Human L-DOPA decarboxylase mRNA is a target of miR-145: A prediction to validation workflow.

    PubMed

    Papadopoulos, Emmanuel I; Fragoulis, Emmanuel G; Scorilas, Andreas

    2015-01-10

    l-DOPA decarboxylase (DDC) is a multiply-regulated gene which encodes the enzyme that catalyzes the biosynthesis of dopamine in humans. MicroRNAs comprise a novel class of endogenously transcribed small RNAs that can post-transcriptionally regulate the expression of various genes. Given that the mechanism of microRNA target recognition remains elusive, several genes, including DDC, have not yet been identified as microRNA targets. Nevertheless, a number of specifically designed bioinformatic algorithms provide candidate miRNAs for almost every gene, but still their results exhibit moderate accuracy and should be experimentally validated. Motivated by the above, we herein sought to discover a microRNA that regulates DDC expression. By using the current algorithms according to bibliographic recommendations we found that miR-145 could be predicted with high specificity as a candidate regulatory microRNA for DDC expression. Thus, a validation experiment followed by firstly transfecting an appropriate cell culture system with a synthetic miR-145 sequence and sequentially assessing the mRNA and protein levels of DDC via real-time PCR and Western blotting, respectively. Our analysis revealed that miR-145 had no significant impact on protein levels of DDC but managed to dramatically downregulate its mRNA expression. Overall, the experimental and bioinformatic analysis conducted herein indicate that miR-145 has the ability to regulate DDC mRNA expression and potentially this occurs by recognizing its mRNA as a target. Copyright © 2014. Published by Elsevier B.V.

  9. Uncooperative target-in-the-loop performance with backscattered speckle-field effects

    NASA Astrophysics Data System (ADS)

    Kansky, Jan E.; Murphy, Daniel V.

    2007-09-01

    Systems utilizing target-in-the-loop (TIL) techniques for adaptive optics phase compensation rely on a metric sensor to perform a hill climbing algorithm that maximizes the far-field Strehl ratio. In uncooperative TIL, the metric signal is derived from the light backscattered from a target. In cases where the target is illuminated with a laser with suffciently long coherence length, the potential exists for the validity of the metric sensor to be compromised by speckle-field effects. We report experimental results from a scaled laboratory designed to evaluate TIL performance in atmospheric turbulence and thermal blooming conditions where the metric sensors are influenced by varying degrees of backscatter speckle. We compare performance of several TIL configurations and metrics for cases with static speckle, and for cases with speckle fluctuations within the frequency range that the TIL system operates. The roles of metric sensor filtering and system bandwidth are discussed.

  10. Pigs as laboratory animals

    USDA-ARS?s Scientific Manuscript database

    The pig is increasingly popular as a laboratory animal either as the target species in its own right or as a model for humans in biomedical science. As an intelligent, social animal it has a complex behavioral repertoire reminiscent of its ancestor, the wild boar. Within a laboratory setting, the pi...

  11. Validating presupposed versus focused text information.

    PubMed

    Singer, Murray; Solar, Kevin G; Spear, Jackie

    2017-04-01

    There is extensive evidence that readers continually validate discourse accuracy and congruence, but that they may also overlook conspicuous text contradictions. Validation may be thwarted when the inaccurate ideas are embedded sentence presuppositions. In four experiments, we examined readers' validation of presupposed ("given") versus new text information. Throughout, a critical concept, such as a truck versus a bus, was introduced early in a narrative. Later, a character stated or thought something about the truck, which therefore matched or mismatched its antecedent. Furthermore, truck was presented as either given or new information. Mismatch target reading times uniformly exceeded the matching ones by similar magnitudes for given and new concepts. We obtained this outcome using different grammatical constructions and with different antecedent-target distances. In Experiment 4, we examined only given critical ideas, but varied both their matching and the main verb's factivity (e.g., factive know vs. nonfactive think). The Match × Factivity interaction closely resembled that previously observed for new target information (Singer, 2006). Thus, readers can successfully validate given target information. Although contemporary theories tend to emphasize either deficient or successful validation, both types of theory can accommodate the discourse and reader variables that may regulate validation.

  12. National survey on intra-laboratory turnaround time for some most common routine and stat laboratory analyses in 479 laboratories in China.

    PubMed

    Fei, Yang; Zeng, Rong; Wang, Wei; He, Falin; Zhong, Kun; Wang, Zhiguo

    2015-01-01

    To investigate the state of the art of intra-laboratory turnaround time (intra-TAT), provide suggestions and find out whether laboratories accredited by International Organization for Standardization (ISO) 15189 or College of American Pathologists (CAP) will show better performance on intra-TAT than non-accredited ones. 479 Chinese clinical laboratories participating in the external quality assessment programs of chemistry, blood gas, and haematology tests organized by the National Centre for Clinical Laboratories in China were included in our study. General information and the median of intra-TAT of routine and stat tests in last one week were asked in the questionnaires. The response rate of clinical biochemistry, blood gas, and haematology testing were 36% (479/1307), 38% (228/598), and 36% (449/1250), respectively. More than 50% of laboratories indicated that they had set up intra-TAT median goals and almost 60% of laboratories declared they had monitored intra-TAT generally for every analyte they performed. Among all analytes we investigated, the intra-TAT of haematology analytes was shorter than biochemistry while the intra-TAT of blood gas analytes was the shortest. There were significant differences between median intra-TAT on different days of the week for routine tests. However, there were no significant differences in median intra-TAT reported by accredited laboratories and non-accredited laboratories. Many laboratories in China are aware of intra-TAT control and are making effort to reach the target. There is still space for improvement. Accredited laboratories have better status on intra-TAT monitoring and target setting than the non-accredited, but there are no significant differences in median intra-TAT reported by them.

  13. National survey on intra-laboratory turnaround time for some most common routine and stat laboratory analyses in 479 laboratories in China

    PubMed Central

    Fei, Yang; Zeng, Rong; Wang, Wei; He, Falin; Zhong, Kun

    2015-01-01

    Introduction To investigate the state of the art of intra-laboratory turnaround time (intra-TAT), provide suggestions and find out whether laboratories accredited by International Organization for Standardization (ISO) 15189 or College of American Pathologists (CAP) will show better performance on intra-TAT than non-accredited ones. Materials and methods 479 Chinese clinical laboratories participating in the external quality assessment programs of chemistry, blood gas, and haematology tests organized by the National Centre for Clinical Laboratories in China were included in our study. General information and the median of intra-TAT of routine and stat tests in last one week were asked in the questionnaires. Results The response rate of clinical biochemistry, blood gas, and haematology testing were 36% (479 / 1307), 38% (228 / 598), and 36% (449 / 1250), respectively. More than 50% of laboratories indicated that they had set up intra-TAT median goals and almost 60% of laboratories declared they had monitored intra-TAT generally for every analyte they performed. Among all analytes we investigated, the intra-TAT of haematology analytes was shorter than biochemistry while the intra-TAT of blood gas analytes was the shortest. There were significant differences between median intra-TAT on different days of the week for routine tests. However, there were no significant differences in median intra-TAT reported by accredited laboratories and non-accredited laboratories. Conclusions Many laboratories in China are aware of intra-TAT control and are making effort to reach the target. There is still space for improvement. Accredited laboratories have better status on intra-TAT monitoring and target setting than the non-accredited, but there are no significant differences in median intra-TAT reported by them. PMID:26110033

  14. [Validation Study for Analytical Method of Diarrhetic Shellfish Poisons in 9 Kinds of Shellfish].

    PubMed

    Yamaguchi, Mizuka; Yamaguchi, Takahiro; Kakimoto, Kensaku; Nagayoshi, Haruna; Okihashi, Masahiro; Kajimura, Keiji

    2016-01-01

    A method was developed for the simultaneous determination of okadaic acid, dinophysistoxin-1 and dinophysistoxin-2 in shellfish using ultra performance liquid chromatography with tandem mass spectrometry. Shellfish poisons in spiked samples were extracted with methanol and 90% methanol, and were hydrolyzed with 2.5 mol/L sodium hydroxide solution. Purification was done on an HLB solid-phase extraction column. This method was validated in accordance with the notification of Ministry of Health, Labour and Welfare of Japan. As a result of the validation study in nine kinds of shellfish, the trueness, repeatability and within-laboratory reproducibility were 79-101%, less than 12 and 16%, respectively. The trueness and precision met the target values of notification.

  15. Structural Test Laboratory | Water Power | NREL

    Science.gov Websites

    Structural Test Laboratory Structural Test Laboratory NREL engineers design and configure structural components can validate models, demonstrate system reliability, inform design margins, and assess , including mass and center of gravity, to ensure compliance with design goals Dynamic Characterization Use

  16. Robust diagnosis of non-Hodgkin lymphoma phenotypes validated on gene expression data from different laboratories.

    PubMed

    Bhanot, Gyan; Alexe, Gabriela; Levine, Arnold J; Stolovitzky, Gustavo

    2005-01-01

    A major challenge in cancer diagnosis from microarray data is the need for robust, accurate, classification models which are independent of the analysis techniques used and can combine data from different laboratories. We propose such a classification scheme originally developed for phenotype identification from mass spectrometry data. The method uses a robust multivariate gene selection procedure and combines the results of several machine learning tools trained on raw and pattern data to produce an accurate meta-classifier. We illustrate and validate our method by applying it to gene expression datasets: the oligonucleotide HuGeneFL microarray dataset of Shipp et al. (www.genome.wi.mit.du/MPR/lymphoma) and the Hu95Av2 Affymetrix dataset (DallaFavera's laboratory, Columbia University). Our pattern-based meta-classification technique achieves higher predictive accuracies than each of the individual classifiers , is robust against data perturbations and provides subsets of related predictive genes. Our techniques predict that combinations of some genes in the p53 pathway are highly predictive of phenotype. In particular, we find that in 80% of DLBCL cases the mRNA level of at least one of the three genes p53, PLK1 and CDK2 is elevated, while in 80% of FL cases, the mRNA level of at most one of them is elevated.

  17. A 3D Computational fluid dynamics model validation for candidate molybdenum-99 target geometry

    NASA Astrophysics Data System (ADS)

    Zheng, Lin; Dale, Greg; Vorobieff, Peter

    2014-11-01

    Molybdenum-99 (99Mo) is the parent product of technetium-99m (99mTc), a radioisotope used in approximately 50,000 medical diagnostic tests per day in the U.S. The primary uses of this product include detection of heart disease, cancer, study of organ structure and function, and other applications. The US Department of Energy seeks new methods for generating 99Mo without the use of highly enriched uranium, to eliminate proliferation issues and provide a domestic supply of 99mTc for medical imaging. For this project, electron accelerating technology is used by sending an electron beam through a series of 100Mo targets. During this process a large amount of heat is created, which directly affects the operating temperature dictated by the tensile stress limit of the wall material. To maintain the required temperature range, helium gas is used as a cooling agent that flows through narrow channels between the target disks. In our numerical study, we investigate the cooling performance on a series of new geometry designs of the cooling channel. This research is supported by Los Alamos National Laboratory.

  18. Practical Aspects of Designing and Conducting Validation Studies Involving Multi-study Trials.

    PubMed

    Coecke, Sandra; Bernasconi, Camilla; Bowe, Gerard; Bostroem, Ann-Charlotte; Burton, Julien; Cole, Thomas; Fortaner, Salvador; Gouliarmou, Varvara; Gray, Andrew; Griesinger, Claudius; Louhimies, Susanna; Gyves, Emilio Mendoza-de; Joossens, Elisabeth; Prinz, Maurits-Jan; Milcamps, Anne; Parissis, Nicholaos; Wilk-Zasadna, Iwona; Barroso, João; Desprez, Bertrand; Langezaal, Ingrid; Liska, Roman; Morath, Siegfried; Reina, Vittorio; Zorzoli, Chiara; Zuang, Valérie

    This chapter focuses on practical aspects of conducting prospective in vitro validation studies, and in particular, by laboratories that are members of the European Union Network of Laboratories for the Validation of Alternative Methods (EU-NETVAL) that is coordinated by the EU Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM). Prospective validation studies involving EU-NETVAL, comprising a multi-study trial involving several laboratories or "test facilities", typically consist of two main steps: (1) the design of the validation study by EURL ECVAM and (2) the execution of the multi-study trial by a number of qualified laboratories within EU-NETVAL, coordinated and supported by EURL ECVAM. The approach adopted in the conduct of these validation studies adheres to the principles described in the OECD Guidance Document on the Validation and International Acceptance of new or updated test methods for Hazard Assessment No. 34 (OECD 2005). The context and scope of conducting prospective in vitro validation studies is dealt with in Chap. 4 . Here we focus mainly on the processes followed to carry out a prospective validation of in vitro methods involving different laboratories with the ultimate aim of generating a dataset that can support a decision in relation to the possible development of an international test guideline (e.g. by the OECD) or the establishment of performance standards.

  19. Validation of Immunohistochemical Assays for Integral Biomarkers in the NCI-MATCH EAY131 Clinical Trial.

    PubMed

    Khoury, Joseph D; Wang, Wei-Lien; Prieto, Victor G; Medeiros, L Jeffrey; Kalhor, Neda; Hameed, Meera; Broaddus, Russell; Hamilton, Stanley R

    2018-02-01

    Biomarkers that guide therapy selection are gaining unprecedented importance as targeted therapy options increase in scope and complexity. In conjunction with high-throughput molecular techniques, therapy-guiding biomarker assays based upon immunohistochemistry (IHC) have a critical role in cancer care in that they inform about the expression status of a protein target. Here, we describe the validation procedures for four clinical IHC biomarker assays-PTEN, RB, MLH1, and MSH2-for use as integral biomarkers in the nationwide NCI-Molecular Analysis for Therapy Choice (NCI-MATCH) EAY131 clinical trial. Validation procedures were developed through an iterative process based on collective experience and adaptation of broad guidelines from the FDA. The steps included primary antibody selection; assay optimization; development of assay interpretation criteria incorporating biological considerations; and expected staining patterns, including indeterminate results, orthogonal validation, and tissue validation. Following assay lockdown, patient samples and cell lines were used for analytic and clinical validation. The assays were then approved as laboratory-developed tests and used for clinical trial decisions for treatment selection. Calculations of sensitivity and specificity were undertaken using various definitions of gold-standard references, and external validation was required for the PTEN IHC assay. In conclusion, validation of IHC biomarker assays critical for guiding therapy in clinical trials is feasible using comprehensive preanalytic, analytic, and postanalytic steps. Implementation of standardized guidelines provides a useful framework for validating IHC biomarker assays that allow for reproducibility across institutions for routine clinical use. Clin Cancer Res; 24(3); 521-31. ©2017 AACR . ©2017 American Association for Cancer Research.

  20. Interferometric analysis of laboratory photoionized plasmas utilizing supersonic gas jet targets.

    NASA Astrophysics Data System (ADS)

    Swanson, Kyle James; Ivanov, Vladimir; Mancini, Roberto; Mayes, Daniel C.

    2018-06-01

    Photoionized plasmas are an important component of active galactic nuclei, x-ray binary systems and other astrophysical objects. Laboratory produced photoionized plasmas have mainly been studied at large scale facilities, due to the need for high intensity broadband x-ray flux. Using supersonic gas jets as targets has allowed university scale pulsed power generators to begin similar research. The two main advantages of this approach with supersonic gas jets include: possibility of a closer location to the x-ray source and no attenuation related to material used for containment and or tamping. Due to these factors, this experimental platform creates a laboratory environment that more closely resembles astrophysical environments. This system was developed at the Nevada Terawatt Facility using the 1 MA pulsed power generator Zebra. Neon, argon, and nitrogen supersonic gas jets are produced approximately 7-8mm from the z-pinch axis. The high intensity broadband x-ray flux produced by the collapse of the z-pinch wire array implosion irradiates the gas jet. Cylindrical wire arrays are made with 4 and 8 gold 10µm thick wire. The z-pinch radiates approximately 12-16kj of x-ray energy, with x-ray photons under 1Kev in energy. The photoionized plasma is measured via x-ray absorption spectroscopy and interferometry. A Mach-Zehnder interferometer is used to the measure neutral density of the jet prior to the zebra shot at a wavelength of 266 nm. A dual channel air-wedge shearing interferometer is used to measure electron density of the ionized gas jet during the shot, at wavelengths of 532nm and 266nm. Using a newly developed interferometric analysis tool, average ionization state maps of the plasma can be calculated. Interferometry for nitrogen and argon show an average ionization state in the range of 3-8. Preliminary x-ray absorption spectroscopy collected show neon absorption lines. This work was sponsored in part by DOE Office of Science Grant DE-SC0014451.

  1. Data Validation & Laboratory Quality Assurance for Region 9

    EPA Pesticide Factsheets

    In all hazardous site investigations it is essential to know the quality of the data used for decision-making purposes. Validation of data requires that appropriate quality assurance and quality control (QA/QC) procedures be followed.

  2. GIS prospectivity mapping and 3D modeling validation for potential uranium deposit targets in Shangnan district, China

    NASA Astrophysics Data System (ADS)

    Xie, Jiayu; Wang, Gongwen; Sha, Yazhou; Liu, Jiajun; Wen, Botao; Nie, Ming; Zhang, Shuai

    2017-04-01

    Integrating multi-source geoscience information (such as geology, geophysics, geochemistry, and remote sensing) using GIS mapping is one of the key topics and frontiers in quantitative geosciences for mineral exploration. GIS prospective mapping and three-dimensional (3D) modeling can be used not only to extract exploration criteria and delineate metallogenetic targets but also to provide important information for the quantitative assessment of mineral resources. This paper uses the Shangnan district of Shaanxi province (China) as a case study area. GIS mapping and potential granite-hydrothermal uranium targeting were conducted in the study area combining weights of evidence (WofE) and concentration-area (C-A) fractal methods with multi-source geoscience information. 3D deposit-scale modeling using GOCAD software was performed to validate the shapes and features of the potential targets at the subsurface. The research results show that: (1) the known deposits have potential zones at depth, and the 3D geological models can delineate surface or subsurface ore-forming features, which can be used to analyze the uncertainty of the shape and feature of prospectivity mapping at the subsurface; (2) single geochemistry anomalies or remote sensing anomalies at the surface require combining the depth exploration criteria of geophysics to identify potential targets; and (3) the single or sparse exploration criteria zone with few mineralization spots at the surface has high uncertainty in terms of the exploration target.

  3. Laboratory hemostasis: milestones in Clinical Chemistry and Laboratory Medicine.

    PubMed

    Lippi, Giuseppe; Favaloro, Emmanuel J

    2013-01-01

    Hemostasis is a delicate, dynamic and intricate system, in which pro- and anti-coagulant forces cooperate for either maintaining blood fluidity under normal conditions, or else will prompt blood clot generation to limit the bleeding when the integrity of blood vessels is jeopardized. Excessive prevalence of anticoagulant forces leads to hemorrhage, whereas excessive activation of procoagulant forces triggers excessive coagulation and thrombosis. The hemostasis laboratory performs a variety of first, second and third line tests, and plays a pivotal role in diagnostic and monitoring of most hemostasis disturbances. Since the leading targets of Clinical Chemistry and Laboratory Medicine include promotion of progress in fundamental and applied research, along with publication of guidelines and recommendations in laboratory diagnostics, this journal is an ideal source of information on current developments in the laboratory technology of hemostasis, and this article is aimed to celebrate some of the most important and popular articles ever published by the journal in the filed of laboratory hemostasis.

  4. Applications of Biophysics in High-Throughput Screening Hit Validation.

    PubMed

    Genick, Christine Clougherty; Barlier, Danielle; Monna, Dominique; Brunner, Reto; Bé, Céline; Scheufler, Clemens; Ottl, Johannes

    2014-06-01

    For approximately a decade, biophysical methods have been used to validate positive hits selected from high-throughput screening (HTS) campaigns with the goal to verify binding interactions using label-free assays. By applying label-free readouts, screen artifacts created by compound interference and fluorescence are discovered, enabling further characterization of the hits for their target specificity and selectivity. The use of several biophysical methods to extract this type of high-content information is required to prevent the promotion of false positives to the next level of hit validation and to select the best candidates for further chemical optimization. The typical technologies applied in this arena include dynamic light scattering, turbidometry, resonance waveguide, surface plasmon resonance, differential scanning fluorimetry, mass spectrometry, and others. Each technology can provide different types of information to enable the characterization of the binding interaction. Thus, these technologies can be incorporated in a hit-validation strategy not only according to the profile of chemical matter that is desired by the medicinal chemists, but also in a manner that is in agreement with the target protein's amenability to the screening format. Here, we present the results of screening strategies using biophysics with the objective to evaluate the approaches, discuss the advantages and challenges, and summarize the benefits in reference to lead discovery. In summary, the biophysics screens presented here demonstrated various hit rates from a list of ~2000 preselected, IC50-validated hits from HTS (an IC50 is the inhibitor concentration at which 50% inhibition of activity is observed). There are several lessons learned from these biophysical screens, which will be discussed in this article. © 2014 Society for Laboratory Automation and Screening.

  5. STR-validator: an open source platform for validation and process control.

    PubMed

    Hansson, Oskar; Gill, Peter; Egeland, Thore

    2014-11-01

    This paper addresses two problems faced when short tandem repeat (STR) systems are validated for forensic purposes: (1) validation is extremely time consuming and expensive, and (2) there is strong consensus about what to validate but not how. The first problem is solved by powerful data processing functions to automate calculations. Utilising an easy-to-use graphical user interface, strvalidator (hereafter referred to as STR-validator) can greatly increase the speed of validation. The second problem is exemplified by a series of analyses, and subsequent comparison with published material, highlighting the need for a common validation platform. If adopted by the forensic community STR-validator has the potential to standardise the analysis of validation data. This would not only facilitate information exchange but also increase the pace at which laboratories are able to switch to new technology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Short communication: Validation of 4 candidate causative trait variants in 2 cattle breeds using targeted sequence imputation.

    PubMed

    Pausch, Hubert; Wurmser, Christine; Reinhardt, Friedrich; Emmerling, Reiner; Fries, Ruedi

    2015-06-01

    Most association studies for pinpointing trait-associated variants are performed within breed. The availability of sequence data from key ancestors of several cattle breeds now enables immediate assessment of the frequency of trait-associated variants in populations different from the mapping population and their imputation into large validation populations. The objective of this study was to validate the effects of 4 putatively causative variants on milk production traits, male fertility, and stature in German Fleckvieh and Holstein-Friesian animals using targeted sequence imputation. We used whole-genome sequence data of 456 animals to impute 4 missense mutations in DGAT1, GHR, PRLR, and PROP1 into 10,363 Fleckvieh and 8,812 Holstein animals. The accuracy of the imputed genotypes exceeded 95% for all variants. Association testing with imputed variants revealed consistent antagonistic effects of the DGAT1 p.A232K and GHR p.F279Y variants on milk yield and protein and fat contents, respectively, in both breeds. The allele frequency of both polymorphisms has changed considerably in the past 20 yr, indicating that they were targets of recent selection for milk production traits. The PRLR p.S18N variant was associated with yield traits in Fleckvieh but not in Holstein, suggesting that it may be in linkage disequilibrium with a mutation affecting yield traits rather than being causal. The reported effects of the PROP1 p.H173R variant on milk production, male fertility, and stature could not be confirmed. Our results demonstrate that population-wide imputation of candidate causal variants from sequence data is feasible, enabling their rapid validation in large independent populations. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. Chemical probes targeting epigenetic proteins: Applications beyond oncology

    PubMed Central

    Ackloo, Suzanne; Brown, Peter J.; Müller, Susanne

    2017-01-01

    ABSTRACT Epigenetic chemical probes are potent, cell-active, small molecule inhibitors or antagonists of specific domains in a protein; they have been indispensable for studying bromodomains and protein methyltransferases. The Structural Genomics Consortium (SGC), comprising scientists from academic and pharmaceutical laboratories, has generated most of the current epigenetic chemical probes. Moreover, the SGC has shared about 4 thousand aliquots of these probes, which have been used primarily for phenotypic profiling or to validate targets in cell lines or primary patient samples cultured in vitro. Epigenetic chemical probes have been critical tools in oncology research and have uncovered mechanistic insights into well-established targets, as well as identify new therapeutic starting points. Indeed, the literature primarily links epigenetic proteins to oncology, but applications in inflammation, viral, metabolic and neurodegenerative diseases are now being reported. We summarize the literature of these emerging applications and provide examples where existing probes might be used. PMID:28080202

  8. One-week 96-well soft agar growth assay for cancer target validation.

    PubMed

    Ke, Ning; Albers, Aaron; Claassen, Gisela; Yu, De-hua; Chatterton, Jon E; Hu, Xiuyuan; Meyhack, Bernd; Wong-Staal, Flossie; Li, Qi-Xiang

    2004-05-01

    Soft agar growth, used to measure cell anchorage-independent proliferation potential, is one of the most important and most commonly used assays to detect cell transformation. However, the traditional soft agar assay is time-consuming, labor-intensive, and plagued with inconsistencies due to individual subjectivity. It does not, therefore, meet the increasing demands of today's oncology drug target screening or validation processes. This report describes an alternative 96-well soft agar growth assay that can function as a replacement for the traditional method and overcomes the aforementioned limitations. It offers the following advantages: a shortened assay duration (1 week instead of 4 weeks) that makes transient transfection or treatment possible; plate reader quantification of soft agar growth (measuring cloning efficiency and colony size); and a significant reduction in required labor. Higher throughput also makes it possible to process large numbers of samples and treatments simultaneously and in a much more efficient manner, while saving precious workspace and overall cost.

  9. Visualizing Energy on Target: Molecular Dynamics Simulations

    DTIC Science & Technology

    2017-12-01

    ARL-TR-8234 ● DEC 2017 US Army Research Laboratory Visualizing Energy on Target: Molecular Dynamics Simulations by DeCarlos E...return it to the originator. ARL-TR-8234● DEC 2017 US Army Research Laboratory Visualizing Energy on Target: Molecular Dynamics...REPORT TYPE Technical Report 3. DATES COVERED (From - To) 1 October 2015–30 September 2016 4. TITLE AND SUBTITLE Visualizing Energy on Target

  10. Validation of a personalized dosimetric evaluation tool (Oedipe) for targeted radiotherapy based on the Monte Carlo MCNPX code

    NASA Astrophysics Data System (ADS)

    Chiavassa, S.; Aubineau-Lanièce, I.; Bitar, A.; Lisbona, A.; Barbet, J.; Franck, D.; Jourdain, J. R.; Bardiès, M.

    2006-02-01

    Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy.

  11. Single-Laboratory Validation for the Determination of Flavonoids in Hawthorn Leaves and Finished Products by LC-UV.

    PubMed

    Mudge, Elizabeth M; Liu, Ying; Lund, Jensen A; Brown, Paula N

    2016-11-01

    Suitably validated analytical methods that can be used to quantify medicinally active phytochemicals in natural health products are required by regulators, manufacturers, and consumers. Hawthorn ( Crataegus ) is a botanical ingredient in natural health products used for the treatment of cardiovascular disorders. A method for the quantitation of vitexin-2″- O - rhamnoside, vitexin, isovitexin, rutin, and hyperoside in hawthorn leaf and flower raw materials and finished products was optimized and validated according to AOAC International guidelines. A two-level partial factorial study was used to guide the optimization of the sample preparation. The optimal conditions were found to be a 60-minute extraction using 50 : 48 : 2 methanol : water : acetic acid followed by a 25-minute separation using a reversed-phased liquid chromatography column with ultraviolet absorbance detection. The single-laboratory validation study evaluated method selectivity, accuracy, repeatability, linearity, limit of quantitation, and limit of detection. Individual flavonoid content ranged from 0.05 mg/g to 17.5 mg/g in solid dosage forms and raw materials. Repeatability ranged from 0.7 to 11.7 % relative standard deviation corresponding to HorRat ranges from 0.2 to 1.6. Calibration curves for each flavonoid were linear within the analytical ranges with correlation coefficients greater than 99.9 %. Herein is the first report of a validated method that is fit for the purpose of quantifying five major phytochemical marker compounds in both raw materials and finished products made from North American ( Crataegus douglasii ) and European ( Crataegus monogyna and Crataegus laevigata) hawthorn species. The method includes optimized extraction of samples without a prolonged drying process and reduced liquid chromatography separation time. Georg Thieme Verlag KG Stuttgart · New York.

  12. Ada Compiler Validation Summary Report: Harris Corporation, Harris Ada Compiler, Version 4.0, Harris HCX-9 (Host) and (Target), 880603W1.09059

    DTIC Science & Technology

    1988-06-06

    TYPE Of REPORT & PERIOD COVERED Ada Compiler Validation Summary Report : Harris 6 June 1988 to 6 June 1988 Corporation, Harris Ada Compiler, Version...4.0, Harris 1 PERFORINGDRG REPORT NUMBER HCX-9 (Host) and (Target), 880603W1.09059 7. AUTHOR(s) S. CONTRACT OR 6RANT NUMBER(s) Wright-Patterson AFB...88-03-02-HAR Ada COMPILER VALIDATION SUMMARY REPORT : Certificate Number: 880603WI.09059 A Harris Corporation AccessionFor Harris Ada Compiler, Version

  13. A feasibility study of returning clinically actionable somatic genomic alterations identified in a research laboratory.

    PubMed

    Arango, Natalia Paez; Brusco, Lauren; Mills Shaw, Kenna R; Chen, Ken; Eterovic, Agda Karina; Holla, Vijaykumar; Johnson, Amber; Litzenburger, Beate; Khotskaya, Yekaterina B; Sanchez, Nora; Bailey, Ann; Zheng, Xiaofeng; Horombe, Chacha; Kopetz, Scott; Farhangfar, Carol J; Routbort, Mark; Broaddus, Russell; Bernstam, Elmer V; Mendelsohn, John; Mills, Gordon B; Meric-Bernstam, Funda

    2017-06-27

    Molecular profiling performed in the research setting usually does not benefit the patients that donate their tissues. Through a prospective protocol, we sought to determine the feasibility and utility of performing broad genomic testing in the research laboratory for discovery, and the utility of giving treating physicians access to research data, with the option of validating actionable alterations in the CLIA environment. 1200 patients with advanced cancer underwent characterization of their tumors with high depth hybrid capture sequencing of 201 genes in the research setting. Tumors were also tested in the CLIA laboratory, with a standardized hotspot mutation analysis on an 11, 46 or 50 gene platform. 527 patients (44%) had at least one likely somatic mutation detected in an actionable gene using hotspot testing. With the 201 gene panel, 945 patients (79%) had at least one alteration in a potentially actionable gene that was undetected with the more limited CLIA panel testing. Sixty-four genomic alterations identified on the research panel were subsequently tested using an orthogonal CLIA assay. Of 16 mutations tested in the CLIA environment, 12 (75%) were confirmed. Twenty-five (52%) of 48 copy number alterations were confirmed. Nine (26.5%) of 34 patients with confirmed results received genotype-matched therapy. Seven of these patients were enrolled onto genotype-matched targeted therapy trials. Expanded cancer gene sequencing identifies more actionable genomic alterations. The option of CLIA validating research results can provide alternative targets for personalized cancer therapy.

  14. Complementary Approaches to Existing Target Based Drug Discovery for Identifying Novel Drug Targets.

    PubMed

    Vasaikar, Suhas; Bhatia, Pooja; Bhatia, Partap G; Chu Yaiw, Koon

    2016-11-21

    In the past decade, it was observed that the relationship between the emerging New Molecular Entities and the quantum of R&D investment has not been favorable. There might be numerous reasons but few studies stress the introduction of target based drug discovery approach as one of the factors. Although a number of drugs have been developed with an emphasis on a single protein target, yet identification of valid target is complex. The approach focuses on an in vitro single target, which overlooks the complexity of cell and makes process of validation drug targets uncertain. Thus, it is imperative to search for alternatives rather than looking at success stories of target-based drug discovery. It would be beneficial if the drugs were developed to target multiple components. New approaches like reverse engineering and translational research need to take into account both system and target-based approach. This review evaluates the strengths and limitations of known drug discovery approaches and proposes alternative approaches for increasing efficiency against treatment.

  15. External Validity, Internal Validity, and Organizational Reality: A Response to Robert L. Cardy (Commentary).

    ERIC Educational Resources Information Center

    Steinfatt, Thomas M.

    1991-01-01

    Responds to an article in the same issue of this journal which defends the applied value of laboratory studies to managers. Agrees that external validity is often irrelevant, and maintains that the problem of making inferences from any subject sample in management communication is one that demands internal, not external, validity. (SR)

  16. Optimization of detection conditions and single-laboratory validation of a multiresidue method for the determination of 135 pesticides and 25 organic pollutants in grapes and wine by gas chromatography time-of-flight mass spectrometry.

    PubMed

    Dasgupta, Soma; Banerjee, Kaushik; Dhumal, Kondiba N; Adsule, Pandurang G

    2011-01-01

    This paper describes single-laboratory validation of a multiresidue method for the determination of 135 pesticides, 12 dioxin-like polychlorinated biphenyls, 12 polyaromatic hydrocarbons, and bisphenol A in grapes and wine by GC/time-of-flight MS in a total run time of 48 min. The method is based on extraction with ethyl acetate in a sample-to-solvent ratio of 1:1, followed by selective dispersive SPE cleanup for grapes and wine. The GC/MS conditions were optimized for the chromatographic separation and to achieve highest S/N for all 160 target analytes, including the temperature-sensitive compounds, like captan and captafol, that are prone to degradation during analysis. An average recovery of 80-120% with RSD < 10% could be attained for all analytes except 17, for which the average recoveries were 70-80%. LOQ ranged within 10-50 ng/g, with < 25% expanded uncertainties, for 155 compounds in grapes and 151 in wine. In the incurred grape and wine samples, the residues of buprofezin, chlorpyriphos, metalaxyl, and myclobutanil were detected, with an RSD of < 5% (n = 6); the results were statistically similar to previously reported validated methods.

  17. Solid hydrogen target for laser driven proton acceleration

    NASA Astrophysics Data System (ADS)

    Perin, J. P.; Garcia, S.; Chatain, D.; Margarone, D.

    2015-05-01

    The development of very high power lasers opens up new horizons in various fields, such as laser plasma acceleration in Physics and innovative approaches for proton therapy in Medicine. Laser driven proton acceleration is commonly based on the so-called Target Normal Sheath Acceleration (TNSA) mechanisms: a high power laser is focused onto a solid target (thin metallic or plastic foil) and interact with matter at very high intensity, thus generating a plasma; as a consequence "hot" electrons are produced and move into the forward direction through the target. Protons are generated at the target rear side, electrons try to escape from the target and an ultra-strong quasi-electrostatic field (~1TV/m) is generated. Such a field can accelerate protons with a wide energy spectrum (1-200 MeV) in a few tens of micrometers. The proton beam characteristics depend on the laser parameters and on the target geometry and nature. This technique has been validated experimentally in several high power laser facilities by accelerating protons coming from hydrogenated contaminant (mainly water) at the rear of metallic target, however, several research groups are investigating the possibility to perform experiments by using "pure" hydrogen targets. In this context, the low temperature laboratory at CEA-Grenoble has developed a cryostat able to continuously produce a thin hydrogen ribbon (from 40 to 100 microns thick). A new extrusion concept, without any moving part has been carried out, using only the thermodynamic properties of the fluid. First results and perspectives are presented in this paper.

  18. Design and Validation of CRISPR/Cas9 Systems for Targeted Gene Modification in Induced Pluripotent Stem Cells.

    PubMed

    Lee, Ciaran M; Zhu, Haibao; Davis, Timothy H; Deshmukh, Harshahardhan; Bao, Gang

    2017-01-01

    The CRISPR/Cas9 system is a powerful tool for precision genome editing. The ability to accurately modify genomic DNA in situ with single nucleotide precision opens up new possibilities for not only basic research but also biotechnology applications and clinical translation. In this chapter, we outline the procedures for design, screening, and validation of CRISPR/Cas9 systems for targeted modification of coding sequences in the human genome and how to perform genome editing in induced pluripotent stem cells with high efficiency and specificity.

  19. Development and Single-Laboratory Validation of a Liquid Chromatography Tandem Mass Spectrometry Method for Quantitation of Tetrodotoxin in Mussels and Oysters.

    PubMed

    Turner, Andrew D; Boundy, Michael J; Rapkova, Monika Dhanji

    2017-09-01

    In recent years, evidence has grown for the presence of tetrodotoxin (TTX) in bivalve mollusks, leading to the potential for consumers of contaminated products to be affected by Tetrodotoxin Shellfish Poisoning (TSP). A single-laboratory validation was conducted for the hydrophilic interaction LC (HILIC) tandem MS (MS/MS) analysis of TTX in common mussels and Pacific oysters-the bivalve species that have been found to contain TTXs in the United Kingdom in recent years. The method consists of a single-step dispersive extraction in 1% acetic acid, followed by a carbon SPE cleanup step before dilution and instrumental analysis. The full method was developed as a rapid tool for the quantitation of TTX, as well as for the associated analogs 4-epi-TTX; 5,6,11-trideoxy TTX; 11-nor TTX-6-ol; 5-deoxy TTX; and 4,9-anhydro TTX. The method can also be run as the acquisition of TTX together with paralytic shellfish toxins. Results demonstrated acceptable method performance characteristics for specificity, linearity, recovery, ruggedness, repeatability, matrix variability, and within-laboratory reproducibility for the analysis of TTX. The LOD and LOQ were fit-for-purpose in comparison to the current action limit for TTX enforced in The Netherlands. In addition, aspects of method performance (LOD, LOQ, and within-laboratory reproducibility) were found to be satisfactory for three other TTX analogs (11-nor TTX-6-ol, 5-deoxy TTX, and 4,9-anhydro TTX). The method was found to be practical and suitable for use in regulatory testing, providing rapid turnaround of sample analysis. Plans currently underway on a full collaborative study to validate a HILIC-MS/MS method for paralytic shellfish poisoning toxins will be extended to include TTX in order to generate international acceptance, ultimately for use as an alternative official control testing method should regulatory controls be adopted.

  20. Optimization and single-laboratory validation of a method for the determination of flavonolignans in milk thistle seeds by high-performance liquid chromatography with ultraviolet detection.

    PubMed

    Mudge, Elizabeth; Paley, Lori; Schieber, Andreas; Brown, Paula N

    2015-10-01

    Seeds of milk thistle, Silybum marianum (L.) Gaertn., are used for treatment and prevention of liver disorders and were identified as a high priority ingredient requiring a validated analytical method. An AOAC International expert panel reviewed existing methods and made recommendations concerning method optimization prior to validation. A series of extraction and separation studies were undertaken on the selected method for determining flavonolignans from milk thistle seeds and finished products to address the review panel recommendations. Once optimized, a single-laboratory validation study was conducted. The method was assessed for repeatability, accuracy, selectivity, LOD, LOQ, analyte stability, and linearity. Flavonolignan content ranged from 1.40 to 52.86% in raw materials and dry finished products and ranged from 36.16 to 1570.7 μg/mL in liquid tinctures. Repeatability for the individual flavonolignans in raw materials and finished products ranged from 1.03 to 9.88% RSDr, with HorRat values between 0.21 and 1.55. Calibration curves for all flavonolignan concentrations had correlation coefficients of >99.8%. The LODs for the flavonolignans ranged from 0.20 to 0.48 μg/mL at 288 nm. Based on the results of this single-laboratory validation, this method is suitable for the quantitation of the six major flavonolignans in milk thistle raw materials and finished products, as well as multicomponent products containing dandelion, schizandra berry, and artichoke extracts. It is recommended that this method be adopted as First Action Official Method status by AOAC International.

  1. Single-laboratory validation of a GC/MS method for the determination of 27 polycyclic aromatic hydrocarbons (PAHs) in oils and fats.

    PubMed

    Rose, M; White, S; Macarthur, R; Petch, R G; Holland, J; Damant, A P

    2007-06-01

    A protocol for the measurement of 27 polycyclic aromatic hydrocarbons (PAHs) in vegetable oils by GC/MS has undergone single-laboratory validation. PAHs were measured in three oils (olive pomace, sunflower and coconut oil). Five samples of each oil (one unfortified, and four fortified at concentrations between 2 and 50 microg kg(-1)) were analysed in replicate (four times in separate runs). Two samples (one unfortified and one fortified at 2 microg kg(-1)) of five oils (virgin olive oil, grapeseed oil, toasted sesame oil, olive margarine and palm oil) were also analysed. The validation included an assessment of measurement bias from the results of 120 measurements of a certified reference material (coconut oil BCR CRM458 certified for six PAHs). The method is capable of reliably detecting 26 out of 27 PAHs, at concentration <2 microg kg(-1) which is the European Union maximum limit for benzo[a]pyrene, in vegetable oils, olive pomace oil, sunflower oil and coconut oil. Quantitative results were obtained that are fit for purpose for concentrations from <2 to 50 microg kg(-1) for 24 out of 27 PAHs in olive pomace oil, sunflower oil and coconut oil. The reliable detection of 2 microg kg(-1) of PAHs in five additional oils (virgin olive oil, grapeseed oil, toasted sesame oil, olive margarine and palm oil) has been demonstrated. The method failed to produce fit-for-purpose results for the measurement of dibenzo[a,h]pyrene, anthanthrene and cyclopenta[c,d]pyrene. The reason for the failure was the large variation in results. The likely cause was the lack of availability of (13)C isotope internal standards for these PAHs at the time of the study. The protocol has been shown to be fit-for-purpose and is suitable for formal validation by inter-laboratory collaborative study.

  2. Optimization and in Vivo Validation of Peptide Vectors Targeting the LDL Receptor.

    PubMed

    Jacquot, Guillaume; Lécorché, Pascaline; Malcor, Jean-Daniel; Laurencin, Mathieu; Smirnova, Maria; Varini, Karine; Malicet, Cédric; Gassiot, Fanny; Abouzid, Karima; Faucon, Aude; David, Marion; Gaudin, Nicolas; Masse, Maxime; Ferracci, Géraldine; Dive, Vincent; Cisternino, Salvatore; Khrestchatisky, Michel

    2016-12-05

    -VH4127 in wild-type or ldlr -/- mice confirmed their active LDLR targeting in vivo. Overall, this study extends our previous work toward a diversified portfolio of LDLR-targeted peptide vectors with validated LDLR-targeting potential in vivo.

  3. Validation of Bioreactor and Human-on-a-Chip Devices for Chemical Safety Assessment.

    PubMed

    Rebelo, Sofia P; Dehne, Eva-Maria; Brito, Catarina; Horland, Reyk; Alves, Paula M; Marx, Uwe

    2016-01-01

    Equipment and device qualification and test assay validation in the field of tissue engineered human organs for substance assessment remain formidable tasks with only a few successful examples so far. The hurdles seem to increase with the growing complexity of the biological systems, emulated by the respective models. Controlled single tissue or organ culture in bioreactors improves the organ-specific functions and maintains their phenotypic stability for longer periods of time. The reproducibility attained with bioreactor operations is, per se, an advantage for the validation of safety assessment. Regulatory agencies have gradually altered the validation concept from exhaustive "product" to rigorous and detailed process characterization, valuing reproducibility as a standard for validation. "Human-on-a-chip" technologies applying micro-physiological systems to the in vitro combination of miniaturized human organ equivalents into functional human micro-organisms are nowadays thought to be the most elaborate solution created to date. They target the replacement of the current most complex models-laboratory animals. Therefore, we provide here a road map towards the validation of such "human-on-a-chip" models and qualification of their respective bioreactor and microchip equipment along a path currently used for the respective animal models.

  4. Target detection and localization in shallow water: an experimental demonstration of the acoustic barrier problem at the laboratory scale.

    PubMed

    Marandet, Christian; Roux, Philippe; Nicolas, Barbara; Mars, Jérôme

    2011-01-01

    This study demonstrates experimentally at the laboratory scale the detection and localization of a wavelength-sized target in a shallow ultrasonic waveguide between two source-receiver arrays at 3 MHz. In the framework of the acoustic barrier problem, at the 1/1000 scale, the waveguide represents a 1.1-km-long, 52-m-deep ocean acoustic channel in the kilohertz frequency range. The two coplanar arrays record in the time-domain the transfer matrix of the waveguide between each pair of source-receiver transducers. Invoking the reciprocity principle, a time-domain double-beamforming algorithm is simultaneously performed on the source and receiver arrays. This array processing projects the multireverberated acoustic echoes into an equivalent set of eigenrays, which are defined by their launch and arrival angles. Comparison is made between the intensity of each eigenray without and with a target for detection in the waveguide. Localization is performed through tomography inversion of the acoustic impedance of the target, using all of the eigenrays extracted from double beamforming. The use of the diffraction-based sensitivity kernel for each eigenray provides both the localization and the signature of the target. Experimental results are shown in the presence of surface waves, and methodological issues are discussed for detection and localization.

  5. Evaluation of Non-Laboratory and Laboratory Prediction Models for Current and Future Diabetes Mellitus: A Cross-Sectional and Retrospective Cohort Study

    PubMed Central

    Hahn, Seokyung; Moon, Min Kyong; Park, Kyong Soo; Cho, Young Min

    2016-01-01

    Background Various diabetes risk scores composed of non-laboratory parameters have been developed, but only a few studies performed cross-validation of these scores and a comparison with laboratory parameters. We evaluated the performance of diabetes risk scores composed of non-laboratory parameters, including a recently published Korean risk score (KRS), and compared them with laboratory parameters. Methods The data of 26,675 individuals who visited the Seoul National University Hospital Healthcare System Gangnam Center for a health screening program were reviewed for cross-sectional validation. The data of 3,029 individuals with a mean of 6.2 years of follow-up were reviewed for longitudinal validation. The KRS and 16 other risk scores were evaluated and compared with a laboratory prediction model developed by logistic regression analysis. Results For the screening of undiagnosed diabetes, the KRS exhibited a sensitivity of 81%, a specificity of 58%, and an area under the receiver operating characteristic curve (AROC) of 0.754. Other scores showed AROCs that ranged from 0.697 to 0.782. For the prediction of future diabetes, the KRS exhibited a sensitivity of 74%, a specificity of 54%, and an AROC of 0.696. Other scores had AROCs ranging from 0.630 to 0.721. The laboratory prediction model composed of fasting plasma glucose and hemoglobin A1c levels showed a significantly higher AROC (0.838, P < 0.001) than the KRS. The addition of the KRS to the laboratory prediction model increased the AROC (0.849, P = 0.016) without a significant improvement in the risk classification (net reclassification index: 4.6%, P = 0.264). Conclusions The non-laboratory risk scores, including KRS, are useful to estimate the risk of undiagnosed diabetes but are inferior to the laboratory parameters for predicting future diabetes. PMID:27214034

  6. Microbial ecology laboratory procedures manual NASA/MSFC

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  7. Targeting HER2 in the treatment of non-small cell lung cancer.

    PubMed

    Mar, Nataliya; Vredenburgh, James J; Wasser, Jeffrey S

    2015-03-01

    Oncogenic driver mutations have emerged as major treatment targets for molecular therapies in a variety of cancers. HER2 positivity has been well-studied in breast cancer, but its importance is still being explored in non-small cell lung cancer (NSCLC). Laboratory methods for assessment of HER2 positivity in NSCLC include immunohistochemistry (IHC) for protein overexpression, fluorescent in situ hybridization (FISH) for gene amplification, and next generation sequencing (NGS) for gene mutations. The prognostic and predictive significance of these tests remain to be validated, with an emerging association between HER2 gene mutations and response to HER2 targeted therapies. Despite the assay used to determine the HER2 status of lung tumors, all patients with advanced HER2 positive lung adenocarcinoma should be evaluated for treatment with targeted agents. Several clinical approaches for inclusion of these drugs into patient treatment plans exist, but there is no defined algorithm specific to NSCLC. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Evaluation of Targeted Next-Generation Sequencing for Detection of Bovine Pathogens in Clinical Samples.

    PubMed

    Anis, Eman; Hawkins, Ian K; Ilha, Marcia R S; Woldemeskel, Moges W; Saliki, Jeremiah T; Wilkes, Rebecca P

    2018-07-01

    The laboratory diagnosis of infectious diseases, especially those caused by mixed infections, is challenging. Routinely, it requires submission of multiple samples to separate laboratories. Advances in next-generation sequencing (NGS) have provided the opportunity for development of a comprehensive method to identify infectious agents. This study describes the use of target-specific primers for PCR-mediated amplification with the NGS technology in which pathogen genomic regions of interest are enriched and selectively sequenced from clinical samples. In the study, 198 primers were designed to target 43 common bovine and small-ruminant bacterial, fungal, viral, and parasitic pathogens, and a bioinformatics tool was specifically constructed for the detection of targeted pathogens. The primers were confirmed to detect the intended pathogens by testing reference strains and isolates. The method was then validated using 60 clinical samples (including tissues, feces, and milk) that were also tested with other routine diagnostic techniques. The detection limits of the targeted NGS method were evaluated using 10 representative pathogens that were also tested by quantitative PCR (qPCR), and the NGS method was able to detect the organisms from samples with qPCR threshold cycle ( C T ) values in the 30s. The method was successful for the detection of multiple pathogens in the clinical samples, including some additional pathogens missed by the routine techniques because the specific tests needed for the particular organisms were not performed. The results demonstrate the feasibility of the approach and indicate that it is possible to incorporate NGS as a diagnostic tool in a cost-effective manner into a veterinary diagnostic laboratory. Copyright © 2018 Anis et al.

  9. Measurement and validation of benchmark-quality thick-target tungsten X-ray spectra below 150 kVp.

    PubMed

    Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M

    2000-11-01

    Pulse-height distributions of two constant potential X-ray tubes with fixed anode tungsten targets were measured and unfolded. The measurements employed quantitative alignment of the beam, the use of two different semiconductor detectors (high-purity germanium and cadmium-zinc-telluride), two different ion chamber systems with beam-specific calibration factors, and various filter and tube potential combinations. Monte Carlo response matrices were generated for each detector for unfolding the pulse-height distributions into spectra incident on the detectors. These response matrices were validated for the low error bars assigned to the data. A significant aspect of the validation of spectra, and a detailed characterization of the X-ray tubes, involved measuring filtered and unfiltered beams at multiple tube potentials (30-150 kVp). Full corrections to ion chamber readings were employed to convert normalized fluence spectra into absolute fluence spectra. The characterization of fixed anode pitting and its dominance over exit window plating and/or detector dead layer was determined. An Appendix of tabulated benchmark spectra with assigned error ranges was developed for future reference.

  10. Validation of the kidney disease quality of life-short form: a cross-sectional study of a dialysis-targeted health measure in Singapore.

    PubMed

    Joshi, Veena D; Mooppil, Nandakumar; Lim, Jeremy Fy

    2010-12-20

    In Singapore, the prevalence of end-stage renal disease (ESRD) and the number of people on dialysis is increasing. The impact of ESRD on patient quality of life has been recognized as an important outcome measure. The Kidney Disease Quality Of Life-Short Form (KDQOL-SF™) has been validated and is widely used as a measure of quality of life in dialysis patients in many countries, but not in Singapore. We aimed to determine the reliability and validity of the KDQOL-SF™ for haemodialysis patients in Singapore. From December 2006 through January 2007, this cross-sectional study gathered data on patients ≥21 years old, who were undergoing haemodialysis at National Kidney Foundation in Singapore. We used exploratory factor analysis to determine construct validity of the eight KDQOL-SF™ sub-scales, Cronbach's alpha coefficient to determine internal consistency reliability, correlation of the overall health rating with kidney disease-targeted scales to confirm validity, and correlation of the eight sub-scales with age, income and education to determine convergent and divergent validity. Of 1980 haemodialysis patients, 1180 (59%) completed the KDQOL-SF™. Full information was available for 980 participants, with a mean age of 56 years. The sample was representative of the total dialysis population in Singapore, except Indian ethnicity that was over-represented. The instrument designers' proposed eight sub-scales were confirmed, which together accounted for 68.4% of the variance. All sub-scales had a Cronbach's α above the recommended minimum value of 0.7 to indicate good reliability (range: 0.72 to 0.95), except for Social function (0.66). Correlation of items within subscales was higher than correlation of items outside subscales in 90% of the cases. The overall health rating positively correlated with kidney disease-targeted scales, confirming validity. General health subscales were found to have significant associations with age, income and education

  11. Youth Oriented Activity Trackers: Comprehensive Laboratory- and Field-Based Validation

    PubMed Central

    2017-01-01

    Background Commercial activity trackers are growing in popularity among adults and some are beginning to be marketed to children. There is, however, a paucity of independent research examining the validity of these devices to detect physical activity of different intensity levels. Objectives The purpose of this study was to determine the validity of the output from 3 commercial youth-oriented activity trackers in 3 phases: (1) orbital shaker, (2) structured indoor activities, and (3) 4 days of free-living activity. Methods Four units of each activity tracker (Movband [MB], Sqord [SQ], and Zamzee [ZZ]) were tested in an orbital shaker for 5-minutes at three frequencies (1.3, 1.9, and 2.5 Hz). Participants for Phase 2 (N=14) and Phase 3 (N=16) were 6-12 year old children (50% male). For Phase 2, participants completed 9 structured activities while wearing each tracker, the ActiGraph GT3X+ (AG) research accelerometer, and a portable indirect calorimetry system to assess energy expenditure (EE). For Phase 3, participants wore all 4 devices for 4 consecutive days. Correlation coefficients, linear models, and non-parametric statistics evaluated the criterion and construct validity of the activity tracker output. Results Output from all devices was significantly associated with oscillation frequency (r=.92-.99). During Phase 2, MB and ZZ only differentiated sedentary from light intensity (P<.01), whereas the SQ significantly differentiated among all intensity categories (all comparisons P<.01), similar to AG and EE. During Phase 3, AG counts were significantly associated with activity tracker output (r=.76, .86, and .59 for the MB, SQ, and ZZ, respectively). Conclusions Across study phases, the SQ demonstrated stronger validity than the MB and ZZ. The validity of youth-oriented activity trackers may directly impact their effectiveness as behavior modification tools, demonstrating a need for more research on such devices. PMID:28724509

  12. Implementing a laboratory automation system: experience of a large clinical laboratory.

    PubMed

    Lam, Choong Weng; Jacob, Edward

    2012-02-01

    Laboratories today face increasing pressure to automate their operations as they are challenged by a continuing increase in workload, need to reduce expenditure, and difficulties in recruitment of experienced technical staff. Was the implementation of a laboratory automation system (LAS) in the Clinical Biochemistry Laboratory at Singapore General Hospital successful? There is no simple answer, so the following topics comparing and contrasting pre- and post-LAS have been explored: turnaround time (TAT), laboratory errors, and staff satisfaction. The benefits and limitations of LAS from the laboratory experience were also reviewed. The mean TAT for both stat and routine samples decreased post-LAS (30% and 13.4%, respectively). In the 90th percentile TAT chart, a 29% reduction was seen in the processing of stat samples on the LAS. However, no significant difference in the 90th percentile TAT was observed with routine samples. It was surprising to note that laboratory errors increased post-LAS. Considerable effort was needed to overcome the initial difficulties associated with adjusting to a new system, new software, and new working procedures. Although some of the known advantages and limitations of LAS have been validated, the claimed benefits such as improvements in TAT, laboratory errors, and staff morale were not evident in the initial months.

  13. Safety validation test equipment operation

    NASA Astrophysics Data System (ADS)

    Kurosaki, Tadaaki; Watanabe, Takashi

    1992-08-01

    An overview of the activities conducted on safety validation test equipment operation for materials used for NASA manned missions is presented. Safety validation tests, such as flammability, odor, offgassing, and so forth were conducted in accordance with NASA-NHB-8060.1C using test subjects common with those used by NASA, and the equipment used were qualified for their functions and performances in accordance with NASDA-CR-99124 'Safety Validation Test Qualification Procedures.' Test procedure systems were established by preparing 'Common Procedures for Safety Validation Test' as well as test procedures for flammability, offgassing, and odor tests. The test operation organization chaired by the General Manager of the Parts and Material Laboratory of NASDA (National Space Development Agency of Japan) was established, and the test leaders and operators in the organization were qualified in accordance with the specified procedures. One-hundred-one tests had been conducted so far by the Parts and Material Laboratory according to the request submitted by the manufacturers through the Space Station Group and the Safety and Product Assurance for Manned Systems Office.

  14. Double-shell target fabrication workshop-2016 report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y. Morris; Oertel, John; Farrell, Michael

    On June 30, 2016, over 40 representatives from Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), General Atomics (GA), Laboratory for Laser Energetics (LLE), Schafer Corporation, and NNSA headquarter attended a double-shell (DS) target fabrication workshop at Livermore, California. Pushered-single-shell (PSS) and DS metalgas platforms potentially have a large impact on programmatic applications. The goal of this focused workshop is to bring together target fabrication scientists, physicists, and designers to brainstorm future PSS and DS target fabrication needs and strategies. This one-day workshop intends to give an overall view of historical information, recent approaches, and future research activitiesmore » at each participating organization. Five topical areas have been discussed that are vital to the success of future DS target fabrications, including inner metal shells, foam spheres, outer ablators, fill tube assembly, and metrology.« less

  15. A Profilometry-Based Dentifrice Abrasion Method for V8 Brushing Machines Part III: Multi-Laboratory Validation Testing of RDA-PE.

    PubMed

    Schneiderman, Eva; Colón, Ellen L; White, Donald J; Schemehorn, Bruce; Ganovsky, Tara; Haider, Amir; Garcia-Godoy, Franklin; Morrow, Brian R; Srimaneepong, Viritpon; Chumprasert, Sujin

    2017-09-01

    We have previously reported on progress toward the refinement of profilometry-based abrasivity testing of dentifrices using a V8 brushing machine and tactile or optical measurement of dentin wear. The general application of this technique may be advanced by demonstration of successful inter-laboratory confirmation of the method. The objective of this study was to explore the capability of different laboratories in the assessment of dentifrice abrasivity using a profilometry-based evaluation technique developed in our Mason laboratories. In addition, we wanted to assess the interchangeability of human and bovine specimens. Participating laboratories were instructed in methods associated with Radioactive Dentin Abrasivity-Profilometry Equivalent (RDA-PE) evaluation, including site visits to discuss critical elements of specimen preparation, masking, profilometry scanning, and procedures. Laboratories were likewise instructed on the requirement for demonstration of proportional linearity as a key condition for validation of the technique. Laboratories were provided with four test dentifrices, blinded for testing, with a broad range of abrasivity. In each laboratory, a calibration curve was developed for varying V8 brushing strokes (0, 4,000, and 10,000 strokes) with the ISO abrasive standard. Proportional linearity was determined as the ratio of standard abrasion mean depths created with 4,000 and 10,000 strokes (2.5 fold differences). Criteria for successful calibration within the method (established in our Mason laboratory) was set at proportional linearity = 2.5 ± 0.3. RDA-PE was compared to Radiotracer RDA for the four test dentifrices, with the latter obtained by averages from three independent Radiotracer RDA sites. Individual laboratories and their results were compared by 1) proportional linearity and 2) acquired RDA-PE values for test pastes. Five sites participated in the study. One site did not pass proportional linearity objectives. Data for this site are

  16. Approach and Instrument Placement Validation

    NASA Technical Reports Server (NTRS)

    Ator, Danielle

    2005-01-01

    The Mars Exploration Rovers (MER) from the 2003 flight mission represents the state of the art technology for target approach and instrument placement on Mars. It currently takes 3 sols (Martian days) for the rover to place an instrument on a designated rock target that is about 10 to 20 m away. The objective of this project is to provide an experimentally validated single-sol instrument placement capability to future Mars missions. After completing numerous test runs on the Rocky8 rover under various test conditions, it has been observed that lighting conditions, shadow effects, target features and the initial target distance have an effect on the performance and reliability of the tracking software. Additional software validation testing will be conducted in the months to come.

  17. Estimating body fat in NCAA Division I female athletes: a five-compartment model validation of laboratory methods.

    PubMed

    Moon, Jordan R; Eckerson, Joan M; Tobkin, Sarah E; Smith, Abbie E; Lockwood, Christopher M; Walter, Ashley A; Cramer, Joel T; Beck, Travis W; Stout, Jeffrey R

    2009-01-01

    The purpose of the present study was to determine the validity of various laboratory methods for estimating percent body fat (%fat) in NCAA Division I college female athletes (n = 29; 20 +/- 1 year). Body composition was assessed via hydrostatic weighing (HW), air displacement plethysmography (ADP), and dual-energy X-ray absorptiometry (DXA), and estimates of %fat derived using 4-compartment (C), 3C, and 2C models were compared to a criterion 5C model that included bone mineral content, body volume (BV), total body water, and soft tissue mineral. The Wang-4C and the Siri-3C models produced nearly identical values compared to the 5C model (r > 0.99, total error (TE) < 0.40%fat). For the remaining laboratory methods, constant error values (CE) ranged from -0.04%fat (HW-Siri) to -3.71%fat (DXA); r values ranged from 0.89 (ADP-Siri, ADP-Brozek) to 0.93 (DXA); standard error of estimate values ranged from 1.78%fat (DXA) to 2.19%fat (ADP-Siri, ADP-Brozek); and TE values ranged from 2.22%fat (HW-Brozek) to 4.90%fat (DXA). The limits of agreement for DXA (-10.10 to 2.68%fat) were the largest with a significant trend of -0.43 (P < 0.05). With the exception of DXA, all of the equations resulted in acceptable TE values (<3.08%fat). However, the results for individual estimates of %fat using the Brozek equation indicated that the 2C models that derived BV from ADP and HW overestimated (5.38, 3.65%) and underestimated (5.19, 4.88%) %fat, respectively. The acceptable TE values for both HW and ADP suggest that these methods are valid for estimating %fat in college female athletes; however, the Wang-4C and Siri-3C models should be used to identify individual estimates of %fat in this population.

  18. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov Websites

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination different hydroxyl groups (-OH) in pyrolysis bio-oil: aliphatic-OH, phenolic-OH, and carboxylic-OH. Download

  19. Clinical Laboratory Helper.

    ERIC Educational Resources Information Center

    Szucs, Susan C.; And Others

    This curriculum guide provides competencies and tasks for the position of clinical laboratory helper; it serves as both a career exploration experience and/or entry-level employment training. A list of 25 validated competencies and tasks covers careers from entry level to those that must be mastered to earn an associate degree in clinical…

  20. Nuclease Target Site Selection for Maximizing On-target Activity and Minimizing Off-target Effects in Genome Editing

    PubMed Central

    Lee, Ciaran M; Cradick, Thomas J; Fine, Eli J; Bao, Gang

    2016-01-01

    The rapid advancement in targeted genome editing using engineered nucleases such as ZFNs, TALENs, and CRISPR/Cas9 systems has resulted in a suite of powerful methods that allows researchers to target any genomic locus of interest. A complementary set of design tools has been developed to aid researchers with nuclease design, target site selection, and experimental validation. Here, we review the various tools available for target selection in designing engineered nucleases, and for quantifying nuclease activity and specificity, including web-based search tools and experimental methods. We also elucidate challenges in target selection, especially in predicting off-target effects, and discuss future directions in precision genome editing and its applications. PMID:26750397

  1. A network model of genomic hormone interactions underlying dementia and its translational validation through serendipitous off-target effect

    PubMed Central

    2013-01-01

    Background While the majority of studies have focused on the association between sex hormones and dementia, emerging evidence supports the role of other hormone signals in increasing dementia risk. However, due to the lack of an integrated view on mechanistic interactions of hormone signaling pathways associated with dementia, molecular mechanisms through which hormones contribute to the increased risk of dementia has remained unclear and capacity of translating hormone signals to potential therapeutic and diagnostic applications in relation to dementia has been undervalued. Methods Using an integrative knowledge- and data-driven approach, a global hormone interaction network in the context of dementia was constructed, which was further filtered down to a model of convergent hormone signaling pathways. This model was evaluated for its biological and clinical relevance through pathway recovery test, evidence-based analysis, and biomarker-guided analysis. Translational validation of the model was performed using the proposed novel mechanism discovery approach based on ‘serendipitous off-target effects’. Results Our results reveal the existence of a well-connected hormone interaction network underlying dementia. Seven hormone signaling pathways converge at the core of the hormone interaction network, which are shown to be mechanistically linked to the risk of dementia. Amongst these pathways, estrogen signaling pathway takes the major part in the model and insulin signaling pathway is analyzed for its association to learning and memory functions. Validation of the model through serendipitous off-target effects suggests that hormone signaling pathways substantially contribute to the pathogenesis of dementia. Conclusions The integrated network model of hormone interactions underlying dementia may serve as an initial translational platform for identifying potential therapeutic targets and candidate biomarkers for dementia-spectrum disorders such as Alzheimer

  2. Lab meets real life: A laboratory assessment of spontaneous thought and its ecological validity

    PubMed Central

    Welz, Annett; Reinhard, Iris; Alpers, Georg W.

    2017-01-01

    People’s minds frequently wander towards self-generated thoughts, which are unrelated to external stimuli or demands. These phenomena, referred to as “spontaneous thought” (ST) and “mind wandering” (MW), have previously been linked with both costs and benefits. Current assessments of ST and MW have predominantly been conducted in the laboratory, whereas studies on the ecological validity of such lab-related constructs and their interrelations are rare. The current study examined the stability of ST dimensions assessed in the lab and their predictive value with respect to MW, repetitive negative thought (uncontrollable rumination, RUM), and affect in daily life. Forty-three university students were assessed with the Amsterdam Resting State Questionnaire (2nd version) to assess ten ST dimensions during the resting state in two laboratory sessions, which were separated by five days of electronic ambulatory assessment (AA). During AA, individuals indicated the intensity of MW and RUM, as well as of positive and negative affect in daily life ten times a day. ST dimensions measured in the lab were moderately stable across one week. Five out of ten ST lab dimensions were predicted by mental health-related symptoms or by dispositional cognitive traits. Hierarchical linear models revealed that a number of ST lab dimensions predicted cognitive and affective states in daily life. Mediation analyses showed that RUM, but not MW per se, accounted for the relationship between specific ST lab dimensions and mood in daily life. By using a simple resting state task, we could demonstrate that a number of lab dimensions of spontaneous thought are moderately stable, are predicted by mental health symptoms and cognitive traits, and show plausible associations with categories of self-generated thought and mood in daily life. PMID:28910351

  3. Evaluation of Performance of Laboratories and Manufacturers Within the Framework of the IFCC model for Quality Targets of HbA1c.

    PubMed

    Weykamp, Cas; Siebelder, Carla

    2017-11-01

    HbA1c is a key parameter in diabetes management. For years the test has been used exclusively for monitoring of long-term diabetic control. However, due to improvement of the performance, HbA1c is considered more and more for diagnosis and screening. With this new application, quality demands further increase. A task force of the International Federation of Clinical Chemistry and Laboratory Medicine developed a model to set and evaluate quality targets for HbA1c. The model is based on the concept of total error and takes into account the major sources of analytical errors in the medical laboratory: bias and imprecision. Performance criteria are derived from sigma-metrics and biological variation. This review shows 2 examples of the application of the model: at the level of single laboratories, and at the level of a group of laboratories. In the first example data of 125 individual laboratories of a recent external quality assessment program in the Netherlands are evaluated. Differences between laboratories as well as their relation to method principles are shown. The second example uses recent and 3-year-old data of the proficiency test of the College of American Pathologists. The differences in performance between 26 manufacturer-related groups of laboratories are shown. Over time these differences are quite consistent although some manufacturers improved substantially either by better standardization or by replacing a test. The IFCC model serves all who are involved in HbA1c testing in the ongoing process of better performance and better patient care.

  4. Validation of CoaBC as a Bactericidal Target in the Coenzyme A Pathway of Mycobacterium tuberculosis.

    PubMed

    Evans, Joanna C; Trujillo, Carolina; Wang, Zhe; Eoh, Hyungjin; Ehrt, Sabine; Schnappinger, Dirk; Boshoff, Helena I M; Rhee, Kyu Y; Barry, Clifton E; Mizrahi, Valerie

    2016-12-09

    Mycobacterium tuberculosis relies on its own ability to biosynthesize coenzyme A to meet the needs of the myriad enzymatic reactions that depend on this cofactor for activity. As such, the essential pantothenate and coenzyme A biosynthesis pathways have attracted attention as targets for tuberculosis drug development. To identify the optimal step for coenzyme A pathway disruption in M. tuberculosis, we constructed and characterized a panel of conditional knockdown mutants in coenzyme A pathway genes. Here, we report that silencing of coaBC was bactericidal in vitro, whereas silencing of panB, panC, or coaE was bacteriostatic over the same time course. Silencing of coaBC was likewise bactericidal in vivo, whether initiated at infection or during either the acute or chronic stages of infection, confirming that CoaBC is required for M. tuberculosis to grow and persist in mice and arguing against significant CoaBC bypass via transport and assimilation of host-derived pantetheine in this animal model. These results provide convincing genetic validation of CoaBC as a new bactericidal drug target.

  5. Practical methodological guide for hydrometric inter-laboratory organisation

    NASA Astrophysics Data System (ADS)

    Besson, David; Bertrand, Xavier

    2015-04-01

    Discharge measurements performed by the French governmental hydrometer team feed a national database. This data is available for general river flows knowkedge, flood forecasting, low water survey, statistical calculations flow, control flow regulatory and many other uses. Regularly checking the measurements quality and better quantifying its accuracy is therefore an absolute need. The practice of inter-laboratory comparison in hydrometry particularly developed during the last decade. Indeed, discharge measurement can not easily be linked to a standard. Therefore, on-site measurement accuracy control is very difficult. Inter-laboratory comparison is thus a practical solution to this issue. However, it needs some regulations in order to ease its practice and legitimize its results. To do so, the French government hydrometrics teams produced a practical methodological guide for hydrometric inter-laboratory organisation in destination of hydrometers community in view of ensure the harmonization of inter-laboratory comparison practices for different materials (ADCP, current meter on wadind rod or gauging van, tracer dilution, surface speed) and flow range (flood, low water). Ensure the results formalization and banking. The realisation of this practice guide is grounded on the experience of the governmental teams & their partners (or fellows), following existing approaches (Doppler group especially). The guide is designated to validate compliance measures and identify outliers : Hardware, methodological, environmental, or human. Inter-laboratory comparison provides the means to verify the compliance of the instruments (devices + methods + operators) and provides methods to determine an experimental uncertainty of the tested measurement method which is valid only for the site and the measurement conditions but does not address the calibration or periodic monitoring of the few materials. After some conceptual definitions, the guide describes the different stages of an

  6. 76 FR 15945 - National Voluntary Laboratory Accreditation Program (NVLAP) Workshop for Laboratories Interested...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... Accreditation Program (NVLAP) is considering establishing an accreditation program for laboratories that test... the general accreditation criteria referenced in Sections 4 and 5 of the NIST handbook 150 to the test... accreditation, test and measurement equipment, personnel requirements, validation of test methods, and reporting...

  7. ICP-MS Data Validation

    EPA Pesticide Factsheets

    Document designed to offer data reviewers guidance in determining the validity ofanalytical data generated through the USEPA Contract Laboratory Program Statement ofWork (SOW) ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  8. An Application of Practical Strategies in Assessing the Criterion-Related Validity of Credentialing Examinations.

    ERIC Educational Resources Information Center

    Fidler, James R.

    1993-01-01

    Criterion-related validities of 2 laboratory practitioner certification examinations for medical technologists (MTs) and medical laboratory technicians (MLTs) were assessed for 81 MT and 70 MLT examinees. Validity coefficients are presented for both measures. Overall, summative ratings yielded stronger validity coefficients than ratings based on…

  9. Measuring preschool learning engagement in the laboratory.

    PubMed

    Halliday, Simone E; Calkins, Susan D; Leerkes, Esther M

    2018-03-01

    Learning engagement is a critical factor for academic achievement and successful school transitioning. However, current methods of assessing learning engagement in young children are limited to teacher report or classroom observation, which may limit the types of research questions one could assess about this construct. The current study investigated the validity of a novel assessment designed to measure behavioral learning engagement among young children in a standardized laboratory setting and examined how learning engagement in the laboratory relates to future classroom adjustment. Preschool-aged children (N = 278) participated in a learning-based Tangrams task and Story sequencing task and were observed based on seven behavioral indicators of engagement. Confirmatory factor analysis supported the construct validity for a behavioral engagement factor composed of six of the original behavioral indicators: attention to instructions, on-task behavior, enthusiasm/energy, persistence, monitoring progress/strategy use, and negative affect. Concurrent validity for this behavioral engagement factor was established through its associations with parent-reported mastery motivation and pre-academic skills in math and literacy measured in the laboratory, and predictive validity was demonstrated through its associations with teacher-reported classroom learning behaviors and performance in math and reading in kindergarten. These associations were found when behavioral engagement was observed during both the nonverbal task and the verbal story sequencing tasks and persisted even after controlling for child minority status, gender, and maternal education. Learning engagement in preschool appears to be successfully measurable in a laboratory setting. This finding has implications for future research on the mechanisms that support successful academic development. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Determination of Phosphorus and Potassium in Commercial Inorganic Fertilizers by Inductively Coupled Plasma-Optical Emission Spectrometry: Single-Laboratory Validation, First Action 2015.18.

    PubMed

    Thiex, Nancy J

    2016-07-01

    A previously validated method for the determination of both citrate-EDTA-soluble P and K and acid-soluble P and K in commercial inorganic fertilizers by inductively coupled plasma-optical emission spectrometry was submitted to the expert review panel (ERP) for fertilizers for consideration of First Action Official Method(SM) status. The ERP evaluated the single-laboratory validation results and recommended the method for First Action Official Method status and provided recommendations for achieving Final Action. Validation materials ranging from 4.4 to 52.4% P2O5 (1.7-22.7% P) and 3-62% K2O (2.5-51.1% K) were used for the validation. Recoveries from validation materials for citrate-soluble P and K ranged from 99.3 to 124.9% P and from 98.4 to 100.7% K. Recoveries from validation materials for acid-soluble "total" P and K ranged from 95.53 to 99.40% P and from 98.36 to 107.28% K. Values of r for citrate-soluble P and K, expressed as RSD, ranged from 0.28 to 1.30% for P and from 0.41 to 1.52% for K. Values of r for total P and K, expressed as RSD, ranged from 0.71 to 1.13% for P and from 0.39 to 1.18% for K. Based on the validation data, the ERP recommended the method (with alternatives for the citrate-soluble and the acid-soluble extractions) for First Action Official Method status and provided recommendations for achieving Final Action status.

  11. ICP-AES Data Validation

    EPA Pesticide Factsheets

    Document designed to offer data reviewers guidance in determining the validity ofanalytical data generated through the USEPA Contract Laboratory Program (CLP) Statement ofWork (SOW) ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  12. Trace Volatile Data Validation

    EPA Pesticide Factsheets

    Document designed to offer data reviewers guidance in determining the validity ofanalytical data generated through the USEPA Contract Laboratory Program (CLP) Statement ofWork (SOW) ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  13. Quantitative and Systems Pharmacology. 1. In Silico Prediction of Drug-Target Interactions of Natural Products Enables New Targeted Cancer Therapy.

    PubMed

    Fang, Jiansong; Wu, Zengrui; Cai, Chuipu; Wang, Qi; Tang, Yun; Cheng, Feixiong

    2017-11-27

    Natural products with diverse chemical scaffolds have been recognized as an invaluable source of compounds in drug discovery and development. However, systematic identification of drug targets for natural products at the human proteome level via various experimental assays is highly expensive and time-consuming. In this study, we proposed a systems pharmacology infrastructure to predict new drug targets and anticancer indications of natural products. Specifically, we reconstructed a global drug-target network with 7,314 interactions connecting 751 targets and 2,388 natural products and built predictive network models via a balanced substructure-drug-target network-based inference approach. A high area under receiver operating characteristic curve of 0.96 was yielded for predicting new targets of natural products during cross-validation. The newly predicted targets of natural products (e.g., resveratrol, genistein, and kaempferol) with high scores were validated by various literature studies. We further built the statistical network models for identification of new anticancer indications of natural products through integration of both experimentally validated and computationally predicted drug-target interactions of natural products with known cancer proteins. We showed that the significantly predicted anticancer indications of multiple natural products (e.g., naringenin, disulfiram, and metformin) with new mechanism-of-action were validated by various published experimental evidence. In summary, this study offers powerful computational systems pharmacology approaches and tools for the development of novel targeted cancer therapies by exploiting the polypharmacology of natural products.

  14. Simple non-laboratory- and laboratory-based risk assessment algorithms and nomogram for detecting undiagnosed diabetes mellitus.

    PubMed

    Wong, Carlos K H; Siu, Shing-Chung; Wan, Eric Y F; Jiao, Fang-Fang; Yu, Esther Y T; Fung, Colman S C; Wong, Ka-Wai; Leung, Angela Y M; Lam, Cindy L K

    2016-05-01

    The aim of the present study was to develop a simple nomogram that can be used to predict the risk of diabetes mellitus (DM) in the asymptomatic non-diabetic subjects based on non-laboratory- and laboratory-based risk algorithms. Anthropometric data, plasma fasting glucose, full lipid profile, exercise habits, and family history of DM were collected from Chinese non-diabetic subjects aged 18-70 years. Logistic regression analysis was performed on a random sample of 2518 subjects to construct non-laboratory- and laboratory-based risk assessment algorithms for detection of undiagnosed DM; both algorithms were validated on data of the remaining sample (n = 839). The Hosmer-Lemeshow test and area under the receiver operating characteristic (ROC) curve (AUC) were used to assess the calibration and discrimination of the DM risk algorithms. Of 3357 subjects recruited, 271 (8.1%) had undiagnosed DM defined by fasting glucose ≥7.0 mmol/L or 2-h post-load plasma glucose ≥11.1 mmol/L after an oral glucose tolerance test. The non-laboratory-based risk algorithm, with scores ranging from 0 to 33, included age, body mass index, family history of DM, regular exercise, and uncontrolled blood pressure; the laboratory-based risk algorithm, with scores ranging from 0 to 37, added triglyceride level to the risk factors. Both algorithms demonstrated acceptable calibration (Hosmer-Lemeshow test: P = 0.229 and P = 0.483) and discrimination (AUC 0.709 and 0.711) for detection of undiagnosed DM. A simple-to-use nomogram for detecting undiagnosed DM has been developed using validated non-laboratory-based and laboratory-based risk algorithms. © 2015 Ruijin Hospital, Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  15. Improving quality management systems of laboratories in developing countries: an innovative training approach to accelerate laboratory accreditation.

    PubMed

    Yao, Katy; McKinney, Barbara; Murphy, Anna; Rotz, Phil; Wafula, Winnie; Sendagire, Hakim; Okui, Scolastica; Nkengasong, John N

    2010-09-01

    The Strengthening Laboratory Management Toward Accreditation (SLMTA) program was developed to promote immediate, measurable improvement in laboratories of developing countries. The laboratory management framework, a tool that prescribes managerial job tasks, forms the basis of the hands-on, activity-based curriculum. SLMTA is implemented through multiple workshops with intervening site visits to support improvement projects. To evaluate the effectiveness of SLMTA, the laboratory accreditation checklist was developed and subsequently adopted by the World Health Organization Regional Office for Africa (WHO AFRO). The SLMTA program and the implementation model were validated through a pilot in Uganda. SLMTA yielded observable, measurable results in the laboratories and improved patient flow and turnaround time in a laboratory simulation. The laboratory staff members were empowered to improve their own laboratories by using existing resources, communicate with clinicians and hospital administrators, and advocate for system strengthening. The SLMTA program supports laboratories by improving management and building preparedness for accreditation.

  16. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  17. Development and Validation of an Improved PCR Method Using the 23S-5S Intergenic Spacer for Detection of Rickettsiae in Dermacentor variabilis Ticks and Tissue Samples from Humans and Laboratory Animals

    PubMed Central

    Kakumanu, Madhavi L.; Ponnusamy, Loganathan; Sutton, Haley T.; Meshnick, Steven R.; Nicholson, William L.

    2016-01-01

    A novel nested PCR assay was developed to detect Rickettsia spp. in ticks and tissue samples from humans and laboratory animals. Primers were designed for the nested run to amplify a variable region of the 23S-5S intergenic spacer (IGS) of Rickettsia spp. The newly designed primers were evaluated using genomic DNA from 11 Rickettsia species belonging to the spotted fever, typhus, and ancestral groups and, in parallel, compared to other Rickettsia-specific PCR targets (ompA, gltA, and the 17-kDa protein gene). The new 23S-5S IGS nested PCR assay amplified all 11 Rickettsia spp., but the assays employing other PCR targets did not. The novel nested assay was sensitive enough to detect one copy of a cloned 23S-5S IGS fragment from “Candidatus Rickettsia amblyommii.” Subsequently, the detection efficiency of the 23S-5S IGS nested assay was compared to those of the other three assays using genomic DNA extracted from 40 adult Dermacentor variabilis ticks. The nested 23S-5S IGS assay detected Rickettsia DNA in 45% of the ticks, while the amplification rates of the other three assays ranged between 5 and 20%. The novel PCR assay was validated using clinical samples from humans and laboratory animals that were known to be infected with pathogenic species of Rickettsia. The nested 23S-5S IGS PCR assay was coupled with reverse line blot hybridization with species-specific probes for high-throughput detection and simultaneous identification of the species of Rickettsia in the ticks. “Candidatus Rickettsia amblyommii,” R. montanensis, R. felis, and R. bellii were frequently identified species, along with some potentially novel Rickettsia strains that were closely related to R. bellii and R. conorii. PMID:26818674

  18. Development and Validation of an Improved PCR Method Using the 23S-5S Intergenic Spacer for Detection of Rickettsiae in Dermacentor variabilis Ticks and Tissue Samples from Humans and Laboratory Animals.

    PubMed

    Kakumanu, Madhavi L; Ponnusamy, Loganathan; Sutton, Haley T; Meshnick, Steven R; Nicholson, William L; Apperson, Charles S

    2016-04-01

    A novel nested PCR assay was developed to detectRickettsiaspp. in ticks and tissue samples from humans and laboratory animals. Primers were designed for the nested run to amplify a variable region of the 23S-5S intergenic spacer (IGS) ofRickettsiaspp. The newly designed primers were evaluated using genomic DNA from 11Rickettsiaspecies belonging to the spotted fever, typhus, and ancestral groups and, in parallel, compared to otherRickettsia-specific PCR targets (ompA,gltA, and the 17-kDa protein gene). The new 23S-5S IGS nested PCR assay amplified all 11Rickettsiaspp., but the assays employing other PCR targets did not. The novel nested assay was sensitive enough to detect one copy of a cloned 23S-5S IGS fragment from "CandidatusRickettsia amblyommii." Subsequently, the detection efficiency of the 23S-5S IGS nested assay was compared to those of the other three assays using genomic DNA extracted from 40 adultDermacentor variabilisticks. The nested 23S-5S IGS assay detectedRickettsiaDNA in 45% of the ticks, while the amplification rates of the other three assays ranged between 5 and 20%. The novel PCR assay was validated using clinical samples from humans and laboratory animals that were known to be infected with pathogenic species ofRickettsia The nested 23S-5S IGS PCR assay was coupled with reverse line blot hybridization with species-specific probes for high-throughput detection and simultaneous identification of the species ofRickettsiain the ticks. "CandidatusRickettsia amblyommii,"R. montanensis,R. felis, andR. belliiwere frequently identified species, along with some potentially novelRickettsiastrains that were closely related toR. belliiandR. conorii. Copyright © 2016 Kakumanu et al.

  19. Development and Validation of a Disease-Specific Instrument to Measure Diet-Targeted Quality of Life for Postoperative Patients with Esophagogastric Cancer.

    PubMed

    Honda, Michitaka; Wakita, Takafumi; Onishi, Yoshihiro; Nunobe, Souya; Miura, Akinori; Nishigori, Tatsuto; Kusanagi, Hiroshi; Yamamoto, Takatsugu; Boddy, Alexander; Fukuhara, Shunichi

    2015-12-01

    Patients who have undergone esophagectomy or gastrectomy have certain dietary limitations because of changes to the alimentary tract. This study attempted to develop a psychometric scale, named "Esophago-Gastric surgery and Quality of Dietary life (EGQ-D)," for assessment of impact of upper gastrointestinal surgery on diet-targeted quality of life. Using qualitative methods, the study team interviewed both patients and surgeons involved in esophagogastric cancer surgery, and we prepared an item pool and a draft scale. To evaluate the scale's psychometric reliability and validity, a survey involving a large number of patients was conducted. Items for the final scale were selected by factor analysis and item response theory. Cronbach's alpha was used for assessment of reliability, and correlations with the short form (SF)-12, esophagus and stomach surgery symptom scale (ES(4)), and nutritional indicators were analyzed to assess the criterion-related validity. Through multifaceted discussion and the pilot study, a draft questionnaire comprising 14 items was prepared, and a total of 316 patients were enrolled. On the basis of factor analysis and item response theory, six items were excluded, and the remaining eight items demonstrated strong unidimensionality for the final scale. Cronbach's alpha was 0.895. There were significant associations with all the subscale scores for SF-12, ES(4), and nutritional indicators. The EGQ-D scale has good contents and psychometric validity and can be used to evaluate disease-specific instrument to measure diet-targeted quality of life for postoperative patients with esophagogastric cancer.

  20. Laboratory validation of MEMS-based sensors for post-earthquake damage assessment image

    NASA Astrophysics Data System (ADS)

    Pozzi, Matteo; Zonta, Daniele; Santana, Juan; Colin, Mikael; Saillen, Nicolas; Torfs, Tom; Amditis, Angelos; Bimpas, Matthaios; Stratakos, Yorgos; Ulieru, Dumitru; Bairaktaris, Dimitirs; Frondistou-Yannas, Stamatia; Kalidromitis, Vasilis

    2011-04-01

    The evaluation of seismic damage is today almost exclusively based on visual inspection, as building owners are generally reluctant to install permanent sensing systems, due to their high installation, management and maintenance costs. To overcome this limitation, the EU-funded MEMSCON project aims to produce small size sensing nodes for measurement of strain and acceleration, integrating Micro-Electro-Mechanical Systems (MEMS) based sensors and Radio Frequency Identification (RFID) tags in a single package that will be attached to reinforced concrete buildings. To reduce the impact of installation and management, data will be transmitted to a remote base station using a wireless interface. During the project, sensor prototypes were produced by assembling pre-existing components and by developing ex-novo miniature devices with ultra-low power consumption and sensing performance beyond that offered by sensors available on the market. The paper outlines the device operating principles, production scheme and working at both unit and network levels. It also reports on validation campaigns conducted in the laboratory to assess system performance. Accelerometer sensors were tested on a reduced scale metal frame mounted on a shaking table, back to back with reference devices, while strain sensors were embedded in both reduced and full-scale reinforced concrete specimens undergoing increasing deformation cycles up to extensive damage and collapse. The paper assesses the economical sustainability and performance of the sensors developed for the project and discusses their applicability to long-term seismic monitoring.

  1. Walk this way: validity evidence of iphone health application step count in laboratory and free-living conditions.

    PubMed

    Duncan, Markus J; Wunderlich, Kelly; Zhao, Yingying; Faulkner, Guy

    2018-08-01

    Several attempts have been made to demonstrate the accuracy of the iPhone pedometer function in laboratory test conditions. However, no studies have attempted to evaluate evidence of convergent validity of the iPhone step counts as a surveillance tool in the field. This study takes a pragmatic approach to evaluating Health application derived iPhone step counts by measuring accuracy of a standardized criterion iPhone SE and a heterogeneous sample of participant owned iPhones (6 or newer) in a laboratory condition, as well as comparing personal iPhones to accelerometer derived steps in a free-living test. During lab tests, criterion and personal iPhones differed from manually counted steps by a mean bias of less than ±5% when walking at 5km/h, 7.5km/h and 10km/h on a treadmill, which is generally considered acceptable for pedometers. In the free-living condition steps differed by a mean bias of 21.5% or 1340 steps/day when averaged across observation days. Researchers should be cautioned in considering the use of iPhone models as a research grade pedometer for physical activity surveillance or evaluation, likely due to the iPhone not being continually carried by participants; if compliance can be maximized then the iPhone might be suitable.

  2. Novices in surgery are the target group of a virtual reality training laboratory.

    PubMed

    Hassan, Iyad; Maschuw, Katja; Rothmund, Matthias; Koller, Michael; Gerdes, Berthold

    2006-01-01

    This study aims to establish which physicians represent the suitable target group of a virtual training laboratory. Novices (48 physicians with fewer than 10 laparoscopic operations) and intermediate trainees (19 physicians who performed 30-50 laparoscopic operations) participated in this study. Each participant performed the basic module 'clip application' at the beginning and after a 1-hour short training course on the LapSim. The course consisted of the tasks coordination, lift and grasp, clip application, cutting with diathermy and fine dissection at increasing difficulty levels. The time taken to complete the tasks, number of errors, and economy of motion parameters (path length and angular path) were analyzed. Following training with the simulator, novices completed the task significantly faster (p = 0.001), demonstrated a greater economy of motion [path length (p = 0.04) and angular path (p = 0.01)]. In contrast, the intermediate trainees showed a reduction of their errors, but without reaching statistical significance. They showed no improvement in economy of motion and completed the task significantly slower (p = 0.03). Novices, in comparison to intermediate trainees, tend to benefit most during their first exposure to a laparoscopy simulator.

  3. National laboratory policies and plans in sub-Saharan African countries: gaps and opportunities

    PubMed Central

    van der Broek, Ankie; Jansen, Christel; de Bruijn, Hilde; Schultsz, Constance

    2017-01-01

    Background The 2008 Maputo Declaration calls for the development of dedicated national laboratory policies and strategic plans supporting the enhancement of laboratory services in response to the long-lasting relegation of medical laboratory systems in sub-Saharan Africa. Objectives This study describes the extent to which laboratories are addressed in the national health policies and plans created directly following the 2008 momentum for laboratory strengthening. Method National health policies and plans from 39 sub-Saharan African countries, valid throughout and beyond 31 December 2010 were collected in March 2012 and analysed during 2013. Results Laboratories were addressed by all countries. Human resources were the most addressed topic (38/39) and finances and budget were the least addressed (< 5/39). Countries lagging behind in national laboratory strategic planning at the end of 2013 (17/39) were more likely to be francophone countries located in West-Central Africa (13/17) and have historically low HIV prevalence. The most common gaps anticipated to compromise the implementation of the policies and plans were the disconnect between policies and plans, under-developed finance sections and monitoring and evaluating frameworks, absence of points of reference to define gaps and shortages, and inappropriate governance structure. Conclusion The availability of laboratory policy and plan implementation can be improved by strictly applying a more standardised methodology for policy development, using harmonised norms to set targets for improvement and intensifying the establishment of directorates of laboratory services directly under the authority of Ministries of Health. Horizontal programmes such as the Global Health Security Agenda could provide the necessary impulse to take the least advanced countries on board. PMID:28879152

  4. Good Laboratory Practice. Part 1. An Introduction

    ERIC Educational Resources Information Center

    Wedlich, Richard C.; Libera, Agata E.; Pires, Amanda; Therrien, Matthew T.

    2013-01-01

    The Good Laboratory Practice (GLP) regulations were put into place in 1978. They establish a standard of practice to ensure that results from the nonclinical laboratory study reported to the U.S. Food and Drug Administration (FDA) are valid and that the study report accurately reflects the conduct of the study. While the GLP regulations promulgate…

  5. Preanalytical management: serum vacuum tubes validation for routine clinical chemistry.

    PubMed

    Lima-Oliveira, Gabriel; Lippi, Giuseppe; Salvagno, Gian Luca; Montagnana, Martina; Picheth, Geraldo; Guidi, Gian Cesare

    2012-01-01

    The validation process is essential in accredited clinical laboratories. Aim of this study was to validate five kinds of serum vacuum tubes for routine clinical chemistry laboratory testing. Blood specimens from 100 volunteers in five different serum vacuum tubes (Tube I: VACUETTE, Tube II: LABOR IMPORT, Tube III: S-Monovette, Tube IV: SST and Tube V: SST II) were collected by a single, expert phlebotomist. The routine clinical chemistry tests were analyzed on cobas 6000 module. The significance of the differences between samples was assessed by paired Student's t-test after checking for normality. The level of statistical significance was set at P < 0.005. Finally, the biases from Tube I, Tube II, Tube III, Tube IV and Tube V were compared with the current desirable quality specifications for bias (B), derived from biological variation. Basically, our validation will permit the laboratory or hospital managers to select the brand's vacuum tubes validated according him/her technical or economical reasons, in order to perform the following laboratory tests: glucose, total cholesterol, high density lipoprotein-cholesterol, triglycerides, total protein, albumin, blood urea nitrogen, uric acid, alkaline phosphatise, aspartate aminotransferase, gamma-glutamyltransferase, lactate dehydrogenase, creatine kinase, total bilirubin, direct bilirubin, calcium, iron, sodium and potassium. On the contrary special attention will be required if the laboratory already performs creatinine, amylase, phosphate and magnesium determinations and the quality laboratory manager intend to change the serum tubes. We suggest that laboratory management should both standardize the procedures and frequently evaluate the quality of in vitro diagnostic devices.

  6. Using Pulsed Power for Hydrodynamic Code Validation

    DTIC Science & Technology

    2001-06-01

    Air Force Research Laboratory ( AFRL ). A...bank at the Air Force Research Laboratory ( AFRL ). A cylindrical aluminum liner that is magnetically imploded onto a central target by self-induced...James Degnan, George Kiuttu Air Force Research Laboratory Albuquerque, NM 87117 Abstract As part of ongoing hydrodynamic code

  7. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for detection of genotoxic carcinogens: II. Summary of definitive validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Beevers, Carol; De Boeck, Marlies; Burlinson, Brian; Hobbs, Cheryl A; Kitamoto, Sachiko; Kraynak, Andrew R; McNamee, James; Nakagawa, Yuzuki; Pant, Kamala; Plappert-Helbig, Ulla; Priestley, Catherine; Takasawa, Hironao; Wada, Kunio; Wirnitzer, Uta; Asano, Norihide; Escobar, Patricia A; Lovell, David; Morita, Takeshi; Nakajima, Madoka; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this exercise was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The study protocol was optimized in the pre-validation studies, and then the definitive (4th phase) validation study was conducted in two steps. In the 1st step, assay reproducibility was confirmed among laboratories using four coded reference chemicals and the positive control ethyl methanesulfonate. In the 2nd step, the predictive capability was investigated using 40 coded chemicals with known genotoxic and carcinogenic activity (i.e., genotoxic carcinogens, genotoxic non-carcinogens, non-genotoxic carcinogens, and non-genotoxic non-carcinogens). Based on the results obtained, the in vivo comet assay is concluded to be highly capable of identifying genotoxic chemicals and therefore can serve as a reliable predictor of rodent carcinogenicity. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. OECD validation study to assess intra- and inter-laboratory reproducibility of the zebrafish embryo toxicity test for acute aquatic toxicity testing.

    PubMed

    Busquet, François; Strecker, Ruben; Rawlings, Jane M; Belanger, Scott E; Braunbeck, Thomas; Carr, Gregory J; Cenijn, Peter; Fochtman, Przemyslaw; Gourmelon, Anne; Hübler, Nicole; Kleensang, André; Knöbel, Melanie; Kussatz, Carola; Legler, Juliette; Lillicrap, Adam; Martínez-Jerónimo, Fernando; Polleichtner, Christian; Rzodeczko, Helena; Salinas, Edward; Schneider, Katharina E; Scholz, Stefan; van den Brandhof, Evert-Jan; van der Ven, Leo T M; Walter-Rohde, Susanne; Weigt, Stefan; Witters, Hilda; Halder, Marlies

    2014-08-01

    The OECD validation study of the zebrafish embryo acute toxicity test (ZFET) for acute aquatic toxicity testing evaluated the ZFET reproducibility by testing 20 chemicals at 5 different concentrations in 3 independent runs in at least 3 laboratories. Stock solutions and test concentrations were analytically confirmed for 11 chemicals. Newly fertilised zebrafish eggs (20/concentration and control) were exposed for 96h to chemicals. Four apical endpoints were recorded daily as indicators of acute lethality: coagulation of the embryo, lack of somite formation, non-detachment of the tail bud from the yolk sac and lack of heartbeat. Results (LC50 values for 48/96h exposure) show that the ZFET is a robust method with a good intra- and inter-laboratory reproducibility (CV<30%) for most chemicals and laboratories. The reproducibility was lower (CV>30%) for some very toxic or volatile chemicals, and chemicals tested close to their limit of solubility. The ZFET is now available as OECD Test Guideline 236. Considering the high predictive capacity of the ZFET demonstrated by Belanger et al. (2013) in their retrospective analysis of acute fish toxicity and fish embryo acute toxicity data, the ZFET is ready to be considered for acute fish toxicity for regulatory purposes. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Reference method for detection of Pgp mediated multidrug resistance in human hematological malignancies: a method validated by the laboratories of the French Drug Resistance Network.

    PubMed

    Huet, S; Marie, J P; Gualde, N; Robert, J

    1998-12-15

    Multidrug resistance (MDR) associated with overexpression of the MDR1 gene and of its product, P-glycoprotein (Pgp), plays an important role in limiting cancer treatment efficacy. Many studies have investigated Pgp expression in clinical samples of hematological malignancies but failed to give definitive conclusion on its usefulness. One convenient method for fluorescent detection of Pgp in malignant cells is flow cytometry which however gives variable results from a laboratory to another one, partly due to the lack of a reference method rigorously tested. The purpose of this technical note is to describe each step of a reference flow cytometric method. The guidelines for sample handling, staining and analysis have been established both for Pgp detection with monoclonal antibodies directed against extracellular epitopes (MRK16, UIC2 and 4E3), and for Pgp functional activity measurement with Rhodamine 123 as a fluorescent probe. Both methods have been validated on cultured cell lines and clinical samples by 12 laboratories of the French Drug Resistance Network. This cross-validated multicentric study points out crucial steps for the accuracy and reproducibility of the results, like cell viability, data analysis and expression.

  10. Virtual Reality for Enhanced Ecological Validity and Experimental Control in the Clinical, Affective and Social Neurosciences.

    PubMed

    Parsons, Thomas D

    2015-01-01

    An essential tension can be found between researchers interested in ecological validity and those concerned with maintaining experimental control. Research in the human neurosciences often involves the use of simple and static stimuli lacking many of the potentially important aspects of real world activities and interactions. While this research is valuable, there is a growing interest in the human neurosciences to use cues about target states in the real world via multimodal scenarios that involve visual, semantic, and prosodic information. These scenarios should include dynamic stimuli presented concurrently or serially in a manner that allows researchers to assess the integrative processes carried out by perceivers over time. Furthermore, there is growing interest in contextually embedded stimuli that can constrain participant interpretations of cues about a target's internal states. Virtual reality environments proffer assessment paradigms that combine the experimental control of laboratory measures with emotionally engaging background narratives to enhance affective experience and social interactions. The present review highlights the potential of virtual reality environments for enhanced ecological validity in the clinical, affective, and social neurosciences.

  11. Feeling validated yet? A scoping review of the use of consumer-targeted wearable and mobile technology to measure and improve sleep.

    PubMed

    Baron, Kelly Glazer; Duffecy, Jennifer; Berendsen, Mark A; Cheung Mason, Ivy; Lattie, Emily G; Manalo, Natalie C

    2017-12-20

    The objectives of this review were to evaluate the use of consumer-targeted wearable and mobile sleep monitoring technology, identify gaps in the literature and determine the potential for use in behavioral interventions. We undertook a scoping review of studies conducted in adult populations using consumer-targeted wearable technology or mobile devices designed to measure and/or improve sleep. After screening for inclusion/exclusion criteria, data were extracted from the articles by two co-authors. Articles included in the search were using wearable or mobile technology to estimate or evaluate sleep, published in English and conducted in adult populations. Our search returned 3897 articles and 43 met our inclusion criteria. Results indicated that the majority of studies focused on validating technology to measure sleep (n = 23) or were observational studies (n = 10). Few studies were used to identify sleep disorders (n = 2), evaluate response to interventions (n = 3) or deliver interventions (n = 5). In conclusion, the use of consumer-targeted wearable and mobile sleep monitoring technology has largely focused on validation of devices and applications compared with polysomnography (PSG) but opportunities exist for observational research and for delivery of behavioral interventions. Multidisciplinary research is needed to determine the uses of these technologies in interventions as well as the use in more diverse populations including sleep disorders and other patient populations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments

    PubMed Central

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-01-01

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272

  13. Validation of isoleucine utilization targets in Plasmodium falciparum

    PubMed Central

    Istvan, Eva S.; Dharia, Neekesh V.; Bopp, Selina E.; Gluzman, Ilya; Winzeler, Elizabeth A.; Goldberg, Daniel E.

    2011-01-01

    Intraerythrocytic malaria parasites can obtain nearly their entire amino acid requirement by degrading host cell hemoglobin. The sole exception is isoleucine, which is not present in adult human hemoglobin and must be obtained exogenously. We evaluated two compounds for their potential to interfere with isoleucine utilization. Mupirocin, a clinically used antibacterial, kills Plasmodium falciparum parasites at nanomolar concentrations. Thiaisoleucine, an isoleucine analog, also has antimalarial activity. To identify targets of the two compounds, we selected parasites resistant to either mupirocin or thiaisoleucine. Mutants were analyzed by genome-wide high-density tiling microarrays, DNA sequencing, and copy number variation analysis. The genomes of three independent mupirocin-resistant parasite clones had all acquired either amplifications encompassing or SNPs within the chromosomally encoded organellar (apicoplast) isoleucyl-tRNA synthetase. Thiaisoleucine-resistant parasites had a mutation in the cytoplasmic isoleucyl-tRNA synthetase. The role of this mutation in thiaisoleucine resistance was confirmed by allelic replacement. This approach is generally useful for elucidation of new targets in P. falciparum. Our study shows that isoleucine utilization is an essential pathway that can be targeted for antimalarial drug development. PMID:21205898

  14. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  15. Multi-analyte validation in heterogeneous solution by ELISA.

    PubMed

    Lakshmipriya, Thangavel; Gopinath, Subash C B; Hashim, Uda; Murugaiyah, Vikneswaran

    2017-12-01

    Enzyme Linked Immunosorbent Assay (ELISA) is a standard assay that has been used widely to validate the presence of analyte in the solution. With the advancement of ELISA, different strategies have shown and became a suitable immunoassay for a wide range of analytes. Herein, we attempted to provide additional evidence with ELISA, to show its suitability for multi-analyte detection. To demonstrate, three clinically relevant targets have been chosen, which include 16kDa protein from Mycobacterium tuberculosis, human blood clotting Factor IXa and a tumour marker Squamous Cell Carcinoma antigen. Indeed, we adapted the routine steps from the conventional ELISA to validate the occurrence of analytes both in homogeneous and heterogeneous solutions. With the homogeneous and heterogeneous solutions, we could attain the sensitivity of 2, 8 and 1nM for the targets 16kDa protein, FIXa and SSC antigen, respectively. Further, the specific multi-analyte validations were evidenced with the similar sensitivities in the presence of human serum. ELISA assay in this study has proven its applicability for the genuine multiple target validation in the heterogeneous solution, can be followed for other target validations. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Validation of the sperm class analyser CASA system for sperm counting in a busy diagnostic semen analysis laboratory.

    PubMed

    Dearing, Chey G; Kilburn, Sally; Lindsay, Kevin S

    2014-03-01

    Sperm counts have been linked to several fertility outcomes making them an essential parameter of semen analysis. It has become increasingly recognised that Computer-Assisted Semen Analysis (CASA) provides improved precision over manual methods but that systems are seldom validated robustly for use. The objective of this study was to gather the evidence to validate or reject the Sperm Class Analyser (SCA) as a tool for routine sperm counting in a busy laboratory setting. The criteria examined were comparison with the Improved Neubauer and Leja 20-μm chambers, within and between field precision, sperm concentration linearity from a stock diluted in semen and media, accuracy against internal and external quality material, assessment of uneven flow effects and a receiver operating characteristic (ROC) analysis to predict fertility in comparison with the Neubauer method. This work demonstrates that SCA CASA technology is not a standalone 'black box', but rather a tool for well-trained staff that allows rapid, high-number sperm counting providing errors are identified and corrected. The system will produce accurate, linear, precise results, with less analytical variance than manual methods that correlate well against the Improved Neubauer chamber. The system provides superior predictive potential for diagnosing fertility problems.

  17. Collaborative trial validation studies of real-time PCR-based GMO screening methods for detection of the bar gene and the ctp2-cp4epsps construct.

    PubMed

    Grohmann, Lutz; Brünen-Nieweler, Claudia; Nemeth, Anne; Waiblinger, Hans-Ulrich

    2009-10-14

    Polymerase Chain Reaction (PCR)-based screening methods targeting genetic elements commonly used in genetically modified (GM) plants are important tools for the detection of GM materials in food, feed, and seed samples. To expand and harmonize the screening capability of enforcement laboratories, the German Federal Office of Consumer Protection and Food Safety conducted collaborative trials for interlaboratory validation of real-time PCR methods for detection of the phosphinothricin acetyltransferase (bar) gene from Streptomyces hygroscopicus and a construct containing the 5-enolpyruvylshikimate-3-phosphate synthase gene from Agrobacterium tumefaciens sp. strain CP4 (ctp2-cp4epsps), respectively. To assess the limit of detection, precision, and accuracy of the methods, laboratories had to analyze two sets of 18 coded genomic DNA samples of events LLRice62 and MS8 with the bar method and NK603 and GT73 with the ctp2-cp4epsps method at analyte levels of 0, 0.02, and 0.1% GM content, respectively. In addition, standard DNAs were provided to the laboratories to generate calibration curves for copy number quantification of the bar and ctp2-cp4epsps target sequences present in the test samples. The study design and the results obtained are discussed with respect to the difficult issue of developing general guidelines and concepts for the collaborative trial validation of qualitative PCR screening methods.

  18. Laboratory compliance with the American Society of Clinical Oncology/college of American Pathologists guidelines for human epidermal growth factor receptor 2 testing: a College of American Pathologists survey of 757 laboratories.

    PubMed

    Nakhleh, Raouf E; Grimm, Erin E; Idowu, Michael O; Souers, Rhona J; Fitzgibbons, Patrick L

    2010-05-01

    To ensure quality human epidermal growth receptor 2 (HER2) testing in breast cancer, the American Society of Clinical Oncology/College of American Pathologists guidelines were introduced with expected compliance by 2008. To assess the effect these guidelines have had on pathology laboratories and their ability to address key components. In late 2008, a survey was distributed with the HER2 immunohistochemistry (IHC) proficiency testing program. It included questions regarding pathology practice characteristics and assay validation using fluorescence in situ hybridization or another IHC laboratory assay and assessed pathologist HER2 scoring competency. Of the 907 surveys sent, 757 (83.5%) were returned. The median laboratory accessioned 15 000 cases and performed 190 HER2 tests annually. Quantitative computer image analysis was used by 33% of laboratories. In-house fluorescence in situ hybridization was performed in 23% of laboratories, and 60% of laboratories addressed the 6- to 48-hour tissue fixation requirement by embedding tissue on the weekend. HER2 testing was performed on the initial biopsy in 40%, on the resection specimen in 6%, and on either in 56% of laboratories. Testing was validated with only fluorescence in situ hybridization in 47% of laboratories, whereas 10% of laboratories used another IHC assay only; 13% used both assays, and 12% and 15% of laboratories had not validated their assays or chose "not applicable" on the survey question, respectively. The 90% concordance rate with fluorescence in situ hybridization results was achieved by 88% of laboratories for IHC-negative findings and by 81% of laboratories for IHC-positive cases. The 90% concordance rate for laboratories using another IHC assay was achieved by 80% for negative findings and 75% for positive cases. About 91% of laboratories had a pathologist competency assessment program. This survey demonstrates the extent and characteristics of HER2 testing. Although some American Society of

  19. Multi-MW accelerator target material properties under proton irradiation at Brookhaven National Laboratory linear isotope producer

    NASA Astrophysics Data System (ADS)

    Simos, N.; Ludewig, H.; Kirk, H.; Dooryhee, E.; Ghose, S.; Zhong, Z.; Zhong, H.; Makimura, S.; Yoshimura, K.; Bennett, J. R. J.; Kotsinas, G.; Kotsina, Z.; McDonald, K. T.

    2018-05-01

    The effects of proton beams irradiating materials considered for targets in high-power accelerator experiments have been studied using the Brookhaven National Laboratory's (BNL) 200 MeV proton linac. A wide array of materials and alloys covering a wide range of the atomic number (Z) are being scoped by the high-power accelerator community prompting the BNL studies to focus on materials representing each distinct range, i.e. low-Z, mid-Z and high-Z. The low range includes materials such as beryllium and graphite, the midrange alloys such as Ti-6Al-4V, gum metal and super-Invar and finally the high-Z range pure tungsten and tantalum. Of interest in assessing proton irradiation effects are (a) changes in physiomechanical properties which are important in maintaining high-power target functionality, (b) identification of possible limits of proton flux or fluence above which certain materials cease to maintain integrity, (c) the role of material operating temperature in inducing or maintaining radiation damage reversal, and (d) phase stability and microstructural changes. The paper presents excerpt results deduced from macroscopic and microscopic post-irradiation evaluation (PIE) following several irradiation campaigns conducted at the BNL 200 MeV linac and specifically at the isotope producer beam-line/target station. The microscopic PIE relied on high energy x-ray diffraction at the BNL NSLS X17B1 and NSLS II XPD beam lines. The studies reveal the dramatic effects of irradiation on phase stability in several of the materials, changes in physical properties and ductility loss as well as thermally induced radiation damage reversal in graphite and alloys such as super-Invar.

  20. Inter-laboratory validation of the modified murine local lymph node assay based on 5-bromo-2'-deoxyuridine incorporation.

    PubMed

    Kojima, Hajime; Takeyoshi, Masahiro; Sozu, Takashi; Awogi, Takumi; Arima, Kazunori; Idehara, Kenji; Ikarashi, Yoshiaki; Kanazawa, Yukiko; Maki, Eiji; Omori, Takashi; Yuasa, Atsuko; Yoshimura, Isao

    2011-01-01

    The murine local lymph node assay (LLNA) is a well-established alternative to the guinea pig maximization test (GPMT) or Buehler test (BT) for the assessment of the skin sensitizing ability of a drug, cosmetic material, pesticide or industrial chemical. Instead of radioisotope using in this method, Takeyoshi M. et al. (2001) has developed a modified LLNA based on the 5-bromo-2'-deoxyuridine (BrdU) incorporation (LLNA:BrdU-ELISA). The LLNA:BrdU-ELISA is practically identical to the LLNA methodology excluding the use of BrdU, for which a single intraperitoneal injection of BrdU is made on day 4, and colorimetric detection of cell turnover. We conducted the validation study to evaluate the reliability and relevance of LLNA:BrdU-ELISA. The experiment involved 7 laboratories, wherein 10 chemicals were examined under blinded conditions. In this study, 3 chemicals were examined in all laboratories and the remaining 7 were examined in 3 laboratories. The data were expressed as the BrdU incorporation using an ELISA method for each group, and the stimulation index (SI) for each chemical-treated group was determined as the increase in the BrdU incorporation relative to the concurrent vehicle control group. An SI of 2 was set as the cut-off value for exhibiting skin sensitization activity. The results obtained in the experiments conducted for all 10 chemicals were sufficiently consistent with small variations in their SI values. The sensitivity, specificity, and accuracy of LLNA:BrdU-ELISA against those of GPMT/BT were 7/7 (100%), 3/3 (100%), and 10/10 (100%), respectively. Copyright © 2010 John Wiley & Sons, Ltd.

  1. Genomic approach to therapeutic target validation identifies a glucose-lowering GLP1R variant protective for coronary heart disease

    PubMed Central

    Scott, Robert A.; Freitag, Daniel F.; Li, Li; Chu, Audrey Y.; Surendran, Praveen; Young, Robin; Grarup, Niels; Stancáková, Alena; Chen, Yuning; V.Varga, Tibor; Yaghootkar, Hanieh; Luan, Jian'an; Zhao, Jing Hua; Willems, Sara M.; Wessel, Jennifer; Wang, Shuai; Maruthur, Nisa; Michailidou, Kyriaki; Pirie, Ailith; van der Lee, Sven J.; Gillson, Christopher; Olama, Ali Amin Al; Amouyel, Philippe; Arriola, Larraitz; Arveiler, Dominique; Aviles-Olmos, Iciar; Balkau, Beverley; Barricarte, Aurelio; Barroso, Inês; Garcia, Sara Benlloch; Bis, Joshua C.; Blankenberg, Stefan; Boehnke, Michael; Boeing, Heiner; Boerwinkle, Eric; Borecki, Ingrid B.; Bork-Jensen, Jette; Bowden, Sarah; Caldas, Carlos; Caslake, Muriel; Cupples, L. Adrienne; Cruchaga, Carlos; Czajkowski, Jacek; den Hoed, Marcel; Dunn, Janet A.; Earl, Helena M.; Ehret, Georg B.; Ferrannini, Ele; Ferrieres, Jean; Foltynie, Thomas; Ford, Ian; Forouhi, Nita G.; Gianfagna, Francesco; Gonzalez, Carlos; Grioni, Sara; Hiller, Louise; Jansson, Jan-Håkan; Jørgensen, Marit E.; Jukema, J. Wouter; Kaaks, Rudolf; Kee, Frank; Kerrison, Nicola D.; Key, Timothy J.; Kontto, Jukka; Kote-Jarai, Zsofia; Kraja, Aldi T.; Kuulasmaa, Kari; Kuusisto, Johanna; Linneberg, Allan; Liu, Chunyu; Marenne, Gaëlle; Mohlke, Karen L.; Morris, Andrew P.; Muir, Kenneth; Müller-Nurasyid, Martina; Munroe, Patricia B.; Navarro, Carmen; Nielsen, Sune F.; Nilsson, Peter M.; Nordestgaard, Børge G.; Packard, Chris J.; Palli, Domenico; Panico, Salvatore; Peloso, Gina M.; Perola, Markus; Peters, Annette; Poole, Christopher J.; Quirós, J. Ramón; Rolandsson, Olov; Sacerdote, Carlotta; Salomaa, Veikko; Sánchez, María-José; Sattar, Naveed; Sharp, Stephen J.; Sims, Rebecca; Slimani, Nadia; Smith, Jennifer A.; Thompson, Deborah J.; Trompet, Stella; Tumino, Rosario; van der A, Daphne L.; van der Schouw, Yvonne T.; Virtamo, Jarmo; Walker, Mark; Walter, Klaudia; Abraham, Jean E.; Amundadottir, Laufey T.; Aponte, Jennifer L.; Butterworth, Adam S.; Dupuis, Josée; Easton, Douglas F.; Eeles, Rosalind A.; Erdmann, Jeanette; Franks, Paul W.; Frayling, Timothy M.; Hansen, Torben; Howson, Joanna M. M.; Jørgensen, Torben; Kooner, Jaspal; Laakso, Markku; Langenberg, Claudia; McCarthy, Mark I.; Pankow, James S.; Pedersen, Oluf; Riboli, Elio; Rotter, Jerome I.; Saleheen, Danish; Samani, Nilesh J.; Schunkert, Heribert; Vollenweider, Peter; O'Rahilly, Stephen; Deloukas, Panos; Danesh, John; Goodarzi, Mark O.; Kathiresan, Sekar; Meigs, James B.; Ehm, Margaret G.; Wareham, Nicholas J.; Waterworth, Dawn M.

    2016-01-01

    Regulatory authorities have indicated that new drugs to treat type 2 diabetes (T2D) should not be associated with an unacceptable increase in cardiovascular risk. Human genetics may be able to inform development of antidiabetic therapies by predicting cardiovascular and other health endpoints. We therefore investigated the association of variants in 6 genes that encode drug targets for obesity or T2D with a range of metabolic traits in up to 11,806 individuals by targeted exome sequencing, and follow-up in 39,979 individuals by targeted genotyping, with additional in silico follow up in consortia. We used these data to first compare associations of variants in genes encoding drug targets with the effects of pharmacological manipulation of those targets in clinical trials. We then tested the association those variants with disease outcomes, including coronary heart disease, to predict cardiovascular safety of these agents. A low-frequency missense variant (Ala316Thr;rs10305492) in the gene encoding glucagon-like peptide-1 receptor (GLP1R), the target of GLP1R agonists, was associated with lower fasting glucose and lower T2D risk, consistent with GLP1R agonist therapies. The minor allele was also associated with protection against heart disease, thus providing evidence that GLP1R agonists are not likely to be associated with an unacceptable increase in cardiovascular risk. Our results provide an encouraging signal that these agents may be associated with benefit, a question currently being addressed in randomised controlled trials. Genetic variants associated with metabolic traits and multiple disease outcomes can be used to validate therapeutic targets at an early stage in the drug development process. PMID:27252175

  2. Preanalytical management: serum vacuum tubes validation for routine clinical chemistry

    PubMed Central

    Lima-Oliveira, Gabriel; Lippi, Giuseppe; Salvagno, Gian Luca; Montagnana, Martina; Picheth, Geraldo; Guidi, Gian Cesare

    2012-01-01

    Introduction The validation process is essential in accredited clinical laboratories. Aim of this study was to validate five kinds of serum vacuum tubes for routine clinical chemistry laboratory testing. Materials and methods: Blood specimens from 100 volunteers in five diff erent serum vacuum tubes (Tube I: VACUETTE®, Tube II: LABOR IMPORT®, Tube III: S-Monovette®, Tube IV: SST® and Tube V: SST II®) were collected by a single, expert phlebotomist. The routine clinical chemistry tests were analyzed on cobas® 6000 module. The significance of the diff erences between samples was assessed by paired Student’s t-test after checking for normality. The level of statistical significance was set at P < 0.005. Finally, the biases from Tube I, Tube II, Tube III, Tube IV and Tube V were compared with the current desirable quality specifications for bias (B), derived from biological variation. Results and conclusions: Basically, our validation will permit the laboratory or hospital managers to select the brand’s vacuum tubes validated according him/her technical or economical reasons, in order to perform the following laboratory tests: glucose, total cholesterol, high density lipoprotein-cholesterol, triglycerides, total protein, albumin, blood urea nitrogen, uric acid, alkaline phosphatise, aspartate aminotransferase, gamma-glutamyltransferase, lactate dehydrogenase, creatine kinase, total bilirubin, direct bilirubin, calcium, iron, sodium and potassium. On the contrary special attention will be required if the laboratory already performs creatinine, amylase, phosphate and magnesium determinations and the quality laboratory manager intend to change the serum tubes. We suggest that laboratory management should both standardize the procedures and frequently evaluate the quality of in vitro diagnostic devices. PMID:22838184

  3. The Development of Laboratory Safety Questionnaire for Middle School Science Teachers

    ERIC Educational Resources Information Center

    Akpullukcu, Simge; Cavas, Bulent

    2017-01-01

    The purpose of this paper is to develop a "valid and reliable laboratory safety questionnaire" which could be used to identify science teachers' understanding about laboratory safety issues during their science laboratory activities. The questionnaire was developed from a literature review and prior instruments developed on laboratory…

  4. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  5. 42 CFR 493.569 - Consequences of a finding of noncompliance as a result of a validation inspection.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... result of a validation inspection. 493.569 Section 493.569 Public Health CENTERS FOR MEDICARE & MEDICAID... validation inspection. (a) Laboratory with a certificate of accreditation. If a validation inspection results... validation inspection results in a finding that a CLIA-exempt laboratory is out of compliance with one or...

  6. 42 CFR 493.569 - Consequences of a finding of noncompliance as a result of a validation inspection.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... result of a validation inspection. 493.569 Section 493.569 Public Health CENTERS FOR MEDICARE & MEDICAID... validation inspection. (a) Laboratory with a certificate of accreditation. If a validation inspection results... validation inspection results in a finding that a CLIA-exempt laboratory is out of compliance with one or...

  7. 42 CFR 493.569 - Consequences of a finding of noncompliance as a result of a validation inspection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... result of a validation inspection. 493.569 Section 493.569 Public Health CENTERS FOR MEDICARE & MEDICAID... validation inspection. (a) Laboratory with a certificate of accreditation. If a validation inspection results... validation inspection results in a finding that a CLIA-exempt laboratory is out of compliance with one or...

  8. 42 CFR 493.569 - Consequences of a finding of noncompliance as a result of a validation inspection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... result of a validation inspection. 493.569 Section 493.569 Public Health CENTERS FOR MEDICARE & MEDICAID... validation inspection. (a) Laboratory with a certificate of accreditation. If a validation inspection results... validation inspection results in a finding that a CLIA-exempt laboratory is out of compliance with one or...

  9. 42 CFR 493.569 - Consequences of a finding of noncompliance as a result of a validation inspection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... result of a validation inspection. 493.569 Section 493.569 Public Health CENTERS FOR MEDICARE & MEDICAID... validation inspection. (a) Laboratory with a certificate of accreditation. If a validation inspection results... validation inspection results in a finding that a CLIA-exempt laboratory is out of compliance with one or...

  10. Mechanism-Based Tumor-Targeting Drug Delivery System. Validation of Efficient Vitamin Receptor-Mediated Endocytosis and Drug Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, S.; Wong, S.; Zhao, X.

    An efficient mechanism-based tumor-targeting drug delivery system, based on tumor-specific vitamin-receptor mediated endocytosis, has been developed. The tumor-targeting drug delivery system is a conjugate of a tumor-targeting molecule (biotin: vitamin H or vitamin B-7), a mechanism-based self-immolative linker and a second-generation taxoid (SB-T-1214) as the cytotoxic agent. This conjugate (1) is designed to be (i) specific to the vitamin receptors overexpressed on tumor cell surface and (ii) internalized efficiently through receptor-mediated endocytosis, followed by smooth drug release via glutathione-triggered self-immolation of the linker. In order to monitor and validate the sequence of events hypothesized, i.e., receptor-mediated endocytosis of the conjugate,more » drug release, and drug-binding to the target protein (microtubules), three fluorescent/fluorogenic molecular probes (2, 3, and 4) were designed and synthesized. The actual occurrence of these processes was unambiguously confirmed by means of confocal fluorescence microscopy (CFM) and flow cytometry using L1210FR leukemia cells, overexpressing biotin receptors. The molecular probe 4, bearing the taxoid linked to fluorescein, was also used to examine the cell specificity (i.e., efficacy of receptor-based cell targeting) for three cell lines, L1210FR (biotin receptors overexpressed), L1210 (biotin receptors not overexpressed), and WI38 (normal human lung fibroblast, biotin receptor negative). As anticipated, the molecular probe 4 exhibited high specificity only to L1210FR. To confirm the direct correlation between the cell-specific drug delivery and anticancer activity of the probe 4, its cytotoxicity against these three cell lines was also examined. The results clearly showed a good correlation between the two methods. In the same manner, excellent cell-specific cytotoxicity of the conjugate 1 (without fluorescein attachment to the taxoid) against the same three cell lines was confirmed. This

  11. Science Laboratory Learning Environments in Junior Secondary Schools

    ERIC Educational Resources Information Center

    Kwok, Ping Wai

    2015-01-01

    A Chinese version of the Science Laboratory Environment Inventory (SLEI) was used to study the students' perceptions of the actual and preferred laboratory learning environments in Hong Kong junior secondary science lessons. Valid responses of the SLEI from 1932 students of grade 7 to grade 9 indicated that an open-ended inquiry approach seldom…

  12. Selecting clinical quality indicators for laboratory medicine.

    PubMed

    Barth, Julian H

    2012-05-01

    Quality in laboratory medicine is often described as doing the right test at the right time for the right person. Laboratory processes currently operate under the oversight of an accreditation body which gives confidence that the process is good. However, there are aspects of quality that are not measured by these processes. These are largely focused on ensuring that the most clinically appropriate test is performed and interpreted correctly. Clinical quality indicators were selected through a two-phase process. Firstly, a series of focus groups of clinical scientists were held with the aim of developing a list of quality indicators. These were subsequently ranked in order by an expert panel of primary and secondary care physicians. The 10 top indicators included the communication of critical results, comprehensive education to all users and adequate quality assurance for point-of-care testing. Laboratories should ensure their tests are used to national standards, that they have clinical utility, are calibrated to national standards and have long-term stability for chronic disease management. Laboratories should have error logs and demonstrate evidence of measures introduced to reduce chances of similar future errors. Laboratories should make a formal scientific evaluation of analytical quality. This paper describes the process of selection of quality indicators for laboratory medicine that have been validated sequentially by deliverers and users of the service. They now need to be converted into measureable variables related to outcome and validated in practice.

  13. Decision support for clinical laboratory capacity planning.

    PubMed

    van Merode, G G; Hasman, A; Derks, J; Goldschmidt, H M; Schoenmaker, B; Oosten, M

    1995-01-01

    The design of a decision support system for capacity planning in clinical laboratories is discussed. The DSS supports decisions concerning the following questions: how should the laboratory be divided into job shops (departments/sections), how should staff be assigned to workstations and how should samples be assigned to workstations for testing. The decision support system contains modules for supporting decisions at the overall laboratory level (concerning the division of the laboratory into job shops) and for supporting decisions at the job shop level (assignment of staff to workstations and sample scheduling). Experiments with these modules are described showing both the functionality and the validity.

  14. Effects of borehole design on complex electrical resistivity measurements: laboratory validation and numerical experiments

    NASA Astrophysics Data System (ADS)

    Treichel, A.; Huisman, J. A.; Zhao, Y.; Zimmermann, E.; Esser, O.; Kemna, A.; Vereecken, H.

    2012-12-01

    Geophysical measurements within a borehole are typically affected by the presence of the borehole. The focus of the current study is to quantify the effect of borehole design on broadband electrical impedance tomography (EIT) measurements within boreholes. Previous studies have shown that effects on the real part of the electrical resistivity are largest for boreholes with large diameters and for materials with a large formation factor. However, these studies have not considered the effect of the well casing and the filter gravel on the measurement of the real part of the electrical resistivity. In addition, the effect of borehole design on the imaginary part of the electrical resistivity has not been investigated yet. Therefore, the aim of this study is to investigate the effect of borehole design on the complex electrical resistivity using laboratory measurements and numerical simulations. In order to do so, we developed a high resolution two dimensional axisymmetric finite element model (FE) that enables us to simulate the effects of several key borehole design parameters (e.g. borehole diameter, thickness of PVC well casing) on the measurement process. For the material surrounding the borehole, realistic values for complex resistivity were obtained from a database of laboratory measurements of complex resistivity from the test site Krauthausen (Germany). The slotted PVC well casing is represented by an effective resistivity calculated from the water-filled slot volume and the PVC volume. Measurements with and without PVC well casing were made with a four-electrode EIT logging tool in a water-filled rain barrel. The initial comparison for the case that the logging tool was inserted in the PVC well casing showed a considerable mismatch between measured and modeled values. It was required to consider a complete electrode model instead of point electrodes to remove this mismatch. This validated model was used to investigate in detail how complex resistivity

  15. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments.

    PubMed

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-07-08

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Expansion of the Scope of AOAC First Action Method 2012.25--Single-Laboratory Validation of Triphenylmethane Dye and Leuco Metabolite Analysis in Shrimp, Tilapia, Catfish, and Salmon by LC-MS/MS.

    PubMed

    Andersen, Wendy C; Casey, Christine R; Schneider, Marilyn J; Turnipseed, Sherri B

    2015-01-01

    Prior to conducting a collaborative study of AOAC First Action 2012.25 LC-MS/MS analytical method for the determination of residues of three triphenylmethane dyes (malachite green, crystal violet, and brilliant green) and their metabolites (leucomalachite green and leucocrystal violet) in seafood, a single-laboratory validation of method 2012.25 was performed to expand the scope of the method to other seafood matrixes including salmon, catfish, tilapia, and shrimp. The validation included the analysis of fortified and incurred residues over multiple weeks to assess analyte stability in matrix at -80°C, a comparison of calibration methods over the range 0.25 to 4 μg/kg, study of matrix effects for analyte quantification, and qualitative identification of targeted analytes. Method accuracy ranged from 88 to 112% with 13% RSD or less for samples fortified at 0.5, 1.0, and 2.0 μg/kg. Analyte identification and determination limits were determined by procedures recommended both by the U. S. Food and Drug Administration and the European Commission. Method detection limits and decision limits ranged from 0.05 to 0.24 μg/kg and 0.08 to 0.54 μg/kg, respectively. AOAC First Action Method 2012.25 with an extracted matrix calibration curve and internal standard correction is suitable for the determination of triphenylmethane dyes and leuco metabolites in salmon, catfish, tilapia, and shrimp by LC-MS/MS at a residue determination level of 0.5 μg/kg or below.

  17. Development and validation of the Italian version of the Mobile Application Rating Scale and its generalisability to apps targeting primary prevention.

    PubMed

    Domnich, Alexander; Arata, Lucia; Amicizia, Daniela; Signori, Alessio; Patrick, Bernard; Stoyanov, Stoyan; Hides, Leanne; Gasparini, Roberto; Panatto, Donatella

    2016-07-07

    A growing body of literature affirms the usefulness of mobile technologies, including mobile applications (apps), in the primary prevention field. The quality of health apps, which today number in the thousands, is a crucial parameter, as it may affect health-related decision-making and outcomes among app end-users. The mobile application rating scale (MARS) has recently been developed to evaluate the quality of such apps, and has shown good psychometric properties. Since there is no standardised tool for assessing the apps available in Italian app stores, the present study developed and validated an Italian version of MARS in apps targeting primary prevention. The original 23-item version of the MARS assesses mobile app quality in four objective quality dimensions (engagement, functionality, aesthetics, information) and one subjective dimension. Validation of this tool involved several steps; the universalist approach to achieving equivalence was adopted. Following two backward translations, a reconciled Italian version of MARS was produced and compared with the original scale. On the basis of sample size estimation, 48 apps from three major app stores were downloaded; the first 5 were used for piloting, while the remaining 43 were used in the main study in order to assess the psychometric properties of the scale. The apps were assessed by two raters, each working independently. The psychometric properties of the final version of the scale was assessed including the inter-rater reliability, internal consistency, convergent, divergent and concurrent validities. The intralingual equivalence of the Italian version of the MARS was confirmed by the authors of the original scale. A total of 43 apps targeting primary prevention were tested. The MARS displayed acceptable psychometric properties. The MARS total score showed an excellent level of both inter-rater agreement (intra-class correlation coefficient of .96) and internal consistency (Cronbach's α of .90 and .91

  18. Mercury and Cyanide Data Validation

    EPA Pesticide Factsheets

    Document designed to offer data reviewers guidance in determining the validity ofanalytical data generated through the USEPA Contract Laboratory Program (CLP) Statement ofWork (SOW) ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  19. Low/Medium Volatile Data Validation

    EPA Pesticide Factsheets

    Document designed to offer data reviewers guidance in determining the validity of analytical data generated through the US EPA Contract Laboratory Program Statement of Work ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  20. CTD2 Dashboard: a searchable web interface to connect validated results from the Cancer Target Discovery and Development Network

    PubMed Central

    Aksoy, Bülent Arman; Dančík, Vlado; Smith, Kenneth; Mazerik, Jessica N.; Ji, Zhou; Gross, Benjamin; Nikolova, Olga; Jaber, Nadia; Califano, Andrea; Schreiber, Stuart L.; Gerhard, Daniela S.; Hermida, Leandro C.; Jagu, Subhashini

    2017-01-01

    Abstract The Cancer Target Discovery and Development (CTD2) Network aims to use functional genomics to accelerate the translation of high-throughput and high-content genomic and small-molecule data towards use in precision oncology. As part of this goal, and to share its conclusions with the research community, the Network developed the ‘CTD2 Dashboard’ [https://ctd2-dashboard.nci.nih.gov/], which compiles CTD2 Network-generated conclusions, termed ‘observations’, associated with experimental entities, collected by its member groups (‘Centers’). Any researcher interested in learning about a given gene, protein, or compound (a ‘subject’) studied by the Network can come to the CTD2 Dashboard to quickly and easily find, review, and understand Network-generated experimental results. In particular, the Dashboard allows visitors to connect experiments about the same target, biomarker, etc., carried out by multiple Centers in the Network. The Dashboard’s unique knowledge representation allows information to be compiled around a subject, so as to become greater than the sum of the individual contributions. The CTD2 Network has broadly defined levels of validation for evidence (‘Tiers’) pertaining to a particular finding, and the CTD2 Dashboard uses these Tiers to indicate the extent to which results have been validated. Researchers can use the Network’s insights and tools to develop a new hypothesis or confirm existing hypotheses, in turn advancing the findings towards clinical applications. Database URL: https://ctd2-dashboard.nci.nih.gov/ PMID:29220450

  1. Design and validation of the eyesafe ladar testbed (ELT) using the LadarSIM system simulator

    NASA Astrophysics Data System (ADS)

    Neilsen, Kevin D.; Budge, Scott E.; Pack, Robert T.; Fullmer, R. Rees; Cook, T. Dean

    2009-05-01

    The development of an experimental full-waveform LADAR system has been enhanced with the assistance of the LadarSIM system simulation software. The Eyesafe LADAR Test-bed (ELT) was designed as a raster scanning, single-beam, energy-detection LADAR with the capability of digitizing and recording the return pulse waveform at up to 2 GHz for 3D off-line image formation research in the laboratory. To assist in the design phase, the full-waveform LADAR simulation in LadarSIM was used to simulate the expected return waveforms for various system design parameters, target characteristics, and target ranges. Once the design was finalized and the ELT constructed, the measured specifications of the system and experimental data captured from the operational sensor were used to validate the behavior of the system as predicted during the design phase. This paper presents the methodology used, and lessons learned from this "design, build, validate" process. Simulated results from the design phase are presented, and these are compared to simulated results using measured system parameters and operational sensor data. The advantages of this simulation-based process are also presented.

  2. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models.

    PubMed

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com .

  3. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models

    NASA Astrophysics Data System (ADS)

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com.

  4. Target Acquisition: Human Observer Performance Studies and TARGAC Model Validation

    DTIC Science & Technology

    1994-06-01

    complete Scenario I run were displayed on a monitor. A transparent sheet was attached to the face of the monitor, the approach route was sketched on...track, passed a mark on the monitor face , the target was at a stop-sign location. The IRIG-B time of that instance was noted, and, since the distance of...route right 100- 80- 60 - 40 C.) 20 [- ion, A20 recognitio 0 0 . identification,C.) . 1I . I - - -• , 4) 1000 2000 3000 4000 SP Target G Q Target I0

  5. A validated methodology for determination of laboratory instrument computer interface efficacy

    NASA Astrophysics Data System (ADS)

    1984-12-01

    This report is intended to provide a methodology for determining when, and for which instruments, direct interfacing of laboratory instrument and laboratory computers is beneficial. This methodology has been developed to assist the Tri-Service Medical Information Systems Program Office in making future decisions regarding laboratory instrument interfaces. We have calculated the time savings required to reach a break-even point for a range of instrument interface prices and corresponding average annual costs. The break-even analyses used empirical data to estimate the number of data points run per day that are required to meet the break-even point. The results indicate, for example, that at a purchase price of $3,000, an instrument interface will be cost-effective if the instrument is utilized for at least 154 data points per day if operated in the continuous mode, or 216 points per day if operated in the discrete mode. Although this model can help to ensure that instrument interfaces are cost effective, additional information should be considered in making the interface decisions. A reduction in results transcription errors may be a major benefit of instrument interfacing.

  6. Rover-based visual target tracking validation and mission infusion

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Steele, Robert D.; Ansar, Adnan I.; Ali, Khaled; Nesnas, Issa

    2005-01-01

    The Mars Exploration Rovers (MER'03), Spirit and Opportunity, represent the state of the art in rover operations on Mars. This paper presents validation experiments of different visual tracking algorithms using the rover's navigation camera.

  7. Differential Impact of Plasma Proteins on the Adhesion Efficiency of Vascular-Targeted Carriers (VTCs) in Blood of Common Laboratory Animals.

    PubMed

    Namdee, Katawut; Sobczynski, Daniel J; Onyskiw, Peter J; Eniola-Adefeso, Omolola

    2015-12-16

    Vascular-targeted carrier (VTC) interaction with human plasma is known to reduce targeted adhesion efficiency in vitro. However, the role of plasma proteins on the adhesion efficiency of VTCs in laboratory animals remains unknown. Here, in vitro blood flow assays are used to explore the effects of plasma from mouse, rabbit, and porcine on VTC adhesion. Porcine blood exhibited a strong negative plasma effect on VTC adhesion while no significant plasma effect was found with rabbit and mouse blood. A brush density poly(ethylene glycol) (PEG) on VTCs was effective at improving adhesion of microsized, but not nanosized, VTCs in porcine blood. Overall, the results suggest that porcine models, as opposed to mouse, can serve as better models in preclinical research for predicting the in vivo functionality of VTCs for use in humans. These considerations hold great importance for the design of various pharmaceutical products and development of reliable drug delivery systems.

  8. Measuring laboratory-based influenza surveillance capacity: development of the 'International Influenza Laboratory Capacity Review' Tool.

    PubMed

    Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C

    2016-01-01

    The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    NASA Astrophysics Data System (ADS)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  10. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation.

    PubMed

    Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M

    2015-09-07

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo(®) TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus(®) chamber. An EBT3(®) film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  11. Building a pipeline to discover and validate novel therapeutic targets and lead compounds for Alzheimer's disease

    PubMed Central

    Bennett, David A.; Yu, Lei; De Jager, Philip L.

    2014-01-01

    Cognitive decline, Alzheimer's disease (AD) and other causes are major public health problems worldwide. With changing demographics, the number of persons with dementia will increase rapidly. The treatment and prevention of AD and other dementias, therefore, is an urgent unmet need. There have been considerable advances in understanding the biology of many age-related disorders that cause dementia. Gains in understanding AD have led to the development of ante-mortem biomarkers of traditional neuropathology and the conduct of several phase III interventions in the amyloid-β cascade early in the disease process. Many other intervention strategies are in various stages of development. However, efforts to date have met with limited success. A recent National Institute on Aging Research Summit led to a number of requests for applications. One was to establish multi-disciplinary teams of investigators who use systems biology approaches and stem cell technology to identify a new generation of AD targets. We were recently awarded one of three such grants to build a pipeline that integrates epidemiology, systems biology, and stem cell technology to discover and validate novel therapeutic targets and lead compounds for AD treatment and prevention. Here we describe the two cohorts that provide the data and biospecimens being exploited for our pipeline and describe the available unique datasets. Second, we present evidence in support of a chronic disease model of AD that informs our choice of phenotypes as the target outcome. Third, we provide an overview of our approach. Finally, we present the details of our planned drug discovery pipeline. PMID:24508835

  12. Gene-environment interactions and construct validity in preclinical models of psychiatric disorders.

    PubMed

    Burrows, Emma L; McOmish, Caitlin E; Hannan, Anthony J

    2011-08-01

    optimize 'environmental construct validity' in animal models, while maintaining comparability between laboratories, so as to ensure optimal scientific and medical outcomes. Utilizing more sophisticated models to elucidate the relative contributions of genetic and environmental factors will allow for improved construct, face and predictive validity, thus facilitating the identification of novel therapeutic targets. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Nucleic Acid Amplification Based Diagnostic of Lyme (Neuro-)borreliosis – Lost in the Jungle of Methods, Targets, and Assays?

    PubMed Central

    Nolte, Oliver

    2012-01-01

    Laboratory based diagnosis of infectious diseases usually relies on culture of the disease causing micro-organism, followed by identification and susceptibility testing. Since Borrelia burgdorferi sensu lato, the etiologic agent of Lyme disease or Lyme borreliosis, requires very specific culture conditions (e.g. specific liquid media, long term cul-ture) traditional bacteriology is often not done on a routine basis. Instead, confirmation of the clinical diagnosis needs ei-ther indirect techniques (like serology or measurement of cellular activity in the presence of antigens) or direct but culture independent techniques, like microscopy or nucleic acid amplification techniques (NAT), with polymerase chain reaction (PCR) being the most frequently applied NAT method in routine laboratories. NAT uses nucleic acids of the disease causing micro-organism as template for amplification, isolated from various sources of clinical specimens. Although the underlying principle, adoption of the enzymatic process running during DNA duplication prior to prokaryotic cell division, is comparatively easy, a couple of ‘pitfalls’ is associated with the technique itself as well as with interpretation of the results. At present, no commercial, CE-marked and sufficiently validated PCR assay is available. A number of homebrew assays have been published, which are different in terms of target (i.e. the gene targeted by the amplification primers), method (nested PCR, PCR followed by hybridization, real-time PCR) and validation criteria. Inhibitory compounds may lead to false negative results, if no appropriate internal control is included. Carry-over of amplicons, insufficient handling and workflow and/or insufficiently validated targets/primers may result in false positive results. Different targets may yield different analytical sensitivity, depending, among other factors, of the redundancy of a target gene in the genome. Per-formance characteristics (e.g. analytical sensitivity and

  14. A single-laboratory validated method for the generation of DNA barcodes for the identification of fish for regulatory compliance.

    PubMed

    Handy, Sara M; Deeds, Jonathan R; Ivanova, Natalia V; Hebert, Paul D N; Hanner, Robert H; Ormos, Andrea; Weigt, Lee A; Moore, Michelle M; Yancy, Haile F

    2011-01-01

    The U.S. Food and Drug Administration is responsible for ensuring that the nation's food supply is safe and accurately labeled. This task is particularly challenging in the case of seafood where a large variety of species are marketed, most of this commodity is imported, and processed product is difficult to identify using traditional morphological methods. Reliable species identification is critical for both foodborne illness investigations and for prevention of deceptive practices, such as those where species are intentionally mislabeled to circumvent import restrictions or for resale as species of higher value. New methods that allow accurate and rapid species identifications are needed, but any new methods to be used for regulatory compliance must be both standardized and adequately validated. "DNA barcoding" is a process by which species discriminations are achieved through the use of short, standardized gene fragments. For animals, a fragment (655 base pairs starting near the 5' end) of the cytochrome c oxidase subunit 1 mitochondrial gene has been shown to provide reliable species level discrimination in most cases. We provide here a protocol with single-laboratory validation for the generation of DNA barcodes suitable for the identification of seafood products, specifically fish, in a manner that is suitable for FDA regulatory use.

  15. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs

  16. Beamed Energy Propulsion by Means of Target Ablation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Benjamin A.

    2004-03-30

    This paper describes hundreds of pendulum tests examining the beamed energy conversion efficiency of different metal targets coated with multiple liquid enhancers. Preliminary testing used a local laser with photographic paper targets, with no liquid, water, canola oil, or methanol additives. Laboratory experimentation was completed at Wright-Patterson AFB using a high-powered laser, and ballistic pendulums of aluminum, titanium, or copper. Dry targets, and those coated with water, methanol and oil were repeatedly tested in laboratory conditions. Results were recorded on several high-speed digital video cameras, and the conversion efficiency was calculated. Paper airplanes successfully launched using BEP were likewise recorded.

  17. The validation index: a new metric for validation of segmentation algorithms using two or more expert outlines with application to radiotherapy planning.

    PubMed

    Juneja, Prabhjot; Evans, Philp M; Harris, Emma J

    2013-08-01

    Validation is required to ensure automated segmentation algorithms are suitable for radiotherapy target definition. In the absence of true segmentation, algorithmic segmentation is validated against expert outlining of the region of interest. Multiple experts are used to overcome inter-expert variability. Several approaches have been studied in the literature, but the most appropriate approach to combine the information from multiple expert outlines, to give a single metric for validation, is unclear. None consider a metric that can be tailored to case-specific requirements in radiotherapy planning. Validation index (VI), a new validation metric which uses experts' level of agreement was developed. A control parameter was introduced for the validation of segmentations required for different radiotherapy scenarios: for targets close to organs-at-risk and for difficult to discern targets, where large variation between experts is expected. VI was evaluated using two simulated idealized cases and data from two clinical studies. VI was compared with the commonly used Dice similarity coefficient (DSCpair - wise) and found to be more sensitive than the DSCpair - wise to the changes in agreement between experts. VI was shown to be adaptable to specific radiotherapy planning scenarios.

  18. Inter-laboratory analysis of selected genetically modified plant reference materials with digital PCR.

    PubMed

    Dobnik, David; Demšar, Tina; Huber, Ingrid; Gerdes, Lars; Broeders, Sylvia; Roosens, Nancy; Debode, Frederic; Berben, Gilbert; Žel, Jana

    2018-01-01

    Digital PCR (dPCR), as a new technology in the field of genetically modified (GM) organism (GMO) testing, enables determination of absolute target copy numbers. The purpose of our study was to test the transferability of methods designed for quantitative PCR (qPCR) to dPCR and to carry out an inter-laboratory comparison of the performance of two different dPCR platforms when determining the absolute GM copy numbers and GM copy number ratio in reference materials certified for GM content in mass fraction. Overall results in terms of measured GM% were within acceptable variation limits for both tested dPCR systems. However, the determined absolute copy numbers for individual genes or events showed higher variability between laboratories in one third of the cases, most possibly due to variability in the technical work, droplet size variability, and analysis of the raw data. GMO quantification with dPCR and qPCR was comparable. As methods originally designed for qPCR performed well in dPCR systems, already validated qPCR assays can most generally be used for dPCR technology with the purpose of GMO detection. Graphical abstract The output of three different PCR-based platforms was assessed in an inter-laboratory comparison.

  19. Revealing the Effects of the Herbal Pair of Euphorbia kansui and Glycyrrhiza on Hepatocellular Carcinoma Ascites with Integrating Network Target Analysis and Experimental Validation

    PubMed Central

    Zhang, Yanqiong; Lin, Ya; Zhao, Haiyu; Guo, Qiuyan; Yan, Chen; Lin, Na

    2016-01-01

    Although the herbal pair of Euphorbia kansui (GS) and Glycyrrhiza (GC) is one of the so-called "eighteen antagonistic medicaments" in Chinese medicinal literature, it is prescribed in a classic Traditional Chinese Medicine (TCM) formula Gansui-Banxia-Tang for cancerous ascites, suggesting that GS and GC may exhibit synergistic or antagonistic effects in different combination designs. Here, we modeled the effects of GS/GC combination with a target interaction network and clarified the associations between the network topologies involving the drug targets and the drug combination effects. Moreover, the "edge-betweenness" values, which is defined as the frequency with which edges are placed on the shortest paths between all pairs of modules in network, were calculated, and the ADRB1-PIK3CG interaction exhibited the greatest edge-betweenness value, suggesting its crucial role in connecting the other edges in the network. Because ADRB1 and PIK3CG were putative targets of GS and GC, respectively, and both had functional interactions with AVPR2 approved as known therapeutic target for ascites, we proposed that the ADRB1-PIK3CG-AVPR2 signal axis might be involved in the effects of the GS-GC combination on ascites. This proposal was further experimentally validated in a H22 hepatocellular carcinoma (HCC) ascites model. Collectively, this systems-level investigation integrated drug target prediction and network analysis to reveal the combination principles of the herbal pair of GS and GC. Experimental validation in an in vivo system provided convincing evidence that different combination designs of GS and GC might result in synergistic or antagonistic effects on HCC ascites that might be partially related to their regulation of the ADRB1-PIK3CG-AVPR2 signal axis. PMID:27143956

  20. Open access chemical probes for epigenetic targets

    PubMed Central

    Brown, Peter J; Müller, Susanne

    2015-01-01

    Background High attrition rates in drug discovery call for new approaches to improve target validation. Academia is filling gaps, but often lacks the experience and resources of the pharmaceutical industry resulting in poorly characterized tool compounds. Discussion The SGC has established an open access chemical probe consortium, currently encompassing ten pharmaceutical companies. One of its mandates is to create well-characterized inhibitors (chemical probes) for epigenetic targets to enable new biology and target validation for drug development. Conclusion Epigenetic probe compounds have proven to be very valuable and have not only spurred a plethora of novel biological findings, but also provided starting points for clinical trials. These probes have proven to be critical complementation to traditional genetic targeting strategies and provided sometimes surprising results. PMID:26397018

  1. RobOKoD: microbial strain design for (over)production of target compounds.

    PubMed

    Stanford, Natalie J; Millard, Pierre; Swainston, Neil

    2015-01-01

    Sustainable production of target compounds such as biofuels and high-value chemicals for pharmaceutical, agrochemical, and chemical industries is becoming an increasing priority given their current dependency upon diminishing petrochemical resources. Designing these strains is difficult, with current methods focusing primarily on knocking-out genes, dismissing other vital steps of strain design including the overexpression and dampening of genes. The design predictions from current methods also do not translate well-into successful strains in the laboratory. Here, we introduce RobOKoD (Robust, Overexpression, Knockout and Dampening), a method for predicting strain designs for overproduction of targets. The method uses flux variability analysis to profile each reaction within the system under differing production percentages of target-compound and biomass. Using these profiles, reactions are identified as potential knockout, overexpression, or dampening targets. The identified reactions are ranked according to their suitability, providing flexibility in strain design for users. The software was tested by designing a butanol-producing Escherichia coli strain, and was compared against the popular OptKnock and RobustKnock methods. RobOKoD shows favorable design predictions, when predictions from these methods are compared to a successful butanol-producing experimentally-validated strain. Overall RobOKoD provides users with rankings of predicted beneficial genetic interventions with which to support optimized strain design.

  2. Spectra Transfer Between a Fourier Transform Near-Infrared Laboratory and a Miniaturized Handheld Near-Infrared Spectrometer.

    PubMed

    Hoffmann, Uwe; Pfeifer, Frank; Hsuing, Chang; Siesler, Heinz W

    2016-05-01

    The aim of this contribution is to demonstrate the transfer of spectra that have been measured on two different laboratory Fourier transform near-infrared (FT-NIR) spectrometers to the format of a handheld instrument by measuring only a few samples with both spectrometer types. Thus, despite the extreme differences in spectral range and resolution, spectral data sets that have been collected and quantitative as well as qualitative calibrations that have been developed thereof, respectively, over a long period on a laboratory instrument can be conveniently transferred to the handheld system. Thus, the necessity to prepare completely new calibration samples and the effort required to develop calibration models when changing hardware platforms is minimized. The enabling procedure is based on piecewise direct standardization (PDS) and will be described for the data sets of a quantitative and a qualitative application case study. For this purpose the spectra measured on the FT-NIR laboratory spectrometers were used as "master" data and transferred to the "target" format of the handheld instrument. The quantitative test study refers to transmission spectra of three-component liquid solvent mixtures whereas the qualitative application example encompasses diffuse reflection spectra of six different current polymers. To prove the performance of the transfer procedure for quantitative applications, partial least squares (PLS-1) calibrations were developed for the individual components of the solvent mixtures with spectra transferred from the master to the target instrument and the cross-validation parameters were compared with the corresponding parameters obtained for spectra measured on the master and target instruments, respectively. To test the retention of the discrimination ability of the transferred polymer spectra sets principal component analyses (PCAs) were applied exemplarily for three of the six investigated polymers and their identification was demonstrated by

  3. Maui Space Surveillance System Satellite Categorization Laboratory

    NASA Astrophysics Data System (ADS)

    Deiotte, R.; Guyote, M.; Kelecy, T.; Hall, D.; Africano, J.; Kervin, P.

    The MSSS satellite categorization laboratory is a fusion of robotics and digital imaging processes that aims to decompose satellite photometric characteristics and behavior in a controlled setting. By combining a robot, light source and camera to acquire non-resolved images of a model satellite, detailed photometric analyses can be performed to extract relevant information about shape features, elemental makeup, and ultimately attitude and function. Using the laboratory setting a detailed analysis can be done on any type of material or design and the results cataloged in a database that will facilitate object identification by "curve-fitting" individual elements in the basis set to observational data that might otherwise be unidentifiable. Currently the laboratory has created, an ST-Robotics five degree of freedom robotic arm, collimated light source and non-focused Apogee camera have all been integrated into a MATLAB based software package that facilitates automatic data acquisition and analysis. Efforts to date have been aimed at construction of the lab as well as validation and verification of simple geometric objects. Simple tests on spheres, cubes and simple satellites show promising results that could lead to a much better understanding of non-resolvable space object characteristics. This paper presents a description of the laboratory configuration and validation test results with emphasis on the non-resolved photometric characteristics for a variety of object shapes, spin dynamics and orientations. The future vision, utility and benefits of the laboratory to the SSA community as a whole are also discussed.

  4. Numerical simulations of impacts involving porous bodies. II. Comparison with laboratory experiments

    NASA Astrophysics Data System (ADS)

    Jutzi, Martin; Michel, Patrick; Hiraoka, Kensuke; Nakamura, Akiko M.; Benz, Willy

    2009-06-01

    In this paper, we compare the outcome of high-velocity impact experiments on porous targets, composed of pumice, with the results of simulations by a 3D SPH hydrocode in which a porosity model has been implemented. The different populations of small bodies of our Solar System are believed to be composed, at least partially, of objects with a high degree of porosity. To describe the fragmentation of such porous objects, a different model is needed than that used for non-porous bodies. In the case of porous bodies, the impact process is not only driven by the presence of cracks which propagate when a stress threshold is reached, it is also influenced by the crushing of pores and compaction. Such processes can greatly affect the whole body's response to an impact. Therefore, another physical model is necessary to improve our understanding of the collisional process involving porous bodies. Such a model has been developed recently and introduced successfully in a 3D SPH hydrocode [Jutzi, M., Benz, W., Michel, P., 2008. Icarus 198, 242-255]. Basic tests have been performed which already showed that it is implemented in a consistent way and that theoretical solutions are well reproduced. However, its full validation requires that it is also capable of reproducing the results of real laboratory impact experiments. Here we present simulations of laboratory experiments on pumice targets for which several of the main material properties have been measured. We show that using the measured material properties and keeping the remaining free parameters fixed, our numerical model is able to reproduce the outcome of these experiments carried out under different impact conditions. This first complete validation of our model, which will be tested for other porous materials in the future, allows us to start addressing problems at larger scale related to small bodies of our Solar System, such as collisions in the Kuiper Belt or the formation of a family by the disruption of a porous

  5. DNA decontamination methods for internal quality management in clinical PCR laboratories.

    PubMed

    Wu, Yingping; Wu, Jianyong; Zhang, Zhihui; Cheng, Chen

    2018-03-01

    The polymerase chain reaction (PCR) technique, one of the most commonly applied methods in diagnostic and molecular biology, has a frustrating downside: the occurrence of false-positive signals due to contamination. In previous research, various DNA decontamination methods have been developed to overcome this limitation. Unfortunately, the use of random or poorly focused sampling methods for monitoring air and/or object surfaces leads to the incomplete elimination during decontamination procedures. We herein attempted to develop a novel DNA decontamination method (environmental surveillance, including surface and air sampling) and quality management program for clinical molecular diagnostic laboratories (or clinical PCR laboratories). Here, we performed a step-by-step evaluation of current DNA decontamination methods and developed an effective procedure for assessing the presence of decontaminating DNA via PCR analysis. Performing targeted environmental surveillance by sampling, which reached optimal performance over 2 weeks, and the decontamination process had been verified as reliable. Additionally, the process was validated to not affect PCR amplification efficiency based on a comparative study. In this study, effective guidelines for DNA decontamination were developed. The method employed ensured that surface DNA contamination could be effectively identified and eliminated. Furthermore, our study highlighted the importance of overall quality assurance and good clinical laboratory practices for preventing contamination, which are key factors for compliance with regulatory or accreditation requirements. Taken together, we provided the evidence that the presented scheme ranged from troubleshooting to the elimination of surface contamination, could serve as critical foundation for developing regular environmental surveillance guidelines for PCR laboratories. © 2017 Wiley Periodicals, Inc.

  6. Validation conform ISO-15189 of assays in the field of autoimmunity: Joint efforts in The Netherlands.

    PubMed

    Mulder, Leontine; van der Molen, Renate; Koelman, Carin; van Leeuwen, Ester; Roos, Anja; Damoiseaux, Jan

    2018-05-01

    ISO 15189:2012 requires validation of methods used in the medical laboratory, and lists a series of performance parameters for consideration to include. Although these performance parameters are feasible for clinical chemistry analytes, application in the validation of autoimmunity tests is a challenge. Lack of gold standards or reference methods in combination with the scarcity of well-defined diagnostic samples of patients with rare diseases make validation of new assays difficult. The present manuscript describes the initiative of Dutch medical immunology laboratory specialists to combine efforts and perform multi-center validation studies of new assays in the field of autoimmunity. Validation data and reports are made available to interested Dutch laboratory specialists. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Japanese Society for Laboratory Hematology flow cytometric reference method of determining the differential leukocyte count: external quality assurance using fresh blood samples.

    PubMed

    Kawai, Y; Nagai, Y; Ogawa, E; Kondo, H

    2017-04-01

    To provide target values for the manufacturers' survey of the Japanese Society for Laboratory Hematology (JSLH), accurate standard data from healthy volunteers were needed for the five-part differential leukocyte count. To obtain such data, JSLH required an antibody panel that achieved high specificity (particularly for mononuclear cells) using simple gating procedures. We developed a flow cytometric method for determining the differential leukocyte count (JSLH-Diff) and validated it by comparison with the flow cytometric differential leukocyte count of the International Council for Standardization in Haematology (ICSH-Diff) and the manual differential count obtained by microscopy (Manual-Diff). First, the reference laboratory performed an imprecision study of JSLH-Diff and ICSH-Diff, as well as performing comparison among JSLH-Diff, Manual-Diff, and ICSH-Diff. Then two reference laboratories and seven participating laboratories performed imprecision and accuracy studies of JSLH-Diff, Manual-Diff, and ICSH-Diff. Simultaneously, six manufacturers' laboratories provided their own representative values by using automated hematology analyzers. The precision of both JSLH-Diff and ICSH-Diff methods was adequate. Comparison by the reference laboratory showed that all correlation coefficients, slopes and intercepts obtained by the JSLH-Diff, ICSH-Diff, and Manual-Diff methods conformed to the criteria. When the imprecision and accuracy of JSLH-Diff were assessed at seven laboratories, the CV% for lymphocytes, neutrophils, monocytes, eosinophils, and basophils was 0.5~0.9%, 0.3~0.7%, 1.7~2.6%, 3.0~7.9%, and 3.8~10.4%, respectively. More than 99% of CD45 positive leukocytes were identified as normal leukocytes by JSLH-Diff. When JSLH-Diff method were validated by comparison with Manual-Diff and ICSH-Diff, JSLH-Diff showed good performance as a reference method. © 2016 John Wiley & Sons Ltd.

  8. Determination of lipophilic toxins by LC/MS/MS: single-laboratory validation.

    PubMed

    Villar-González, Adriano; Rodríguez-Velasco, María Luisa; Gago-Martínez, Ana

    2011-01-01

    An LC/MS/MS method has been developed, assessed, and intralaboratory-validated for the analysis of the lipophilic toxins currently regulated by European Union legislation: okadaic acid (OA) and dinophysistoxins 1 and 2, including their ester forms; azaspiracids 1, 2, and 3; pectenotoxins 1 and 2; yessotoxin (YTX), and the analogs 45 OH-YTX, Homo YTX, and 45 OH-Homo YTX; as well as for the analysis of 13-desmetil-spirolide C. The method consists of duplicate sample extraction with methanol and direct analysis of the crude extract without further cleanup or concentration. Ester forms of OA and dinophysistoxins are detected as the parent ions after alkaline hydrolysis of the extract. The validation process of this method was performed using both fortified and naturally contaminated samples, and experiments were designed according to International Organization for Standardization, International Union of Pure and Applied Chemistry, and AOAC guidelines. With the exception of YTX in fortified samples, RSDr below 15% and RSDR were below 25%. Recovery values were between 77 and 95%, and LOQs were below 60 microg/kg. These data together with validation experiments for recovery, selectivity, robustness, traceability, and linearity, as well as uncertainty calculations, are presented in this paper.

  9. Comparison of Continuous Wave CO2 Doppler Lidar Calibration Using Earth Surface Targets in Laboratory and Airborne Measurements

    NASA Technical Reports Server (NTRS)

    Jarzembski, Maurice A.; Srivastava, Vandana

    1999-01-01

    Routine backscatter, beta, measurements by an airborne or space-based lidar from designated earth surfaces with known and fairly uniform beta properties can potentially offer lidar calibration opportunities. This can in turn be used to obtain accurate atmospheric aerosol and cloud beta measurements on large spatial scales. This is important because achieving a precise calibration factor for large pulsed lidars then need not rest solely on using a standard hard target procedure. Furthermore, calibration from designated earth surfaces would provide an inflight performance evaluation of the lidar. Hence, with active remote sensing using lasers with high resolution data, calibration of a space-based lidar using earth's surfaces will be extremely useful. The calibration methodology using the earth's surface initially requires measuring beta of various earth surfaces simulated in the laboratory using a focused continuous wave (CW) CO2 Doppler lidar and then use these beta measurements as standards for the earth surface signal from airborne or space-based lidars. Since beta from the earth's surface may be retrieved at different angles of incidence, beta would also need to be measured at various angles of incidences of the different surfaces. In general, Earth-surface reflectance measurements have been made in the infrared, but the use of lidars to characterize them and in turn use of the Earth's surface to calibrate lidars has not been made. The feasibility of this calibration methodology is demonstrated through a comparison of these laboratory measurements with actual earth surface beta retrieved from the same lidar during the NASA/Multi-center Airborne Coherent Atmospheric Wind Sensor (MACAWS) mission on NASA's DC8 aircraft from 13 - 26 September, 1995. For the selected earth surface from the airborne lidar data, an average beta for the surface was established and the statistics of lidar efficiency was determined. This was compared with the actual lidar efficiency

  10. Science Laboratory Environment and Academic Performance

    NASA Astrophysics Data System (ADS)

    Aladejana, Francisca; Aderibigbe, Oluyemisi

    2007-12-01

    The study determined how students assess the various components of their science laboratory environment. It also identified how the laboratory environment affects students' learning outcomes. The modified ex-post facto design was used. A sample of 328 randomly selected students was taken from a population of all Senior Secondary School chemistry students in a state in Nigeria. The research instrument, Science Laboratory Environment Inventory (SLEI) designed and validated by Fraser et al. (Sci Educ 77:1-24, 1993) was administered on the selected students. Data analysis was done using descriptive statistics and Product Moment Correlation. Findings revealed that students could assess the five components (Student cohesiveness, Open-endedness, Integration, Rule clarity, and Material Environment) of the laboratory environment. Student cohesiveness has the highest assessment while material environment has the least. The results also showed that the five components of the science laboratory environment are positively correlated with students' academic performance. The findings are discussed with a view to improving the quality of the laboratory environment, subsequent academic performance in science and ultimately the enrolment and retaining of learners in science.

  11. Preliminary Validation and Reliability Testing of the Montreal Instrument for Cat Arthritis Testing, for Use by Veterinarians, in a Colony of Laboratory Cats

    PubMed Central

    Klinck, Mary P.; Rialland, Pascale; Guillot, Martin; Moreau, Maxim; Frank, Diane; Troncy, Eric

    2015-01-01

    Simple Summary Feline osteoarthritis (OA) is challenging to diagnose. A pain scale was developed for use by veterinarians, in association with their physical examination, and tested for reliability and validity. The scale items were: Interaction with the examiner, Exploration of the room, Body Posture, Gait, Body Condition, condition of Coat and Claws, and abnormal Findings or Cat Reaction upon joint Palpation. Expert review supported the scale content. Two studies using laboratory-housed cats found the most promising results for Gait and Body Posture, in terms of distinguishing between OA and non-OA cats, repeatability of results, and correlations with objectively measured kinetics (weight-bearing). Abstract Subtle signs and conflicting physical and radiographic findings make feline osteoarthritis (OA) challenging to diagnose. A physical examination-based assessment was developed, consisting of eight items: Interaction, Exploration, Posture, Gait, Body Condition, Coat and Claws, (joint) Palpation–Findings, and Palpation–Cat Reaction. Content (experts) and face (veterinary students) validity were excellent. Construct validity, internal consistency, and intra- and inter-rater reliability were assessed via a pilot and main study, using laboratory-housed cats with and without OA. Gait distinguished OA status in the pilot (p = 0.05) study. In the main study, no scale item achieved statistically significant OA detection. Forelimb peak vertical ground reaction force (PVF) correlated inversely with Gait (Rhos = −0.38 (p = 0.03) to −0.41 (p = 0.02)). Body Posture correlated with Gait, and inversely with forelimb PVF at two of three time points (Rhos = −0.38 (p = 0.03) to −0.43 (p = 0.01)). Palpation (Findings, Cat Reaction) did not distinguish OA from non-OA cats. Palpation—Cat Reaction (Forelimbs) correlated inversely with forelimb PVF at two time points (Rhos = −0.41 (p = 0.02) to −0.41 (p = 0.01)), but scores were highly variable, and poorly reliable

  12. TargetMiner: microRNA target prediction with systematic identification of tissue-specific negative examples.

    PubMed

    Bandyopadhyay, Sanghamitra; Mitra, Ramkrishna

    2009-10-15

    Prediction of microRNA (miRNA) target mRNAs using machine learning approaches is an important area of research. However, most of the methods suffer from either high false positive or false negative rates. One reason for this is the marked deficiency of negative examples or miRNA non-target pairs. Systematic identification of non-target mRNAs is still not addressed properly, and therefore, current machine learning approaches are compelled to rely on artificially generated negative examples for training. In this article, we have identified approximately 300 tissue-specific negative examples using a novel approach that involves expression profiling of both miRNAs and mRNAs, miRNA-mRNA structural interactions and seed-site conservation. The newly generated negative examples are validated with pSILAC dataset, which elucidate the fact that the identified non-targets are indeed non-targets.These high-throughput tissue-specific negative examples and a set of experimentally verified positive examples are then used to build a system called TargetMiner, a support vector machine (SVM)-based classifier. In addition to assessing the prediction accuracy on cross-validation experiments, TargetMiner has been validated with a completely independent experimental test dataset. Our method outperforms 10 existing target prediction algorithms and provides a good balance between sensitivity and specificity that is not reflected in the existing methods. We achieve a significantly higher sensitivity and specificity of 69% and 67.8% based on a pool of 90 feature set and 76.5% and 66.1% using a set of 30 selected feature set on the completely independent test dataset. In order to establish the effectiveness of the systematically generated negative examples, the SVM is trained using a different set of negative data generated using the method in Yousef et al. A significantly higher false positive rate (70.6%) is observed when tested on the independent set, while all other factors are kept the

  13. Poster - Thur Eve - 52: Clinical use of nanoDots: In-vivo dosimetry and treatment validation for stereotactic targets with VMAT techniques.

    PubMed

    Wierzbicki, W; Nicol, S; Furstoss, C; Brunet-Benkhoucha, M; Leduc, V

    2012-07-01

    A newly acquired nanoDot In-Light system was compared with TLD-100 dosimeters to confirm the treatment dose in the multiple cases: an electron eye treatment, H&N IMRT and VMAT validation for small targets. Eye tumour treatment with 9 MeV electrons A dose of 1.8 Gy per fraction was prescribed to the 85% isodose. The average dose measured by three TLDs and three Dots was 1.90 and 1.97 Gy. Both detectors overestimated dose, by 2.9% and 6.7% respectively. H&N IMRT treatment of skin cancer with 6 MV photons Dose per fraction is 2.5 Gy. The average doses measured by two TLDs and two Dots were 2.48 and 2.56 Gy, which represent errors of -0.8% and 2.2%, respectively. VMAT validation for small targets using an Agarose phantom, dose 15 Gy A single-tumour brain treatment was delivered using two coplanar arcs to an Agarise phantom containing a large plastic insert holding 3 nanoDots and 4 TLDs. The difference between the average Pinnacle dose and the average dose of the corresponding detectors was -0.6% for Dots and -1.7% for TLDs. A two-tumour brain treatment was delivered using three non-coplanar arcs. Small and large plastic inserts separated by 5 cm were used to validate the dose. The difference between the average Pinnacle dose and the average dose of the corresponding detectors was the following; small phantom 0.7% for Dots and 0.3% for TLDs, large phantom-1.9% for Dots and -0.6% for TLDs. In conclusion, nanoDot detectors are suitable for in-vivo dosimetry with photon and electron beams. © 2012 American Association of Physicists in Medicine.

  14. Validation of the Apnea Risk Evaluation System (ARES) Device Against Laboratory Polysomnography in Pregnant Women at Risk for Obstructive Sleep Apnea Syndrome

    PubMed Central

    Sharkey, Katherine M.; Waters, Kelly; Millman, Richard P.; Moore, Robin; Martin, Susan M.; Bourjeily, Ghada

    2014-01-01

    Study Objective: To assess the validity of using the Apnea Risk Evaluation System (ARES) Unicorder for detecting obstructive sleep apnea (OSA) in pregnant women. Methods: Sixteen pregnant women, mean age (SD) = 29.8 (5.4) years, average gestational age (SD) = 28.6 (6.3) weeks, mean body mass index (SD) = 44.7 (6.9) kg/m2 with signs and symptoms of OSA wore the ARES Unicorder during one night of laboratory polysomnography (PSG). PSG was scored according to AASM 2007 criteria, and PSG AHI and RDI were compared to the ARES 1%, 3%, and 4% AHIs calculated with the ARES propriety software. Results: Median PSG AHI and PSG RDI were 3.1 and 10.3 events/h of sleep, respectively. Six women had a PSG AHI ≥ 5 events/h of sleep and 11 had a PSG RDI ≥ 5 events/h of sleep. PSG AHI and RDI were strongly correlated with the ARES AHI measures. When compared with polysomnographic diagnosis of OSA, the ARES 3% algorithm provided the best balance between sensitivity (1.0 for PSG AHI, 0.91 for PSG RDI) and specificity (0.5 for PSG AHI, 0.8 for PSG RDI) for detecting sleep disordered breathing in our sample. Conclusions: The ARES Unicorder demonstrated reasonable consistency with PSG for diagnosing OSA in this small, heterogeneous sample of obese pregnant women. Citation: Sharkey KM, Waters K, Millman RP, Moore R, Martin SM, Bourjeily G. Validation of the Apnea Risk Evaluation System (ARES) device against laboratory polysomnography in pregnant women at risk for obstructive sleep apnea syndrome. J Clin Sleep Med 2014;10(5):497-502. PMID:24910550

  15. Effectiveness of influenza vaccine against laboratory-confirmed influenza, in the late 2011–2012 season in Spain, among population targeted for vaccination

    PubMed Central

    2013-01-01

    Background In Spain, the influenza vaccine effectiveness (VE) was estimated in the last three seasons using the observational study cycEVA conducted in the frame of the existing Spanish Influenza Sentinel Surveillance System. The objective of the study was to estimate influenza vaccine effectiveness (VE) against medically attended, laboratory-confirmed influenza-like illness (ILI) among the target groups for vaccination in Spain in the 2011–2012 season. We also studied influenza VE in the early (weeks 52/2011-7/2012) and late (weeks 8-14/2012) phases of the epidemic and according to time since vaccination. Methods Medically attended patients with ILI were systematically swabbed to collect information on exposure, laboratory outcome and confounding factors. Patients belonging to target groups for vaccination and who were swabbed <8 days after symptom onset were included. Cases tested positive for influenza and controls tested negative for any influenza virus. To examine the effect of a late season, analyses were performed according to the phase of the season and according to the time between vaccination and symptoms onset. Results The overall adjusted influenza VE against A(H3N2) was 45% (95% CI, 0–69). The estimated influenza VE was 52% (95% CI, -3 to 78), 40% (95% CI, -40 to 74) and 22% (95% CI, -135 to 74) at 3.5 months, 3.5-4 months, and >4 months, respectively, since vaccination. A decrease in VE with time since vaccination was only observed in individuals aged ≥ 65 years. Regarding the phase of the season, decreasing point estimates were only observed in the early phase, whereas very low or null estimates were obtained in the late phase for the shortest time interval. Conclusions The 2011–2012 influenza vaccine showed a low-to-moderate protective effect against medically attended, laboratory-confirmed influenza in the target groups for vaccination, in a late season and with a limited match between the vaccine and circulating strains. The

  16. Analysis of cocoa flavanols and procyanidins (DP 1-10) in cocoa-containing ingredients and products by rapid resolution liquid chromatography: single-laboratory validation.

    PubMed

    Machonis, Philip R; Jones, Matthew A; Kwik-Uribe, Catherine

    2014-01-01

    Recently, a multilaboratory validation (MLV) of AOAC Official Method 2012.24 for the determination of cocoa flavanols and procyanidins (CF-CP) in cocoa-based ingredients and products determined that the method was robust, reliable, and transferrable. Due to the complexity of the CF-CP molecules, this method required a run time exceeding 1 h to achieve acceptable separations. To address this issue, a rapid resolution normal phase LC method was developed, and a single-laboratory validation (SLV) study conducted. Flavanols and procyanidins with a degree of polymerization (DP) up to 10 were eluted in 15 min using a binary gradient applied to a diol stationary phase, detected using fluorescence detection, and reported as a total sum of DP 1-10. Quantification was achieved using (-)-epicatechin-based relative response factors for DP 2-10. Spike recovery samples and seven different types of cocoa-based samples were analyzed to evaluate the accuracy, precision, LOD, LOQ, and linearity of the method. The within-day precision of the reported content for the samples was 1.15-5.08%, and overall precision was 3.97-13.61%. Spike-recovery experiments demonstrated recoveries of over 98%. The results of this SLV were compared to those previously obtained in the MLV and found to be consistent. The translation to rapid resolution LC allowed for an 80% reduction in analysis time and solvent usage, while retaining the accuracy and reliability of the original method. The savings in both cost and time of this rapid method make it well-suited for routine laboratory use.

  17. K(3)EDTA Vacuum Tubes Validation for Routine Hematological Testing.

    PubMed

    Lima-Oliveira, Gabriel; Lippi, Giuseppe; Salvagno, Gian Luca; Montagnana, Martina; Poli, Giovanni; Solero, Giovanni Pietro; Picheth, Geraldo; Guidi, Gian Cesare

    2012-01-01

    Background and Objective. Some in vitro diagnostic devices (e.g, blood collection vacuum tubes and syringes for blood analyses) are not validated before the quality laboratory managers decide to start using or to change the brand. Frequently, the laboratory or hospital managers select the vacuum tubes for blood collection based on cost considerations or on relevance of a brand. The aim of this study was to validate two dry K(3)EDTA vacuum tubes of different brands for routine hematological testing. Methods. Blood specimens from 100 volunteers in two different K(3)EDTA vacuum tubes were collected by a single, expert phlebotomist. The routine hematological testing was done on Advia 2120i hematology system. The significance of the differences between samples was assessed by paired Student's t-test after checking for normality. The level of statistical significance was set at P < 0.05. Results and Conclusions. Different brand's tubes evaluated can represent a clinically relevant source of variations only on mean platelet volume (MPV) and platelet distribution width (PDW). Basically, our validation will permit the laboratory or hospital managers to select the brand's vacuum tubes validated according to him/her technical or economical reasons for routine hematological tests.

  18. Effectiveness of Practices To Increase Timeliness of Providing Targeted Therapy for Inpatients with Bloodstream Infections: a Laboratory Medicine Best Practices Systematic Review and Meta-analysis

    PubMed Central

    Buehler, Stephanie S.; Madison, Bereneice; Snyder, Susan R.; Derzon, James H.; Saubolle, Michael A.; Weissfeld, Alice S.; Weinstein, Melvin P.; Liebow, Edward B.; Wolk, Donna M.

    2015-01-01

    SUMMARY Background. Bloodstream infection (BSI) is a major cause of morbidity and mortality throughout the world. Rapid identification of bloodstream pathogens is a laboratory practice that supports strategies for rapid transition to direct targeted therapy by providing for timely and effective patient care. In fact, the more rapidly that appropriate antimicrobials are prescribed, the lower the mortality for patients with sepsis. Rapid identification methods may have multiple positive impacts on patient outcomes, including reductions in mortality, morbidity, hospital lengths of stay, and antibiotic use. In addition, the strategy can reduce the cost of care for patients with BSIs. Objectives. The purpose of this review is to evaluate the evidence for the effectiveness of three rapid diagnostic practices in decreasing the time to targeted therapy for hospitalized patients with BSIs. The review was performed by applying the Centers for Disease Control and Prevention's (CDC's) Laboratory Medicine Best Practices Initiative (LMBP) systematic review methods for quality improvement (QI) practices and translating the results into evidence-based guidance (R. H. Christenson et al., Clin Chem 57:816–825, 2011, http://dx.doi.org/10.1373/clinchem.2010.157131). Search strategy. A comprehensive literature search was conducted to identify studies with measurable outcomes. A search of three electronic bibliographic databases (PubMed, Embase, and CINAHL), databases containing “gray” literature (unpublished academic, government, or industry evidence not governed by commercial publishing) (CIHI, NIHR, SIGN, and other databases), and the Cochrane database for English-language articles published between 1990 and 2011 was conducted in July 2011. Dates of search. The dates of our search were from 1990 to July 2011. Selection criteria. Animal studies and non-English publications were excluded. The search contained the following medical subject headings: bacteremia; bloodstream

  19. Onboard calibration igneous targets for the Mars Science Laboratory Curiosity rover and the Chemistry Camera laser induced breakdown spectroscopy instrument

    NASA Astrophysics Data System (ADS)

    Fabre, C.; Maurice, S.; Cousin, A.; Wiens, R. C.; Forni, O.; Sautter, V.; Guillaume, D.

    2011-03-01

    Accurate characterization of the Chemistry Camera (ChemCam) laser-induced breakdown spectroscopy (LIBS) on-board composition targets is of prime importance for the ChemCam instrument. The Mars Science Laboratory (MSL) science and operations teams expect ChemCam to provide the first compositional results at remote distances (1.5-7 m) during the in situ analyses of the Martian surface starting in 2012. Thus, establishing LIBS reference spectra from appropriate calibration standards must be undertaken diligently. Considering the global mineralogy of the Martian surface, and the possible landing sites, three specific compositions of igneous targets have been determined. Picritic, noritic, and shergottic glasses have been produced, along with a Macusanite natural glass. A sample of each target will fly on the MSL Curiosity rover deck, 1.56 m from the ChemCam instrument, and duplicates are available on the ground. Duplicates are considered to be identical, as the relative standard deviation (RSD) of the composition dispersion is around 8%. Electronic microprobe and laser ablation inductively coupled plasma mass spectrometry (LA ICP-MS) analyses give evidence that the chemical composition of the four silicate targets is very homogeneous at microscopic scales larger than the instrument spot size, with RSD < 5% for concentration variations > 0.1 wt.% using electronic microprobe, and < 10% for concentration variations > 0.01 wt.% using LA ICP-MS. The LIBS campaign on the igneous targets performed under flight-like Mars conditions establishes reference spectra for the entire mission. The LIBS spectra between 240 and 900 nm are extremely rich, hundreds of lines with high signal-to-noise, and a dynamical range sufficient to identify unambiguously major, minor and trace elements. For instance, a first LIBS calibration curve has been established for strontium from [Sr] = 284 ppm to [Sr] = 1480 ppm, showing the potential for the future calibrations for other major or minor

  20. Laboratory longitudinal diffusion tests: 1. Dimensionless formulations and validity of simplified solutions

    NASA Astrophysics Data System (ADS)

    Takeda, M.; Nakajima, H.; Zhang, M.; Hiratsuka, T.

    2008-04-01

    To obtain reliable diffusion parameters for diffusion testing, multiple experiments should not only be cross-checked but the internal consistency of each experiment should also be verified. In the through- and in-diffusion tests with solution reservoirs, test interpretation of different phases often makes use of simplified analytical solutions. This study explores the feasibility of steady, quasi-steady, equilibrium and transient-state analyses using simplified analytical solutions with respect to (i) valid conditions for each analytical solution, (ii) potential error, and (iii) experimental time. For increased generality, a series of numerical analyses are performed using unified dimensionless parameters and the results are all related to dimensionless reservoir volume (DRV) which includes only the sorptive parameter as an unknown. This means the above factors can be investigated on the basis of the sorption properties of the testing material and/or tracer. The main findings are that steady, quasi-steady and equilibrium-state analyses are applicable when the tracer is not highly sorptive. However, quasi-steady and equilibrium-state analyses become inefficient or impractical compared to steady state analysis when the tracer is non-sorbing and material porosity is significantly low. Systematic and comprehensive reformulation of analytical models enables the comparison of experimental times between different test methods. The applicability and potential error of each test interpretation can also be studied. These can be applied in designing, performing, and interpreting diffusion experiments by deducing DRV from the available information for the target material and tracer, combined with the results of this study.

  1. Determination of Aflatoxins and Ochratoxin A in Traditional Turkish Concentrated Fruit Juice Products by Multi-Immunoaffinity Column Cleanup and LC Fluorescence Detection: Single-Laboratory Validation.

    PubMed

    Kaymak, Tugrul; Türker, Levent; Tulay, Hüseyin; Stroka, Joerg

    2018-04-27

    Background : Pekmez and pestil are traditional Turkish foods made from concentrated grapejuice, which can be contaminated with mycotoxins such as aflatoxins and ochratoxin A (OTA). Objective : To carry out a single-laboratory validation of a method to simultaneously determine aflatoxins B 1 , B₂, G 1 , and G₂ and ochratoxin A in pekmez and pestil. Methods : The homogenized sample is extracted with methanol-water (80 + 20) using a high-speed blender. The (sample) extract is filtered, diluted with phosphate-buffered saline solution, and applied to a multi-immunoaffinity column (AFLAOCHRA PREP®). Aflatoxins and ochratoxin A are removed with (neat) methanol and then directly analyzed by reversed-phase LC with fluorescence detection using post-column bromination (Kobra cell®). Results : Test portions of blank pekmez and pestil were spiked with a mixture of aflatoxins and ochratoxin A to give levels ranging from 2.6 to 10.4 μg/kg and 1.0-4.0 μg/kg, respectively. Recoveries for total aflatoxins and ochratoxin A ranged from 84 to 106% and 80-97%, respectively, for spiked samples. Based on results for spiked pekmez and pestil (30 replicates each at three levels), the repeatability RSD ranged from 1.6 to 12% and 2.7-11% for total aflatoxins and ochratoxin A, respectively. Conclusions : The method performance in terms of recovery, repeatability, and detection limits has been demonstrated to be suitable for use as an Official Method. Highlights : First immunoaffinity column method validated for simultaneous analysis of aflatoxins and ochratoxin A in pekmez and pestil. Suitability for use for official purposes in Turkey, demonstrated by single-laboratory validation. Co-occurrence of aflatoxins and OTA in mulberry and carob pekmez reported for the first time.

  2. Evaluation of copper toxicity using site specific algae and water chemistry: Field validation of laboratory bioassays.

    PubMed

    Fawaz, Elyssa G; Salam, Darine A; Kamareddine, Lina

    2018-07-15

    Studies of metal toxicity to microalgae have predominantly been conducted using single non-target algae species and without due regard for the chemistry of the treated waters, leading to ineffective or excessive algaecide treatments. In this study, indigenous multi-algal species (Scenedesmus quadricauda, and Scenedesmus subspicatus and Oscillatoria agardhii) were used in laboratory toxicity bioassays under simulated field water chemistry (pH = 7.2, hardness = 196 mg L -1 as CaCO 3 , and alkalinity = 222 mg L -1 as CaCO 3 ) to determine the optimum copper sulfate treatment dose to control algae growth in an irrigation canal. Toxicity bioassays were conducted using copper sulfate in chelated (with EDTA) and non-chelated (without EDTA) forms to assess the influence of the use of synthetic chelators in toxicity studies. Also, copper toxicity to the indigenous algae species was measured in the non-modified EPA test medium (pH = 7.5, hardness = 92 mg L -1 as CaCO 3 , alkalinity = 10 mg L -1 as CaCO 3 and EDTA= 300 µg L -1 ) to assess the impact of the water chemistry on algae inhibitory algal dosages. Under simulated water chemistry conditions, lower toxicity was measured in the test flasks with the chelated form of copper (96 h- EC 50 = 386.67 µg L -1 as Cu) as compared to those with the non-chelated metal (96 h-EC 50 = 217.17 µg L -1 as Cu). In addition, higher copper toxicity was measured in the test flasks prepared with the non-modified EPA medium using chelated copper (96 h-EC 50 = 65.93 µg L -1 as Cu) as compared to their analogous microcosms with modified water chemistry (96 h-EC 50 = 386.67 µg L -1 as Cu), the increased water hardness and alkalinity in the latter case contributing to the decrease of the metal bioavailability. Results from laboratory experiments showed good correlation with copper dosages used in a small scale field testing to control algae growth, increasing confidence in

  3. TARGET Research Goals

    Cancer.gov

    TARGET researchers use various sequencing and array-based methods to examine the genomes, transcriptomes, and for some diseases epigenomes of select childhood cancers. This “multi-omic” approach generates a comprehensive profile of molecular alterations for each cancer type. Alterations are changes in DNA or RNA, such as rearrangements in chromosome structure or variations in gene expression, respectively. Through computational analyses and assays to validate biological function, TARGET researchers predict which alterations disrupt the function of a gene or pathway and promote cancer growth, progression, and/or survival. Researchers identify candidate therapeutic targets and/or prognostic markers from the cancer-associated alterations.

  4. Determination of Nitrogen, Phosphorus, and Potassium Release Rates of Slow- and Controlled-Release Fertilizers: Single-Laboratory Validation, First Action 2015.15.

    PubMed

    Thiex, Nancy

    2016-01-01

    A previously validated method for the determination of nitrogen release patterns of slow- and controlled-release fertilizers (SRFs and CRFs, respectively) was submitted to the Expert Review Panel (ERP) for Fertilizers for consideration of First Action Official Method(SM) status. The ERP evaluated the single-laboratory validation results and recommended the method for First Action Official Method status and provided recommendations for achieving Final Action. The 180 day soil incubation-column leaching technique was demonstrated to be a robust and reliable method for characterizing N release patterns from SRFs and CRFs. The method was reproducible, and the results were only slightly affected by variations in environmental factors such as microbial activity, soil moisture, temperature, and texture. The release of P and K were also studied, but at fewer replications than for N. Optimization experiments on the accelerated 74 h extraction method indicated that temperature was the only factor found to substantially influence nutrient-release rates from the materials studied, and an optimized extraction profile was established as follows: 2 h at 25°C, 2 h at 50°C, 20 h at 55°C, and 50 h at 60°C.

  5. Laboratory Education in New Zealand

    ERIC Educational Resources Information Center

    Borrmann, Thomas

    2008-01-01

    Laboratory work is one of the main forms of teaching used in chemistry, physics, biology and medicine. For many years researchers and teachers have argued in favor of or against this form of education. Student opinion could be a valuable tool for teachers to demonstrate the validity of such expensive and work intensive forms of education as…

  6. Effects of glyphosate on the non-target leaf beetle Cerotoma arcuata (Coleoptera: Chrysomelidae) in field and laboratory conditions.

    PubMed

    Pereira, Jardel L; Galdino, Tarcísio V S; Silva, Geverson A R; Picanço, Marcelo C; Silva, Antônio A; Corrêa, Alberto S; Martins, Júlio C

    2018-04-06

    This study aimed to assess the glyphosate application effects on the Cerotoma arcuata Oliver (Coleoptera: Chrysomelidae) population in glyphosate-resistant soybean crops. Field studies were conducted with glyphosate and the insecticide endosulfan to observe the effects of these pesticides on C. arcuata, on its damages in the crop and on the populations of natural enemies in glyphosate-resistant soybean crops. Moreover, the lethal and behavioral sublethal response of C. arcuata to glyphosate and endosulfan was conducted in the laboratory. The results of the field and laboratory experiments showed that glyphosate caused moderate toxicity and high irritability in C. arcuata and that endosulfan caused high toxicity and irritability. Therefore, the direct effect of glyphosate on C. arcuata was negative and does not explain the population increases of this pest in glyphosate-resistant soybean. However, the glyphosate also decreased the density of predators. Thus, the negative effect of glyphosate on the predators may be related to population increases of C. arcuata in glyphosate-resistant soybean crops, however, more studies are needed to better evidence this relationship. This study suggests that glyphosate can impact other non-target organisms, such as herbivorous insects and natural enemies and that the use of this herbicide will need to be carefully stewarded to prevent potential disturbances in beneficial insect communities in agricultural systems.

  7. Determination of quaternary ammonium compounds by potentiometric titration with an ionic surfactant electrode: single-laboratory validation.

    PubMed

    Price, Randi; Wan, Ping

    2010-01-01

    A potentiometric titration for determining the quaternary ammonium compounds (QAC) commonly found in antimicrobial products was validated by a single laboratory. Traditionally, QACs were determined by using a biphasic (chloroform and water) manual titration procedure. Because of safety considerations regarding chloroform, as well as the subjectivity of color indicator-based manual titration determinations, an automatic potentiometric titration procedure was tested with quaternary nitrogen product formulations. By using the Metrohm Titrando system coupled with an ionic surfactant electrode and an Ag/AgCl reference electrode, titrations were performed with various QAC-containing formulation products/matrixes; a standard sodium lauryl sulfate solution was used as the titrant. Results for the products tested are sufficiently reproducible and accurate for the purpose of regulatory product enforcement. The robustness of the method was measured by varying pH levels, as well as by comparing buffered versus unbuffered titration systems. A quantitation range of 1-1000 ppm quaternary nitrogen was established. Eight commercially available antimicrobial products covering a variety of matrixes were assayed; the results obtained were comparable to those obtained by the manual titration method. Recoveries of 94 to 104% were obtained for spiked samples.

  8. Analytical validation of a reference laboratory ELISA for the detection of feline leukemia virus p27 antigen.

    PubMed

    Buch, Jesse S; Clark, Genevieve H; Cahill, Roberta; Thatcher, Brendon; Smith, Peter; Chandrashekar, Ramaswamy; Leutenegger, Christian M; O'Connor, Thomas P; Beall, Melissa J

    2017-09-01

    Feline leukemia virus (FeLV) is an oncogenic retrovirus of cats. Immunoassays for the p27 core protein of FeLV aid in the detection of FeLV infections. Commercial microtiter-plate ELISAs have rapid protocols and visual result interpretation, limiting their usefulness in high-throughput situations. The purpose of our study was to validate the PetChek FeLV 15 ELISA, which is designed for the reference laboratory, and incorporates sequential, orthogonal screening and confirmatory protocols. A cutoff for the screening assay was established with 100% accuracy using 309 feline samples (244 negative, 65 positive) defined by the combined results of FeLV PCR and an independent reference p27 antigen ELISA. Precision of the screening assay was measured using a panel of 3 samples (negative, low-positive, and high-positive). The intra-assay coefficient of variation (CV) was 3.9-7.9%; the inter-assay CV was 6.0-8.6%. For the confirmatory assay, the intra-assay CV was 3.0-4.7%, and the inter-assay CV was 7.4-9.7%. The analytical sensitivity for p27 antigen was 3.7 ng/mL for inactivated whole FeLV and 1.2 ng/mL for purified recombinant FeLV p27. Analytical specificity was demonstrated based on the absence of cross-reactivity to related retroviruses. No interference was observed for samples containing added bilirubin, hemoglobin, or lipids. Based on these results, the new high-throughput design of the PetChek FeLV 15 ELISA makes it suitable for use in reference laboratory settings and maintains overall analytical performance.

  9. Knowledge, attitude, and practice (KAP) of 'teaching laboratory' technicians towards laboratory safety and waste management: a pilot interventional study.

    PubMed

    El-Gilany, A-H; El-Shaer, S; Khashaba, E; El-Dakroory, S A; Omar, N

    2017-06-01

    A quasi-experimental study was performed on 20 technicians working in the Faculty of Medicine, Mansoura University, Egypt. The knowledge, attitude, and practice (KAP) of laboratory technicians was measured before and two months after enrolling them in an intervention programme about laboratory best practice procedures. The programme addressed laboratory safety and medical waste management. The assessment was performed using a validated Arabic self-administered questionnaire. Pre- and post-intervention scores were compared using non-parametric tests. There are significant increases in the scores of KAP after implementation of the training programme. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  10. The Pitfalls of Companion Diagnostics: Evaluation of Discordant EGFR Mutation Results from a Clinical Laboratory and a Central Laboratory.

    PubMed

    Turner, Scott A; Peterson, Jason D; Pettus, Jason R; de Abreu, Francine B; Amos, Christopher I; Dragnev, Konstantin H; Tsongalis, Gregory J

    2016-05-01

    Accurate identification of somatic mutations in formalin-fixed, paraffin-embedded tumor tissue is required for enrollment into clinical trials for many novel targeted therapeutics, including trials requiring EGFR mutation status in non-small-cell lung carcinomas. Central clinical trial laboratories contracted to perform this analysis typically rely on US Food and Drug Administration-approved targeted assays to identify these mutations. We present two cases in which central laboratories inaccurately reported EGFR mutation status because of improper identification and isolation of tumor material and failure to accurately report assay limitations, resulting in enrollment denial. Such cases highlight the need for increased awareness by clinical trials of the limitation of these US Food and Drug Administration-approved assays and the necessity for a mechanism to reevaluate discordant results by alternative laboratory-developed procedures, including clinical next-generation sequencing. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  11. Reliability and validity of ten consumer activity trackers.

    PubMed

    Kooiman, Thea J M; Dontje, Manon L; Sprenger, Siska R; Krijnen, Wim P; van der Schans, Cees P; de Groot, Martijn

    2015-01-01

    Activity trackers can potentially stimulate users to increase their physical activity behavior. The aim of this study was to examine the reliability and validity of ten consumer activity trackers for measuring step count in both laboratory and free-living conditions. Healthy adult volunteers (n = 33) walked twice on a treadmill (4.8 km/h) for 30 min while wearing ten different activity trackers (i.e. Lumoback, Fitbit Flex, Jawbone Up, Nike+ Fuelband SE, Misfit Shine, Withings Pulse, Fitbit Zip, Omron HJ-203, Yamax Digiwalker SW-200 and Moves mobile application). In free-living conditions, 56 volunteers wore the same activity trackers for one working day. Test-retest reliability was analyzed with the Intraclass Correlation Coefficient (ICC). Validity was evaluated by comparing each tracker with the gold standard (Optogait system for laboratory and ActivPAL for free-living conditions), using paired samples t-tests, mean absolute percentage errors, correlations and Bland-Altman plots. Test-retest analysis revealed high reliability for most trackers except for the Omron (ICC .14), Moves app (ICC .37) and Nike+ Fuelband (ICC .53). The mean absolute percentage errors of the trackers in laboratory and free-living conditions respectively, were: Lumoback (-0.2, -0.4), Fibit Flex (-5.7, 3.7), Jawbone Up (-1.0, 1.4), Nike+ Fuelband (-18, -24), Misfit Shine (0.2, 1.1), Withings Pulse (-0.5, -7.9), Fitbit Zip (-0.3, 1.2), Omron (2.5, -0.4), Digiwalker (-1.2, -5.9), and Moves app (9.6, -37.6). Bland-Altman plots demonstrated that the limits of agreement varied from 46 steps (Fitbit Zip) to 2422 steps (Nike+ Fuelband) in the laboratory condition, and 866 steps (Fitbit Zip) to 5150 steps (Moves app) in the free-living condition. The reliability and validity of most trackers for measuring step count is good. The Fitbit Zip is the most valid whereas the reliability and validity of the Nike+ Fuelband is low.

  12. Determination of Yohimbine in Yohimbe Bark and Related Dietary Supplements Using UHPLC-UV/MS: Single-Laboratory Validation.

    PubMed

    Chen, Pei; Bryden, Noella

    2015-01-01

    A single-laboratory validation was performed on a practical ultra-HPLC (UHPLC)-diode array detector (DAD)/tandem MS method for determination of yohimbine in yohimbe barks and related dietary supplements. Good separation was achieved using a Waters Acquity ethylene bridged hybrid C18 column with gradient elution using 0.1% (v/v) aqueous ammonium hydroxide and 0.1% ammonium hydroxide in methanol as the mobile phases. The method can separate corynanthine from yohimbine in yohimbe bark extract, which is critical for accurate quantitation of yohimbine in yohimbe bark and related dietary supplements. Accuracy of the method was demonstrated using standard addition methods. Both intraday and interday precisions of the method were good. The method can be used without MS since yohimbine concentration in yohimbe barks and related dietary supplements are usually high enough for DAD detection, which can make it an easy and economical method for routine analysis of yohimbe barks and related dietary supplements. On the other hand, the method can be used with MS if desired for more challenging work such as biological and/or clinical studies.

  13. Managing laboratory automation in a changing pharmaceutical industry

    PubMed Central

    Rutherford, Michael L.

    1995-01-01

    The health care reform movement in the USA and increased requirements by regulatory agencies continue to have a major impact on the pharmaceutical industry and the laboratory. Laboratory management is expected to improve effciency by providing more analytical results at a lower cost, increasing customer service, reducing cycle time, while ensuring accurate results and more effective use of their staff. To achieve these expectations, many laboratories are using robotics and automated work stations. Establishing automated systems presents many challenges for laboratory management, including project and hardware selection, budget justification, implementation, validation, training, and support. To address these management challenges, the rationale for project selection and implementation, the obstacles encountered, project outcome, and learning points for several automated systems recently implemented in the Quality Control Laboratories at Eli Lilly are presented. PMID:18925014

  14. Cumberland Target for Drilling by Curiosity Mars Rover

    NASA Image and Video Library

    2013-05-09

    Cumberland has been selected as the second target for drilling by NASA Mars rover Curiosity. The rover has the capability to collect powdered material from inside the target rock and analyze that powder with laboratory instruments.

  15. Ada Compiler Validation Summary Report: Harris Corporation, Harris Ada Compiler, Version 1.0, Harris H1200 Host. Textronix 8540A-1750A Target.

    DTIC Science & Technology

    1987-06-03

    REPORT - HARRIS U-1 CORPORATION HARRIS ADA COM (Ui) ADA JOINT PROGRAM OFFICE ARLINGTON VA 93 JUN 87 NC... Report : 3 June 1987 to 3 June 1988 Harris Corp., Harris Ada Compiler, Ver. 1.0, Harris H1200 Host. Tektronix 8540A-1750A Target 6. PERFORMING ORG. REPORT ...01 -07-HAR Ada ® COMPILER VALIDATION SUMMARY REPORT : Harris Corporation Harris Ada Compiler, Version 1.0 Harris H1200 Host Tektronix

  16. National survey on current situation of critical value reporting in 973 laboratories in China.

    PubMed

    Fei, Yang; Zhao, Haijian; Wang, Wei; He, Falin; Zhong, Kun; Yuan, Shuai; Wang, Zhiguo

    2017-10-15

    The aim of the study was to investigate the state-of-the-art of the performance of critical value reporting and provide recommendations for laboratories setting critical value reporting time frames. The National Centre for Clinical Laboratories in China initiated a critical value reporting investigation in 2015. A questionnaire related to critical value reporting policy was sent to 1589 clinical laboratories in China online. The questionnaire consisted of a set of questions related to critical value reporting policy and a set of questions related to timeliness of critical value reporting. The survey data were collected between March and April 2015. A total survey response rate was 61.2%. The critical value unreported rate, unreported timely rate, and clinical unacknowledged rate of more than half of participants were all 0.0%. More than 75.0% of participants could report half of critical values to clinicians within 20 minutes and could report 90.0% of critical values to clinicians within 25 minutes (from result validation to result communication to the clinician). The median of target critical value reporting time was 15 minutes. "Reporting omission caused by laboratory staff", "communications equipment failure to connect", and "uncompleted application form without contact information of clinician" were the three major reasons for unreported critical value. The majority of laboratories can report critical values to responsible clinical staff within 25 minutes. Thus, this value could be recommended as suitable critical value reporting time frame for biochemistry laboratories in China. However, careful monitoring of the complete reporting process and improvement of information systems should ensure further improvement of critical value reporting timeliness.

  17. FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.

    PubMed

    Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver

    2014-06-14

    Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.

  18. Targeted proteomic assays for quantitation of proteins identified by proteogenomic analysis of ovarian cancer

    DOE PAGES

    Song, Ehwang; Gao, Yuqian; Wu, Chaochao; ...

    2017-07-19

    Here, mass spectrometry (MS) based targeted proteomic methods such as selected reaction monitoring (SRM) are becoming the method of choice for preclinical verification of candidate protein biomarkers. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute has investigated the standardization and analytical validation of the SRM assays and demonstrated robust analytical performance on different instruments across different laboratories. An Assay Portal has also been established by CPTAC to provide the research community a resource consisting of large set of targeted MS-based assays, and a depository to share assays publicly, providing that assays meet the guidelines proposed bymore » CPTAC. Herein, we report 98 SRM assays covering 70 candidate protein biomarkers previously reported as associated with ovarian cancer that have been thoroughly characterized according to the CPTAC Assay Characterization Guidance Document. The experiments, methods and results for characterizing these SRM assays for their MS response, repeatability, selectivity, stability, and reproducible detection of endogenous analytes are described in detail.« less

  19. Targeting legume loci: A comparison of three methods for target enrichment bait design in Leguminosae phylogenomics.

    PubMed

    Vatanparast, Mohammad; Powell, Adrian; Doyle, Jeff J; Egan, Ashley N

    2018-03-01

    The development of pipelines for locus discovery has spurred the use of target enrichment for plant phylogenomics. However, few studies have compared pipelines from locus discovery and bait design, through validation, to tree inference. We compared three methods within Leguminosae (Fabaceae) and present a workflow for future efforts. Using 30 transcriptomes, we compared Hyb-Seq, MarkerMiner, and the Yang and Smith (Y&S) pipelines for locus discovery, validated 7501 baits targeting 507 loci across 25 genera via Illumina sequencing, and inferred gene and species trees via concatenation- and coalescent-based methods. Hyb-Seq discovered loci with the longest mean length. MarkerMiner discovered the most conserved loci with the least flagged as paralogous. Y&S offered the most parsimony-informative sites and putative orthologs. Target recovery averaged 93% across taxa. We optimized our targeted locus set based on a workflow designed to minimize paralog/ortholog conflation and thus present 423 loci for legume phylogenomics. Methods differed across criteria important for phylogenetic marker development. We recommend Hyb-Seq as a method that may be useful for most phylogenomic projects. Our targeted locus set is a resource for future, community-driven efforts to reconstruct the legume tree of life.

  20. Clinical Laboratory Automation: A Case Study.

    PubMed

    Archetti, Claudia; Montanelli, Alessandro; Finazzi, Dario; Caimi, Luigi; Garrafa, Emirena

    2017-04-13

    This paper presents a case study of an automated clinical laboratory in a large urban academic teaching hospital in the North of Italy, the Spedali Civili in Brescia, where four laboratories were merged in a unique laboratory through the introduction of laboratory automation. The analysis compares the preautomation situation and the new setting from a cost perspective, by considering direct and indirect costs. It also presents an analysis of the turnaround time (TAT). The study considers equipment, staff and indirect costs. The introduction of automation led to a slight increase in equipment costs which is highly compensated by a remarkable decrease in staff costs. Consequently, total costs decreased by 12.55%. The analysis of the TAT shows an improvement of nonemergency exams while emergency exams are still validated within the maximum time imposed by the hospital. The strategy adopted by the management, which was based on re-using the available equipment and staff when merging the pre-existing laboratories, has reached its goal: introducing automation while minimizing the costs.

  1. Accidental fires in clinical laboratories.

    PubMed

    Hoeltge, G A; Miller, A; Klein, B R; Hamlin, W B

    1993-12-01

    The National Fire Protection Association, Quincy, Mass, estimates that 169 fires have occurred annually in health care, medical, and chemical laboratories. On the average, there are 13 civilian injuries and $1.5 million per year in direct property damage. Most fires in which the cause or ignition source can be identified originate in malfunctioning electrical equipment (41.6%) or in the facility's electrical distribution system (14.7%). The prevalence of fire safety deficiencies was measured in the College of American Pathologists Laboratory Accreditation Program. Of the 1732 inspected laboratories, 5.5% lacked records of electrical receptacle polarity and ground checks in the preceding year. Of these inspected laboratories, 4.7% had no or incomplete documentation of electrical safety checks on laboratory instruments. There was no evidence of quarterly fire exit drills in 9% of the laboratories. Deficiencies were also found in precautionary labeling (6.8%), in periodic review of safe work practices (4.2%), in the use of safety cans (3.7%), and in venting of flammable liquid storage areas (2.8%). Fire preparedness would be improved if all clinical laboratories had smoke detectors and automatic fire-extinguishing systems. In-service training courses in fire safety should be targeted to the needs of specific service areas.

  2. Region 9 Superfund Data Evaluation/Validation Guide

    EPA Pesticide Factsheets

    This guidance document is designed by EPARegion 9 Quality Assurance Office to provide assistance to project officers, Superfund contractors, and Superfund grantees in performing timely data evaluation and/or validation of laboratory data.

  3. TargetCompare: A web interface to compare simultaneous miRNAs targets.

    PubMed

    Moreira, Fabiano Cordeiro; Dustan, Bruno; Hamoy, Igor G; Ribeiro-Dos-Santos, André M; Dos Santos, Andrea Ribeiro

    2014-01-01

    MicroRNAs (miRNAs) are small non-coding nucleotide sequences between 17 and 25 nucleotides in length that primarily function in the regulation of gene expression. A since miRNA has thousand of predict targets in a complex, regulatory cell signaling network. Therefore, it is of interest to study multiple target genes simultaneously. Hence, we describe a web tool (developed using Java programming language and MySQL database server) to analyse multiple targets of pre-selected miRNAs. We cross validated the tool in eight most highly expressed miRNAs in the antrum region of stomach. This helped to identify 43 potential genes that are target of at least six of the referred miRNAs. The developed tool aims to reduce the randomness and increase the chance of selecting strong candidate target genes and miRNAs responsible for playing important roles in the studied tissue. http://lghm.ufpa.br/targetcompare.

  4. New HPV Serology Laboratory Aims to Standardize Assays and Contribute to Vaccine Implementation and Access | Frederick National Laboratory for Cancer Research

    Cancer.gov

    A new international initiative, led by scientists at the Frederick National Laboratory for Cancer Research and several other institutions, is being launched to provide expertise and leadership on the development, validation, and standardization of hu

  5. A Systematic Planning for Science Laboratory Instruction: Research-Based Evidence

    ERIC Educational Resources Information Center

    Balta, Nuri

    2015-01-01

    The aim of this study is to develop an instructional design model for science laboratory instruction. Well-known ID models were analysed and Dick and Carey model was imitated to produce a science laboratory instructional design (SLID) model. In order to validate the usability of the designed model, the views of 34 high school teachers related to…

  6. Targeting BRCAness in Gastric Cancer

    DTIC Science & Technology

    2017-10-01

    generated a modified CRISPR system using dCas9-KRAB expressing variants of these cells, and validated them for CRISPRi screening. These reagents will...adenocarcinoma - - - Figure 2. Validation of CRISPR activity following transduction with sgRNAs targeting CD55 and FACS staining with the anti...execution, and interpretation of CRISPR experiments Morgan Diolaiti Specialist UCSF PH.D. Experimental planning and reporting Jefferson Woods SRA

  7. Self-Disclosure Between Friends: A Validity Study

    ERIC Educational Resources Information Center

    Panyard, Christine Marie

    1973-01-01

    Subjects reported that they had disclosed approximately the same amount of information as they had received. The consensual validation of the amount of personal information exchanged between friends suggested that the Self-Disclosure Questionnaire is a valid measure of self-disclosure to a specific target person. (Author)

  8. K3EDTA Vacuum Tubes Validation for Routine Hematological Testing

    PubMed Central

    Lima-Oliveira, Gabriel; Lippi, Giuseppe; Salvagno, Gian Luca; Montagnana, Martina; Poli, Giovanni; Solero, Giovanni Pietro; Picheth, Geraldo; Guidi, Gian Cesare

    2012-01-01

    Background and Objective. Some in vitro diagnostic devices (e.g, blood collection vacuum tubes and syringes for blood analyses) are not validated before the quality laboratory managers decide to start using or to change the brand. Frequently, the laboratory or hospital managers select the vacuum tubes for blood collection based on cost considerations or on relevance of a brand. The aim of this study was to validate two dry K3EDTA vacuum tubes of different brands for routine hematological testing. Methods. Blood specimens from 100 volunteers in two different K3EDTA vacuum tubes were collected by a single, expert phlebotomist. The routine hematological testing was done on Advia 2120i hematology system. The significance of the differences between samples was assessed by paired Student's t-test after checking for normality. The level of statistical significance was set at P < 0.05. Results and Conclusions. Different brand's tubes evaluated can represent a clinically relevant source of variations only on mean platelet volume (MPV) and platelet distribution width (PDW). Basically, our validation will permit the laboratory or hospital managers to select the brand's vacuum tubes validated according to him/her technical or economical reasons for routine hematological tests. PMID:22888448

  9. Analysis of Ethanolamines: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS888

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Vu, A; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled 'Analysis of Diethanolamine, Triethanolamine, n-Methyldiethanolamine, and n-Ethyldiethanolamine in Water by Single Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS): EPA Method MS888'. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in 'EPA Method MS888' for analysis of themore » listed ethanolamines in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of 'EPA Method MS888' can be determined.« less

  10. Validation of Metagenomic Next-Generation Sequencing Tests for Universal Pathogen Detection.

    PubMed

    Schlaberg, Robert; Chiu, Charles Y; Miller, Steve; Procop, Gary W; Weinstock, George

    2017-06-01

    - Metagenomic sequencing can be used for detection of any pathogens using unbiased, shotgun next-generation sequencing (NGS), without the need for sequence-specific amplification. Proof-of-concept has been demonstrated in infectious disease outbreaks of unknown causes and in patients with suspected infections but negative results for conventional tests. Metagenomic NGS tests hold great promise to improve infectious disease diagnostics, especially in immunocompromised and critically ill patients. - To discuss challenges and provide example solutions for validating metagenomic pathogen detection tests in clinical laboratories. A summary of current regulatory requirements, largely based on prior guidance for NGS testing in constitutional genetics and oncology, is provided. - Examples from 2 separate validation studies are provided for steps from assay design, and validation of wet bench and bioinformatics protocols, to quality control and assurance. - Although laboratory and data analysis workflows are still complex, metagenomic NGS tests for infectious diseases are increasingly being validated in clinical laboratories. Many parallels exist to NGS tests in other fields. Nevertheless, specimen preparation, rapidly evolving data analysis algorithms, and incomplete reference sequence databases are idiosyncratic to the field of microbiology and often overlooked.

  11. USGS aerial resolution targets.

    USGS Publications Warehouse

    Salamonowicz, P.H.

    1982-01-01

    It is necessary to measure the achievable resolution of any airborne sensor that is to be used for metric purposes. Laboratory calibration facilities may be inadequate or inappropriate for determining the resolution of non-photographic sensors such as optical-mechanical scanners, television imaging tubes, and linear arrays. However, large target arrays imaged in the field can be used in testing such systems. The USGS has constructed an array of resolution targets in order to permit field testing of a variety of airborne sensing systems. The target array permits any interested organization with an airborne sensing system to accurately determine the operational resolution of its system. -from Author

  12. Automation and validation of DNA-banking systems.

    PubMed

    Thornton, Melissa; Gladwin, Amanda; Payne, Robin; Moore, Rachael; Cresswell, Carl; McKechnie, Douglas; Kelly, Steve; March, Ruth

    2005-10-15

    DNA banking is one of the central capabilities on which modern genetic research rests. The DNA-banking system plays an essential role in the flow of genetic data from patients and genetics researchers to the application of genetic research in the clinic. Until relatively recently, large collections of DNA samples were not common in human genetics. Now, collections of hundreds of thousands of samples are common in academic institutions and private companies. Automation of DNA banking can dramatically increase throughput, eliminate manual errors and improve the productivity of genetics research. An increased emphasis on pharmacogenetics and personalized medicine has highlighted the need for genetics laboratories to operate within the principles of a recognized quality system such as good laboratory practice (GLP). Automated systems are suitable for such laboratories but require a level of validation that might be unfamiliar to many genetics researchers. In this article, we use the AstraZeneca automated DNA archive and reformatting system (DART) as a case study of how such a system can be successfully developed and validated within the principles of GLP.

  13. RobOKoD: microbial strain design for (over)production of target compounds

    PubMed Central

    Stanford, Natalie J.; Millard, Pierre; Swainston, Neil

    2015-01-01

    Sustainable production of target compounds such as biofuels and high-value chemicals for pharmaceutical, agrochemical, and chemical industries is becoming an increasing priority given their current dependency upon diminishing petrochemical resources. Designing these strains is difficult, with current methods focusing primarily on knocking-out genes, dismissing other vital steps of strain design including the overexpression and dampening of genes. The design predictions from current methods also do not translate well-into successful strains in the laboratory. Here, we introduce RobOKoD (Robust, Overexpression, Knockout and Dampening), a method for predicting strain designs for overproduction of targets. The method uses flux variability analysis to profile each reaction within the system under differing production percentages of target-compound and biomass. Using these profiles, reactions are identified as potential knockout, overexpression, or dampening targets. The identified reactions are ranked according to their suitability, providing flexibility in strain design for users. The software was tested by designing a butanol-producing Escherichia coli strain, and was compared against the popular OptKnock and RobustKnock methods. RobOKoD shows favorable design predictions, when predictions from these methods are compared to a successful butanol-producing experimentally-validated strain. Overall RobOKoD provides users with rankings of predicted beneficial genetic interventions with which to support optimized strain design. PMID:25853130

  14. Laboratory development and testing of spacecraft diagnostics

    NASA Astrophysics Data System (ADS)

    Amatucci, William; Tejero, Erik; Blackwell, Dave; Walker, Dave; Gatling, George; Enloe, Lon; Gillman, Eric

    2017-10-01

    The Naval Research Laboratory's Space Chamber experiment is a large-scale laboratory device dedicated to the creation of large-volume plasmas with parameters scaled to realistic space plasmas. Such devices make valuable contributions to the investigation of space plasma phenomena under controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. However, in addition to investigations such as plasma wave and instability studies, such devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this talk, we will describe how the laboratory simulation of space plasmas made this development path possible. Work sponsored by the US Naval Research Laboratory Base Program.

  15. Development, validation and comparison of NIR and Raman methods for the identification and assay of poor-quality oral quinine drops.

    PubMed

    Mbinze, J K; Sacré, P-Y; Yemoa, A; Mavar Tayey Mbay, J; Habyalimana, V; Kalenda, N; Hubert, Ph; Marini, R D; Ziemons, E

    2015-01-01

    Poor quality antimalarial drugs are one of the public's major health problems in Africa. The depth of this problem may be explained in part by the lack of effective enforcement and the lack of efficient local drug analysis laboratories. To tackle part of this issue, two spectroscopic methods with the ability to detect and to quantify quinine dihydrochloride in children's oral drops formulations were developed and validated. Raman and near infrared (NIR) spectroscopy were selected for the drug analysis due to their low cost, non-destructive and rapid characteristics. Both of the methods developed were successfully validated using the total error approach in the range of 50-150% of the target concentration (20%W/V) within the 10% acceptance limits. Samples collected on the Congolese pharmaceutical market were analyzed by both techniques to detect potentially substandard drugs. After a comparison of the analytical performance of both methods, it has been decided to implement the method based on NIR spectroscopy to perform the routine analysis of quinine oral drop samples in the Quality Control Laboratory of Drugs at the University of Kinshasa (DRC). Copyright © 2015 Elsevier B.V. All rights reserved.

  16. A meta-analysis of the validity of FFQ targeted to adolescents.

    PubMed

    Tabacchi, Garden; Filippi, Anna Rita; Amodio, Emanuele; Jemni, Monèm; Bianco, Antonino; Firenze, Alberto; Mammina, Caterina

    2016-05-01

    The present work is aimed at meta-analysing validity studies of FFQ for adolescents, to investigate their overall accuracy and variables that can affect it negatively. A meta-analysis of sixteen original articles was performed within the ASSO Project (Adolescents and Surveillance System in the Obesity prevention). The articles assessed the validity of FFQ for adolescents, compared with food records or 24 h recalls, with regard to energy and nutrient intakes. Pearson's or Spearman's correlation coefficients, means/standard deviations, kappa agreement, percentiles and mean differences/limits of agreement (Bland-Altman method) were extracted. Pooled estimates were calculated and heterogeneity tested for correlation coefficients and means/standard deviations. A subgroup analysis assessed variables influencing FFQ accuracy. An overall fair/high correlation between FFQ and reference method was found; a good agreement, measured through the intake mean comparison for all nutrients except sugar, carotene and K, was observed. Kappa values showed fair/moderate agreement; an overall good ability to rank adolescents according to energy and nutrient intakes was evidenced by data of percentiles; absolute validity was not confirmed by mean differences/limits of agreement. Interviewer administration mode, consumption interval of the previous year/6 months and high number of food items are major contributors to heterogeneity and thus can reduce FFQ accuracy. The meta-analysis shows that FFQ are accurate tools for collecting data and could be used for ranking adolescents in terms of energy and nutrient intakes. It suggests how the design and the validation of a new FFQ should be addressed.

  17. A Unified Constitutive Model for Subglacial Till, Part II: Laboratory Tests, Disturbed State Modeling, and Validation for Two Subglacial Tills

    NASA Astrophysics Data System (ADS)

    Desai, C. S.; Sane, S. M.; Jenson, J. W.; Contractor, D. N.; Carlson, A. E.; Clark, P. U.

    2006-12-01

    This presentation, which is complementary to Part I (Jenson et al.), describes the application of the Disturbed State Concept (DSC) constitutive model to define the behavior of the deforming sediment (till) underlying glaciers and ice sheets. The DSC includes elastic, plastic, and creep strains, and microstructural changes leading to degradation, failure, and sometimes strengthening or healing. Here, we describe comprehensive laboratory experiments conducted on samples of two regionally significant tills deposited by the Laurentide Ice Sheet: the Tiskilwa Till and Sky Pilot Till. The tests are used to determine the parameters to calibrate the DSC model, which is validated with respect to the laboratory tests by comparing the predictions with test data used to find the parameters, and also comparing them with independent tests not used to find the parameters. Discussion of the results also includes comparison of the DSC model with the classical Mohr-Coulomb model, which has been commonly used for glacial tills. A numerical procedure based on finite element implementation of the DSC is used to simulate an idealized field problem, and its predictions are discussed. Based on these analyses, the unified DSC model is proposed to provide an improved model for subglacial tills compared to other models used commonly, and thus to provide the potential for improved predictions of ice sheet movements.

  18. Research Diagnostic Criteria for Temporomandibular Disorders: Validity of Axis I Diagnoses

    PubMed Central

    Truelove, Edmond; Pan, Wei; Look, John O.; Mancl, Lloyd A.; Ohrbach, Richard K.; Velly, Ana; Huggins, Kimberly; Lenton, Patricia; Schiffman, Eric L.

    2011-01-01

    AIMS To estimate the criterion validity of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Axis I TMD diagnoses. METHODS A combined total of 614 TMD community and clinic cases and 91 controls were examined at 3 study sites. RDC/TMD Axis I diagnoses were algorithmically derived from an examination performed by calibrated dental hygienists. Reference standards (Gold Standards) were established by means of consensus diagnoses rendered by 2 TMD experts using all available clinical data, including imaging studies. Validity of the RDC/TMD Axis I TMD diagnoses was estimated relative to reference-standard diagnoses (gold standard diagnoses). Target sensitivity and specificity were set a priori at ≥ 0.70 and ≥ 0.95, respectively. RESULTS Target sensitivity and specificity were not observed for any of the 8 RDC/TMD diagnoses. The highest validity was achieved for Group Ia myofascial pain (sensitivity 0.65, specificity 0.92) and Group Ib myofascial pain with limited opening (sensitivity 0.79, specificity 0.92). Target sensitivity and specificity were observed only when both Group I diagnoses were combined (0.87 and 0.98, respectively). For Group II (disc displacements) and Group III (arthralgia, arthritis, arthrosis) diagnoses, all estimates for sensitivity were below target (0.03 to 0.53), and specificity ranged from below to on target (0.86 to 0.99). CONCLUSION The RDC/TMD Axis I TMD diagnoses did not reach the targets set at sensitivity of ≥ 0.70 and specificity of ≥ 0.95. Target validity was obtained only for myofascial pain without differentiation between normal and limited opening. Revision of the current Axis I TMD diagnostic algorithms is warranted to improve their validity. PMID:20213030

  19. Reliability and validity of a treatment fidelity assessment for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.

    PubMed

    Seng, Elizabeth K; Lovejoy, Travis I

    2013-12-01

    This study psychometrically evaluates the Motivational Interviewing Treatment Integrity Code (MITI) to assess fidelity to motivational interviewing to reduce sexual risk behaviors in people living with HIV/AIDS. 74 sessions from a pilot randomized controlled trial of motivational interviewing to reduce sexual risk behaviors in people living with HIV were coded with the MITI. Participants reported sexual behavior at baseline, 3-month, and 6-months. Regarding reliability, excellent inter-rater reliability was achieved for measures of behavior frequency across the 12 sessions coded by both coders; global scales demonstrated poor intraclass correlations, but adequate percent agreement. Regarding validity, principle components analyses indicated that a two-factor model accounted for an adequate amount of variance in the data. These factors were associated with decreases in sexual risk behaviors after treatment. The MITI is a reliable and valid measurement of treatment fidelity for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.

  20. Radioligand Recognition of Insecticide Targets.

    PubMed

    Casida, John E

    2018-04-04

    Insecticide radioligands allow the direct recognition and analysis of the targets and mechanisms of toxic action critical to effective and safe pest control. These radioligands are either the insecticides themselves or analogs that bind at the same or coupled sites. Preferred radioligands and their targets, often in both insects and mammals, are trioxabicyclooctanes for the γ-aminobutyric acid (GABA) receptor, avermectin for the glutamate receptor, imidacloprid for the nicotinic receptor, ryanodine and chlorantraniliprole for the ryanodine receptor, and rotenone or pyridaben for NADH + ubiquinone oxidoreductase. Pyrethroids and other Na + channel modulator insecticides are generally poor radioligands due to lipophilicity and high nonspecific binding. For target site validation, the structure-activity relationships competing with the radioligand in the binding assays should be the same as that for insecticidal activity or toxicity except for rapidly detoxified or proinsecticide analogs. Once the radioligand assay is validated for relevance, it will often help define target site modifications on selection of resistant pest strains, selectivity between insects and mammals, and interaction with antidotes and other chemicals at modulator sites. Binding assays also serve for receptor isolation and photoaffinity labeling to characterize the interactions involved.

  1. The suitability of matrix assisted laser desorption/ionization time of flight mass spectrometry in a laboratory developed test using cystic fibrosis carrier screening as a model.

    PubMed

    Farkas, Daniel H; Miltgen, Nicholas E; Stoerker, Jay; van den Boom, Dirk; Highsmith, W Edward; Cagasan, Lesley; McCullough, Ron; Mueller, Reinhold; Tang, Lin; Tynan, John; Tate, Courtney; Bombard, Allan

    2010-09-01

    We designed a laboratory developed test (LDT) by using an open platform for mutation/polymorphism detection. Using a 108-member (mutation plus variant) cystic fibrosis carrier screening panel as a model, we completed the last phase of LDT validation by using matrix-assisted laser desorption/ionization time of flight mass spectrometry. Panel customization was accomplished via specific amplification primer and extension probe design. Amplified genomic DNA was subjected to allele specific, single base extension endpoint analysis by mass spectrometry for inspection of the cystic fibrosis transmembrane regulator gene (NM_000492.3). The panel of mutations and variants was tested against 386 blinded samples supplied by "authority" laboratories highly experienced in cystic fibrosis transmembrane regulator genotyping; >98% concordance was observed. All discrepant and discordant results were resolved satisfactorily. Taken together, these results describe the concluding portion of the LDT validation process and the use of mass spectrometry to detect a large number of complex reactions within a single run as well as its suitability as a platform appropriate for interrogation of scores to hundreds of targets.

  2. Determination of Chondroitin Sulfate Content in Raw Materials and Dietary Supplements by High-Performance Liquid Chromatography with UV Detection After Enzymatic Hydrolysis: Single-Laboratory Validation First Action 2015.11.

    PubMed

    Brunelle, Sharon L

    2016-01-01

    A previously validated method for determination of chondroitin sulfate in raw materials and dietary supplements was submitted to the AOAC Expert Review Panel (ERP) for Stakeholder Panel on Dietary Supplements Set 1 Ingredients (Anthocyanins, Chondroitin, and PDE5 Inhibitors) for consideration of First Action Official Methods(SM) status. The ERP evaluated the single-laboratory validation results against AOAC Standard Method Performance Requirements 2014.009. With recoveries of 100.8-101.6% in raw materials and 105.4-105.8% in finished products and precision of 0.25-1.8% RSDr within-day and 1.6-4.72% RSDr overall, the ERP adopted the method for First Action Official Methods status and provided recommendations for achieving Final Action status.

  3. Validating the random search model for two targets of different difficulty.

    PubMed

    Chan, Alan H S; Yu, Ruifeng

    2010-02-01

    A random visual search model was fitted to 1,788 search times obtained from a nonidentical double-target search task. 30 Hong Kong Chinese (13 men, 17 women) ages 18 to 33 years (M = 23, SD = 6.8) took part in the experiment voluntarily. The overall adequacy and prediction accuracy of the model for various search time parameters (mean and median search times and response times) for both individual and pooled data show that search strategy may reasonably be inferred from search time distributions. The results also suggested the general applicability of the random search model for describing the search behavior of a large number of participants performing the type of search used here, as well as the practical feasibility of its application for determination of stopping policy for optimization of an inspection system design. Although the data generally conformed to the model the search for the more difficult target was faster than expected. The more difficult target was usually detected after the easier target and it is suggested that some degree of memory-guided searching may have been used for the second target. Some abnormally long search times were observed and it is possible that these might have been due to the characteristics of visual lobes, nonoptimum interfixation distances and inappropriate overlapping of lobes, as has been previously reported.

  4. Determination of catechins and caffeine in camillia sinensis raw materials, extracts, and dietary supplements by HPLC-uv: single-laboratory validation.

    PubMed

    Roman, Mark C

    2013-01-01

    A rapid method has been developed to quantify seven catechins and caffeine in green tea (Camillia sinensis) raw material and powdered extract, and dietary supplements containing green tea extract. The method utilizes RP HPLC with a phenyl-based stationary phase and gradient elution. Detection is by UV absorbance. The total run time, including column re-equilibration, is 13 min. Single-laboratory validation (SLV) has been performed on the method to determine the repeatability, accuracy, selectivity, LOD, LOQ, ruggedness, and linearity for (+)-catechin, (-)-epicatechin, (-)-epicatechin gallate, (-)-epigallocatechin, (-)-gallocatechin gallate, (-)-epigallocatechin gallate, and (+)-gallocatechin, as well as caffeine. Repeatability precision and recovery results met AOAC guidelines for SLV studies for all catechins and caffeine down to a level of approximately 20 mg/g. Finished products containing high concentrations of minerals require the use of EDTA to prevent decomposition of the catechins.

  5. Refining the accuracy of validated target identification through coding variant fine-mapping in type 2 diabetes.

    PubMed

    Mahajan, Anubha; Wessel, Jennifer; Willems, Sara M; Zhao, Wei; Robertson, Neil R; Chu, Audrey Y; Gan, Wei; Kitajima, Hidetoshi; Taliun, Daniel; Rayner, N William; Guo, Xiuqing; Lu, Yingchang; Li, Man; Jensen, Richard A; Hu, Yao; Huo, Shaofeng; Lohman, Kurt K; Zhang, Weihua; Cook, James P; Prins, Bram Peter; Flannick, Jason; Grarup, Niels; Trubetskoy, Vassily Vladimirovich; Kravic, Jasmina; Kim, Young Jin; Rybin, Denis V; Yaghootkar, Hanieh; Müller-Nurasyid, Martina; Meidtner, Karina; Li-Gao, Ruifang; Varga, Tibor V; Marten, Jonathan; Li, Jin; Smith, Albert Vernon; An, Ping; Ligthart, Symen; Gustafsson, Stefan; Malerba, Giovanni; Demirkan, Ayse; Tajes, Juan Fernandez; Steinthorsdottir, Valgerdur; Wuttke, Matthias; Lecoeur, Cécile; Preuss, Michael; Bielak, Lawrence F; Graff, Marielisa; Highland, Heather M; Justice, Anne E; Liu, Dajiang J; Marouli, Eirini; Peloso, Gina Marie; Warren, Helen R; Afaq, Saima; Afzal, Shoaib; Ahlqvist, Emma; Almgren, Peter; Amin, Najaf; Bang, Lia B; Bertoni, Alain G; Bombieri, Cristina; Bork-Jensen, Jette; Brandslund, Ivan; Brody, Jennifer A; Burtt, Noël P; Canouil, Mickaël; Chen, Yii-Der Ida; Cho, Yoon Shin; Christensen, Cramer; Eastwood, Sophie V; Eckardt, Kai-Uwe; Fischer, Krista; Gambaro, Giovanni; Giedraitis, Vilmantas; Grove, Megan L; de Haan, Hugoline G; Hackinger, Sophie; Hai, Yang; Han, Sohee; Tybjærg-Hansen, Anne; Hivert, Marie-France; Isomaa, Bo; Jäger, Susanne; Jørgensen, Marit E; Jørgensen, Torben; Käräjämäki, Annemari; Kim, Bong-Jo; Kim, Sung Soo; Koistinen, Heikki A; Kovacs, Peter; Kriebel, Jennifer; Kronenberg, Florian; Läll, Kristi; Lange, Leslie A; Lee, Jung-Jin; Lehne, Benjamin; Li, Huaixing; Lin, Keng-Hung; Linneberg, Allan; Liu, Ching-Ti; Liu, Jun; Loh, Marie; Mägi, Reedik; Mamakou, Vasiliki; McKean-Cowdin, Roberta; Nadkarni, Girish; Neville, Matt; Nielsen, Sune F; Ntalla, Ioanna; Peyser, Patricia A; Rathmann, Wolfgang; Rice, Kenneth; Rich, Stephen S; Rode, Line; Rolandsson, Olov; Schönherr, Sebastian; Selvin, Elizabeth; Small, Kerrin S; Stančáková, Alena; Surendran, Praveen; Taylor, Kent D; Teslovich, Tanya M; Thorand, Barbara; Thorleifsson, Gudmar; Tin, Adrienne; Tönjes, Anke; Varbo, Anette; Witte, Daniel R; Wood, Andrew R; Yajnik, Pranav; Yao, Jie; Yengo, Loïc; Young, Robin; Amouyel, Philippe; Boeing, Heiner; Boerwinkle, Eric; Bottinger, Erwin P; Chowdhury, Rajiv; Collins, Francis S; Dedoussis, George; Dehghan, Abbas; Deloukas, Panos; Ferrario, Marco M; Ferrières, Jean; Florez, Jose C; Frossard, Philippe; Gudnason, Vilmundur; Harris, Tamara B; Heckbert, Susan R; Howson, Joanna M M; Ingelsson, Martin; Kathiresan, Sekar; Kee, Frank; Kuusisto, Johanna; Langenberg, Claudia; Launer, Lenore J; Lindgren, Cecilia M; Männistö, Satu; Meitinger, Thomas; Melander, Olle; Mohlke, Karen L; Moitry, Marie; Morris, Andrew D; Murray, Alison D; de Mutsert, Renée; Orho-Melander, Marju; Owen, Katharine R; Perola, Markus; Peters, Annette; Province, Michael A; Rasheed, Asif; Ridker, Paul M; Rivadineira, Fernando; Rosendaal, Frits R; Rosengren, Anders H; Salomaa, Veikko; Sheu, Wayne H-H; Sladek, Rob; Smith, Blair H; Strauch, Konstantin; Uitterlinden, André G; Varma, Rohit; Willer, Cristen J; Blüher, Matthias; Butterworth, Adam S; Chambers, John Campbell; Chasman, Daniel I; Danesh, John; van Duijn, Cornelia; Dupuis, Josée; Franco, Oscar H; Franks, Paul W; Froguel, Philippe; Grallert, Harald; Groop, Leif; Han, Bok-Ghee; Hansen, Torben; Hattersley, Andrew T; Hayward, Caroline; Ingelsson, Erik; Kardia, Sharon L R; Karpe, Fredrik; Kooner, Jaspal Singh; Köttgen, Anna; Kuulasmaa, Kari; Laakso, Markku; Lin, Xu; Lind, Lars; Liu, Yongmei; Loos, Ruth J F; Marchini, Jonathan; Metspalu, Andres; Mook-Kanamori, Dennis; Nordestgaard, Børge G; Palmer, Colin N A; Pankow, James S; Pedersen, Oluf; Psaty, Bruce M; Rauramaa, Rainer; Sattar, Naveed; Schulze, Matthias B; Soranzo, Nicole; Spector, Timothy D; Stefansson, Kari; Stumvoll, Michael; Thorsteinsdottir, Unnur; Tuomi, Tiinamaija; Tuomilehto, Jaakko; Wareham, Nicholas J; Wilson, James G; Zeggini, Eleftheria; Scott, Robert A; Barroso, Inês; Frayling, Timothy M; Goodarzi, Mark O; Meigs, James B; Boehnke, Michael; Saleheen, Danish; Morris, Andrew P; Rotter, Jerome I; McCarthy, Mark I

    2018-04-01

    We aggregated coding variant data for 81,412 type 2 diabetes cases and 370,832 controls of diverse ancestry, identifying 40 coding variant association signals (P < 2.2 × 10 -7 ); of these, 16 map outside known risk-associated loci. We make two important observations. First, only five of these signals are driven by low-frequency variants: even for these, effect sizes are modest (odds ratio ≤1.29). Second, when we used large-scale genome-wide association data to fine-map the associated variants in their regional context, accounting for the global enrichment of complex trait associations in coding sequence, compelling evidence for coding variant causality was obtained for only 16 signals. At 13 others, the associated coding variants clearly represent 'false leads' with potential to generate erroneous mechanistic inference. Coding variant associations offer a direct route to biological insight for complex diseases and identification of validated therapeutic targets; however, appropriate mechanistic inference requires careful specification of their causal contribution to disease predisposition.

  6. Targeting BRCAness in Gastric Cancer

    DTIC Science & Technology

    2017-10-01

    inhibitors. We also generated a modified CRISPR system using dCas9-KRAB expressing variants of these cells, and validated them for CRISPRi screening...Figure 2. Validation of CRISPR activity following transduction with sgRNAs targeting CD55 and FACS staining with the anti-CD55 antibody. Data shown...interpretation of CRISPR experiments Morgan Diolaiti Specialist UCSF PH.D. Experimental planning and reporting Jefferson Woods SRA UCSF B.S. Perform drug

  7. TargetCompare: A web interface to compare simultaneous miRNAs targets

    PubMed Central

    Moreira, Fabiano Cordeiro; Dustan, Bruno; Hamoy, Igor G; Ribeiro-dos-Santos, André M; dos Santos, Ândrea Ribeiro

    2014-01-01

    MicroRNAs (miRNAs) are small non-coding nucleotide sequences between 17 and 25 nucleotides in length that primarily function in the regulation of gene expression. A since miRNA has thousand of predict targets in a complex, regulatory cell signaling network. Therefore, it is of interest to study multiple target genes simultaneously. Hence, we describe a web tool (developed using Java programming language and MySQL database server) to analyse multiple targets of pre-selected miRNAs. We cross validated the tool in eight most highly expressed miRNAs in the antrum region of stomach. This helped to identify 43 potential genes that are target of at least six of the referred miRNAs. The developed tool aims to reduce the randomness and increase the chance of selecting strong candidate target genes and miRNAs responsible for playing important roles in the studied tissue. Availability http://lghm.ufpa.br/targetcompare PMID:25352731

  8. Inadequate response to treat-to-target methotrexate therapy in patients with new-onset rheumatoid arthritis: development and validation of clinical predictors.

    PubMed

    Teitsma, Xavier M; Jacobs, Johannes W G; Welsing, Paco M J; de Jong, Pascal H P; Hazes, Johanna M W; Weel, Angelique E A M; Pethö-Schramm, Attila; Borm, Michelle E A; van Laar, Jacob M; Lafeber, Floris P J G; Bijlsma, Johannes W J

    2018-05-14

    To identify and validate clinical baseline predictors associated with inadequate response (IR) to methotrexate (MTX) therapy in newly diagnosed patients with rheumatoid arthritis (RA). In U-Act-Early, 108 disease-modifying antirheumatic drug (DMARD)-naive patients with RA were randomised to initiate MTX therapy and treated to target until sustained remission (disease activity score assessing 28 joints (DAS28) <2.6 with four or less swollen joints for ≥24 weeks) was achieved. If no remission, hydroxychloroquine was added to the treatment regimen (ie, 'MTX+') and replaced by tocilizumab if the target still was not reached thereafter. Regression analyses were performed to identify clinical predictors for IR, defined as needing addition of a biological DMARD, to 'MTX+'. Data from the treatment in the Rotterdam Early Arthritis Cohort were used for external validation of the prediction model. Within 1 year, 56/108 (52%) patients in U-Act-Early showed IR to 'MTX+'. DAS28 (adjusted OR (OR adj ) 2.1, 95% CI 1.4 to 3.2), current smoking (OR adj 3.02, 95% CI 1.1 to 8.0) and alcohol consumption (OR adj 0.4, 95% CI 0.1 to 0.9) were identified as baseline predictors. The area under the receiver operator characteristic curve (AUROC) of the prediction model was 0.75 (95% CI 0.66 to 0.84); the positive (PPV) and negative predictive value (NPV) were 65% and 80%, respectively. When applying the model to the validation cohort, the AUROC slightly decreased to 0.67 (95% CI 0.55 to 0.79) and the PPV and NPV to 54% and 80%, respectively. Higher DAS28, current smoking and no alcohol consumption are predictive factors for IR to step-up 'MTX+' in DMARD-naive patients with new-onset RA. NCT01034137; Post-results, ISRCTN26791028; Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Building and validating a prediction model for paediatric type 1 diabetes risk using next generation targeted sequencing of class II HLA genes.

    PubMed

    Zhao, Lue Ping; Carlsson, Annelie; Larsson, Helena Elding; Forsander, Gun; Ivarsson, Sten A; Kockum, Ingrid; Ludvigsson, Johnny; Marcus, Claude; Persson, Martina; Samuelsson, Ulf; Örtqvist, Eva; Pyo, Chul-Woo; Bolouri, Hamid; Zhao, Michael; Nelson, Wyatt C; Geraghty, Daniel E; Lernmark, Åke

    2017-11-01

    It is of interest to predict possible lifetime risk of type 1 diabetes (T1D) in young children for recruiting high-risk subjects into longitudinal studies of effective prevention strategies. Utilizing a case-control study in Sweden, we applied a recently developed next generation targeted sequencing technology to genotype class II genes and applied an object-oriented regression to build and validate a prediction model for T1D. In the training set, estimated risk scores were significantly different between patients and controls (P = 8.12 × 10 -92 ), and the area under the curve (AUC) from the receiver operating characteristic (ROC) analysis was 0.917. Using the validation data set, we validated the result with AUC of 0.886. Combining both training and validation data resulted in a predictive model with AUC of 0.903. Further, we performed a "biological validation" by correlating risk scores with 6 islet autoantibodies, and found that the risk score was significantly correlated with IA-2A (Z-score = 3.628, P < 0.001). When applying this prediction model to the Swedish population, where the lifetime T1D risk ranges from 0.5% to 2%, we anticipate identifying approximately 20 000 high-risk subjects after testing all newborns, and this calculation would identify approximately 80% of all patients expected to develop T1D in their lifetime. Through both empirical and biological validation, we have established a prediction model for estimating lifetime T1D risk, using class II HLA. This prediction model should prove useful for future investigations to identify high-risk subjects for prevention research in high-risk populations. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Laboratory Validation and Field Assessment of Petroleum Laboratory Technicians' Dermal Exposure to Crude Oil Using a Wipe Sampling Method.

    PubMed

    Galea, Karen S; Mueller, Will; Arfaj, Ayman M; Llamas, Jose L; Buick, Jennifer; Todd, David; McGonagle, Carolyn

    2018-05-21

    Crude oil may cause adverse dermal effects therefore dermal exposure is an exposure route of concern. Galea et al. (2014b) reported on a study comparing recovery (wipe) and interception (cotton glove) dermal sampling methods. The authors concluded that both methods were suitable for assessing dermal exposure to oil-based drilling fluids and crude oil but that glove samplers may overestimate the amount of fluid transferred to the skin. We describe a study which aimed to further evaluate the wipe sampling method to assess dermal exposure to crude oil, with this assessment including extended sample storage periods and sampling efficiency tests being undertaken at environmental conditions to mimic those typical of outdoor conditions in Saudi Arabia. The wipe sampling method was then used to assess the laboratory technicians' actual exposure to crude oil during typical petroleum laboratory tasks. Overall, acceptable storage efficiencies up to 54 days were reported with results suggesting storage stability over time. Sampling efficiencies were also reported to be satisfactory at both ambient and elevated temperature and relative humidity environmental conditions for surrogate skin spiked with known masses of crude oil and left up to 4 h prior to wiping, though there was an indication of reduced sampling efficiency over time. Nineteen petroleum laboratory technicians provided a total of 35 pre- and 35 post-activity paired hand wipe samples. Ninety-three percent of the pre-exposure paired hand wipes were less than the analytical limit of detection (LOD), whereas 46% of the post-activity paired hand wipes were less than the LOD. The geometric mean paired post-activity wipe sample measurement was 3.09 µg cm-2 (range 1.76-35.4 µg cm-2). It was considered that dermal exposure most frequently occurred through direct contact with the crude oil (emission) or via deposition. The findings of this study suggest that the wipe sampling method is satisfactory in quantifying

  11. Assessing the generalizability of randomized trial results to target populations.

    PubMed

    Stuart, Elizabeth A; Bradshaw, Catherine P; Leaf, Philip J

    2015-04-01

    Recent years have seen increasing interest in and attention to evidence-based practices, where the "evidence" generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as "internal validity"), they do not always yield relevant information about the effects in a particular target population (known as "external validity"). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a prespecified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of school-wide positive behavioral interventions and supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless

  12. Prediction methodologies for target scene generation in the aerothermal targets analysis program (ATAP)

    NASA Astrophysics Data System (ADS)

    Hudson, Douglas J.; Torres, Manuel; Dougherty, Catherine; Rajendran, Natesan; Thompson, Rhoe A.

    2003-09-01

    The Air Force Research Laboratory (AFRL) Aerothermal Targets Analysis Program (ATAP) is a user-friendly, engineering-level computational tool that features integrated aerodynamics, six-degree-of-freedom (6-DoF) trajectory/motion, convective and radiative heat transfer, and thermal/material response to provide an optimal blend of accuracy and speed for design and analysis applications. ATAP is sponsored by the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility at Eglin AFB, where it is used with the CHAMP (Composite Hardbody and Missile Plume) technique for rapid infrared (IR) signature and imagery predictions. ATAP capabilities include an integrated 1-D conduction model for up to 5 in-depth material layers (with options for gaps/voids with radiative heat transfer), fin modeling, several surface ablation modeling options, a materials library with over 250 materials, options for user-defined materials, selectable/definable atmosphere and earth models, multiple trajectory options, and an array of aerodynamic prediction methods. All major code modeling features have been validated with ground-test data from wind tunnels, shock tubes, and ballistics ranges, and flight-test data for both U.S. and foreign strategic and theater systems. Numerous applications include the design and analysis of interceptors, booster and shroud configurations, window environments, tactical missiles, and reentry vehicles.

  13. Prediction of down-gradient impacts of DNAPL source depletion using tracer techniques: Laboratory and modeling validation

    NASA Astrophysics Data System (ADS)

    Jawitz, J. W.; Basu, N.; Chen, X.

    2007-05-01

    Interwell application of coupled nonreactive and reactive tracers through aquifer contaminant source zones enables quantitative characterization of aquifer heterogeneity and contaminant architecture. Parameters obtained from tracer tests are presented here in a Lagrangian framework that can be used to predict the dissolution of nonaqueous phase liquid (NAPL) contaminants. Nonreactive tracers are commonly used to provide information about travel time distributions in hydrologic systems. Reactive tracers have more recently been introduced as a tool to quantify the amount of NAPL contaminant present within the tracer swept volume. Our group has extended reactive tracer techniques to also characterize NAPL spatial distribution heterogeneity. By conceptualizing the flow field through an aquifer as a collection of streamtubes, the aquifer hydrodynamic heterogeneities may be characterized by a nonreactive tracer travel time distribution, and NAPL spatial distribution heterogeneity may be similarly described using reactive travel time distributions. The combined statistics of these distributions are used to derive a simple analytical solution for contaminant dissolution. This analytical solution, and the tracer techniques used for its parameterization, were validated both numerically and experimentally. Illustrative applications are presented from numerical simulations using the multiphase flow and transport simulator UTCHEM, and laboratory experiments of surfactant-enhanced NAPL remediation in two-dimensional flow chambers.

  14. Analysis of Carbamate Pesticides: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS666

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for analysis of aldicarb, bromadiolone, carbofuran, oxamyl, and methomyl in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS666. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in MS666 for analysis of carbamatemore » pesticides in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS666 can be determined.« less

  15. Analysis of Phosphonic Acids: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Vu, A; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled Analysis of Diisopropyl Methylphosphonate, Ethyl Hydrogen Dimethylamidophosphate, Isopropyl Methylphosphonic Acid, Methylphosphonic Acid, and Pinacolyl Methylphosphonic Acid in Water by Multiple Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry: EPA Version MS999. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures describedmore » in EPA Method MS999 for analysis of the listed phosphonic acids and surrogates in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of EPA Method MS999 can be determined.« less

  16. Utilizing random Forest QSAR models with optimized parameters for target identification and its application to target-fishing server.

    PubMed

    Lee, Kyoungyeul; Lee, Minho; Kim, Dongsup

    2017-12-28

    The identification of target molecules is important for understanding the mechanism of "target deconvolution" in phenotypic screening and "polypharmacology" of drugs. Because conventional methods of identifying targets require time and cost, in-silico target identification has been considered an alternative solution. One of the well-known in-silico methods of identifying targets involves structure activity relationships (SARs). SARs have advantages such as low computational cost and high feasibility; however, the data dependency in the SAR approach causes imbalance of active data and ambiguity of inactive data throughout targets. We developed a ligand-based virtual screening model comprising 1121 target SAR models built using a random forest algorithm. The performance of each target model was tested by employing the ROC curve and the mean score using an internal five-fold cross validation. Moreover, recall rates for top-k targets were calculated to assess the performance of target ranking. A benchmark model using an optimized sampling method and parameters was examined via external validation set. The result shows recall rates of 67.6% and 73.9% for top-11 (1% of the total targets) and top-33, respectively. We provide a website for users to search the top-k targets for query ligands available publicly at http://rfqsar.kaist.ac.kr . The target models that we built can be used for both predicting the activity of ligands toward each target and ranking candidate targets for a query ligand using a unified scoring scheme. The scores are additionally fitted to the probability so that users can estimate how likely a ligand-target interaction is active. The user interface of our web site is user friendly and intuitive, offering useful information and cross references.

  17. Simultaneous analysis of coccidiostats and sulfonamides in non-target feed by HPLC-MS/MS and validation following the Commission Decision 2002/657/EC.

    PubMed

    Gavilán, Rosa Elvira; Nebot, Carolina; Patyra, Ewelina; Miranda, Jose Manuel; Franco, Carlos Manuel; Cepeda, Alberto

    2018-05-02

    Taking into consideration the maximum level for coccidiostats included in the European Regulation 574/2011 and the fact that the presence of residues of sulfonamides in non-target feed is forbidden, the aim of this article is to present an analytical method based on HPLC-MS/MS for the identification and quantification of sulfonamides and coccidiostats in non-target feeds. The method was validated following Decision 2002/657/EC and recovery, repeatability, and reproducibility were within the limits stablished in the Decision. For coccidiostats, the decision limit and detection capability were calculated for the different species taking into account the maximum level allowed in Regulation 574/2011. The applicability of the method was investigated in 50 feed samples collected from dairy farms, 50 obtained from feed mills, and 10 interlaboratory feed samples.

  18. Evaluation of validation of a fully instrumented Hüttlin HKC 05-TJ laboratory-scale fluidized bed granulator.

    PubMed

    Wöstheinrich, K; Schmidt, P C

    2000-06-01

    The instrumentation and validation of a laboratory-scale fluidized bed apparatus is described. For continuous control of the process, the apparatus is instrumented with sensors for temperature, relative humidity (RH), and air velocity. Conditions of inlet air, fluidizing air, product, and exhaust air were determined. The temperature sensors were calibrated at temperatures of 0.0 degree C and 99.9 degrees C. The calibration of the humidity sensors covered the range from 12% RH to 98% RH using saturated electrolyte solutions. The calibration of the anemometer took place in a wind tunnel at defined air velocities. The calibrations led to satisfying results concerning sensitivity and precision. To evaluate the reproducibility of the process, 15 granules were prepared under identical conditions. The influence of the type of pump used for delivering the granulating liquid was investigated. Particle size distribution, bulk density, and tapped density were determined. Granules were tableted on a rotary press at four different compression force levels, followed by determination of tablet properties such as weight, crushing strength, and disintegration time. The apparatus was found to produce granules with good reproducibility concerning the granule and tablet properties.

  19. A laboratory validation study of the time-lapse oscillatory pumping test for leakage detection in geological repositories

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Lu, Jiemin; Islam, Akand

    2017-05-01

    Geologic repositories are extensively used for disposing byproducts in mineral and energy industries. The safety and reliability of these repositories are a primary concern to environmental regulators and the public. Time-lapse oscillatory pumping test (OPT) has been introduced recently as a pressure-based technique for detecting potential leakage in geologic repositories. By routinely conducting OPT at a number of pulsing frequencies, an operator may identify the potential repository anomalies in the frequency domain, alleviating the ambiguity caused by reservoir noise and improving the signal-to-noise ratio. Building on previous theoretical and field studies, this work performed a series of laboratory experiments to validate the concept of time-lapse OPT using a custom made, stainless steel tank under relatively high pressures. The experimental configuration simulates a miniature geologic storage repository consisting of three layers (i.e., injection zone, caprock, and above-zone aquifer). Results show that leakage in the injection zone led to deviations in the power spectrum of observed pressure data, and the amplitude of which also increases with decreasing pulsing frequencies. The experimental results are further analyzed by developing a 3D flow model, using which the model parameters are estimated through frequency domain inversion.

  20. Development and single-laboratory validation of an HPLC method for the determination of cyclamate sweetener in foodstuffs.

    PubMed

    Scotter, M J; Castle, L; Roberts, D P T; Macarthur, R; Brereton, P A; Hasnip, S K; Katz, N

    2009-05-01

    A method for the determination of cyclamate has been developed and single-laboratory validated for a range of foodstuffs including carbonated and fruit-juice drinks, fruit preserves, spreads, and dairy desserts. The method uses the peroxide oxidation of cyclamate to cyclohexylamine followed by derivatization with trinitrobenzenesulfonic acid and analysis by a modified reversed-phase high-performance liquid chromatography-ultraviolet light (HPLC-UV). Cycloheptylamine is used as an internal standard. The limits of detection were in the range 1-20 mg kg(-1) and the analysis was linear up to 1300 mg kg(-1) cyclamic acid in foods and up to 67 mg l(-1) in beverages. Analytical recovery was between 82% and 123%, and results were recovery corrected. Precision was within experimentally predicted levels for all of the matrices tested and Horrat values for the combined standard uncertainty associated with the measurement of cyclamate between 0.4 (water-based drinks) and 1.7 (spreads). The method was used successfully to test three soft drink samples for homogeneity before analytical performance assessment. The method is recommended for use in monitoring compliance and for formal testing by collaborative trial.

  1. Determination of Major Phenolic Compounds in Echinacea spp. Raw Materials and Finished Products by High-Performance Liquid Chromatography with Ultraviolet Detection: Single-Laboratory Validation Matrix Extension

    PubMed Central

    Brown, Paula N.; Chan, Michael; Paley, Lori; Betz, Joseph M.

    2013-01-01

    A method previously validated to determine caftaric acid, chlorogenic acid, cynarin, echinacoside, and cichoric acid in echinacea raw materials has been successfully applied to dry extract and liquid tincture products in response to North American consumer needs. Single-laboratory validation was used to assess the repeatability, accuracy, selectivity, LOD, LOQ, analyte stability (ruggedness), and linearity of the method, with emphasis on finished products. Repeatability precision for each phenolic compound was between 1.04 and 5.65% RSD, with HorRat values between 0.30 and 1.39 for raw and dry extract finished products. HorRat values for tinctures were between 0.09 and 1.10. Accuracy of the method was determined through spike recovery studies. Recovery of each compound from raw material negative control (ginseng) was between 90 and 114%, while recovery from the finished product negative control (maltodextrin and magnesium stearate) was between 97 and 103%. A study was conducted to determine if cichoric acid, a major phenolic component of Echinacea purpurea (L.) Moench and E. angustifolia DC, degrades during sample preparation (extraction) and HPLC analysis. No significant degradation was observed over an extended testing period using the validated method. PMID:22165004

  2. Determination of major phenolic compounds in Echinacea spp. raw materials and finished products by high-performance liquid chromatography with ultraviolet detection: single-laboratory validation matrix extension.

    PubMed

    Brown, Paula N; Chan, Michael; Paley, Lori; Betz, Joseph M

    2011-01-01

    A method previously validated to determine caftaric acid, chlorogenic acid, cynarin, echinacoside, and cichoric acid in echinacea raw materials has been successfully applied to dry extract and liquid tincture products in response to North American consumer needs. Single-laboratory validation was used to assess the repeatability, accuracy, selectivity, LOD, LOQ, analyte stability (ruggedness), and linearity of the method, with emphasis on finished products. Repeatability precision for each phenolic compound was between 1.04 and 5.65% RSD, with HorRat values between 0.30 and 1.39 for raw and dry extract finished products. HorRat values for tinctures were between 0.09 and 1.10. Accuracy of the method was determined through spike recovery studies. Recovery of each compound from raw material negative control (ginseng) was between 90 and 114%, while recovery from the finished product negative control (maltodextrin and magnesium stearate) was between 97 and 103%. A study was conducted to determine if cichoric acid, a major phenolic component of Echinacea purpurea (L.) Moench and E. angustifolia DC, degrades during sample preparation (extraction) and HPLC analysis. No significant degradation was observed over an extended testing period using the validated method.

  3. A laboratory validation study of the time-lapse oscillatory pumping test concept for leakage detection in geological repositories

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Islam, A.; Lu, J.

    2017-12-01

    Time-lapse oscillatory pumping test (OPT) has been introduced recently as a pressure-based monitoring technique for detecting potential leakage in geologic repositories. By routinely conducting OPT at a number of pulsing frequencies, a site operator may identify the potential anomalies in the frequency domain, alleviating the ambiguity caused by reservoir noise and improving the signal-to-noise ratio. Building on previous theoretical and field studies, this work performed a series of laboratory experiments to validate the concept of time-lapse OPT using a custom made, stainless steel tank under relatively high pressures ( 120psi). The experimental configuration simulates a miniature geologic storage repository consisting of three layers (i.e., injection zone, caprock, and above-zone aquifer). Results show that leakage in the injection zone led to deviations in the power spectrum of observed pressure data, and the amplitude of which also increases with decreasing pulsing frequencies. The experimental results were further analyzed by developing a 3D flow model, using which the model parameters were estimated through frequency domain inversion.

  4. [Development and validating an educational booklet for childbirth companions].

    PubMed

    Teles, Liana Mara Rocha; Oliveira, Amanda Souza de; Campos, Fernanda Câmara; Lima, Thaís Marques; Costa, Camila Chaves da; Gomes, Linicarla Fabiole de Souza; Oriá, Mônica Oliveira Batista; Damasceno, Ana Kelve de Castro

    2014-12-01

    The article describes the steps in producing and validating an educational booklet for childbirth companions. Methodological study conducted in 2011 consisting of the following steps: situational assessment; establishing brochure content; content selection and referencing; drafting the text; design of illustrations; layout; consultation of specialists; consultation of target audience; amendments; proofreading; evaluation using the Flesch Reading Ease Formula. The topics portrayed the sequence of events involving support from gestation to the postpartum period. The concordance rate among companions was greater than or equal to 81.8% for the topics organisation, writing style, presentation and motives. The overall Content Validity Index of the booklet was 0.94. The booklet was classified as easy reading or very easy reading according to the results of the Flesch Reading Ease Formula. The presentation and content of the manual were validated for use with the target audience by the specialists and representatives of the target audience.

  5. Aptamers as tools for target prioritization and lead identification.

    PubMed

    Burgstaller, Petra; Girod, Anne; Blind, Michael

    2002-12-15

    The increasing number of potential drug target candidates has driven the development of novel technologies designed to identify functionally important targets and enhance the subsequent lead discovery process. Highly specific synthetic nucleic acid ligands--also known as aptamers--offer a new exciting route in the drug discovery process by linking target validation directly with HTS. Recently, aptamers have proven to be valuable tools for modulating the function of endogenous cellular proteins in their natural environment. A set of technologies has been developed to use these sophisticated ligands for the validation of potential drug targets in disease models. Moreover, aptamers that are specific antagonists of protein function can act as substitute interaction partners in HTS assays to facilitate the identification of small-molecule lead compounds.

  6. Bioinformatics prediction and experimental validation of microRNA-20a targeting Cyclin D1 in hepatocellular carcinoma.

    PubMed

    Karimkhanloo, Hamzeh; Mohammadi-Yeganeh, Samira; Ahsani, Zeinab; Paryan, Mahdi

    2017-04-01

    Hepatocellular carcinoma is the major form of primary liver cancer, which is the second and sixth leading cause of cancer-related death in men and women, respectively. Extensive research indicates that Wnt/β-catenin signaling pathway, which plays a pivotal role in growth, development, and differentiation of hepatocellular carcinoma, is one of the major signaling pathways that is dysregulated in hepatocellular carcinoma. Cyclin D1 is a proto-oncogene and is one of the major regulators of Wnt signaling pathway, and its overexpression has been detected in various types of cancers including hepatocellular carcinoma. Using several validated bioinformatic databases, we predicted that the microRNAs are capable of targeting 3'-untranslated region of Cyclin D1 messenger RNA. According to the results, miR-20a was selected as the highest ranking microRNA targeting Cyclin D1 messenger RNA. Luciferase assay was recruited to confirm bioinformatic prediction results. Cyclin D1 expression was first assessed by quantitative real-time polymerase chain reaction in HepG2 cell line. Afterward, HepG2 cells were transduced by lentiviruses containing miR-20a. Then, the expression of miR-20a and Cyclin D1 was evaluated. The results of luciferase assay demonstrated targeting of 3'-untranslated region of Cyclin D1 messenger RNA by miR-20a. Furthermore, 238-fold decline in Cyclin D1 expression was observed after lentiviral induction of miR-20a in HepG2 cells. The results highlighted a considerable effect of miRNA-20a induction on the down-regulation of Cyclin D1 gene. Our results suggest that miR-20a can be used as a novel candidate for therapeutic purposes and a biomarker for hepatocellular carcinoma diagnosis.

  7. [Quality Management System in Pathological Laboratory].

    PubMed

    Koyatsu, Junichi; Ueda, Yoshihiko

    2015-07-01

    Even compared to other clinical laboratories, the pathological laboratory conducts troublesome work, and many of the work processes are also manual. Therefore, the introduction of the systematic management of administration is necessary. It will be a shortcut to use existing standards such as ISO 15189 for this purpose. There is no standard specialized for the pathological laboratory, but it is considered to be important to a pathological laboratory in particular. 1. Safety nianagement of the personnel and environmental conditions. Comply with laws and regulations concerning the handling of hazardous materials. 2. Pre-examination processes. The laboratory shall have documented procedures for the proper collection and handling of primary samples. Developed and documented criteria for acceptance or rejection of samples are applied. 3. Examination processes. Selection, verification, and validation of the examination procedures. Devise a system that can constantly monitor the traceability of the sample. 4. Post-examination processes. Storage, retention, and disposal of clinical samples. 5. Release of results. When examination results fall within established alert or critical intervals, immediately notify the physicians. The important point is to recognize the needs of the client and be aware that pathological diagnoses are always "the final diagnoses".

  8. Validating the disruption of proliferating cell nuclear antigen interactions in the development of targeted cancer therapeutics.

    PubMed

    Smith, Shanna J; Hickey, Robert J; Malkas, Linda H

    2016-01-01

    Human DNA replication and repair is a highly coordinated process involving the specifically timed actions of numerous proteins and enzymes. Many of these proteins require interaction with proliferating cell nuclear antigen (PCNA) for activation within the process. The interdomain connector loop (IDCL) of PCNA provides a docking site for many of those proteins, suggesting that this region is critically important in the regulation of cellular function. Previous work in this laboratory has demonstrated that a peptide mimicking a specific region of the IDCL (caPeptide) has the ability to disrupt key protein-protein interactions between PCNA and its binding partners, thereby inhibiting DNA replication within the cells. In this study, we confirm the ability of the caPeptide to disrupt DNA replication function using both intact cell and in vitro DNA replication assays. Further, we were able to demonstrate that treatment with caPeptide results in a decrease of polymerase δ activity that correlates with the observed decrease in DNA replication. We have also successfully developed a surface plasmon resonance (SPR) assay to validate the disruption of the PCNA-pol δ interaction with caPeptide.

  9. Head repositioning accuracy in patients with neck pain and asymptomatic subjects: concurrent validity, influence of motion speed, motion direction and target distance.

    PubMed

    Dugailly, Pierre-Michel; De Santis, Roberta; Tits, Mathieu; Sobczak, Stéphane; Vigne, Anna; Feipel, Véronique

    2015-12-01

    Cervicocephalic kinesthetic deficiencies have been demonstrated in patients with chronic neck pain (NP). On the other hand, authors emphasized the use of different motion speeds for assessing functional impairment of the cervical spine. The objectives of this study were (1) to investigate the head repositioning accuracy in NP patients and control subjects and (2) to assess the influence of target distance, motion speed, motion direction and pain. Seventy-one subjects (36 healthy subjects and 35 NP patients; age 30-55 years) performed the head repositioning test (HRT) at two different speeds for horizontal and vertical movements and at two different distances. For each condition, six consecutive trials were sampled. The study showed the validity and reproducibility of the HRT, confirming a dysfunctional threshold of 4.5°. Normative values of head repositioning error up to 3.6° and 7.1° were identified for healthy and NP subjects, respectively. A distance of 180 cm from the target and a natural motion speed increased HRT accuracy. Repositioning after extension movement showed a significantly larger error in both groups. Intensity, duration of pain as well as pain level did not significantly alter head repositioning error. The assessment of proprioceptive performance in healthy and NP subjects allowed the validation of the HRT. The HRT is a simple, not expensive and fast test, easily implementable in daily practice to assess and monitor treatment and evolution of proprioceptive cervical deficits.

  10. Comparable Educational Benefits in Half the Time: An Alternating Organic Chemistry Laboratory Sequence Targeting Prehealth Students

    ERIC Educational Resources Information Center

    Young, Sherri C.; Colabroy, Keri L.; Baar, Marsha R.

    2016-01-01

    The laboratory is a mainstay in STEM education, promoting the development of critical thinking skills, dexterity, and scientific curiosity. The goals in the laboratory for nonchemistry, prehealth majors, though, could be distinguished from those for chemistry majors. In service courses such as organic chemistry, much laboratory time is often spent…

  11. The University of Kansas High-Throughput Screening Laboratory. Part II: enabling collaborative drug-discovery partnerships through cutting-edge screening technology

    PubMed Central

    McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam

    2011-01-01

    The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core was established in 2002 at the University of Kansas with support from an NIH grant and the state of Kansas. It collaborates with investigators from national and international academic, nonprofit and pharmaceutical organizations in executing HTS-ready assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization. This is part two of a contribution from the KU HTS laboratory. PMID:21806374

  12. The University of Kansas High-Throughput Screening Laboratory. Part II: enabling collaborative drug-discovery partnerships through cutting-edge screening technology.

    PubMed

    McDonald, Peter R; Roy, Anuradha; Chaguturu, Rathnam

    2011-07-01

    The University of Kansas High-Throughput Screening (KU HTS) core is a state-of-the-art drug-discovery facility with an entrepreneurial open-service policy, which provides centralized resources supporting public- and private-sector research initiatives. The KU HTS core was established in 2002 at the University of Kansas with support from an NIH grant and the state of Kansas. It collaborates with investigators from national and international academic, nonprofit and pharmaceutical organizations in executing HTS-ready assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization. This is part two of a contribution from the KU HTS laboratory.

  13. Single-laboratory validation of a saponification method for the determination of four polycyclic aromatic hydrocarbons in edible oils by HPLC-fluorescence detection.

    PubMed

    Akdoğan, Abdullah; Buttinger, Gerhard; Wenzl, Thomas

    2016-01-01

    An analytical method is reported for the determination of four polycyclic aromatic hydrocarbons (benzo[a]pyrene (BaP), benz[a]anthracene (BaA), benzo[b]fluoranthene (BbF) and chrysene (CHR)) in edible oils (sesame, maize, sunflower and olive oil) by high-performance liquid chromatography. Sample preparation is based on three steps including saponification, liquid-liquid partitioning and, finally, clean-up by solid phase extraction on 2 g of silica. Guidance on single-laboratory validation of the proposed analysis method was taken from the second edition of the Eurachem guide on method validation. The lower level of the working range of the method was determined by the limits of quantification of the individual analytes, and the upper level was equal to 5.0 µg kg(-1). The limits of detection and quantification of the four PAHs ranged from 0.06 to 0.12 µg kg(-1) and from 0.13 to 0.24 µg kg(-1). Recoveries of more than 84.8% were achieved for all four PAHs at two concentration levels (2.5 and 5.0 µg kg(-1)), and expanded relative measurement uncertainties were below 20%. The performance of the validated method was in all aspects compliant with provisions set in European Union legislation for the performance of analytical methods employed in the official control of food. The applicability of the method to routine samples was evaluated based on a limited number of commercial edible oil samples.

  14. External Heat Transfer Coefficient Measurements on a Surrogate Indirect Inertial Confinement Fusion Target

    DOE PAGES

    Miles, Robin; Havstad, Mark; LeBlanc, Mary; ...

    2015-09-15

    External heat transfer coefficients were measured around a surrogate Indirect inertial confinement fusion (ICF) based on the Laser Inertial Fusion Energy (LIFE) design target to validate thermal models of the LIFE target during flight through a fusion chamber. Results indicate that heat transfer coefficients for this target 25-50 W/m 2∙K are consistent with theoretically derived heat transfer coefficients and valid for use in calculation of target heating during flight through a fusion chamber.

  15. Mitochondria as new therapeutic targets for eradicating cancer stem cells: Quantitative proteomics and functional validation via MCT1/2 inhibition.

    PubMed

    Lamb, Rebecca; Harrison, Hannah; Hulit, James; Smith, Duncan L; Lisanti, Michael P; Sotgia, Federica

    2014-11-30

    Here, we used quantitative proteomics analysis to identify novel therapeutic targets in cancer stem cells and/or progenitor cells. For this purpose, mammospheres from two ER-positive breast cancer cell lines (MCF7 and T47D) were grown in suspension using low-attachment plates and directly compared to attached monolayer cells grown in parallel. This allowed us to identify a subset of proteins that were selectively over-expressed in mammospheres, relative to epithelial monolayers. We focused on mitochondrial proteins, as they appeared to be highly upregulated in both MCF7 and T47D mammospheres. Key mitochondrial-related enzymes involved in beta-oxidation and ketone metabolism were significantly upregulated in mammospheres, as well as proteins involved in mitochondrial biogenesis, and specific protein inhibitors of autophagy/mitophagy. Overall, we identified >40 "metabolic targets" that were commonly upregulated in both MCF7 and T47D mammospheres. Most of these "metabolic targets" were also transcriptionally upregulated in human breast cancer cells in vivo, validating their clinical relevance. Based on this analysis, we propose that increased mitochondrial biogenesis and decreased mitochondrial degradation could provide a novel mechanism for the accumulation of mitochondrial mass in cancer stem cells. To functionally validate our observations, we utilized a specific MCT1/2 inhibitor (AR-C155858), which blocks the cellular uptake of two types of mitochondrial fuels, namely ketone bodies and L-lactate. Our results indicate that inhibition of MCT1/2 function effectively reduces mammosphere formation, with an IC-50 of ~1 µM, in both ER-positive and ER-negative breast cancer cell lines. Very similar results were obtained with oligomycin A, an inhibitor of the mitochondrial ATP synthase. Thus, the proliferative clonal expansion of cancer stem cells appears to require oxidative mitochondrial metabolism, related to the re-use of monocarboxylic acids, such as ketones or L

  16. Electronic laboratory notebook: the academic point of view.

    PubMed

    Rudolphi, Felix; Goossen, Lukas J

    2012-02-27

    Based on a requirement analysis and alternative design considerations, a platform-independent electronic laboratory notebook (ELN) has been developed that specifically targets academic users. Its intuitive design and numerous productivity features motivate chemical researchers and students to record their data electronically. The data are stored in a highly structured form that offers substantial benefits over laboratory notebooks written on paper with regard to data retrieval, data mining, and exchange of results.

  17. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  18. Selection, calibration, and validation of models of tumor growth.

    PubMed

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  19. Development of performance assessment instrument based contextual learning for measuring students laboratory skills

    NASA Astrophysics Data System (ADS)

    Susilaningsih, E.; Khotimah, K.; Nurhayati, S.

    2018-04-01

    The assessment of laboratory skill in general hasn’t specific guideline in assessment, while the individual assessment of students during a performance and skill in performing laboratory is still not been observed and measured properly. Alternative assessment that can be used to measure student laboratory skill is use performance assessment. The purpose of this study was to determine whether the performance assessment instrument that the result of research can be used to assess basic skills student laboratory. This research was conducted by the Research and Development. The result of the data analysis performance assessment instruments developed feasible to implement and validation result 62.5 with very good categories for observation sheets laboratory skills and all of the components with the very good category. The procedure is the preliminary stages of research and development stages. Preliminary stages are divided in two, namely the field studies and literature studies. The development stages are divided into several parts, namely 1) development of the type instrument, 2) validation by an expert, 3) a limited scale trial, 4) large-scale trials and 5) implementation of the product. The instrument included in the category of effective because 26 from 29 students have very high laboratory skill and high laboratory skill. The research of performance assessment instrument is standard and can be used to assess basic skill student laboratory.

  20. Development and Validation of a Qualitative Method for Target Screening of 448 Pesticide Residues in Fruits and Vegetables Using UHPLC/ESI Q-Orbitrap Based on Data-Independent Acquisition and Compound Database.

    PubMed

    Wang, Jian; Chow, Willis; Chang, James; Wong, Jon W

    2017-01-18

    A semiautomated qualitative method for target screening of 448 pesticide residues in fruits and vegetables was developed and validated using ultrahigh-performance liquid chromatography coupled with electrospray ionization quadrupole Orbitrap high-resolution mass spectrometry (UHPLC/ESI Q-Orbitrap). The Q-Orbitrap Full MS/dd-MS 2 (data dependent acquisition) was used to acquire product-ion spectra of individual pesticides to build a compound database or an MS library, while its Full MS/DIA (data independent acquisition) was utilized for sample data acquisition from fruit and vegetable matrices fortified with pesticides at 10 and 100 μg/kg for target screening purpose. Accurate mass, retention time and response threshold were three key parameters in a compound database that were used to detect incurred pesticide residues in samples. The concepts and practical aspects of in-spectrum mass correction or solvent background lock-mass correction, retention time alignment and response threshold adjustment are discussed while building a functional and working compound database for target screening. The validated target screening method is capable of screening at least 94% and 99% of 448 pesticides at 10 and 100 μg/kg, respectively, in fruits and vegetables without having to evaluate every compound manually during data processing, which significantly reduced the workload in routine practice.

  1. Changing resident test ordering behavior: a multilevel intervention to decrease laboratory utilization at an academic medical center.

    PubMed

    Vidyarthi, Arpana R; Hamill, Timothy; Green, Adrienne L; Rosenbluth, Glenn; Baron, Robert B

    2015-01-01

    Hospital laboratory test volume is increasing, and overutilization contributes to errors and costs. Efforts to reduce laboratory utilization have targeted aspects of ordering behavior, but few have utilized a multilevel collaborative approach. The study team partnered with residents to reduce unnecessary laboratory tests and associated costs through multilevel interventions across the academic medical center. The study team selected laboratory tests for intervention based on cost, volume, and ordering frequency (complete blood count [CBC] and CBC with differential, common electrolytes, blood enzymes, and liver function tests). Interventions were designed collaboratively with residents and targeted components of ordering behavior, including system changes, teaching, social marketing, academic detailing, financial incentives, and audit/feedback. Laboratory ordering was reduced by 8% cumulatively over 3 years, saving $2 019 000. By involving residents at every stage of the intervention and targeting multiple levels simultaneously, laboratory utilization was reduced and cost savings were sustained over 3 years. © 2014 by the American College of Medical Quality.

  2. Predicting selective drug targets in cancer through metabolic networks

    PubMed Central

    Folger, Ori; Jerby, Livnat; Frezza, Christian; Gottlieb, Eyal; Ruppin, Eytan; Shlomi, Tomer

    2011-01-01

    The interest in studying metabolic alterations in cancer and their potential role as novel targets for therapy has been rejuvenated in recent years. Here, we report the development of the first genome-scale network model of cancer metabolism, validated by correctly identifying genes essential for cellular proliferation in cancer cell lines. The model predicts 52 cytostatic drug targets, of which 40% are targeted by known, approved or experimental anticancer drugs, and the rest are new. It further predicts combinations of synthetic lethal drug targets, whose synergy is validated using available drug efficacy and gene expression measurements across the NCI-60 cancer cell line collection. Finally, potential selective treatments for specific cancers that depend on cancer type-specific downregulation of gene expression and somatic mutations are compiled. PMID:21694718

  3. Validity and reliability of balance assessment software using the Nintendo Wii balance board: usability and validation.

    PubMed

    Park, Dae-Sung; Lee, GyuChang

    2014-06-10

    A balance test provides important information such as the standard to judge an individual's functional recovery or make the prediction of falls. The development of a tool for a balance test that is inexpensive and widely available is needed, especially in clinical settings. The Wii Balance Board (WBB) is designed to test balance, but there is little software used in balance tests, and there are few studies on reliability and validity. Thus, we developed a balance assessment software using the Nintendo Wii Balance Board, investigated its reliability and validity, and compared it with a laboratory-grade force platform. Twenty healthy adults participated in our study. The participants participated in the test for inter-rater reliability, intra-rater reliability, and concurrent validity. The tests were performed with balance assessment software using the Nintendo Wii balance board and a laboratory-grade force platform. Data such as Center of Pressure (COP) path length and COP velocity were acquired from the assessment systems. The inter-rater reliability, the intra-rater reliability, and concurrent validity were analyzed by an intraclass correlation coefficient (ICC) value and a standard error of measurement (SEM). The inter-rater reliability (ICC: 0.89-0.79, SEM in path length: 7.14-1.90, SEM in velocity: 0.74-0.07), intra-rater reliability (ICC: 0.92-0.70, SEM in path length: 7.59-2.04, SEM in velocity: 0.80-0.07), and concurrent validity (ICC: 0.87-0.73, SEM in path length: 5.94-0.32, SEM in velocity: 0.62-0.08) were high in terms of COP path length and COP velocity. The balance assessment software incorporating the Nintendo Wii balance board was used in our study and was found to be a reliable assessment device. In clinical settings, the device can be remarkably inexpensive, portable, and convenient for the balance assessment.

  4. Validating soil denitrification models based on laboratory N_{2} and N_{2}O fluxes and underlying processes derived by stable isotope approaches

    NASA Astrophysics Data System (ADS)

    Well, Reinhard; Böttcher, Jürgen; Butterbach-Bahl, Klaus; Dannenmann, Michael; Deppe, Marianna; Dittert, Klaus; Dörsch, Peter; Horn, Marcus; Ippisch, Olaf; Mikutta, Robert; Müller, Carsten; Müller, Christoph; Senbayram, Mehmet; Vogel, Hans-Jörg; Wrage-Mönnig, Nicole

    2016-04-01

    Robust denitrification data suitable to validate soil N2 fluxes in denitrification models are scarce due to methodical limitations and the extreme spatio-temporal heterogeneity of denitrification in soils. Numerical models have become essential tools to predict denitrification at different scales. Model performance could either be tested for total gaseous flux (NO + N2O + N2), individual denitrification products (e.g. N2O and/or NO) or for the effect of denitrification factors (e.g. C-availability, respiration, diffusivity, anaerobic volume, etc.). While there are numerous examples for validating N2O fluxes, there are neither robust field data of N2 fluxes nor sufficiently resolved measurements of control factors used as state variables in the models. To the best of our knowledge there has been only one published validation of modelled soil N2 flux by now, using a laboratory data set to validate an ecosystem model. Hence there is a need for validation data at both, the mesocosm and the field scale including validation of individual denitrification controls. Here we present the concept for collecting model validation data which is be part of the DFG-research unit "Denitrification in Agricultural Soils: Integrated Control and Modelling at Various Scales (DASIM)" starting this year. We will use novel approaches including analysis of stable isotopes, microbial communities, pores structure and organic matter fractions to provide denitrification data sets comprising as much detail on activity and regulation as possible as a basis to validate existing and calibrate new denitrification models that are applied and/or developed by DASIM subprojects. The basic idea is to simulate "field-like" conditions as far as possible in an automated mesocosm system without plants in order to mimic processes in the soil parts not significantly influenced by the rhizosphere (rhizosphere soils are studied by other DASIM projects). Hence, to allow model testing in a wide range of conditions

  5. Undergraduate Laboratory on a Turbulent Impinging Jet

    NASA Astrophysics Data System (ADS)

    Ivanosky, Arnaud; Brezzard, Etienne; van Poppel, Bret; Benson, Michael

    2017-11-01

    An undergraduate thermal sciences laboratory exercise that includes both experimental fluid mechanics and heat transfer measurements of an impinging jet is presented. The flow field is measured using magnetic resonance velocimetry (MRV) of a water flow, while IR thermography is used in the heat transfer testing. Flow Reynolds numbers for both the heat transfer and fluid mechanics tests range from 20,000-50,000 based on the jet diameter for a fully turbulent flow condition, with target surface temperatures in the heat transfer test reaching a maximum of approximately 50 Kelvin. The heat transfer target surface is subject to a measured uniform Joule heat flux, a well-defined boundary condition that allows comparison to existing correlations. The MRV generates a 3-component 3-dimensional data set, while the IR thermography provides a 2-dimensional heat transfer coefficient (or Nusselt number) map. These data sets can be post-processed and compared to existing correlations to verify data quality, and the sets can be juxtaposed to understand how flow features drive heat transfer. The laboratory setup, data acquisition, and analysis procedures are described for the laboratory experience, which can be incorporated as fluid mechanics, experimental methods, and heat transfer courses

  6. Laboratory challenges in the scaling up of HIV, TB, and malaria programs: The interaction of health and laboratory systems, clinical research, and service delivery.

    PubMed

    Birx, Deborah; de Souza, Mark; Nkengasong, John N

    2009-06-01

    Strengthening national health laboratory systems in resource-poor countries is critical to meeting the United Nations Millennium Development Goals. Despite strong commitment from the international community to fight major infectious diseases, weak laboratory infrastructure remains a huge rate-limiting step. Some major challenges facing laboratory systems in resource-poor settings include dilapidated infrastructure; lack of human capacity, laboratory policies, and strategic plans; and limited synergies between clinical and research laboratories. Together, these factors compromise the quality of test results and impact patient management. With increased funding, the target of laboratory strengthening efforts in resource-poor countries should be the integrating of laboratory services across major diseases to leverage resources with respect to physical infrastructure; types of assays; supply chain management of reagents and equipment; and maintenance of equipment.

  7. Research with Radioactive Targets

    NASA Astrophysics Data System (ADS)

    Ahle, Larry

    2004-10-01

    Obtaining precise information about neutron capture cross-sections for s-process branch points is a key goal of nuclear astrophysics. Since these nuclei are unstable and neutron targets do not exist, performing these measurements require a facility that can produce the nuclei of interest at a sufficient rate to allow formation of a meaningful target (at least 1015 atoms). The Rare Isotope Accelerator (RIA) promises such rates, often enabling collection of greater than 1016 atoms after only of few days of production running. By properly designing both the ISOL and fragmentation lines, these collections will often be possible to obtained parasitically to other radioactive ion beam production. But given a target, performing the neutron capture cross-section measurement also presents its own challenges. In many cases, activation measurements are feasible, providing one obtains a target of sufficient purity. But for many branch point nuclei, the capture product is stable or long enough lived that no radiation signature is available for detection. Measurements for these nuclei will require a BaF2 array like DANCE at Los Alamos National Laboratory, which uses gamma calorimetry to detect neutron capture events. Plans and issues associated with isotope harvesting will be discussed, as well as challenges associated with performing theses measurements. Current plans for doing DANCE type measurements at RIA will also be discussed. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  8. Adaptive Training and Education Research at the US Army Research Laboratory: Bibliography (2016-2017)

    DTIC Science & Technology

    2018-03-05

    Validation suite. Synthetic training environments. Service orientated architecture. Citation: Robson, E., Ray, F., Sinatra, A. M., & Sinatra, A. M. (2017...ARL-SR-0393 ● MAR 2018 US Army Research Laboratory Adaptive Training and Education Research at the US Army Research Laboratory... Training and Education Research at the US Army Research Laboratory: Bibliography (2016–2017) by Robert A Sottilare Human Research and

  9. Test Takers and the Validity of Score Interpretations

    ERIC Educational Resources Information Center

    Kopriva, Rebecca J.; Thurlow, Martha L.; Perie, Marianne; Lazarus, Sheryl S.; Clark, Amy

    2016-01-01

    This article argues that test takers are as integral to determining validity of test scores as defining target content and conditioning inferences on test use. A principled sustained attention to how students interact with assessment opportunities is essential, as is a principled sustained evaluation of evidence confirming the validity or calling…

  10. Conceptualization, Development and Validation of an Instrument for Investigating Elements of Undergraduate Physics Laboratory Learning Environments: The UPLLES (Undergraduate Physics Laboratory Learning Environment Survey)

    ERIC Educational Resources Information Center

    Thomas, Gregory P; Meldrum, Al; Beamish, John

    2013-01-01

    First-year undergraduate physics laboratories are important physics learning environments. However, there is a lack of empirically informed literature regarding how students perceive their overall laboratory learning experiences. Recipe formats persist as the dominant form of instructional design in these sites, and these formats do not adequately…

  11. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  12. Targeted estimation of nuisance parameters to obtain valid statistical inference.

    PubMed

    van der Laan, Mark J

    2014-01-01

    In order to obtain concrete results, we focus on estimation of the treatment specific mean, controlling for all measured baseline covariates, based on observing independent and identically distributed copies of a random variable consisting of baseline covariates, a subsequently assigned binary treatment, and a final outcome. The statistical model only assumes possible restrictions on the conditional distribution of treatment, given the covariates, the so-called propensity score. Estimators of the treatment specific mean involve estimation of the propensity score and/or estimation of the conditional mean of the outcome, given the treatment and covariates. In order to make these estimators asymptotically unbiased at any data distribution in the statistical model, it is essential to use data-adaptive estimators of these nuisance parameters such as ensemble learning, and specifically super-learning. Because such estimators involve optimal trade-off of bias and variance w.r.t. the infinite dimensional nuisance parameter itself, they result in a sub-optimal bias/variance trade-off for the resulting real-valued estimator of the estimand. We demonstrate that additional targeting of the estimators of these nuisance parameters guarantees that this bias for the estimand is second order and thereby allows us to prove theorems that establish asymptotic linearity of the estimator of the treatment specific mean under regularity conditions. These insights result in novel targeted minimum loss-based estimators (TMLEs) that use ensemble learning with additional targeted bias reduction to construct estimators of the nuisance parameters. In particular, we construct collaborative TMLEs (C-TMLEs) with known influence curve allowing for statistical inference, even though these C-TMLEs involve variable selection for the propensity score based on a criterion that measures how effective the resulting fit of the propensity score is in removing bias for the estimand. As a particular special

  13. SECOND TARGET STATION MODERATOR PERFORMANCE WITH A ROTATING TARGET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Gallmeier, Franz X; Rennich, Mark J

    2016-01-01

    Oak Ridge National Laboratory manages and operates the Spallation Neutron Source and the High Flux Isotope Reactor, two of the world's most advanced neutron scattering facilities. Both facilities are funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Science, and are available to researchers from all over the world. Delivering cutting edge science requires continuous improvements and development of the facilities and instruments. The SNS was designed from the outset to accommodate an additional target station, or Second Target Station (STS), and an upgraded accelerator feeding proton beams to STS and the existing First Targetmore » Station (FTS). Upgrade of the accelerator and the design and construction of STS are being proposed. The presently considered STS configuration is driven with short (<1 s) proton pulses at 10 Hz repetition rate and 467 kW proton beam power, and is optimized for high intensity and high resolution long wavelength neutron applications. STS will allow installation of 22 beamlines and will expand and complement the current national neutron scattering capabilities. In 2015 the STS studies were performed for a compact tungsten target; first a stationary tungsten plate target was analyzed to considerable details and then dropped in favor of a rotating target. For both target options the proton beam footprint as small as acceptable from mechanical and heat removal aspects is required to arrive at a compact-volume neutron production zone in the target, which is essential for tight coupling of target and moderators and for achieving high-intensity peak neutron fluxes. This paper will present recent STS work with the emphasis on neutronics and moderator performance.« less

  14. Down-regulation of miR-146a-5p and its potential targets in hepatocellular carcinoma validated by a TCGA- and GEO-based study.

    PubMed

    Zhang, Xin; Ye, Zhi-Hua; Liang, Hai-Wei; Ren, Fang-Hui; Li, Ping; Dang, Yi-Wu; Chen, Gang

    2017-04-01

    Our previous research has demonstrated that miR-146a-5p is down-regulated in hepatocellular carcinoma (HCC) and might play a tumor-suppressive role. In this study, we sought to validate the decreased expression with a larger cohort and to explore potential molecular mechanisms. GEO and TCGA databases were used to gather miR-146a-5p expression data in HCC, which included 762 HCC and 454 noncancerous liver tissues. A meta-analysis of the GEO-based microarrays, TCGA-based RNA-seq data, and additional qRT-PCR data validated the down-regulation of miR-146a-5p in HCC and no publication bias was observed. Integrated genes were generated by overlapping miR-146a-5p-related genes from predicted and formerly reported HCC-related genes using natural language processing. The overlaps were comprehensively analyzed to discover the potential gene signatures, regulatory pathways, and networks of miR-146a-5p in HCC. A total of 251 miR-146a-5p potential target genes were predicted by bioinformatics platforms and 104 genes were considered as both HCC- and miR-146a-5p-related overlaps. RAC1 was the most connected hub gene for miR-146a-5p and four pathways with high enrichment (VEGF signaling pathway, adherens junction, toll-like receptor signaling pathway, and neurotrophin signaling pathway) were denoted for the overlapped genes. The down-regulation of miR-146a-5p in HCC has been validated with the most complete data possible. The potential gene signatures, regulatory pathways, and networks identified for miR-146a-5p in HCC could prove useful for molecular-targeted diagnostics and therapeutics.

  15. TARGETED DELIVERY OF INHALED PROTEINS

    EPA Science Inventory

    ETD-02-047 (Martonen) GPRA # 10108

    TARGETED DELIVERY OF INHALED PROTEINS
    T. B. Martonen1, J. Schroeter2, Z. Zhang3, D. Hwang4, and J. S. Fleming5
    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, Research Triangle Park...

  16. Network testbed creation and validation

    DOEpatents

    Thai, Tan Q.; Urias, Vincent; Van Leeuwen, Brian P.; Watts, Kristopher K.; Sweeney, Andrew John

    2017-03-21

    Embodiments of network testbed creation and validation processes are described herein. A "network testbed" is a replicated environment used to validate a target network or an aspect of its design. Embodiments describe a network testbed that comprises virtual testbed nodes executed via a plurality of physical infrastructure nodes. The virtual testbed nodes utilize these hardware resources as a network "fabric," thereby enabling rapid configuration and reconfiguration of the virtual testbed nodes without requiring reconfiguration of the physical infrastructure nodes. Thus, in contrast to prior art solutions which require a tester manually build an emulated environment of physically connected network devices, embodiments receive or derive a target network description and build out a replica of this description using virtual testbed nodes executed via the physical infrastructure nodes. This process allows for the creation of very large (e.g., tens of thousands of network elements) and/or very topologically complex test networks.

  17. An Optimized Transient Dual Luciferase Assay for Quantifying MicroRNA Directed Repression of Targeted Sequences

    PubMed Central

    Moyle, Richard L.; Carvalhais, Lilia C.; Pretorius, Lara-Simone; Nowak, Ekaterina; Subramaniam, Gayathery; Dalton-Morgan, Jessica; Schenk, Peer M.

    2017-01-01

    Studies investigating the action of small RNAs on computationally predicted target genes require some form of experimental validation. Classical molecular methods of validating microRNA action on target genes are laborious, while approaches that tag predicted target sequences to qualitative reporter genes encounter technical limitations. The aim of this study was to address the challenge of experimentally validating large numbers of computationally predicted microRNA-target transcript interactions using an optimized, quantitative, cost-effective, and scalable approach. The presented method combines transient expression via agroinfiltration of Nicotiana benthamiana leaves with a quantitative dual luciferase reporter system, where firefly luciferase is used to report the microRNA-target sequence interaction and Renilla luciferase is used as an internal standard to normalize expression between replicates. We report the appropriate concentration of N. benthamiana leaf extracts and dilution factor to apply in order to avoid inhibition of firefly LUC activity. Furthermore, the optimal ratio of microRNA precursor expression construct to reporter construct and duration of the incubation period post-agroinfiltration were determined. The optimized dual luciferase assay provides an efficient, repeatable and scalable method to validate and quantify microRNA action on predicted target sequences. The optimized assay was used to validate five predicted targets of rice microRNA miR529b, with as few as six technical replicates. The assay can be extended to assess other small RNA-target sequence interactions, including assessing the functionality of an artificial miRNA or an RNAi construct on a targeted sequence. PMID:28979287

  18. Laboratory Design for Microbiological Safety

    PubMed Central

    Phillips, G. Briggs; Runkle, Robert S.

    1967-01-01

    Of the large amount of funds spent each year in this country on construction and remodeling of biomedical research facilities, a significant portion is directed to laboratories handling infectious microorganisms. This paper is intended for the scientific administrators, architects, and engineers concerned with the design of new microbiological facilities. It develops and explains the concept of primary and secondary barriers for the containment of microorganisms. The basic objectives of a microbiological research laboratory, (i) protection of the experimenter and staff, (ii) protection of the surrounding community, and (iii) maintenance of experimental validity, are defined. In the design of a new infectious-disease research laboratory, early identification should be made of the five functional zones of the facility and their relation to each other. The following five zones and design criteria applicable to each are discussed: clean and transition, research area, animal holding and research area, laboratory support, engineering support. The magnitude of equipment and design criteria which are necessary to integrate these five zones into an efficient and safe facility are delineated. Images Fig. 1 Fig. 3 Fig. 4 Fig. 5 Fig. 6 Fig. 7 Fig. 8 Fig. 9 Fig. 10 Fig. 11 Fig. 12 Fig. 13 Fig. 14 Fig. 15 Fig. 16 Fig. 17 Fig. 18 Fig. 19 PMID:4961771

  19. Potential Alternatives Report for Validation of Alternatives to Aliphatic Isocyanate Polyurethanes

    NASA Technical Reports Server (NTRS)

    Lewis, pattie

    2011-01-01

    Identifying and selecting alternative materials and technologies that have the potential to reduce the identified HazMats and hazardous air pollutants (HAPs), while incorporating sound corrosion prevention and control technologies, is a complicated task due to the fast pace at which new technologies emerge and rules change. The alternatives are identified through literature searches, electronic database and Internet searches, surveys, and/or personal and professional contacts. Available test data was then compiled on the proposed alternatives to determine if the materials meet the test objectives or if further)laboratory or field-testing will be required. After reviewing technical information documented in the PAR, government representatives, technical representatives from the affected facilities, and other stakeholders involved in the process will select the list of viable alternative coatings for consideration and testing under the project's Joint Test Protocol entitled Joint Test Protocol for Validation of Alternatives to Aliphatic Isocyanate Polyurethanes and Field Test Plan entitled Field Evaluations Test Plan for Validation of Alternatives to Aliphatic Isocyanate Polyurethanes, both prepared by ITB. Test results will be reported in a Joint Test Report upon completion oftesting. The selection rationale and conclusions are documented in this PAR. A cost benefit analysis will be prepared to quantify the estimated capital and process costs of coating alternatives and cost savings relative to the current coating processes, however, some initial cost data has been included in this PAR. For this coatings project, isocyanates, as found in aliphatic isocyanate polyurethanes, were identified as the target HazMat to be eliminated. Table 1-1 lists the target HazMats, the related process and application, current specifications, and affected programs.

  20. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    PubMed

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Assessing Clinical Laboratory Quality: A College of American Pathologists Q-Probes Study of Prothrombin Time INR Structures, Processes, and Outcomes in 98 Laboratories.

    PubMed

    Howanitz, Peter J; Darcy, Theresa P; Meier, Frederick A; Bashleben, Christine P

    2015-09-01

    The anticoagulant warfarin has been identified as the second most frequent drug responsible for serious, disabling, and fatal adverse drug events in the United States, and its effect on blood coagulation is monitored by the laboratory test called international normalized ratio (INR). To determine the presence of INR policies and procedures, INR practices, and completeness and timeliness of reporting critical INR results in participants' clinical laboratories. Participants reviewed their INR policies and procedure requirements, identified their practices by using a questionnaire, and studied completeness of documentation and timeliness of reporting critical value INR results for outpatients and emergency department patients. In 98 participating institutions, the 5 required policies and procedures were in place in 93% to 99% of clinical laboratories. Fifteen options for the allowable variations among duplicate results from different analyzers, 12 different timeliness goals for reporting critical values, and 18 unique critical value limits were used by participants. All required documentation elements were present in 94.8% of 192 reviewed INR validation reports. Critical value INR results were reported within the time frame established by the laboratory for 93.4% of 2604 results, but 1.0% of results were not reported. Although the median laboratories successfully communicated all critical results within their established time frames and had all the required validation elements based in their 2 most recent INR calculations, those participants at the lowest 10th percentile were successful in 80.0% and 85.7% of these requirements, respectively. Significant opportunities exist for adherence to INR procedural requirements and for practice patterns and timeliness goals for INR critical results' reporting.

  2. Total laboratory automation: Do stat tests still matter?

    PubMed

    Dolci, Alberto; Giavarina, Davide; Pasqualetti, Sara; Szőke, Dominika; Panteghini, Mauro

    2017-07-01

    During the past decades the healthcare systems have rapidly changed and today hospital care is primarily advocated for critical patients and acute treatments, for which laboratory test results are crucial and need to be always reported in predictably short turnaround time (TAT). Laboratories in the hospital setting can face this challenge by changing their organization from a compartmentalized laboratory department toward a decision making-based laboratory department. This requires the implementation of a core laboratory, that exploits total laboratory automation (TLA) using technological innovation in analytical platforms, track systems and information technology, including middleware, and a number of satellite specialized laboratory sections cooperating with care teams for specific medical conditions. In this laboratory department model, the short TAT for all first-line tests performed by TLA in the core laboratory represents the key paradigm, where no more stat testing is required because all samples are handled in real-time and (auto)validated results dispatched in a time that fulfills clinical needs. To optimally reach this goal, laboratories should be actively involved in managing all the steps covering the total examination process, speeding up also extra-laboratory phases, such sample delivery. Furthermore, to warrant effectiveness and not only efficiency, all the processes, e.g. specimen integrity check, should be managed by middleware through a predefined set of rules defined in light of the clinical governance. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  3. The science of laboratory and project management in regulated bioanalysis.

    PubMed

    Unger, Steve; Lloyd, Thomas; Tan, Melvin; Hou, Jingguo; Wells, Edward

    2014-05-01

    Pharmaceutical drug development is a complex and lengthy process, requiring excellent project and laboratory management skills. Bioanalysis anchors drug safety and efficacy with systemic and site of action exposures. Development of scientific talent and a willingness to innovate or adopt new technology is essential. Taking unnecessary risks, however, should be avoided. Scientists must strategically assess all risks and find means to minimize or negate them. Laboratory Managers must keep abreast of ever-changing technology. Investments in instrumentation and laboratory design are critical catalysts to efficiency and safety. Matrix management requires regular communication between Project Managers and Laboratory Managers. When properly executed, it aligns the best resources at the right times for a successful outcome. Attention to detail is a critical aspect that separates excellent laboratories. Each assay is unique and requires attention in its development, validation and execution. Methods, training and facilities are the foundation of a bioanalytical laboratory.

  4. Laboratory Measurement Implications of Decreasing Childhood Blood Lead Levels

    PubMed Central

    Caldwell, Kathleen L.; Cheng, Po-Yung; Jarrett, Jeffery M.; Makhmudov, Amir; Vance, Kathryn; Ward, Cynthia D.; Jones, Robert L.; Mortensen, Mary E.

    2017-01-01

    In 2012, the Centers for Disease Control and Prevention (CDC) adopted its Advisory Committee on Childhood Lead Poisoning Prevention (ACCLPP) recommendation to use a population-based reference value to identify children and environments associated with lead hazards. The current reference value of 5 μg/dL is calculated as the 97.5th percentile of the distribution of blood lead levels (BLL) in children one to five years old from 2007–2010 National Health and Nutrition Examination Survey (NHANES) data. We calculated and updated selected percentiles, including the 97.5th percentile, using NHANES 2011–2014 blood lead data and examined demographic characteristics of children whose blood lead was ≥90th percentile value. The 97.5% percentile BLL of 3.48 μg/dL highlighted analytical laboratory and clinical interpretation challenges of blood lead measurements ≤ 5 μg/dL. Review of five years of results for target blood lead values < 11 μg/dL for U.S. clinical laboratories participating in CDC’s voluntary Lead and Multi-Element Proficiency (LAMP) quality assurance program showed 40% unable to quantify and reported a non-detectable result at a target blood lead value of 1.48 μg/dL compared 5.5 % at a target blood lead of 4.60 μg/dL. We describe actions taken at CDC’s Environmental Health Laboratory in the Division of Laboratory Sciences, which measures blood lead for NHANES, to improve analytical accuracy and precision and to reduce external lead contamination during blood collection and analysis. PMID:28771411

  5. Cold Agglutinin Disease; A Laboratory Challenge.

    PubMed

    Nikousefat, Zahra; Javdani, Moosa; Hashemnia, Mohammad; Haratyan, Abbas; Jalili, Ali

    2015-10-01

    Autoimmune haemolytic anemia (AIHA) is a complex process characterized by an immune reaction against red blood cell self-antigens. The analysis of specimens, drawn from patients with cold auto-immune hemolytic anemia is a difficult problem for automated hematology analyzer. This paper was written to alert technologists and pathologists to the presence of cold agglutinins and its effect on laboratory tests. A 72-year-old female presented to the Shafa laboratory for hematology profile evaluation. CBC indices showed invalid findings with the Sysmex automated hematology analyzer. Checking the laboratory process showed precipitation residue sticking to the sides of the tube. After warming the tubes, results become valid and the problem attributed to cold agglutinin disease. In this situation, aggregation of RBCs, which occurs at t < 30°C, causes invalid findings meanwhile working with automated hematology analyzer. Knowledge of this phenomenon can help prevent wasting too much time and make an early and accurate diagnosis.

  6. Meta-audit of laboratory ISO accreditation inspections: measuring the old emperor's clothes.

    PubMed

    Wilson, Ian G; Smye, Michael; Wallace, Ian J C

    2016-02-01

    Accreditation to ISO/IEC 17025 is required for EC official food control and veterinary laboratories by Regulation (EC) No. 882/2004. Measurements in hospital laboratories and clinics are increasingly accredited to ISO/IEC 15189. Both of these management standards arose from command and control military standards for factory inspection during World War II. They rely on auditing of compliance and have not been validated internally as assessment bodies require of those they accredit. Neither have they been validated to criteria outside their own ideology such as the Cochrane principles of evidence-based medicine which might establish whether any benefit exceeds their cost. We undertook a retrospective meta-audit over 14 years of internal and external laboratory audits that checked compliance with ISO 17025 in a public health laboratory. Most noncompliances arose solely from clauses in the standard and would not affect users. No effect was likely from 91% of these. Fewer than 1% of noncompliances were likely to have consequences for the validity of results or quality of service. The ISO system of compliance auditing has the performance characteristics of a poor screening test. It adds substantially to costs and generates more noise (false positives) than informative signal. Ethical use of resources indicates that management standards should not be used unless proven to deliver the efficacy, effectiveness, and value required of modern healthcare interventions. © 2015 The Authors. MicrobiologyOpen published by John Wiley & Sons Ltd.

  7. Using Rutherford Backscattering Spectroscopy to Characterize Targets for MTW

    NASA Astrophysics Data System (ADS)

    Brown, Gunnar; Stockler, Barak; Ward, Ryan; Freeman, Charlie; Padalino, Stephen; Stillman, Collin; Ivancic, Steven; Reagan, S. P.; Sangster, T. C.

    2017-10-01

    A study is underway to determine the composition and thickness of targets used at the Multiterawatt (MTW) laser facility at the Laboratory for Laser Energetics (LLE) using Rutherford backscattering spectroscopy (RBS). In RBS, an ion beam is incident on a sample and the scattered ions are detected with a surface barrier detector. The resulting energy spectra of the scattered ions can be analyzed to determine important parameters of the target including elemental composition and thickness. Proton, helium and deuterium beams from the 1.7 MV Pelletron accelerator at SUNY Geneseo have been used to characterize several different targets for MTW, including CH and aluminum foils of varying thickness. RBS spectra were also obtained for a cylindrical iron buried-layer target with aluminum dopant which was mounted on a silicon carbide stalk. The computer program SIMNRA is used to analyze the spectra. This work was funded in part by a Grant from the DOE through the Laboratory for Laser Energetics.

  8. Systematic approach identifies RHOA as a potential biomarker therapeutic target for Asian gastric cancer.

    PubMed

    Chang, Hae Ryung; Nam, Seungyoon; Lee, Jinhyuk; Kim, Jin-Hee; Jung, Hae Rim; Park, Hee Seo; Park, Sungjin; Ahn, Young Zoo; Huh, Iksoo; Balch, Curt; Ku, Ja-Lok; Powis, Garth; Park, Taesung; Jeong, Jin-Hyun; Kim, Yon Hui

    2016-12-06

    Gastric cancer (GC) is a highly heterogeneous disease, in dire need of specific, biomarker-driven cancer therapies. While the accumulation of cancer "Big Data" has propelled the search for novel molecular targets for GC, its specific subpathway and cellular functions vary from patient to patient. In particular, mutations in the small GTPase gene RHOA have been identified in recent genome-wide sequencing of GC tumors. Moreover, protein overexpression of RHOA was reported in Chinese populations, while RHOA mutations were found in Caucasian GC tumors. To develop evidence-based precision medicine for heterogeneous cancers, we established a systematic approach to integrate transcriptomic and genomic data. Predicted signaling subpathways were then laboratory-validated both in vitro and in vivo, resulting in the identification of new candidate therapeutic targets. Here, we show: i) differences in RHOA expression patterns, and its pathway activity, between Asian and Caucasian GC tumors; ii) in vitro and in vivo perturbed RHOA expression inhibits GC cell growth in high RHOA-expressing cell lines; iii) inverse correlation between RHOA and RHOB expression; and iv) an innovative small molecule design strategy for RHOA inhibitors. In summary, RHOA, and its oncogenic signaling pathway, represent a strong biomarker-driven therapeutic target for Asian GC. This comprehensive strategy represents a promising approach for the development of "hit" compounds.

  9. Gene Polymorphism Studies in a Teaching Laboratory

    ERIC Educational Resources Information Center

    Shultz, Jeffry

    2009-01-01

    I present a laboratory procedure for illustrating transcription, post-transcriptional modification, gene conservation, and comparative genetics for use in undergraduate biology education. Students are individually assigned genes in a targeted biochemical pathway, for which they design and test polymerase chain reaction (PCR) primers. In this…

  10. Isotope production and target preparation for nuclear astrophysics data

    NASA Astrophysics Data System (ADS)

    Schumann, Dorothea; Dressler, Rugard; Maugeri, Emilio Andrea; Heinitz, Stephan

    2017-09-01

    Targets are in many cases an indispensable ingredient for successful experiments aimed to produce nuclear data. With the recently observed shift to study nuclear reactions on radioactive targets, this task can become extremely challenging. Concerted actions of a certain number of laboratories able to produce isotopes and manufacture radioactive targets are urgently needed. We present here some examples of successful isotope and target production at PSI, in particular the production of 60Fe samples used for half-life measurements and neutron capture cross section experiments, the chemical processing and fabrication of lanthanide targets for capture cross section experiments at n_TOF (European Organization for Nuclear Research (CERN), Switzerland) as well as the recently performed manufacturing of highly-radioactive 7Be targets for the measurement of the 7Be(n,α)4He cross section in the energy range of interest for the Big-Bang nucleosynthesis contributing to the solving of the cosmological Li-problem. The two future projects: "Determination of the half-life and experiments on neutron capture cross sections of 53Mn" and "32Si - a new chronometer for nuclear dating" are briefly described. Moreover, we propose to work on the establishment of a dedicated network on isotope and target producing laboratories.

  11. Reliability and Validity of Ambulatory Cognitive Assessments

    PubMed Central

    Sliwinski, Martin J.; Mogle, Jacqueline A.; Hyun, Jinshil; Munoz, Elizabeth; Smyth, Joshua M.; Lipton, Richard B.

    2017-01-01

    Mobile technologies are increasingly used to measure cognitive function outside of traditional clinic and laboratory settings. Although ambulatory assessments of cognitive function conducted in people’s natural environments offer potential advantages over traditional assessment approaches, the psychometrics of cognitive assessment procedures have been understudied. We evaluated the reliability and construct validity of ambulatory assessments of working memory and perceptual speed administered via smartphones as part of an ecological momentary assessment (EMA) protocol in a diverse adult sample (N=219). Results indicated excellent between-person reliability (≥.97) for average scores, and evidence of reliable within-person variability across measurement occasions (.41–.53). The ambulatory tasks also exhibited construct validity, as evidence by their loadings on working memory and perceptual speed factors defined by the in-lab assessments. Our findings demonstrate that averaging across brief cognitive assessments made in uncontrolled naturalistic settings provide measurements that are comparable in reliability to assessments made in controlled laboratory environments. PMID:27084835

  12. New virtual laboratories presenting advanced motion control concepts

    NASA Astrophysics Data System (ADS)

    Goubej, Martin; Krejčí, Alois; Reitinger, Jan

    2015-11-01

    The paper deals with development of software framework for rapid generation of remote virtual laboratories. Client-server architecture is chosen in order to employ real-time simulation core which is running on a dedicated server. Ordinary web browser is used as a final renderer to achieve hardware independent solution which can be run on different target platforms including laptops, tablets or mobile phones. The provided toolchain allows automatic generation of the virtual laboratory source code from the configuration file created in the open- source Inkscape graphic editor. Three virtual laboratories presenting advanced motion control algorithms have been developed showing the applicability of the proposed approach.

  13. Method for the determination of catechin and epicatechin enantiomers in cocoa-based ingredients and products by high-performance liquid chromatography: single-laboratory validation.

    PubMed

    Machonis, Philip R; Jones, Matthew A; Schaneberg, Brian T; Kwik-Uribe, Catherine L

    2012-01-01

    A single-laboratory validation study was performed for an HPLC method to identify and quantify the flavanol enantiomers (+)- and (-)-epicatechin and (+)- and (-)-catechin in cocoa-based ingredients and products. These compounds were eluted isocratically with an ammonium acetate-methanol mobile phase applied to a modified beta-cyclodextrin chiral stationary phase and detected using fluorescence. Spike recovery experiments using appropriate matrix blanks, along with cocoa extract, cocoa powder, and dark chocolate, were used to evaluate accuracy, repeatability, specificity, LOD, LOQ, and linearity of the method as performed by a single analyst on multiple days. In all samples analyzed, (-)-epicatechin was the predominant flavanol and represented 68-91% of the total monomeric flavanols detected. For the cocoa-based products, within-day (intraday) precision for (-)-epicatechin was between 1.46-3.22%, for (+)-catechin between 3.66-6.90%, and for (-)-catechin between 1.69-6.89%; (+)-epicatechin was not detected in these samples. Recoveries for the three sample types investigated ranged from 82.2 to 102.1% at the 50% spiking level, 83.7 to 102.0% at the 100% spiking level, and 80.4 to 101.1% at the 200% spiking level. Based on performance results, this method may be suitable for routine laboratory use in analysis of cocoa-based ingredients and products.

  14. Transferability and inter-laboratory variability assessment of the in vitro bovine oocyte fertilization test.

    PubMed

    Tessaro, Irene; Modina, Silvia C; Crotti, Gabriella; Franciosi, Federica; Colleoni, Silvia; Lodde, Valentina; Galli, Cesare; Lazzari, Giovanna; Luciano, Alberto M

    2015-01-01

    The dramatic increase in the number of animals required for reproductive toxicity testing imposes the validation of alternative methods to reduce the use of laboratory animals. As we previously demonstrated for in vitro maturation test of bovine oocytes, the present study describes the transferability assessment and the inter-laboratory variability of an in vitro test able to identify chemical effects during the process of bovine oocyte fertilization. Eight chemicals with well-known toxic properties (benzo[a]pyrene, busulfan, cadmium chloride, cycloheximide, diethylstilbestrol, ketoconazole, methylacetoacetate, mifepristone/RU-486) were tested in two well-trained laboratories. The statistical analysis demonstrated no differences in the EC50 values for each chemical in within (inter-runs) and in between-laboratory variability of the proposed test. We therefore conclude that the bovine in vitro fertilization test could advance toward the validation process as alternative in vitro method and become part of an integrated testing strategy in order to predict chemical hazards on mammalian fertility. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Validation of streamflow measurements made with acoustic doppler current profilers

    USGS Publications Warehouse

    Oberg, K.; Mueller, D.S.

    2007-01-01

    The U.S. Geological Survey and other international agencies have collaborated to conduct laboratory and field validations of acoustic Doppler current profiler (ADCP) measurements of streamflow. Laboratory validations made in a large towing basin show that the mean differences between tow cart velocity and ADCP bottom-track and water-track velocities were -0.51 and -1.10%, respectively. Field validations of commercially available ADCPs were conducted by comparing streamflow measurements made with ADCPs to reference streamflow measurements obtained from concurrent mechanical current-meter measurements, stable rating curves, salt-dilution measurements, or acoustic velocity meters. Data from 1,032 transects, comprising 100 discharge measurements, were analyzed from 22 sites in the United States, Canada, Sweden, and The Netherlands. Results of these analyses show that broadband ADCP streamflow measurements are unbiased when compared to the reference discharges regardless of the water mode used for making the measurement. Measurement duration is more important than the number of transects for reducing the uncertainty of the ADCP streamflow measurement. ?? 2007 ASCE.

  16. Development of RNAi Libraries for Target Validation and Therapeutics

    DTIC Science & Technology

    2006-03-01

    met using a hammerhead ribozyme transgene reduces in vitro invasion and migration in prostate cancer cells. Prostate, 60: 317-324, 2004. 49...Watkins, G., Mason, M.D., Jiang, W.G. (2004) Targeting the HGF/SF receptor c-met using a hammerhead ribozyme transgene reduces in vitro invasion...reduction of tumor growth (27). In addition, when c-met was knocked-down using ribozymes , cell invasion and metastasis were inhibited both in vitro

  17. Hydraulic Hybrid and Conventional Parcel Delivery Vehicles' Measured Laboratory Fuel Economy on Targeted Drive Cycles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lammert, M. P.; Burton, J.; Sindler, P.

    2014-10-01

    This research project compares laboratory-measured fuel economy of a medium-duty diesel powered hydraulic hybrid vehicle drivetrain to both a conventional diesel drivetrain and a conventional gasoline drivetrain in a typical commercial parcel delivery application. Vehicles in this study included a model year 2012 Freightliner P100H hybrid compared to a 2012 conventional gasoline P100 and a 2012 conventional diesel parcel delivery van of similar specifications. Drive cycle analysis of 484 days of hybrid parcel delivery van commercial operation from multiple vehicles was used to select three standard laboratory drive cycles as well as to create a custom representative cycle. These fourmore » cycles encompass and bracket the range of real world in-use data observed in Baltimore United Parcel Service operations. The NY Composite cycle, the City Suburban Heavy Vehicle Cycle cycle, and the California Air Resources Board Heavy Heavy-Duty Diesel Truck (HHDDT) cycle as well as a custom Baltimore parcel delivery cycle were tested at the National Renewable Energy Laboratory's Renewable Fuels and Lubricants Laboratory. Fuel consumption was measured and analyzed for all three vehicles. Vehicle laboratory results are compared on the basis of fuel economy. The hydraulic hybrid parcel delivery van demonstrated 19%-52% better fuel economy than the conventional diesel parcel delivery van and 30%-56% better fuel economy than the conventional gasoline parcel delivery van on cycles other than the highway-oriented HHDDT cycle.« less

  18. How reliable are ligand-centric methods for Target Fishing?

    NASA Astrophysics Data System (ADS)

    Peon, Antonio; Dang, Cuong; Ballester, Pedro

    2016-04-01

    Computational methods for Target Fishing (TF), also known as Target Prediction or Polypharmacology Prediction, can be used to discover new targets for small-molecule drugs. This may result in repositioning the drug in a new indication or improving our current understanding of its efficacy and side effects. While there is a substantial body of research on TF methods, there is still a need to improve their validation, which is often limited to a small part of the available targets and not easily interpretable by the user. Here we discuss how target-centric TF methods are inherently limited by the number of targets that can possibly predict (this number is by construction much larger in ligand-centric techniques). We also propose a new benchmark to validate TF methods, which is particularly suited to analyse how predictive performance varies with the query molecule. On average over approved drugs, we estimate that only five predicted targets will have to be tested to find two true targets with submicromolar potency (a strong variability in performance is however observed). In addition, we find that an approved drug has currently an average of eight known targets, which reinforces the notion that polypharmacology is a common and strong event. Furthermore, with the assistance of a control group of randomly-selected molecules, we show that the targets of approved drugs are generally harder to predict.

  19. [Reducing the use of laboratory animals].

    PubMed

    Claude, Nancy

    2009-11-01

    Since 1959, when Russel and Burch formulated the 3Rs principle (Reduce, Replace, Refine), the scientific community has been attempting to reduce the use of laboratory animals for research purposes. Current regulatory guidelines take this principle into account. Thanks to scientific and technical progress, and advances in bioinformatics, new tools are now available that reduce the need for laboratory animals, albeit without totally replacing them. Implementation of the International Conference on Harmonization recommendations in 1990 represented a major step forward, notably by helping to avoid duplication of studies using laboratory animals. The use of animals for cosmetics testing is now forbidden in the European Union. Although new in vitro and in silico models remain to be validated, they are proving particularly useful during the early stages of product development, by avoiding experimental studies of chemicals that are ineffective or excessively toxic. The success of these measures is reflected in the results of a European study showing a fall, between 1996 and 2005, in the number of laboratory animals used for research and development, despite a large increase in overall research activities. The challenge for the next decade is to amplify this trend.

  20. A pan-European ring trial to validate an International Standard for detection of Vibrio cholerae, Vibrio parahaemolyticus and Vibrio vulnificus in seafoods.

    PubMed

    Hartnell, R E; Stockley, L; Keay, W; Rosec, J-P; Hervio-Heath, D; Van den Berg, H; Leoni, F; Ottaviani, D; Henigman, U; Denayer, S; Serbruyns, B; Georgsson, F; Krumova-Valcheva, G; Gyurova, E; Blanco, C; Copin, S; Strauch, E; Wieczorek, K; Lopatek, M; Britova, A; Hardouin, G; Lombard, B; In't Veld, P; Leclercq, A; Baker-Austin, C

    2018-02-10

    Globally, vibrios represent an important and well-established group of bacterial foodborne pathogens. The European Commission (EC) mandated the Comite de European Normalisation (CEN) to undertake work to provide validation data for 15 methods in microbiology to support EC legislation. As part of this mandated work programme, merging of ISO/TS 21872-1:2007, which specifies a horizontal method for the detection of V. parahaemolyticus and V. cholerae, and ISO/TS 21872-2:2007, a similar horizontal method for the detection of potentially pathogenic vibrios other than V. cholerae and V. parahaemolyticus was proposed. Both parts of ISO/TS 21872 utilized classical culture-based isolation techniques coupled with biochemical confirmation steps. The work also considered simplification of the biochemical confirmation steps. In addition, because of advances in molecular based methods for identification of human pathogenic Vibrio spp. classical and real-time PCR options were also included within the scope of the validation. These considerations formed the basis of a multi-laboratory validation study with the aim of improving the precision of this ISO technical specification and providing a single ISO standard method to enable detection of these important foodborne Vibrio spp.. To achieve this aim, an international validation study involving 13 laboratories from 9 countries in Europe was conducted in 2013. The results of this validation have enabled integration of the two existing technical specifications targeting the detection of the major foodborne Vibrio spp., simplification of the suite of recommended biochemical identification tests and the introduction of molecular procedures that provide both species level identification and discrimination of putatively pathogenic strains of V. parahaemolyticus by the determination of the presence of theromostable direct and direct related haemolysins. The method performance characteristics generated in this have been included in revised