Sample records for qc-6plus quality control

  1. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data

  2. ChronQC: a quality control monitoring system for clinical next generation sequencing.

    PubMed

    Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C

    2018-05-15

    ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.

  3. Quality Control (QC) System Development for the Pell Grant Program: A Conceptual Framework.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    The objectives of the Pell Grant quality control (QC) system and the general definition of QC are considered. Attention is also directed to: the objectives of the Stage II Pell Grant QC system design and testing project, the approach used to develop the QC system, and the interface of the QC system and the Pell Grant delivery system. The…

  4. Application of clinical assay quality control (QC) to multivariate proteomics data: a workflow exemplified by 2-DE QC.

    PubMed

    Jackson, David; Bramwell, David

    2013-12-16

    Proteomics technologies can be effective for the discovery and assay of protein forms altered with disease. However, few examples of successful biomarker discovery yet exist. Critical to addressing this is the widespread implementation of appropriate QC (quality control) methodology. Such QC should combine the rigour of clinical laboratory assays with a suitable treatment of the complexity of the proteome by targeting separate assignable causes of variation. We demonstrate an approach, metric and example workflow for users to develop such targeted QC rules systematically and objectively, using a publicly available plasma DIGE data set. Hierarchical clustering analysis of standard channels is first used to discover correlated groups of features corresponding to specific assignable sources of technical variation. These effects are then quantified using a statistical distance metric, and followed on control charts. This allows measurement of process drift and the detection of runs that outlie for any given effect. A known technical issue on originally rejected gels was detected validating this approach, and relevant novel effects were also detected and classified effectively. Our approach was effective for 2-DE QC. Whilst we demonstrated this in a retrospective DIGE experiment, the principles would apply to ongoing QC and other proteomic technologies. This work asserts that properly carried out QC is essential to proteomics discovery experiments. Its significance is that it provides one possible novel framework for applying such methods, with a particular consideration of how to handle the complexity of the proteome. It not only focusses on 2DE-based methodology but also demonstrates general principles. A combination of results and discussion based upon a publicly available data set is used to illustrate the approach and allows a structured discussion of factors that experimenters may wish to bear in mind in other situations. The demonstration is on retrospective data

  5. qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*

    PubMed Central

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-01-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958

  6. qcML: an exchange format for quality control metrics from mass spectrometry experiments.

    PubMed

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-08-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  7. Quality control in urodynamics and the role of software support in the QC procedure.

    PubMed

    Hogan, S; Jarvis, P; Gammie, A; Abrams, P

    2011-11-01

    This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.

  8. jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.

    PubMed

    Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris

    2014-07-03

    The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .

  9. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  10. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  11. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  12. QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.

    PubMed

    Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O

    2018-04-17

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected.  In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration.  To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample qualityQC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis.  We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  13. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  14. QA/QC in the laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  15. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  16. Degradation Signals for Ubiquitin-Proteasome Dependent Cytosolic Protein Quality Control (CytoQC) in Yeast

    PubMed Central

    Maurer, Matthew J.; Spear, Eric D.; Yu, Allen T.; Lee, Evan J.; Shahzad, Saba; Michaelis, Susan

    2016-01-01

    Cellular protein quality control (PQC) systems selectively target misfolded or otherwise aberrant proteins for degradation by the ubiquitin-proteasome system (UPS). How cells discern abnormal from normal proteins remains incompletely understood, but involves in part the recognition between ubiquitin E3 ligases and degradation signals (degrons) that are exposed in misfolded proteins. PQC is compartmentalized in the cell, and a great deal has been learned in recent years about ER-associated degradation (ERAD) and nuclear quality control. In contrast, a comprehensive view of cytosolic quality control (CytoQC) has yet to emerge, and will benefit from the development of a well-defined set of model substrates. In this study, we generated an isogenic “degron library” in Saccharomyces cerevisiae consisting of short sequences appended to the C-terminus of a reporter protein, Ura3. About half of these degron-containing proteins are substrates of the integral membrane E3 ligase Doa10, which also plays a pivotal role in ERAD and some nuclear protein degradation. Notably, some of our degron fusion proteins exhibit dependence on the E3 ligase Ltn1/Rkr1 for degradation, apparently by a mechanism distinct from its known role in ribosomal quality control of translationally paused proteins. Ubr1 and San1, E3 ligases involved in the recognition of some misfolded CytoQC substrates, are largely dispensable for the degradation of our degron-containing proteins. Interestingly, the Hsp70/Hsp40 chaperone/cochaperones Ssa1,2 and Ydj1, are required for the degradation of all constructs tested. Taken together, the comprehensive degron library presented here provides an important resource of isogenic substrates for testing candidate PQC components and identifying new ones. PMID:27172186

  17. QA/QC in the laboratory. Session F

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  18. Recombinant human G6PD for quality control and quality assurance of novel point-of-care diagnostics for G6PD deficiency.

    PubMed

    Kahn, Maria; LaRue, Nicole; Zhu, Changcheng; Pal, Sampa; Mo, Jack S; Barrett, Lynn K; Hewitt, Steve N; Dumais, Mitchell; Hemmington, Sandra; Walker, Adrian; Joynson, Jeff; Leader, Brandon T; Van Voorhis, Wesley C; Domingo, Gonzalo J

    2017-01-01

    A large gap for the support of point-of-care testing is the availability of reagents to support quality control (QC) of diagnostic assays along the supply chain from the manufacturer to the end user. While reagents and systems exist to support QC of laboratory screening tests for glucose-6-phosphate dehydrogenase (G6PD) deficiency, they are not configured appropriately to support point-of-care testing. The feasibility of using lyophilized recombinant human G6PD as a QC reagent in novel point-of-care tests for G6PD deficiency is demonstrated. Human recombinant G6PD (r-G6PD) was expressed in Escherichia coli and purified. Aliquots were stored at -80°C. Prior to lyophilization, aliquots were thawed, and three concentrations of r-G6PD (representing normal, intermediate, and deficient clinical G6PD levels) were prepared and mixed with a protective formulation, which protects the enzyme activity against degradation from denaturation during the lyophilization process. Following lyophilization, individual single-use tubes of lyophilized r-G6PD were placed in individual packs with desiccants and stored at five temperatures for one year. An enzyme assay for G6PD activity was used to ascertain the stability of r-G6PD activity while stored at different temperatures. Lyophilized r-G6PD is stable and can be used as a control indicator. Results presented here show that G6PD activity is stable for at least 365 days when stored at -80°C, 4°C, 30°C, and 45°C. When stored at 55°C, enzyme activity was found to be stable only through day 28. Lyophilized r-G6PD enzyme is stable and can be used as a control for point-of-care tests for G6PD deficiency.

  19. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    PubMed

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  1. Improvement of the customer satisfaction through Quality Assurance Matrix and QC-Story methods: A case study from automotive industry

    NASA Astrophysics Data System (ADS)

    Sicoe, G. M.; Belu, N.; Rachieru, N.; Nicolae, E. V.

    2017-10-01

    Presently, in the automotive industry, the tendency is to adapt permanently to the changes and introduce the market tendency in the new products that leads of the customer satisfaction. Many quality techniques were adopted in this field to continuous improvement of product and process quality and advantages were also gained. The present paper has focused on possibilities that offers the use of Quality Assurance Matrix (QAM) and Quality Control Story (QC Story) to provide largest protection against nonconformities in the production process, throughout a case study in the automotive industry. There is a direct relationship from the QAM to a QC Story analysis. The failures identified using QAM are treated with QC Story methodology. Using this methods, will help to decrease the PPM values and will increase the quality performance and the customer satisfaction.

  2. Maintaining High Quality Data and Consistency Across a Diverse Flux Network: The Ameriflux QA/QC Technical Team

    NASA Astrophysics Data System (ADS)

    Chan, S.; Billesbach, D. P.; Hanson, C. V.; Biraud, S.

    2014-12-01

    The AmeriFlux quality assurance and quality control (QA/QC) technical team conducts short term (<2 weeks) intercomparisons using a portable eddy covariance system (PECS) to maintain high quality data observations and data consistency across the AmeriFlux network (http://ameriflux.lbl.gov/). Site intercomparisons identify discrepancies between the in situ and portable measurements and calculated fluxes. Findings are jointly discussed by the site staff and the QA/QC team to improve in the situ observations. Despite the relatively short duration of an individual site intercomparison, the accumulated record of all site visits (numbering over 100 since 2002) is a unique dataset. The ability to deploy redundant sensors provides a rare opportunity to identify, quantify, and understand uncertainties in eddy covariance and ancillary measurements. We present a few specific case studies from QA/QC site visits to highlight and share new and relevant findings related to eddy covariance instrumentation and operation.

  3. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  4. Analytical approaches to quality assurance and quality control in rangeland monitoring data

    USDA-ARS?s Scientific Manuscript database

    Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...

  5. QUALITY CONTROLS FOR PCR

    EPA Science Inventory

    The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...

  6. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  7. [Development of quality assurance/quality control web system in radiotherapy].

    PubMed

    Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun

    2013-12-01

    Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.

  8. [Highly quality-controlled radiation therapy].

    PubMed

    Shirato, Hiroki

    2005-04-01

    Advanced radiation therapy for intracranial disease has focused on set-up accuracy for the past 15 years. However, quality control in the prescribed dose is actually as important as the tumor set-up in radiation therapy. Because of the complexity of the three-dimensional radiation treatment planning system in recent years, the highly quality-controlled prescription of the dose has now been reappraised as the mainstream to improve the treatment outcome of radiation therapy for intracranial disease. The Japanese Committee for Quality Control of Radiation Therapy has developed fundamental requirements such as a QC committee in each hospital, a medical physicist, dosimetrists (QC members), and an external audit.

  9. Quality Control of Meteorological Observations

    NASA Technical Reports Server (NTRS)

    Collins, William; Dee, Dick; Rukhovets, Leonid

    1999-01-01

    For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.

  10. CHALLENGES IN SETTING UP QUALITY CONTROL IN DIAGNOSTIC RADIOLOGY FACILITIES IN NIGERIA.

    PubMed

    Inyang, S O; Egbe, N O; Ekpo, E

    2015-01-01

    The Nigerian Nuclear Regulatory Authority (NNRA) was established to regulate and control the use of radioactive and radiation emitting sources in Nigeria. Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the authorization of diagnostic radiology facilities in the Country. Some quality control tests (output, exposure linearity and reproducibility) were measured on the x-ray machines in the facilities that took part in the study. Questionnaire was developed to evaluate the frequencies at which QC tests were conducted in the facilities and the challenges in setting up QC. Results show great variation in the values of the QC parameters measured. Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study. The responses on the frequencies at which QC tests should be conducted did not correspond to the recommended standards; indicating that personnel were not familiar with QC implementation and may require further training on QC.

  11. Interlaboratory quality control of total HIV-1 DNA load measurement for multicenter reservoir studies.

    PubMed

    Gantner, Pierre; Mélard, Adeline; Damond, Florence; Delaugerre, Constance; Dina, Julia; Gueudin, Marie; Maillard, Anne; Sauné, Karine; Rodallec, Audrey; Tuaillon, Edouard; Plantier, Jean-Christophe; Rouzioux, Christine; Avettand-Fenoel, Véronique

    2017-11-01

    Viral reservoirs represent an important barrier to HIV cure. Accurate markers of HIV reservoirs are needed to develop multicenter studies. The aim of this multicenter quality control (QC) was to evaluate the inter-laboratory reproducibility of total HIV-1-DNA quantification. Ten laboratories of the ANRS-AC11 working group participated by quantifying HIV-DNA with a real-time qPCR assay (Biocentric) in four samples (QCMD). Good reproducibility was found between laboratories (standard deviation ≤ 0.2 log 10 copies/10 6 PBMC) for the three positive QC that were correctly classified by each laboratory (QC1<QC2<QC3). Results of this QC validate the feasibility of multicenter studies using this standardized assay. © 2017 Wiley Periodicals, Inc.

  12. Quality control in urinalysis.

    PubMed

    Takubo, T; Tatsumi, N

    1999-01-01

    Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.

  13. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  14. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  15. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  16. Development and Testing of a Nuclear Quality Assurance/Quality Control Technician Curriculum. Final Report.

    ERIC Educational Resources Information Center

    Espy, John; And Others

    A project was conducted to field test selected first- and second-year courses in a postsecondary nuclear quality assurance/quality control (QA/QC) technician curriculum and to develop the teaching/learning modules for seven technical specialty courses remaining in the QA/QC technician curriculum. The field testing phase of the project involved the…

  17. WE-AB-206-00: Diagnostic QA/QC Hands-On Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less

  18. Embankment quality and assessment of moisture control implementation.

    DOT National Transportation Integrated Search

    2016-02-01

    A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 : years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certificatio...

  19. Quality Control in Primary Schools: Progress from 2001-2006

    ERIC Educational Resources Information Center

    Hofman, Roelande H.; de Boom, Jan; Hofman, W. H. Adriaan

    2010-01-01

    This article presents findings of research into the quality control (QC) of schools from 2001-2006. In 2001 several targets for QC were set and the progress of 939 primary schools is presented. Furthermore, using cluster analysis, schools are classified into four QC-types that differ in their focus on school (self) evaluation and school…

  20. Introducing Quality Control in the Chemistry Teaching Laboratory Using Control Charts

    ERIC Educational Resources Information Center

    Schazmann, Benjamin; Regan, Fiona; Ross, Mary; Diamond, Dermot; Paull, Brett

    2009-01-01

    Quality control (QC) measures are less prevalent in teaching laboratories than commercial settings possibly owing to a lack of commercial incentives or teaching resources. This article focuses on the use of QC assessment in the analytical techniques of high performance liquid chromatography (HPLC) and ultraviolet-visible spectroscopy (UV-vis) at…

  1. Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?

    PubMed

    Sharp, Susan E; Miller, Melissa B; Hindler, Janet

    2015-12-01

    The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use "equivalent QC" (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  2. Development of concrete QC/QA specifications for highway construction in Kentucky.

    DOT National Transportation Integrated Search

    2001-08-01

    There is a growing trend toward quality-based specifications in highway construction. A large number of quality control/quality assurance (QC/QA) specifications shift the responsibility of day-to-day testing from the state DOH to the contractor. This...

  3. Quality control management and communication between radiologists and technologists.

    PubMed

    Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M

    2008-06-01

    The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.

  4. Quality Assurance and Control Considerations in Environmental Measurements and Monitoring

    NASA Astrophysics Data System (ADS)

    Sedlet, Jacob

    1982-06-01

    Quality assurance and quality control have become accepted as essential parts of all environmental surveillance, measurements, and monitoring programs, both nuclear and non-nuclear. The same principles and details apply to each. It is primarily the final measurement technique that differs. As the desire and need to measure smaller amounts of pollutants with greater accuracy has increased, it has been recognized that quality assurance and control programs are cost-effective in achieving the expected results. Quality assurance (QA) consists of all the actions necessary to provide confidence in the results. Quality control (QC) is a part of QA, and consists of those actions and activities that permit the control of the individual steps in the environmental program. The distinction between the two terms is not always clearly defined, but a sharp division is not necessary. The essential principle of QA and QC is a commitment to high quality results. The essential components of a QA and QC program are a complete, written procedures manual for all parts of the environmental program, the use of standard or validated procedures, participation in applicable interlaboratory comparison or QA programs, replicate analysis and measurement, training of personnel, and a means of auditing or checking that the QA and QC programs are properly conducted. These components are discussed below in some detail.

  5. ATACseqQC: a Bioconductor package for post-alignment quality assessment of ATAC-seq data.

    PubMed

    Ou, Jianhong; Liu, Haibo; Yu, Jun; Kelliher, Michelle A; Castilla, Lucio H; Lawson, Nathan D; Zhu, Lihua Julie

    2018-03-01

    ATAC-seq (Assays for Transposase-Accessible Chromatin using sequencing) is a recently developed technique for genome-wide analysis of chromatin accessibility. Compared to earlier methods for assaying chromatin accessibility, ATAC-seq is faster and easier to perform, does not require cross-linking, has higher signal to noise ratio, and can be performed on small cell numbers. However, to ensure a successful ATAC-seq experiment, step-by-step quality assurance processes, including both wet lab quality control and in silico quality assessment, are essential. While several tools have been developed or adopted for assessing read quality, identifying nucleosome occupancy and accessible regions from ATAC-seq data, none of the tools provide a comprehensive set of functionalities for preprocessing and quality assessment of aligned ATAC-seq datasets. We have developed a Bioconductor package, ATACseqQC, for easily generating various diagnostic plots to help researchers quickly assess the quality of their ATAC-seq data. In addition, this package contains functions to preprocess aligned ATAC-seq data for subsequent peak calling. Here we demonstrate the utilities of our package using 25 publicly available ATAC-seq datasets from four studies. We also provide guidelines on what the diagnostic plots should look like for an ideal ATAC-seq dataset. This software package has been used successfully for preprocessing and assessing several in-house and public ATAC-seq datasets. Diagnostic plots generated by this package will facilitate the quality assessment of ATAC-seq data, and help researchers to evaluate their own ATAC-seq experiments as well as select high-quality ATAC-seq datasets from public repositories such as GEO to avoid generating hypotheses or drawing conclusions from low-quality ATAC-seq experiments. The software, source code, and documentation are freely available as a Bioconductor package at https://bioconductor.org/packages/release/bioc/html/ATACseqQC.html .

  6. A Framework for a Quality Control System for Vendor/Processor Contracts.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    A framework for monitoring quality control (QC) of processor contracts administered by the Department of Education's Office of Student Financial Assistance (OSFA) is presented and applied to the Pell Grant program. Guidelines for establishing QC measures and standards are included, and the uses of a sampling procedure in the QC system are…

  7. Evaluation of Various Radar Data Quality Control Algorithms Based on Accumulated Radar Rainfall Statistics

    NASA Technical Reports Server (NTRS)

    Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.

  8. Quality Control for Scoring Tests Administered in Continuous Mode: An NCME Instructional Module

    ERIC Educational Resources Information Center

    Allalouf, Avi; Gutentag, Tony; Baumer, Michal

    2017-01-01

    Quality control (QC) in testing is paramount. QC procedures for tests can be divided into two types. The first type, one that has been well researched, is QC for tests administered to large population groups on few administration dates using a small set of test forms (e.g., large-scale assessment). The second type is QC for tests, usually…

  9. The Quality Control Circle: Is It for Education?

    ERIC Educational Resources Information Center

    Land, Arthur J.

    From its start in Japan after World War II, the Quality Control Circle (Q.C.) approach to management and organizational operation evolved into what it is today: people doing similar work meeting regularly to identify, objectively analyze, and develop solutions to problems. The Q.C. approach meets Maslow's theory of motivation by inviting…

  10. An overview of quality control practices in Ontario with particular reference to cholesterol analysis.

    PubMed

    Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H

    1999-03-01

    The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.

  11. Quality control and quality assurance in genotypic data for genome-wide association studies

    PubMed Central

    Laurie, Cathy C.; Doheny, Kimberly F.; Mirel, Daniel B.; Pugh, Elizabeth W.; Bierut, Laura J.; Bhangale, Tushar; Boehm, Frederick; Caporaso, Neil E.; Cornelis, Marilyn C.; Edenberg, Howard J.; Gabriel, Stacy B.; Harris, Emily L.; Hu, Frank B.; Jacobs, Kevin; Kraft, Peter; Landi, Maria Teresa; Lumley, Thomas; Manolio, Teri A.; McHugh, Caitlin; Painter, Ian; Paschall, Justin; Rice, John P.; Rice, Kenneth M.; Zheng, Xiuwen; Weir, Bruce S.

    2011-01-01

    Genome-wide scans of nucleotide variation in human subjects are providing an increasing number of replicated associations with complex disease traits. Most of the variants detected have small effects and, collectively, they account for a small fraction of the total genetic variance. Very large sample sizes are required to identify and validate findings. In this situation, even small sources of systematic or random error can cause spurious results or obscure real effects. The need for careful attention to data quality has been appreciated for some time in this field, and a number of strategies for quality control and quality assurance (QC/QA) have been developed. Here we extend these methods and describe a system of QC/QA for genotypic data in genome-wide association studies. This system includes some new approaches that (1) combine analysis of allelic probe intensities and called genotypes to distinguish gender misidentification from sex chromosome aberrations, (2) detect autosomal chromosome aberrations that may affect genotype calling accuracy, (3) infer DNA sample quality from relatedness and allelic intensities, (4) use duplicate concordance to infer SNP quality, (5) detect genotyping artifacts from dependence of Hardy-Weinberg equilibrium (HWE) test p-values on allelic frequency, and (6) demonstrate sensitivity of principal components analysis (PCA) to SNP selection. The methods are illustrated with examples from the ‘Gene Environment Association Studies’ (GENEVA) program. The results suggest several recommendations for QC/QA in the design and execution of genome-wide association studies. PMID:20718045

  12. Quality control and quality assurance of hot mix asphalt construction in Delaware.

    DOT National Transportation Integrated Search

    2006-07-01

    Since the mid 60s the Federal Highway Administration began to encourage : Departments of Transportation and Contractors toward the use of quality control and : quality assurance (QA/QC) specifications, which are statistically based. : For example,...

  13. Portland cement concrete pavement review of QC/QA data 2000 through 2009.

    DOT National Transportation Integrated Search

    2011-04-01

    This report analyzes the Quality Control/Quality Assurance (QC/QA) data for Portland cement concrete pavement : (PCCP) awarded in the years 2000 through 2009. Analysis of the overall performance of the projects is accomplished by : reviewing the Calc...

  14. QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES. Project Summary

    EPA Science Inventory

    It is generally agreed that both quality assurance (QA) and quality control (QC) are essential to the proper installation and eventual performance of environmentally safe and secure waste containment systems. Even further, there are both manufacturing and construction aspects to...

  15. Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2004.

    DOT National Transportation Integrated Search

    2006-07-01

    This report analyzes the Quality Control/Quality Assurance (QC/QA) data for hot mix asphalt using voids acceptance as : the testing criteria for the years 2000 through 2004. Analysis of the overall quality of the HMA is accomplished by : reviewing th...

  16. Quality control quantification (QCQ): a tool to measure the value of quality control checks in radiation oncology.

    PubMed

    Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa

    2012-11-01

    To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data

  17. Develop a Methodology to Evaluate the Effectiveness of QC/QA Specifications (Phase II)

    DOT National Transportation Integrated Search

    1998-08-01

    The Texas Department of Transportation (TxDOT) has been implementing statistically based quality control/quality assurance (QC/QA) specifications for hot mix asphalt concrete pavements since the early 1990s. These specifications have been continuousl...

  18. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Quality control. 74.6 Section 74.6 Mineral... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and... DUST SAMPLING DEVICES Approval Requirements for Coal Mine Dust Personal Sampler Unit § 74.6 Quality...

  19. From Field Notes to Data Portal - A Scalable Data QA/QC Framework for Tower Networks: Progress and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.

    2017-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.

  20. Impact of dose calibrators quality control programme in Argentina

    NASA Astrophysics Data System (ADS)

    Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.

    1992-02-01

    The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.

  1. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2010.

    DOT National Transportation Integrated Search

    2011-10-01

    This report analyzes the quality control/quality assurance (QC/QA) data for hot mix asphalt (HMA) using : voids acceptance as the testing criteria awarded in the years 2000 through 2010. Analysis of the overall : performance of the projects is accomp...

  3. Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?

    PubMed Central

    Miller, Melissa B.; Hindler, Janet

    2015-01-01

    The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use “equivalent QC” (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. PMID:26447112

  4. New insight into the comparative power of quality-control rules that use control observations within a single analytical run.

    PubMed

    Parvin, C A

    1993-03-01

    The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.

  5. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  6. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  7. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  8. Quality Assurance and Quality Control Practices for Rehabilitation of Sewer and Water Mains

    EPA Science Inventory

    As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued, including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of reha...

  9. Quality Assurance and Quality Control Practices For Rehabilitation of Sewer and Water Mains

    EPA Science Inventory

    As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of rehab...

  10. Preliminary Quality Control System Design for the Pell Grant Program.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    A preliminary design for a quality control (QC) system for the Pell Grant Program is proposed, based on the needs of the Office of Student Financial Assistance (OSFA). The applicability of the general design for other student aid programs administered by OSFA is also considered. The following steps included in a strategic approach to QC system…

  11. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  12. General Quality Control (QC) Guidelines for SAM Methods

    EPA Pesticide Factsheets

    Learn more about quality control guidelines and recommendations for the analysis of samples using the methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  13. Quality-Assurance/Quality-Control Manual for Collection and Analysis of Water-Quality Data in the Ohio District, US Geological Survey

    USGS Publications Warehouse

    Francy, D.S.; Jones, A.L.; Myers, Donna N.; Rowe, G.L.; Eberle, Michael; Sarver, K.M.

    1998-01-01

    The U.S. Geological Survey (USGS), Water Resources Division (WRD), requires that quality-assurance/quality-control (QA/QC) activities be included in any sampling and analysis program. Operational QA/QC procedures address local needs while incorporating national policies. Therefore, specific technical policies were established for all activities associated with water-quality project being done by the Ohio District. The policies described in this report provide Ohio District personnel, cooperating agencies, and others with a reference manual on QA/QC procedures that are followed in collecitng and analyzing water-quality samples and reporting water-quality information in the Ohio District. The project chief, project support staff, District Water-Quality Specialist, and District Laboratory Coordinator are all involved in planning and implementing QA/QC activities at the district level. The District Chief and other district-level managers provide oversight, and the Regional Water-Quality Specialist, Office of Water Quality (USGS headquarters), and the Branch of Quality Systems within the Office of Water Quality create national QA/QC polices and provide assistance to District personnel. In the literature, the quality of all measurement data is expressed in terms of precision, variability, bias, accuracy, completeness, representativeness, and comparability. In the Ohio District, bias and variability will be used to describe quality-control data generated from samples in the field and laboratory. Each project chief must plan for implementation and financing of QA/QC activities necessary to achieve data-quality objectives. At least 15 percent of the total project effort must be directed toward QA/QC activities. Of this total, 5-10 percent will be used for collection and analysis of quality-control samples. This is an absolute minimum, and more may be required based on project objectives. Proper techniques must be followed in the collection and processing of surface

  14. Eight years of quality control in Bulgaria: impact on mammography practice.

    PubMed

    Avramova-Cholakova, S; Lilkov, G; Kaneva, M; Terziev, K; Nakov, I; Mutkurov, N; Kovacheva, D; Ivanova, M; Vasilev, D

    2015-07-01

    The requirements for quality control (QC) in diagnostic radiology were introduced in Bulgarian legislation in 2005. Hospital medical physicists and several private medical physics groups provide QC services to radiology departments. The aim of this study was to analyse data from QC tests in mammography and to investigate the impact of QC introduction on mammography practice in the country. The study was coordinated by the National Centre of Radiobiology and Radiation Protection. All medical physics services were requested to fill in standardised forms with information about most important parameters routinely measured during QC. All QC service providers responded. Results demonstrated significant improvement of practice since the introduction of QC, with reduction of established deviations from 65 % during the first year to 7 % in the last year. The systems that do not meet the acceptability criteria were suspended from use. Performance of automatic exposure control and digital detectors are not regularly tested because of the absence of requirements in the legislation. The need of updated guidance and training of medical physicists to reflect the change in technology was demonstrated. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. The effect of genome-wide association scan quality control on imputation outcome for common variants.

    PubMed

    Southam, Lorraine; Panoutsopoulou, Kalliope; Rayner, N William; Chapman, Kay; Durrant, Caroline; Ferreira, Teresa; Arden, Nigel; Carr, Andrew; Deloukas, Panos; Doherty, Michael; Loughlin, John; McCaskie, Andrew; Ollier, William E R; Ralston, Stuart; Spector, Timothy D; Valdes, Ana M; Wallis, Gillian A; Wilkinson, J Mark; Marchini, Jonathan; Zeggini, Eleftheria

    2011-05-01

    Imputation is an extremely valuable tool in conducting and synthesising genome-wide association studies (GWASs). Directly typed SNP quality control (QC) is thought to affect imputation quality. It is, therefore, common practise to use quality-controlled (QCed) data as an input for imputing genotypes. This study aims to determine the effect of commonly applied QC steps on imputation outcomes. We performed several iterations of imputing SNPs across chromosome 22 in a dataset consisting of 3177 samples with Illumina 610 k (Illumina, San Diego, CA, USA) GWAS data, applying different QC steps each time. The imputed genotypes were compared with the directly typed genotypes. In addition, we investigated the correlation between alternatively QCed data. We also applied a series of post-imputation QC steps balancing elimination of poorly imputed SNPs and information loss. We found that the difference between the unQCed data and the fully QCed data on imputation outcome was minimal. Our study shows that imputation of common variants is generally very accurate and robust to GWAS QC, which is not a major factor affecting imputation outcome. A minority of common-frequency SNPs with particular properties cannot be accurately imputed regardless of QC stringency. These findings may not generalise to the imputation of low frequency and rare variants.

  16. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    PubMed

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be

  17. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm

    PubMed Central

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be

  18. SprayQc: a real-time LC-MS/MS quality monitoring system to maximize uptime using off the shelf components.

    PubMed

    Scheltema, Richard A; Mann, Matthias

    2012-06-01

    With the advent of high-throughput mass spectrometry (MS)-based proteomics, the magnitude and complexity of the performed experiments has increased dramatically. Likewise, investments in chromatographic and MS instrumentation are a large proportion of the budget of proteomics laboratories. Guarding measurement quality and maximizing uptime of the LC-MS/MS systems therefore requires constant care despite automated workflows. We describe a real-time surveillance system, called SprayQc, that continuously monitors the status of the peripheral equipment to ensure that operational parameters are within an acceptable range. SprayQc is composed of multiple plug-in software components that use computer vision to analyze electrospray conditions, monitor the chromatographic device for stable backpressure, interact with a column oven to control pressure by temperature, and ensure that the mass spectrometer is still acquiring data. Action is taken when a failure condition has been detected, such as stopping the column oven and the LC flow, as well as automatically notifying the appropriate operator. Additionally, all defined metrics can be recorded synchronized on retention time with the MS acquisition file, allowing for later inspection and providing valuable information for optimization. SprayQc has been extensively tested in our laboratory, supports third-party plug-in development, and is freely available for download from http://sourceforge.org/projects/sprayqc .

  19. 76 FR 67315 - Supplemental Nutrition Assistance Program: Quality Control Error Tolerance Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ...This direct final rule is amending the Quality Control (QC) review error threshold in our regulations from $25.00 to $50.00. The purpose for raising the QC error threshold is to make permanent the temporary threshold change that was required by the American Recovery and Reinvestment Act of 2008. This change does not have an impact on the public. The QC system measures the accuracy of the eligibility system for the Supplemental Nutrition Assistance Program (SNAP).

  20. Comparison of quality control software tools for diffusion tensor imaging.

    PubMed

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Establishing daily quality control (QC) in screen-film mammography using leeds tor (max) phantom at the breast imaging unit of USTH-Benavides Cancer Institute

    NASA Astrophysics Data System (ADS)

    Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.

    2016-03-01

    Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.

  2. From field notes to data portal - An operational QA/QC framework for tower networks

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.

    2016-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.

  3. Employing quality control and feedback to the EQ-5D-5L valuation protocol to improve the quality of data collection.

    PubMed

    Purba, Fredrick Dermawan; Hunfeld, Joke A M; Iskandarsyah, Aulia; Fitriana, Titi Sahidah; Sadarjoen, Sawitri S; Passchier, Jan; Busschbach, Jan J V

    2017-05-01

    In valuing health states using generic questionnaires such as EQ-5D, there are unrevealed issues with the quality of the data collection. The aims were to describe the problems encountered during valuation and to evaluate a quality control report and subsequent retraining of interviewers in improving this valuation. Data from the first 266 respondents in an EQ-5D-5L valuation study were used. Interviewers were trained and answered questions regarding problems during these initial interviews. Thematic analysis was used, and individual feedback was provided. After completion of 98 interviews, a first quantitative quality control (QC) report was generated, followed by a 1-day retraining program. Subsequently individual feedback was also given on the basis of follow-up QCs. The Wilcoxon signed-rank test was used to assess improvements based on 7 indicators of quality as identified in the first QC and the QC conducted after a further 168 interviews. Interviewers encountered problems in recruiting respondents. Solutions provided were: optimization of the time of interview, the use of broader networks and the use of different scripts to explain the project's goals to respondents. For problems in interviewing process, solutions applied were: developing the technical and personal skills of the interviewers and stimulating the respondents' thought processes. There were also technical problems related to hardware, software and internet connections. There was an improvement in all 7 indicators of quality after the second QC. Training before and during a study, and individual feedback on the basis of a quantitative QC, can increase the validity of values obtained from generic questionnaires.

  4. Challenges in Development of Sperm Repositories for Biomedical Fishes: Quality Control in Small-Bodied Species.

    PubMed

    Torres, Leticia; Liu, Yue; Guitreau, Amy; Yang, Huiping; Tiersch, Terrence R

    2017-12-01

    Quality control (QC) is essential for reproducible and efficient functioning of germplasm repositories. However, many biomedical fish models present significant QC challenges due to small body sizes (<5 cm) and miniscule sperm volumes (<5 μL). Using minimal volumes of sperm, we used Zebrafish to evaluate common QC endpoints as surrogates for fertilization success along sequential steps of cryopreservation. First, concentrations of calibration bead suspensions were evaluated with a Makler ® counting chamber by using different sample volumes and mixing methods. For sperm analysis, samples were initially diluted at a 1:30 ratio with Hanks' balanced salt solution (HBSS). Motility was evaluated by using different ratios of sperm and activation medium, and membrane integrity was analyzed with flow cytometry at different concentrations. Concentration and sperm motility could be confidently estimated by using volumes as small as 1 μL, whereas membrane integrity required a minimum of 2 μL (at 1 × 10 6 cells/mL). Thus, <5 μL of sperm suspension (after dilution to 30-150 μL with HBSS) was required to evaluate sperm quality by using three endpoints. Sperm quality assessment using a combination of complementary endpoints enhances QC efforts during cryopreservation, increasing reliability and reproducibility, and reducing waste of time and resources.

  5. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  6. Operational quality control of daily precipitation using spatio-climatological consistency testing

    NASA Astrophysics Data System (ADS)

    Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.

    2010-09-01

    Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.

  7. An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.

    PubMed

    Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao

    2016-09-01

    The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.

  8. Quality Control Practices for Chemistry and Immunochemistry in a Cohort of 21 Large Academic Medical Centers.

    PubMed

    Rosenbaum, Matthew W; Flood, James G; Melanson, Stacy E F; Baumann, Nikola A; Marzinke, Mark A; Rai, Alex J; Hayden, Joshua; Wu, Alan H B; Ladror, Megan; Lifshitz, Mark S; Scott, Mitchell G; Peck-Palmer, Octavia M; Bowen, Raffick; Babic, Nikolina; Sobhani, Kimia; Giacherio, Donald; Bocsi, Gregary T; Herman, Daniel S; Wang, Ping; Toffaletti, John; Handel, Elizabeth; Kelly, Kathleen A; Albeiroti, Sami; Wang, Sihe; Zimmer, Melissa; Driver, Brandon; Yi, Xin; Wilburn, Clayton; Lewandrowski, Kent B

    2018-05-29

    In the United States, minimum standards for quality control (QC) are specified in federal law under the Clinical Laboratory Improvement Amendment and its revisions. Beyond meeting this required standard, laboratories have flexibility to determine their overall QC program. We surveyed chemistry and immunochemistry QC procedures at 21 clinical laboratories within leading academic medical centers to assess if standardized QC practices exist for chemistry and immunochemistry testing. We observed significant variation and unexpected similarities in practice across laboratories, including QC frequency, cutoffs, number of levels analyzed, and other features. This variation in practice indicates an opportunity exists to establish an evidence-based approach to QC that can be generalized across institutions.

  9. HANDBOOK: QUALITY ASSURANCE/QUALITY CONTROL (QA/QC) PROCEDURES FOR HAZARDOUS WASTE INCINERATION

    EPA Science Inventory

    Resource Conservation and Recovery Act regulations for hazardous waste incineration require trial burns by permit applicants. uality Assurance Project Plan (QAPjP) must accompany a trial burn plan with appropriate quality assurance/quality control procedures. uidance on the prepa...

  10. Sci-Fri AM: Quality, Safety, and Professional Issues 01: CPQR Technical Quality Control Suite Development including Quality Control Workload Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkoske, Kyle; Nielsen, Michelle; Brown, Erika

    A close partnership between the Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) has resulted in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. The framework includes consolidation of existing guidelines and/or literature by expert reviewers, structured stages of public review, external field-testing and ratification by COMP. The adopted framework for the development and maintenance of themore » TQCs ensures the guidelines incorporate input from the medical physics community during development, measures the workload required to perform the QC tests outlined in each TQC, and remain relevant (i.e. “living documents”) through subsequent planned reviews and updates. This presentation will show the Multi-Leaf Linear Accelerator document as an example of how feedback and cross-national work to achieve a robust guidance document. During field-testing, each technology was tested at multiple centres in a variety of clinic environments. As part of the defined feedback, workload data was captured. This lead to average time associated with testing as defined in each TQC document. As a result, for a medium-sized centre comprising 6 linear accelerators and a comprehensive brachytherapy program, we evaluate the physics workload to 1.5 full-time equivalent physicist per year to complete all QC tests listed in this suite.« less

  11. The importance of quality assurance/quality control of diagnostics to increase the confidence in global foot-and-mouth disease control.

    PubMed

    De Clercq, K; Goris, N; Barnett, P V; MacKay, D K

    2008-01-01

    The last decade international trade in animals and animal products was liberated and confidence in this global trade can increase only if appropriate control measures are applied. As foot-and-mouth disease (FMD) diagnostics will play an essential role in this respect, the Food and Agriculture Organization European Commission for the Control of Foot-and-Mouth Disease (EUFMD) co-ordinates, in collaboration with the European Commission, several programmes to increase the quality of FMD diagnostics. A quality assurance (QA) system is deemed essential for laboratories involved in certifying absence of FMDV or antibodies against the virus. Therefore, laboratories are encouraged to validate their diagnostic tests fully and to install a continuous quality control (QC) monitoring system. Knowledge of performance characteristics of diagnostics is essential to interpret results correctly and to calculate sample rates in regional surveillance campaigns. Different aspects of QA/QC of classical and new FMD virological and serological diagnostics are discussed in respect to the EU FMD directive (2003/85/EC). We recommended accepting trade certificates only from laboratories participating in international proficiency testing on a regular basis.

  12. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    PubMed

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the

  13. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald

    2014-12-15

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC

  14. Droplet digital PCR-based EGFR mutation detection with an internal quality control index to determine the quality of DNA.

    PubMed

    Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee

    2018-01-11

    In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.

  15. Disk diffusion quality control guidelines for NVP-PDF 713: a novel peptide deformylase inhibitor.

    PubMed

    Anderegg, Tamara R; Jones, Ronald N

    2004-01-01

    NVP-PDF713 is a peptide deformylase inhibitor that has emerged as a candidate for treating Gram-positive infections and selected Gram-negative species that commonly cause community-acquired respiratory tract infections. This report summarizes the results of a multi-center (seven participants) disk diffusion quality control (QC) investigation for NVP PDF-713 using guidelines of the National Committee for Clinical Laboratory Standards and the standardized disk diffusion method. A total of 420 NVP-PDF 713 zone diameter values were generated for each QC organism. The proposed zone diameter ranges contained 97.6-99.8% of the reported participant results and were: Staphylococcus aureus ATCC 25923 (25-35 mm), Streptococcus pneumoniae ATCC 49619 (30-37 mm), and Haemophilus influenzae ATCC 49247 (24-32 mm). These QC criteria for the disk diffusion method should be applied during the NVP-PDF 713 clinical trials to maximize test accuracy.

  16. The quality control theory of aging.

    PubMed

    Ladiges, Warren

    2014-01-01

    The quality control (QC) theory of aging is based on the concept that aging is the result of a reduction in QC of cellular systems designed to maintain lifelong homeostasis. Four QC systems associated with aging are 1) inadequate protein processing in a distressed endoplasmic reticulum (ER); 2) histone deacetylase (HDAC) processing of genomic histones and gene silencing; 3) suppressed AMPK nutrient sensing with inefficient energy utilization and excessive fat accumulation; and 4) beta-adrenergic receptor (BAR) signaling and environmental and emotional stress. Reprogramming these systems to maintain efficiency and prevent aging would be a rational strategy for increased lifespan and improved health. The QC theory can be tested with a pharmacological approach using three well-known and safe, FDA-approved drugs: 1) phenyl butyric acid, a chemical chaperone that enhances ER function and is also an HDAC inhibitor, 2) metformin, which activates AMPK and is used to treat type 2 diabetes, and 3) propranolol, a beta blocker which inhibits BAR signaling and is used to treat hypertension and anxiety. A critical aspect of the QC theory, then, is that aging is associated with multiple cellular systems that can be targeted with drug combinations more effectively than with single drugs. But more importantly, these drug combinations will effectively prevent, delay, or reverse chronic diseases of aging that impose such a tremendous health burden on our society.

  17. Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler

    NASA Technical Reports Server (NTRS)

    Vacek, Austin

    2016-01-01

    Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.

  18. Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler

    NASA Technical Reports Server (NTRS)

    Vacek, Austin

    2015-01-01

    Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.

  19. Analysis of quality control data of eight modern radiotherapy linear accelerators: the short- and long-term behaviours of the outputs and the reproducibility of quality control measurements

    NASA Astrophysics Data System (ADS)

    Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu

    2006-07-01

    Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.

  20. Use of Six Sigma Worksheets for assessment of internal and external failure costs associated with candidate quality control rules for an ADVIA 120 hematology analyzer.

    PubMed

    Cian, Francesco; Villiers, Elisabeth; Archer, Joy; Pitorri, Francesca; Freeman, Kathleen

    2014-06-01

    Quality control (QC) validation is an essential tool in total quality management of a veterinary clinical pathology laboratory. Cost-analysis can be a valuable technique to help identify an appropriate QC procedure for the laboratory, although this has never been reported in veterinary medicine. The aim of this study was to determine the applicability of the Six Sigma Quality Cost Worksheets in the evaluation of possible candidate QC rules identified by QC validation. Three months of internal QC records were analyzed. EZ Rules 3 software was used to evaluate candidate QC procedures, and the costs associated with the application of different QC rules were calculated using the Six Sigma Quality Cost Worksheets. The costs associated with the current and the candidate QC rules were compared, and the amount of cost savings was calculated. There was a significant saving when the candidate 1-2.5s, n = 3 rule was applied instead of the currently utilized 1-2s, n = 3 rule. The savings were 75% per year (£ 8232.5) based on re-evaluating all of the patient samples in addition to the controls, and 72% per year (£ 822.4) based on re-analyzing only the control materials. The savings were also shown to change accordingly with the number of samples analyzed and with the number of daily QC procedures performed. These calculations demonstrated the importance of the selection of an appropriate QC procedure, and the usefulness of the Six Sigma Costs Worksheet in determining the most cost-effective rule(s) when several candidate rules are identified by QC validation. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.

  1. Quality assurance and quality control of geochemical data—A primer for the research scientist

    USGS Publications Warehouse

    Geboy, Nicholas J.; Engle, Mark A.

    2011-01-01

    Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and

  2. The Individualized Quality Control Plan - Coming Soon to Clinical Microbiology Laboratories Everywhere!

    PubMed

    Anderson, Nancy

    2015-11-15

    As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.

  3. ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores

    ERIC Educational Resources Information Center

    Allalouf, Avi

    2014-01-01

    The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…

  4. Filtered Push: Annotating Distributed Data for Quality Control and Fitness for Use Analysis

    NASA Astrophysics Data System (ADS)

    Morris, P. J.; Kelly, M. A.; Lowery, D. B.; Macklin, J. A.; Morris, R. A.; Tremonte, D.; Wang, Z.

    2009-12-01

    sets, or practices of science. (2) Data quality problems often cannot be detected only from internal statistical correlations or logical analysis, but may need the application of defined workflows that signal illogical output. (3) Changes in scientific theory or practice over time can result in changes of what QC tests should be applied to legacy data. (4) The frequency of some classes of error in a data set may be identifiable without the ability to assert that a particular record is in error. To address these issues requires, as does science itself, framing QC hypotheses against data that may be anywhere and may arise at any time in the future. In short, QC for science data is a never ending process. It must provide for notice to an agent (human or software) that a given dataset supports a hypothesis of inconsistency with a current scientific resource or model, or with potential generalizations of the concepts in a metadata ontology. Like quality control in general, quality control of distributed data is a repeated cyclical process. In implementing a Filtered Push network for quality control, we have a model in which the cost of QC forever is not substantially greater than QC once.

  5. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)

    EPA Science Inventory

    The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.

    The Nation...

  6. Quality control and conduct of genome-wide association meta-analyses

    PubMed Central

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth JF

    2014-01-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for [1] organizational aspects of GWAMAs, and for [2] QC at the study file level, the meta-level across studies, and the meta-analysis output level. Real–world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for use of a powerful and flexible software package called EasyQC. For consortia of comparable size to the GIANT consortium, the present protocol takes a minimum of about 10 months to complete. PMID:24762786

  7. Many roads may lead to Rome: Selected features of quality control within environmental assessment systems in the US, NL, CA, and UK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann

    As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less

  8. AutoLock: a semiautomated system for radiotherapy treatment plan quality control.

    PubMed

    Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G

    2015-05-08

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.

  9. External quality assessment program for detection of glucose-6-phosphate dehydrogenase deficiency in the Guangxi region.

    PubMed

    Tang, Juan; Zhou, Xiangyang; Liu, Xiaochun; Ning, Leping; Zhou, Weiya; He, Yi

    2017-09-01

    The aim of this study is to improve the quality of testing for glucose-6-phosphate dehydrogenase (G6PD) deficiency through evaluation and analysis of the laboratory tests for G6PD activity. External quality assessment (EQA) was carried out twice per year with five samples each from 2014 to 2016. Samples were used for quantitative and qualitative assays. Quantitative results were collected, qualitative results were determined with reference values, and information about methods, reagents and instruments from participating laboratories within the required time. Laboratory performance scores, coefficient of variation (CV), and the rates of false negative and positive results were calculated. As a result, a total of 2,834 cases of negative quality control (QC) samples and 2,451 cases of positive QC samples were assessed, where the rates of false negative and false positive results were 1.31% (37/2,834) and 1.34% (33/2,451), respectively. Quantitative results indicated an increasing trend in testing quality, which were consistent with conclusions based on the comparison of EQA full-score and acceptable ratio in six assessments. The 2nd assay in 2016 had the best full-score ratio of 68.9% (135/196) and best acceptable ratio of 84.2% (165/196). There was a decreasing trend in the average CV of six reagents produced in China, and the range of average CV increased to 14.6-23.6% in 2016. The average CV of low level and high level samples was 22.5% and 15.3%, respectively, demonstrating that samples with low G6PD activity have greater interlaboratory CV values. In conclusion, laboratories improved their testing quality and provided better diagnostic service for G6PD deficiency in areas with high incidence after participation in the EQA program in the Guangxi region.

  10. Impacts of Intelligent Automated Quality Control on a Small Animal APD-Based Digital PET Scanner

    NASA Astrophysics Data System (ADS)

    Charest, Jonathan; Beaudoin, Jean-François; Bergeron, Mélanie; Cadorette, Jules; Arpin, Louis; Lecomte, Roger; Brunet, Charles-Antoine; Fontaine, Réjean

    2016-10-01

    Stable system performance is mandatory to warrant the accuracy and reliability of biological results relying on small animal positron emission tomography (PET) imaging studies. This simple requirement sets the ground for imposing routine quality control (QC) procedures to keep PET scanners at a reliable optimal performance level. However, such procedures can become burdensome to implement for scanner operators, especially taking into account the increasing number of data acquisition channels in newer generation PET scanners. In systems using pixel detectors to achieve enhanced spatial resolution and contrast-to-noise ratio (CNR), the QC workload rapidly increases to unmanageable levels due to the number of independent channels involved. An artificial intelligence based QC system, referred to as Scanner Intelligent Diagnosis for Optimal Performance (SIDOP), was proposed to help reducing the QC workload by performing automatic channel fault detection and diagnosis. SIDOP consists of four high-level modules that employ machine learning methods to perform their tasks: Parameter Extraction, Channel Fault Detection, Fault Prioritization, and Fault Diagnosis. Ultimately, SIDOP submits a prioritized faulty channel list to the operator and proposes actions to correct them. To validate that SIDOP can perform QC procedures adequately, it was deployed on a LabPET™ scanner and multiple performance metrics were extracted. After multiple corrections on sub-optimal scanner settings, a 8.5% (with a 95% confidence interval (CI) of [7.6, 9.3]) improvement in the CNR, a 17.0% (CI: [15.3, 18.7]) decrease of the uniformity percentage standard deviation, and a 6.8% gain in global sensitivity were observed. These results confirm that SIDOP can indeed be of assistance in performing QC procedures and restore performance to optimal figures.

  11. Quality control in the year 2000.

    PubMed

    Schade, B

    1992-01-01

    'Just-in-time' production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems).

  12. Quality control in the year 2000

    PubMed Central

    Schade, Bernd

    1992-01-01

    ‘Just-in-time’ production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems). PMID:18924930

  13. A Strategy to Establish a Quality Assurance/Quality Control Plan for the Application of Biosensors for the Detection of E. coli in Water.

    PubMed

    Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza

    2017-01-03

    Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities.

  14. A Strategy to Establish a Quality Assurance/Quality Control Plan for the Application of Biosensors for the Detection of E. coli in Water

    PubMed Central

    Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza

    2017-01-01

    Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities. PMID:28054956

  15. Quality control and conduct of genome-wide association meta-analyses.

    PubMed

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth J F

    2014-05-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.

  16. Protecting the proteome: Eukaryotic cotranslational quality control pathways

    PubMed Central

    2014-01-01

    The correct decoding of messenger RNAs (mRNAs) into proteins is an essential cellular task. The translational process is monitored by several quality control (QC) mechanisms that recognize defective translation complexes in which ribosomes are stalled on substrate mRNAs. Stalled translation complexes occur when defects in the mRNA template, the translation machinery, or the nascent polypeptide arrest the ribosome during translation elongation or termination. These QC events promote the disassembly of the stalled translation complex and the recycling and/or degradation of the individual mRNA, ribosomal, and/or nascent polypeptide components, thereby clearing the cell of improper translation products and defective components of the translation machinery. PMID:24535822

  17. AutoLock: a semiautomated system for radiotherapy treatment plan quality control

    PubMed Central

    Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.

    2015-01-01

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498

  18. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    PubMed

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  19. SRT Evaluation of AIRS Version-6.02 and Version-6.02 AIRS Only (6.02 AO) Products

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Iredell, Lena; Molnar, Gyula; Blaisdell, John

    2012-01-01

    Version-6 contains a number of significant improvements over Version-5. This report compares Version-6 products resulting from the advances listed below to those from Version-5. 1. Improved methodology to determine skin temperature (T(sub s)) and spectral emissivity (Epsilon(sub v)). 2. Use of Neural-net start-up state. 3. Improvements which decrease the spurious negative Version-5 trend in tropospheric temperatures. 4. Improved QC methodology. Version-6 uses separate QC thresholds optimized for Data Assimilation (QC=0) and Climate applications (QC=0,1) respectively. 5. Channel-by-channel clear-column radiances R-hat(sub tau) QC flags. 6. Improved cloud parameter retrieval algorithm. 7. Improved OLR RTA. Our evaluation compared V6.02 and V6.02 AIRS Only (V6.02 AO) Quality Controlled products with those of Version-5.0. In particular we evaluated surface skin temperature T(sub s), surface spectral emissivity Epsilon(sub v), temperature profile T(p), water vapor profile q(p), OLR, OLR(sub CLR), effective cloud fraction alpha-Epsilon, and cloud cleared radiances R-hat(sub tau) . We conducted two types of evaluations. The first compared results on 7 focus days to collocated ECMWF truth. The seven focus days are: September 6, 2002; January 25, 2003; September 29, 2004; August 5, 2005; February 24, 2007; August 10, 2007; and May 30, 2010. In these evaluations, we show results for T(sub s), Epsilon(sub v), T(p), and q(p) in terms of yields, and RMS differences and biases with regard to ECMWF. We also show yield trends as well as bias trends of these quantities relative to ECMWF truth. We also show yields and accuracy of channel by channel QC d values of R-hat(sub tau) for V6.02 and V6.02 AO. Version-5 did not contain channel by channel QC d values of R-hat(sub tau). In the second type of evaluation, we compared V6.03 monthly mean Level-3 products to those of Version-5.0, for four different months: January, April, July, and October; in 3 different years 2003, 2007, and 2011

  20. mFOLFOX6 Plus Panitumumab Versus 5-FU/LV Plus Panitumumab After Six Cycles of Frontline mFOLFOX6 Plus Panitumumab: A Randomized Phase II Study of Patients With Unresectable or Advanced/Recurrent, RAS Wild-type Colorectal Carcinoma (SAPPHIRE)-Study Design and Rationale.

    PubMed

    Nagata, Naoki; Mishima, Hideyuki; Kurosawa, Shuichi; Oba, Koji; Sakamoto, Junichi

    2017-06-01

    In Japan, oxaliplatin (OXA)/5-fluorouracil (5-FU)/leucovorin (LV)-the mFOLFOX6 regimen-is the most frequently used first-line chemotherapy backbone for metastatic colorectal cancer. However, peripheral nerve disorders caused by OXA during mFOLFOX6 therapy can decrease patients' quality of life. OXA can be safely discontinued from a FOLFOX regimen after 6 cycles during first-line therapy. Also, for patients who discontinue OXA without having experienced peripheral nerve disorders, reintroducing OXA in the later stages of treatment could remain an option. The study is a phase II, multicenter, open-label, parallel-group, randomized, controlled exploratory study comparing the efficacy and safety of mFOLFOX6 plus panitumumab and 5-FU/LV plus panitumumab in patients with chemotherapy-naïve, unresectable, advanced or recurrent colorectal carcinoma of RAS wild-type (SAPPHIRE; ClinicalTrials.gov identifier, NCT02337946). Eligible patients will receive 6 cycles of mFOLFOX6 plus panitumumab combination therapy, followed by 1:1 randomization to either further treatment with mFOLFOX6 plus panitumumab or discontinuation of OXA and treatment with 5-FU/LV plus panitumumab. Up to 100 randomized patients will receive treatment for approximately 12 months or until any of the criteria for treatment discontinuation have been met. The primary endpoint is progression-free survival rate at 9 months after the day of randomization. The secondary endpoints are progression-free survival, overall survival, response rate, and interval to treatment failure. Safety will be evaluated according to the incidence and severity of adverse events, including the incidence of peripheral nerve and skin disorders. Additional endpoints will include maintenance of performance status, continuation of OXA in the mFOLFOX6 plus panitumumab group, and continuation of panitumumab in both groups. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Molecular mechanism of ER stress-induced pre-emptive quality control involving association of the translocon, Derlin-1, and HRD1.

    PubMed

    Kadowaki, Hisae; Satrimafitrah, Pasjan; Takami, Yasunari; Nishitoh, Hideki

    2018-05-09

    The maintenance of endoplasmic reticulum (ER) homeostasis is essential for cell function. ER stress-induced pre-emptive quality control (ERpQC) helps alleviate the burden to a stressed ER by limiting further protein loading. We have previously reported the mechanisms of ERpQC, which includes a rerouting step and a degradation step. Under ER stress conditions, Derlin family proteins (Derlins), which are components of ER-associated degradation, reroute specific ER-targeting proteins to the cytosol. Newly synthesized rerouted polypeptides are degraded via the cytosolic chaperone Bag6 and the AAA-ATPase p97 in the ubiquitin-proteasome system. However, the mechanisms by which ER-targeting proteins are rerouted from the ER translocation pathway to the cytosolic degradation pathway and how the E3 ligase ubiquitinates ERpQC substrates remain unclear. Here, we show that ERpQC substrates are captured by the carboxyl-terminus region of Derlin-1 and ubiquitinated by the HRD1 E3 ubiquitin ligase prior to degradation. Moreover, HRD1 forms a large ERpQC-related complex composed of Sec61α and Derlin-1 during ER stress. These findings indicate that the association of the degradation factor HRD1 with the translocon and the rerouting factor Derlin-1 may be necessary for the smooth and effective clearance of ERpQC substrates.

  2. Building a QC Database of Meteorological Data from NASA KSC and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  3. Quality assurance and quality control for thermal/optical analysis of aerosol samples for organic and elemental carbon.

    PubMed

    Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K

    2011-12-01

    Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.

  4. 40 CFR 75.21 - Quality assurance and quality control requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., as defined in § 72.2, k=2) of plus or minus 1.0 percent (calculated combined standard uncertainty of... system according to the quality assurance and quality control procedures in appendix B of this part. (2... requirements of Method 2, 6C, 7E, or 3A in Appendices A-1, A-2 and A-4 to part 60 of this chapter (supplemented...

  5. Quality Control in Clinical Laboratory Samples

    DTIC Science & Technology

    2015-01-01

    is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient

  6. QA/QC requirements for physical properties sampling and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Innis, B.E.

    1993-07-21

    This report presents results of an assessment of the available information concerning US Environmental Protection Agency (EPA) quality assurance/quality control (QA/QC) requirements and guidance applicable to sampling, handling, and analyzing physical parameter samples at Comprehensive Environmental Restoration, Compensation, and Liability Act (CERCLA) investigation sites. Geotechnical testing laboratories measure the following physical properties of soil and sediment samples collected during CERCLA remedial investigations (RI) at the Hanford Site: moisture content, grain size by sieve, grain size by hydrometer, specific gravity, bulk density/porosity, saturated hydraulic conductivity, moisture retention, unsaturated hydraulic conductivity, and permeability of rocks by flowing air. Geotechnical testing laboratories alsomore » measure the following chemical parameters of soil and sediment samples collected during Hanford Site CERCLA RI: calcium carbonate and saturated column leach testing. Physical parameter data are used for (1) characterization of vadose and saturated zone geology and hydrogeology, (2) selection of monitoring well screen sizes, (3) to support modeling and analysis of the vadose and saturated zones, and (4) for engineering design. The objectives of this report are to determine the QA/QC levels accepted in the EPA Region 10 for the sampling, handling, and analysis of soil samples for physical parameters during CERCLA RI.« less

  7. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... DUST SAMPLING DEVICES Approval Requirements for Coal Mine Dust Personal Sampler Unit § 74.6 Quality... equipment procedures and records and to interview the employees who conduct the control tests. Two copies of...

  8. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151.

    PubMed

    Jones, A Kyle; Heintz, Philip; Geiser, William; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John

    2015-11-01

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.

  9. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Geiser, William; Heintz, Philip

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist ismore » responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.« less

  10. User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org

    PubMed Central

    Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.

    2013-01-01

    Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278

  11. Countably QC-Approximating Posets

    PubMed Central

    Mao, Xuxin; Xu, Luoshan

    2014-01-01

    As a generalization of countably C-approximating posets, the concept of countably QC-approximating posets is introduced. With the countably QC-approximating property, some characterizations of generalized completely distributive lattices and generalized countably approximating posets are given. The main results are as follows: (1) a complete lattice is generalized completely distributive if and only if it is countably QC-approximating and weakly generalized countably approximating; (2) a poset L having countably directed joins is generalized countably approximating if and only if the lattice σ c(L)op of all σ-Scott-closed subsets of L is weakly generalized countably approximating. PMID:25165730

  12. Development of a quality control test procedure for characterizing fracture properties of asphalt mixtures.

    DOT National Transportation Integrated Search

    2011-06-01

    The main objective of this study is to investigate the use of the semi-circular bend (SCB) : test as a quality assurance/quality control (QA/QC) measure for field construction. : Comparison of fracture properties from the SCB test and fatigue beam te...

  13. Evaluation of non-destructive technologies for construction quality control of HMA and PCC pavements in Louisiana.

    DOT National Transportation Integrated Search

    2013-11-01

    Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and : Development (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...

  14. Quality Control Algorithms for the Kennedy Space Center 50-Megahertz Doppler Radar Wind Profiler Winds Database

    NASA Technical Reports Server (NTRS)

    Barbre, Robert E., Jr.

    2012-01-01

    This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown

  15. Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.

    PubMed

    Heredia, Nicholas J

    2018-01-01

    Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.

  16. Effects of N-glycan precursor length diversity on quality control of protein folding and on protein glycosylation

    PubMed Central

    Samuelson, John; Robbins, Phillips W.

    2014-01-01

    Asparagine-linked glycans (N-glycans) of medically important protists have much to tell us about the evolution of N-glycosylation and of N-glycan-dependent quality control (N-glycan QC) of protein folding in the endoplasmic reticulum. While host N-glycans are built upon a dolichol-pyrophosphate-linked precursor with 14 sugars (Glc3Man9GlcNAc2), protist N-glycan precursors vary from Glc3Man9GlcNAc2 (Acanthamoeba) to Man9GlcNAc2 (Trypanosoma) to Glc3Man5GlcNAc2 (Toxoplasma) to Man5GlcNAc2 (Entamoeba, Trichomonas, and Eimeria) to GlcNAc2 (Plasmodium and Giardia) to zero (Theileria). As related organisms have differing N-glycan lengths (e.g. Toxoplasma, Eimeria, Plasmodium, and Theileria), the present N-glycan variation is based upon secondary loss of Alg genes, which encode enzymes that add sugars to the N-glycan precursor. An N-glycan precursor with Man5GlcNAc2 is necessary but not sufficient for N-glycan QC, which is predicted by the presence of the UDP-glucose:glucosyltransferase (UGGT) plus calreticulin and/or calnexin. As many parasites lack glucose in their N-glycan precursor, UGGT product may be identified by inhibition of glucosidase II. The presence of an armless calnexin in Toxoplasma suggests secondary loss of N-glycan QC from coccidia. Positive selection for N-glycan sites occurs in secreted proteins of organisms with NG-QC and is based upon an increased likelihood of threonine but not serine in the second position versus asparagine. In contrast, there appears to be selection against N-glycan length in Plasmodium and N-glycan site density in Toxoplasma. Finally, there is suggestive evidence for N-glycan-dependent ERAD in Trichomonas, which glycosylates and degrades the exogenous reporter mutant carboxypeptidase Y (CPY*). PMID:25475176

  17. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA AND QC CHECKS (UA-C-2.0)

    EPA Science Inventory

    The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.

    The U.S.-Mex...

  18. Genome measures used for quality control are dependent on gene function and ancestry.

    PubMed

    Wang, Jing; Raskin, Leon; Samuels, David C; Shyr, Yu; Guo, Yan

    2015-02-01

    The transition/transversion (Ti/Tv) ratio and heterozygous/nonreference-homozygous (het/nonref-hom) ratio have been commonly computed in genetic studies as a quality control (QC) measurement. Additionally, these two ratios are helpful in our understanding of the patterns of DNA sequence evolution. To thoroughly understand these two genomic measures, we performed a study using 1000 Genomes Project (1000G) released genotype data (N=1092). An additional two datasets (N=581 and N=6) were used to validate our findings from the 1000G dataset. We compared the two ratios among continental ancestry, genome regions and gene functionality. We found that the Ti/Tv ratio can be used as a quality indicator for single nucleotide polymorphisms inferred from high-throughput sequencing data. The Ti/Tv ratio varies greatly by genome region and functionality, but not by ancestry. The het/nonref-hom ratio varies greatly by ancestry, but not by genome regions and functionality. Furthermore, extreme guanine + cytosine content (either high or low) is negatively associated with the Ti/Tv ratio magnitude. Thus, when performing QC assessment using these two measures, care must be taken to apply the correct thresholds based on ancestry and genome region. Failure to take these considerations into account at the QC stage will bias any following analysis. yan.guo@vanderbilt.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Building a QC Database of Meteorological Data From NASA KSC and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.

  20. Comparison of four methods of establishing control limits for monitoring quality controls in infectious disease serology testing.

    PubMed

    Dimech, Wayne; Karakaltsas, Marina; Vincini, Giuseppe A

    2018-05-25

    A general trend towards conducting infectious disease serology testing in centralized laboratories means that quality control (QC) principles used for clinical chemistry testing are applied to infectious disease testing. However, no systematic assessment of methods used to establish QC limits has been applied to infectious disease serology testing. A total of 103 QC data sets, obtained from six different infectious disease serology analytes, were parsed through standard methods for establishing statistical control limits, including guidelines from Public Health England, USA Clinical and Laboratory Standards Institute (CLSI), German Richtlinien der Bundesärztekammer (RiliBÄK) and Australian QConnect. The percentage of QC results failing each method was compared. The percentage of data sets having more than 20% of QC results failing Westgard rules when the first 20 results were used to calculate the mean±2 standard deviation (SD) ranged from 3 (2.9%) for R4S to 66 (64.1%) for 10X rule, whereas the percentage ranged from 0 (0%) for R4S to 32 (40.5%) for 10X when the first 100 results were used to calculate the mean±2 SD. By contrast, the percentage of data sets with >20% failing the RiliBÄK control limits was 25 (24.3%). Only two data sets (1.9%) had more than 20% of results outside the QConnect Limits. The rate of failure of QCs using QConnect Limits was more applicable for monitoring infectious disease serology testing compared with UK Public Health, CLSI and RiliBÄK, as the alternatives to QConnect Limits reported an unacceptably high percentage of failures across the 103 data sets.

  1. Results-driven approach to improving quality and productivity

    Treesearch

    John Dramm

    2000-01-01

    Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of “Someday, this will all pay off.” Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...

  2. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  3. Data-quality measures for stakeholder-implemented watershed-monitoring programs

    USGS Publications Warehouse

    Greve, Adrienne I.

    2002-01-01

    Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.

  4. Evaluation of non-destructive technologies for construction quality control of HMA and PCC pavements in Louisiana : [tech summary].

    DOT National Transportation Integrated Search

    2013-11-01

    Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and Development : (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...

  5. Improving GEOS-5 seven day forecast skill by assimilation of quality controlled AIRS temperature profiles

    NASA Astrophysics Data System (ADS)

    Susskind, J.; Rosenberg, R. I.

    2016-12-01

    The GEOS-5 Data Assimilation System (DAS) generates a global analysis every six hours by combining the previous six hour forecast for that time period with contemporaneous observations. These observations include in-situ observations as well as those taken by satellite borne instruments, such as AIRS/AMSU on EOS Aqua and CrIS/ATMS on S-NPP. Operational data assimilation methodology assimilates observed channel radiances Ri for IR sounding instruments such as AIRS and CrIS, but only for those channels i in a given scene whose radiances are thought to be unaffected by clouds. A limitation of this approach is that radiances in most tropospheric sounding channels are affected by clouds under partial cloud cover conditions, which occurs most of the time. The AIRS Science Team Version-6 retrieval algorithm generates cloud cleared radiances (CCR's) for each channel in a given scene, which represent the radiances AIRS would have observed if the scene were cloud free, and then uses them to determine quality controlled (QC'd) temperature profiles T(p) under all cloud conditions. There are potential advantages to assimilate either AIRS QC'd CCR's or QC'd T(p) instead of Ri in that the spatial coverage of observations is greater under partial cloud cover. We tested these two alternate data assimilation approaches by running three parallel data assimilation experiments over different time periods using GEOS-5. Experiment 1 assimilated all observations as done operationally, Experiment 2 assimilated QC'd values of AIRS CCRs in place of AIRS radiances, and Experiment 3 assimilated QC'd values of T(p) in place of observed radiances. Assimilation of QC'd AIRS T(p) resulted in significant improvement in seven day forecast skill compared to assimilation of CCR's or assimilation of observed radiances, especially in the Southern Hemisphere Extra-tropics.

  6. Bulgarian experience in the establishment of reference dose levels and implementation of a quality control system in diagnostic radiology.

    PubMed

    Vassileva, J; Dimov, A; Slavchev, A; Karadjov, A

    2005-01-01

    Results from a Bulgarian patient dose survey in diagnostic radiology are presented. Reference levels for entrance surface dose (ESD) were 0.9 mGy for chest radiography (PA), 30 mGy for lumbar spine (Lat), 10 mGy for pelvis, 5 mGy for skull (AP), 3 mGy for skull (Lat) and 13 mGy for mammography. Quality control (QC) programmes were proposed for various areas of diagnostic radiology. Film processing QC warranted special attention. Proposed QC programmes included parameters to be tested, level of expertise needed and two action levels: remedial and suspension. Programmes were tested under clinical conditions to assess initial results and draw conclusions for further QC system development. On the basis of international experience, measurement protocols were developed for all parameters tested. QC equipment was provided as part of the PHARE project. A future problem for QC programme implementation may be the small number of medical physics experts in diagnostic radiology.

  7. Autonomous Quality Control of Joint Orientation Measured with Inertial Sensors.

    PubMed

    Lebel, Karina; Boissy, Patrick; Nguyen, Hung; Duval, Christian

    2016-07-05

    Clinical mobility assessment is traditionally performed in laboratories using complex and expensive equipment. The low accessibility to such equipment, combined with the emerging trend to assess mobility in a free-living environment, creates a need for body-worn sensors (e.g., inertial measurement units-IMUs) that are capable of measuring the complexity in motor performance using meaningful measurements, such as joint orientation. However, accuracy of joint orientation estimates using IMUs may be affected by environment, the joint tracked, type of motion performed and velocity. This study investigates a quality control (QC) process to assess the quality of orientation data based on features extracted from the raw inertial sensors' signals. Joint orientation (trunk, hip, knee, ankle) of twenty participants was acquired by an optical motion capture system and IMUs during a variety of tasks (sit, sit-to-stand transition, walking, turning) performed under varying conditions (speed, environment). An artificial neural network was used to classify good and bad sequences of joint orientation with a sensitivity and a specificity above 83%. This study confirms the possibility to perform QC on IMU joint orientation data based on raw signal features. This innovative QC approach may be of particular interest in a big data context, such as for remote-monitoring of patients' mobility.

  8. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Hydrogen Production § 98.164 Monitoring and QA/QC requirements. The GHG emissions data for hydrogen production process units must be quality-assured as specified in... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated...

  9. Real Time Quality Control Methods for Cued EMI Data Collection

    DTIC Science & Technology

    2016-03-14

    contents be construed as reflecting the official policy or position of the Department of Defense. Reference herein to any specific commercial product...This project evaluated the effectiveness of in-field quality control (QC) procedures during cued electromagnetic induction (EMI) data collection. The...electromagnetic induction ESTCP Environmental Security Technology Certification Program hr hour ISO Industry Standard Object IVS Instrument

  10. Quality controls for wind measurement of a 1290-MHz boundary layer profiler under strong wind conditions.

    PubMed

    Liu, Zhao; Zheng, Chaorong; Wu, Yue

    2017-09-01

    Wind profilers have been widely adopted to observe the wind field information in the atmosphere for different purposes. But accuracy of its observation has limitations due to various noises or disturbances and hence need to be further improved. In this paper, the data measured under strong wind conditions, using a 1290-MHz boundary layer profiler (BLP), are quality controlled via a composite quality control (QC) procedure proposed by the authors. Then, through the comparison with the data measured by radiosonde flights (balloon observations), the critical thresholds in the composite QC procedure, including consensus average threshold T 1 and vertical shear threshold T 3 , are systematically discussed. And the performance of the BLP operated under precipitation is also evaluated. It is found that to ensure the high accuracy and high data collectable rate, the optimal range of subsets is determined to be 4 m/s. Although the number of data rejected by the combined algorithm of vertical shear examination and small median test is quite limited, it is proved that the algorithm is quite useful to recognize the outlier with a large discrepancy. And the optimal wind shear threshold T 3 can be recommended as 5 ms -1 /100m. During patchy precipitation, the quality of data measured by the four oblique beams (using the DBS measuring technique) can still be ensured. After the BLP data are quality controlled by the composite QC procedure, the output can show good agreement with the balloon observation.

  11. Evaluation of non-destructive technologies for construction quality control of HMA and PCC pavements in Louisiana : [research project capsule].

    DOT National Transportation Integrated Search

    2009-07-01

    Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...

  12. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Hydrogen Production § 98.164 Monitoring and QA/QC requirements. The GHG emissions data for hydrogen production process units must be quality-assured as specified in..., Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference, see § 98.7). (xi...

  13. Microbiological water methods: quality control measures for Federal Clean Water Act and Safe Drinking Water Act regulatory compliance.

    PubMed

    Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie

    2014-01-01

    Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.

  14. Machine-Specific Magnetic Resonance Imaging Quality Control Procedures for Stereotactic Radiosurgery Treatment Planning

    PubMed Central

    Taghizadeh, Somayeh; Yang, Claus Chunli; R. Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan

    2017-01-01

    Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID3D and Quasar GRID3D phantoms were used to evaluate the effects of static magnetic field (B0) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible

  15. Machine-Specific Magnetic Resonance Imaging Quality Control Procedures for Stereotactic Radiosurgery Treatment Planning.

    PubMed

    Fatemi, Ali; Taghizadeh, Somayeh; Yang, Claus Chunli; R Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan

    2017-12-18

    Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID 3D and Quasar GRID 3D phantoms were used to evaluate the effects of static magnetic field (B 0 ) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning

  16. A Rotatable Quality Control Phantom for Evaluating the Performance of Flat Panel Detectors in Imaging Moving Objects.

    PubMed

    Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki

    2016-02-01

    As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.

  17. Comparison of Different Matrices as Potential Quality Control Samples for Neurochemical Dementia Diagnostics.

    PubMed

    Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M C; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Frölich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr

    2016-03-01

    Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples. To prepare and test alternative matrices for QC samples that could facilitate intra- and inter-laboratory QC of the NDD biomarkers. Three matrices were validated in this study: (A) human pooled CSF, (B) Aβ peptides spiked into human prediluted plasma, and (C) Aβ peptides spiked into solution of bovine serum albumin in phosphate-buffered saline. All matrices were tested also after supplementation with an antibacterial agent (sodium azide). We analyzed short- and long-term stability of the biomarkers with ELISA and chemiluminescence (Fujirebio Europe, MSD, IBL International), and performed an inter-laboratory variability study. NDD biomarkers turned out to be stable in almost all samples stored at the tested conditions for up to 14 days as well as in samples stored deep-frozen (at - 80°C) for up to one year. Sodium azide did not influence biomarker stability. Inter-center variability of the samples sent at room temperature (pooled CSF, freeze-dried CSF, and four artificial matrices) was comparable to the results obtained on deep-frozen samples in other large-scale projects. Our results suggest that it is possible to replace self-made, CSF-based QC samples with large-scale volumes of QC materials prepared with artificial peptides and matrices. This would greatly facilitate intra- and inter-laboratory QC schedules for NDD measurements.

  18. Diffusion imaging quality control via entropy of principal direction distribution.

    PubMed

    Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A

    2013-11-15

    Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called

  19. Diffusion imaging quality control via entropy of principal direction distribution

    PubMed Central

    Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.

    2013-01-01

    Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here

  20. SIRFLOX: Randomized Phase III Trial Comparing First-Line mFOLFOX6 (Plus or Minus Bevacizumab) Versus mFOLFOX6 (Plus or Minus Bevacizumab) Plus Selective Internal Radiation Therapy in Patients With Metastatic Colorectal Cancer.

    PubMed

    van Hazel, Guy A; Heinemann, Volker; Sharma, Navesh K; Findlay, Michael P N; Ricke, Jens; Peeters, Marc; Perez, David; Robinson, Bridget A; Strickland, Andrew H; Ferguson, Tom; Rodríguez, Javier; Kröning, Hendrik; Wolf, Ido; Ganju, Vinod; Walpole, Euan; Boucher, Eveline; Tichler, Thomas; Shacham-Shmueli, Einat; Powell, Alex; Eliadis, Paul; Isaacs, Richard; Price, David; Moeslein, Fred; Taieb, Julien; Bower, Geoff; Gebski, Val; Van Buskirk, Mark; Cade, David N; Thurston, Kenneth; Gibbs, Peter

    2016-05-20

    SIRFLOX was a randomized, multicenter trial designed to assess the efficacy and safety of adding selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres to standard fluorouracil, leucovorin, and oxaliplatin (FOLFOX)-based chemotherapy in patients with previously untreated metastatic colorectal cancer. Chemotherapy-naïve patients with liver metastases plus or minus limited extrahepatic metastases were randomly assigned to receive either modified FOLFOX (mFOLFOX6; control) or mFOLFOX6 plus SIRT (SIRT) plus or minus bevacizumab. The primary end point was progression-free survival (PFS) at any site as assessed by independent centralized radiology review blinded to study arm. Between October 2006 and April 2013, 530 patients were randomly assigned to treatment (control, 263; SIRT, 267). Median PFS at any site was 10.2 v 10.7 months in control versus SIRT (hazard ratio, 0.93; 95% CI, 0.77 to 1.12; P = .43). Median PFS in the liver by competing risk analysis was 12.6 v 20.5 months in control versus SIRT (hazard ratio, 0.69; 95% CI, 0.55 to 0.90; P = .002). Objective response rates (ORRs) at any site were similar (68.1% v 76.4% in control v SIRT; P = .113). ORR in the liver was improved with the addition of SIRT (68.8% v 78.7% in control v SIRT; P = .042). Grade ≥ 3 adverse events, including recognized SIRT-related effects, were reported in 73.4% and 85.4% of patients in control versus SIRT. The addition of SIRT to FOLFOX-based first-line chemotherapy in patients with liver-dominant or liver-only metastatic colorectal cancer did not improve PFS at any site but significantly delayed disease progression in the liver. The safety profile was as expected and was consistent with previous studies. © 2016 by American Society of Clinical Oncology.

  1. The Development of Quality Control Genotyping Approaches: A Case Study Using Elite Maize Lines.

    PubMed

    Chen, Jiafa; Zavala, Cristian; Ortega, Noemi; Petroli, Cesar; Franco, Jorge; Burgueño, Juan; Costich, Denise E; Hearne, Sarah J

    2016-01-01

    Quality control (QC) of germplasm identity and purity is a critical component of breeding and conservation activities. SNP genotyping technologies and increased availability of markers provide the opportunity to employ genotyping as a low-cost and robust component of this QC. In the public sector available low-cost SNP QC genotyping methods have been developed from a very limited panel of markers of 1,000 to 1,500 markers without broad selection of the most informative SNPs. Selection of optimal SNPs and definition of appropriate germplasm sampling in addition to platform section impact on logistical and resource-use considerations for breeding and conservation applications when mainstreaming QC. In order to address these issues, we evaluated the selection and use of SNPs for QC applications from large DArTSeq data sets generated from CIMMYT maize inbred lines (CMLs). Two QC genotyping strategies were developed, the first is a "rapid QC", employing a small number of SNPs to identify potential mislabeling of seed packages or plots, the second is a "broad QC", employing a larger number of SNP, used to identify each germplasm entry and to measure heterogeneity. The optimal marker selection strategies combined the selection of markers with high minor allele frequency, sampling of clustered SNP in proportion to marker cluster distance and selecting markers that maintain a uniform genomic distribution. The rapid and broad QC SNP panels selected using this approach were further validated using blind test assessments of related re-generation samples. The influence of sampling within each line was evaluated. Sampling 192 individuals would result in close to 100% possibility of detecting a 5% contamination in the entry, and approximately a 98% probability to detect a 2% contamination of the line. These results provide a framework for the establishment of QC genotyping. A comparison of financial and time costs for use of these approaches across different platforms is

  2. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  3. Double-quality control reveals high-level toxicity in gloves used for operator protection in assisted reproductive technology.

    PubMed

    Lierman, Sylvie; De Sutter, Petra; Dhont, Marc; Van der Elst, Josiane

    2007-10-01

    To submit different glove brands to double-quality control tests using mouse embryo assay (MEA) and the human sperm motility assay (HuSMA). Operator protection against infectious body fluid contamination is a safety issue in assisted reproductive technology (ART). When using gloves in the ART laboratory, toxic substances can be transmitted to culture media, even during brief contact. Quality control study of gloves in ART. University hospital-based infertility center. Seven- to 8-week-old female B6D2F1 hybrid mice. We tested two surgical, two cleanroom, and six examination glove brands. Only gloves brands that passed both HuSMA and MEA were submitted to further QC using zona-free and/or cryopreserved MEA. Sperm motility index, two-cell and blastocyst development, blastocyst total cell number. Quality control by MEA and HuSMA identified two glove brands to be nontoxic. Our study shows that gloves used in ART can be toxic and should be tested as part of an ongoing quality control program.

  4. Evaluating signal and noise spectral density of a qPlus sensor with an active feedback control

    NASA Astrophysics Data System (ADS)

    Lee, Manhee; An, Sangmin; Jhe, Wonho

    2018-05-01

    Q-control technique enables to actively change the quality factor of the probe oscillation in dynamic atomic force microscopy. The Q-control is realized by adding a self-feedback loop into the original actuation-detection system, in which a damping force with controllable damping coefficient in magnitude and sign is applied to the oscillating probe. While the applied force alters the total damping interaction and thus the overall `signal' of the probe motion, the added feedback system changes the `noise' of the motion as well. Here, we systematically investigate the signal, the noise, and the signal-to-noise ratio of the qPlus sensor under the active Q-control. We quantify the noise of the qPlus motion by measuring the noise spectral density, which is reproduced by a harmonic oscillator model including the thermal and the measurement noises. We show that the noise signal increases with the quality factor controlled, scaling as the square root of the quality factor. Because the overall signal is linearly proportional to the quality factor, the signal-to-noise ratio scales as the square root of the quality factor. The Q-controlled qPlus with a highly enhanced Q, up to 10,000 in air, leads to the minimum detectable force gradient of 0.001 N/m, which would enhance the capability of the qPlus sensor for atomic force microscopy and spectroscopy.

  5. MO-AB-210-00: Diagnostic Ultrasound Imaging Quality Control and High Intensity Focused Ultrasound Therapy Hands-On Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant

  6. Design, implementation, and quality control in the Pathways American-Indian multicenter trial

    PubMed Central

    Stone, Elaine J.; Norman, James E.; Davis, Sally M.; Stewart, Dawn; Clay, Theresa E.; Caballero, Ben; Lohman, Timothy G.; Murray, David M.

    2016-01-01

    Background Pathways was the first multicenter American-Indian school-based study to test the effectiveness of an obesity prevention program promoting healthy eating and physical activity. Methods Pathways employed a nested cohort design in which 41 schools were randomized to intervention or control conditions and students within these schools were followed as a cohort (1,704 third graders at baseline). The study’s primary endpoint was percent body fat. Secondary endpoints were levels of fat in school lunches; time spent in physical activity; and knowledge, attitudes, and behaviors regarding diet and exercise. Quality control (QC) included design of data management systems which provided standardization and quality assurance of data collection and processing. Data QC procedures at study centers included manuals of operation, training and certification, and monitoring of performance. Process evaluation was conducted to monitor dose and fidelity of the interventions. Registration and tracking systems were used for students and schools. Results No difference in mean percent body fat at fifth grade was found between the intervention and control schools. Percent of calories from fat and saturated fat in school lunches was significantly reduced in the intervention schools as was total energy intake from 24-hour recalls. Significant increases in self-reported physical activity levels and knowledge of healthy behaviors were found for the intervention school students. Conclusions The Pathways study results provide evidence demonstrating the role schools can play in public health promotion. Its study design and QC systems and procedures provide useful models for other similar school based multi- or single-site studies. PMID:14636805

  7. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, R; Taylor, R; Angers, C

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less

  8. Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.

    PubMed

    Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W

    2014-02-01

    The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.

  9. Cost-effectiveness analysis of panitumumab plus mFOLFOX6 compared with bevacizumab plus mFOLFOX6 for first-line treatment of patients with wild-type RAS metastatic colorectal cancer.

    PubMed

    Graham, Christopher N; Hechmati, Guy; Hjelmgren, Jonas; de Liège, Frédérique; Lanier, Julie; Knox, Hediyyih; Barber, Beth

    2014-11-01

    To investigate the cost-effectiveness of panitumumab plus mFOLFOX6 (oxaliplatin, 5-fluorouracil and leucovorin) compared with bevacizumab plus mFOLFOX6 in first-line treatment of patients with wild-type RAS metastatic colorectal cancer (mCRC). A semi-Markov model was constructed from a French health collective perspective, with health states related to first-line treatment (progression-free), disease progression with and without subsequent active treatment, resection of metastases, disease-free after successful resection and death. Parametric survival analyses of patient-level progression-free and overall survival data from the only head-to-head clinical trial of panitumumab and bevacizumab (PEAK) were performed to estimate transitions to disease progression and death. Additional data from PEAK informed the amount of each drug consumed, duration of therapy, subsequent therapy use, and toxicities related to mCRC treatment. Literature and French public data sources were used to estimate unit costs associated with treatment and duration of subsequent active therapies. Utility weights were calculated from patient-level data from panitumumab trials in the first-, second- and third-line settings. A life-time perspective was applied. Scenario, one-way, and probabilistic sensitivity analyses were performed. Based on a head-to-head clinical trial that demonstrates better efficacy outcomes for patients with wild-type RAS mCRC who receive panitumumab plus mFOLFOX6 versus bevacizumab plus mFOLFOX6, the incremental cost per life-year gained was estimated to be €26,918, and the incremental cost per quality-adjusted life year (QALY) gained was estimated to be €36,577. Sensitivity analyses indicate the model is robust to alternative parameters and assumptions. The incremental cost per QALY gained indicates that panitumumab plus mFOLFOX6 represents good value for money in comparison to bevacizumab plus mFOLFOX6 and, with a willingness-to-pay ranging from €40,000 to €60

  10. Internal quality control: planning and implementation strategies.

    PubMed

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  11. Spatial Data Quality Control Procedure applied to the Okavango Basin Information System

    NASA Astrophysics Data System (ADS)

    Butchart-Kuhlmann, Daniel

    2014-05-01

    Spatial data is a powerful form of information, capable of providing information of great interest and tremendous use to a variety of users. However, much like other data representing the 'real world', precision and accuracy must be high for the results of data analysis to be deemed reliable and thus applicable to real world projects and undertakings. The spatial data quality control (QC) procedure presented here was developed as the topic of a Master's thesis, in the sphere of and using data from the Okavango Basin Information System (OBIS), itself a part of The Future Okavango (TFO) project. The aim of the QC procedure was to form the basis of a method through which to determine the quality of spatial data relevant for application to hydrological, solute, and erosion transport modelling using the Jena Adaptable Modelling System (JAMS). As such, the quality of all data present in OBIS classified under the topics of elevation, geoscientific information, or inland waters, was evaluated. Since the initial data quality has been evaluated, efforts are underway to correct the errors found, thus improving the quality of the dataset.

  12. Moving beyond quality control in diagnostic radiology and the role of the clinically qualified medical physicist.

    PubMed

    Delis, H; Christaki, K; Healy, B; Loreti, G; Poli, G L; Toroi, P; Meghzifene, A

    2017-09-01

    Quality control (QC), according to ISO definitions, represents the most basic level of quality. It is considered to be the snapshot of the performance or the characteristics of a product or service, in order to verify that it complies with the requirements. Although it is usually believed that "the role of medical physicists in Diagnostic Radiology is QC", this, not only limits the contribution of medical physicists, but is also no longer adequate to meet the needs of Diagnostic Radiology in terms of Quality. In order to assure quality practices more organized activities and efforts are required in the modern era of diagnostic radiology. The complete system of QC is just one element of a comprehensive quality assurance (QA) program that aims at ensuring that the requirements of quality of a product or service will consistently be fulfilled. A comprehensive Quality system, starts even before the procurement of any equipment, as the need analysis and the development of specifications are important components under the QA framework. Further expanding this framework of QA, a comprehensive Quality Management System can provide additional benefits to a Diagnostic Radiology service. Harmonized policies and procedures and elements such as mission statement or job descriptions can provide clarity and consistency in the services provided, enhancing the outcome and representing a solid platform for quality improvement. The International Atomic Energy Agency (IAEA) promotes this comprehensive quality approach in diagnostic imaging and especially supports the field of comprehensive clinical audits as a tool for quality improvement. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. Analysis of CrIS/ATMS using AIRS Version-7 Retrieval and QC Methodology

    NASA Astrophysics Data System (ADS)

    Susskind, J.; Kouvaris, L. C.; Blaisdell, J. M.; Iredell, L. F.

    2017-12-01

    The objective of the proposed research is to develop, implement, test, and refine a CrIS/ATMS retrieval algorithm which will produce monthly mean data products that are compatible with those of the soon to be operational AIRS V7 retrieval algorithm. This is a necessary condition for CrIS/ATMS on NPP and future missions to serve as adequate follow-ons to AIRS for the monitoring of climate variability and trends. Of particular importance toward this end is achieving agreement of monthly mean fields of CrIS and AIRS geophysical parameters on a 1 deg by 1 deg spatial scale, and, more significantly, agreement of their interannual differences. Indications are that the best way to achieve this is to use scientific retrieval and Quality Control (QC) methodology for CrIS/ATMS which is analogous to that which will be used in AIRS V7. We refer to the current scientific candidate for AIRS V7 as AIRS Sounder Research Team (SRT) V6.42, which currently runs at JPL on the AIRS Team Leader Scientific Facility (TLSCF). We ported CrIS SRT V6.42 Level 2 (L2) retrieval code and QC methodology to run at the Sounder SIPS at JPL. The months of January and July 2015 were both processed at JPL using AIRS and CrIS at the TLSCF and SIPS respectively. This paper shows excellent agreement of AIRS and CrIS single day and monthly mean products on a 1 deg lat by 1 deg long spatial grid with each other and with the other satellites measures of the same products.

  14. Quality control of CT systems by automated monitoring of key performance indicators: a two‐year study

    PubMed Central

    Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-01-01

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two‐year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service

  15. Quality control of CT systems by automated monitoring of key performance indicators: a two-year study.

    PubMed

    Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-07-08

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.

  16. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  17. Quality control ranges for testing broth microdilution susceptibility of Flavobacterium columnare and F. psychrophilium to nine antimicrobials

    USDA-ARS?s Scientific Manuscript database

    A multi-laboratory broth microdilution method trial was performed to standardize the specialized test conditions required for fish pathogens Flavobacterium columnare and F. pyschrophilum. Nine laboratories tested the quality control (QC) strains Escherichia coli ATCC 25922 and Aeromonas salmonicid...

  18. PACS 2000: quality control using the task allocation chart

    NASA Astrophysics Data System (ADS)

    Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.

    2000-05-01

    Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.

  19. Office of Student Financial Aid Quality Improvement Program: Design and Implementation Plan.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    The purpose and direction of the Office of Student Financial Aid (OSFA) quality improvement program are described. The background and context for the Pell Grant quality control (QC) design study and the meaning of QC are reviewed. The general approach to quality improvement consists of the following elements: a strategic approach that enables OSFA…

  20. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    PubMed Central

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  1. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  2. Quality Circles: An Innovative Program to Improve Military Hospitals

    DTIC Science & Technology

    1982-08-01

    quality control. However, Dr. Kaoru Ishikawa is credited with starting the first "Quality Control Circles" and registering them with the Japanese Union of...McGregor and Abraham Maslow into a unique style of management. In 1962 Dr. Ishikawa , a professor at Tokyo University, developed the QC concept based on...RECOMMENDATIONS Conclusions The QC concept has come a long way since Dr. Ishikawa gave it birth in 1962. It has left an enviable record of success along its

  3. An effective and optimal quality control approach for green energy manufacturing using design of experiments framework and evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Saavedra, Juan Alejandro

    Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA

  4. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  5. An International Coordinated Effort to Further the Documentation & Development of Quality Assurance, Quality Control, and Best Practices for Oceanographic Observations

    NASA Astrophysics Data System (ADS)

    Bushnell, M.; Waldmann, C.; Hermes, J.; Tamburri, M.

    2017-12-01

    Many oceanographic observation groups create and maintain QA, QC, and best practices (BP) to ensure efficient and accurate data collection and quantify quality. Several entities - IOOS® QARTOD, AtlantOS, ACT, WMO/IOC JCOMM OCG - have joined forces to document existing practices, identify gaps, and support development of emerging techniques. While each group has a slightly different focus, many underlying QA/QC/BP needs can be quite common. QARTOD focuses upon real-time data QC, and has produced manuals that address QC tests for eleven ocean variables. AtlantOS is a research and innovation project working towards the integration of ocean-observing activities across all disciplines in the Atlantic Basin. ACT brings together research institutions, resource managers, and private companies to foster the development and adoption of effective and reliable sensors for coastal, freshwater, and ocean environments. JCOMM promotes broad international coordination of oceanographic and marine meteorological observations and data management and services. Leveraging existing efforts of these organizations is an efficient way to consolidate available information, develop new practices, and evaluate the use of ISO standards to judge the quality of measurements. ISO standards may offer accepted support for a framework for an ocean data quality management system, similar to the meteorological standards defined by WMO (https://www.wmo.int/pages/prog/arep/gaw/qassurance.html). We will first cooperatively develop a plan to create a QA/QC/BP manual. The resulting plan will describe the need for such a manual, the extent of the manual, the process used to engage the community in creating it, the maintenance of the resultant document, and how these things will be done. It will also investigate standards for metadata. The plan will subsequently be used to develop the QA/QC/BP manual, providing guidance which advances the standards adopted by IOOS, AtlantOS, JCOMM, and others.

  6. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  7. A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS

    PubMed Central

    Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T.; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J.; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A.; Lempicki, Richard A.; Huang, Da Wei

    2013-01-01

    PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results. PMID:24179701

  8. A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS.

    PubMed

    Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A; Lempicki, Richard A; Huang, Da Wei

    2013-07-31

    PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results.

  9. Assays for Qualification and Quality Stratification of Clinical Biospecimens Used in Research: A Technical Report from the ISBER Biospecimen Science Working Group.

    PubMed

    Betsou, Fay; Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita

    2016-10-01

    This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality.

  10. Construction method of QC-LDPC codes based on multiplicative group of finite field in optical communication

    NASA Astrophysics Data System (ADS)

    Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui

    2016-09-01

    In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.

  11. A novel construction scheme of QC-LDPC codes based on the RU algorithm for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-03-01

    A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.

  12. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  13. Building a Quality Controlled Database of Meteorological Data from NASA Kennedy Space Center and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  14. Follow-Up of External Quality Controls for PCR-Based Diagnosis of Whooping Cough in a Hospital Laboratory Network (Renacoq) and in Other Hospital and Private Laboratories in France.

    PubMed

    Guillot, Sophie; Guiso, Nicole

    2016-08-01

    The French National Reference Centre (NRC) for Whooping Cough carried out an external quality control (QC) analysis in 2010 for the PCR diagnosis of whooping cough. The main objective of the study was to assess the impact of this QC in the participating laboratories through a repeat analysis in 2012. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  15. Assays for Qualification and Quality Stratification of Clinical Biospecimens Used in Research: A Technical Report from the ISBER Biospecimen Science Working Group

    PubMed Central

    Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita

    2016-01-01

    This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality. PMID:27046294

  16. Implementation and application of moving average as continuous analytical quality control instrument demonstrated for 24 routine chemistry assays.

    PubMed

    Rossum, Huub H van; Kemperman, Hans

    2017-07-26

    General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.

  17. References on EPA Quality Assurance Project Plans

    EPA Pesticide Factsheets

    Provides requirements for the conduct of quality management practices, including quality assurance (QA) and quality control (QC) activities, for all environmental data collection and environmental technology programs performed by or for this Agency.

  18. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments

    PubMed Central

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert

    2017-01-01

    Abstract ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. PMID:28911122

  19. SNP Data Quality Control in a National Beef and Dairy Cattle System and Highly Accurate SNP Based Parentage Verification and Identification

    PubMed Central

    McClure, Matthew C.; McCarthy, John; Flynn, Paul; McClure, Jennifer C.; Dair, Emma; O'Connell, D. K.; Kearney, John F.

    2018-01-01

    A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP) verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS), they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF) analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800) selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR), and minor allele frequency (MAF) in the Irish cattle population. Large datasets require sample and SNP quality control (QC). Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present), and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non

  20. Material quality assurance risk assessment : [summary].

    DOT National Transportation Integrated Search

    2013-01-01

    With the shift from quality control (QC) of materials and placement techniques : to quality assurance (QA) and acceptance over the years, the role of the Office : of Materials Technology (OMT) has been shifting towards assurance of : material quality...

  1. Design of the data quality control system for the ALICE O2

    NASA Astrophysics Data System (ADS)

    von Haller, Barthélémy; Lesiak, Patryk; Otwinowski, Jacek; ALICE Collaboration

    2017-10-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A major upgrade of the experiment is planned for 2019-20. In order to cope with a 100 times higher data rate and with the continuous readout of the Time Projection Chamber (TPC), it is necessary to upgrade the Online and Offline computing to a new common system called O2. The online Data Quality Monitoring (DQM) and the offline Quality Assurance (QA) are critical aspects of the data acquisition and reconstruction software chains. The former intends to provide shifters with precise and complete information in order to quickly identify and overcome problems while the latter aims at providing good quality data for physics analyses. DQM and QA typically involve the gathering of data, its distributed analysis by user-defined algorithms, the merging of the resulting objects and their visualization. This paper discusses the architecture and the design of the data Quality Control system that regroups the DQM and QA. In addition it presents the main design requirements and early results of a working prototype. A special focus is put on the merging of monitoring objects generated by the QC tasks. The merging is a crucial and challenging step of the O2 system, not only for QC but also for the calibration. Various scenarios and implementations have been made and large-scale tests carried out. This document presents the final results of this extensive work on merging. We conclude with the plan of work for the coming years that will bring the QC to production by 2019.

  2. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  3. Technical Note: Independent component analysis for quality assurance in functional MRI.

    PubMed

    Astrakas, Loukas G; Kallistis, Nikolaos S; Kalef-Ezra, John A

    2016-02-01

    Independent component analysis (ICA) is an established method of analyzing human functional MRI (fMRI) data. Here, an ICA-based fMRI quality control (QC) tool was developed and used. ICA-based fMRI QC tool to be used with a commercial phantom was developed. In an attempt to assess the performance of the tool relative to preexisting alternative tools, it was used seven weeks before and eight weeks after repair of a faulty gradient amplifier of a non-state-of-the-art MRI unit. More specifically, its performance was compared with the AAPM 100 acceptance testing and quality assurance protocol and two fMRI QC protocols, proposed by Freidman et al. ["Report on a multicenter fMRI quality assurance protocol," J. Magn. Reson. Imaging 23, 827-839 (2006)] and Stocker et al. ["Automated quality assurance routines for fMRI data applied to a multicenter study," Hum. Brain Mapp. 25, 237-246 (2005)], respectively. The easily developed and applied ICA-based QC protocol provided fMRI QC indices and maps equally sensitive to fMRI instabilities with the indices and maps of other established protocols. The ICA fMRI QC indices were highly correlated with indices of other fMRI QC protocols and in some cases theoretically related to them. Three or four independent components with slow varying time series are detected under normal conditions. ICA applied on phantom measurements is an easy and efficient tool for fMRI QC. Additionally, it can protect against misinterpretations of artifact components as human brain activations. Evaluating fMRI QC indices in the central region of a phantom is not always the optimal choice.

  4. LOVE CANAL MONITORING PROGRAM. GCA QA/QC (QUALITY ASSURANCE/QUALITY CONTROL) SUMMARY REPORT

    EPA Science Inventory

    One of the most important responsibilities of the Love Canal prime contractor was the institution and maintenance of a quality assurance program. An important objective of the quality assurance program was to alert the subcontractors to the importance of high quality work on thei...

  5. Lubiprostone plus PEG electrolytes versus placebo plus PEG electrolytes for outpatient colonoscopy preparation: a randomized, double-blind placebo-controlled trial.

    PubMed

    Sofi, Aijaz A; Nawras, Ali T; Pai, Chetan; Samuels, Qiana; Silverman, Ann L

    2015-01-01

    Bowel preparation using large volume of polyethylene glycol (PEG) solutions is often poorly tolerated. Therefore, there are ongoing efforts to develop an alternative bowel cleansing regimen that should be equally effective and better tolerated. The aim of this study was to assess the efficacy of lubiprostone (versus placebo) plus PEG as a bowel cleansing preparation for colonoscopy. Our study was a randomized, double-blind placebo-controlled design. Patients scheduled for screening colonoscopy were randomized 1:1 to lubiprostone (group 1) or placebo (group 2) plus 1 gallon of PEG. The primary endpoints were patient's tolerability and endoscopist's evaluation of the preparation quality. The secondary endpoint was to determine any reduction in the amount of PEG consumed in the lubiprostone group compared with the placebo group. One hundred twenty-three patients completed the study and were included in the analysis. There was no difference in overall cleanliness. The volume of PEG was similar in both the groups. The volume of PEG approached significance as a predictor of improved score for both the groups (P = 0.054). Lubiprostone plus PEG was similar to placebo plus PEG in colon cleansing and volume of PEG consumed. The volume of PEG consumed showed a trend toward improving the quality of the colon cleansing.

  6. A real-time automated quality control of rain gauge data based on multiple sensors

    NASA Astrophysics Data System (ADS)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  7. Revision 2 of the Enbridge Quality Assurance Project Plan

    EPA Pesticide Factsheets

    This Quality Assurance Project Plan (QAPP) presents Revision 2 of the organization, objectives, planned activities, and specific quality assurance/quality control (QA/QC) procedures associated with the Enbridge Marshall Pipeline Release Project.

  8. Material quality assurance risk assessment.

    DOT National Transportation Integrated Search

    2013-01-01

    Over the past two decades the role of SHA has shifted from quality control (QC) of materials and : placement techniques to quality assurance (QA) and acceptance. The role of the Office of Materials : Technology (OMT) has been shifting towards assuran...

  9. Efficacy and safety of albendazole plus ivermectin, albendazole plus mebendazole, albendazole plus oxantel pamoate, and mebendazole alone against Trichuris trichiura and concomitant soil-transmitted helminth infections: a four-arm, randomised controlled trial.

    PubMed

    Speich, Benjamin; Ali, Said M; Ame, Shaali M; Bogoch, Isaac I; Alles, Rainer; Huwyler, Jörg; Albonico, Marco; Hattendorf, Jan; Utzinger, Jürg; Keiser, Jennifer

    2015-03-01

    Existing anthelmintic drugs (eg, albendazole and mebendazole) have low efficacy against the intestinal nematode species Trichuris trichiura and the drug pipeline is exhausted. We aimed to investigate the strategy of combination chemotherapy with existing drugs to establish whether their efficacy could be enhanced and broadened. In this randomised controlled trial, we compared three drug combinations and one standard drug alone in children aged 6-14 years in two schools on Pemba Island, Tanzania infected with T trichiura and concomitant intestinal nematodes. We assigned children, via a randomisation list with block sizes of either four or eight, to orally receive albendazole (400 mg) plus ivermectin (200 μg/kg); albendazole (400 mg) plus mebendazole (500 mg); albendazole (400 mg) plus oxantel pamoate (20 mg/kg); or mebendazole (500 mg) alone. The primary endpoints were the proportion of children cured of T trichiura infection and the reduction of T trichiura eggs in stool based on geometric means, both analysed by available case. This study is registered with ISRCTN, number ISRCTN80245406. We randomly assigned 440 eligible children infected with T trichiura between Sept 2, and Oct 18, 2013, to one of the four treatment groups (110 children per group). Data for 431 children were included in the analysis for the primary endpoints. Albendazole plus oxantel pamoate (74 of 108 children cured [68·5%, 95% CI 59·6-77·4]; egg reduction 99·2%, 98·7-99·6) and albendazole plus ivermectin (30 of 109 cured [27·5%, 19·0-36·0]; egg reduction 94·5%, 91·7-96·3) were significantly more effective against T trichiura than mebendazole alone (nine of 107 cured [8·4%, 3·1-13·8]; egg reduction 58·5%, 45·2-70·9). Albendazole plus mebendazole had similar low efficacy (nine of 107 cured [8·4%, 3·1-13·8; egg reduction 51·6%, 35·0-65·3) to mebendazole alone. About a fifth of the children reported adverse events, which were mainly mild. Abdominal cramps and headache were

  10. Implementation of quality assurance in diagnostic radiology in Bosnia and Herzegovina (Republic of Srpska).

    PubMed

    Bosnjak, J; Ciraj-Bjelac, O; Strbac, B

    2008-01-01

    Application of a quality control (QC) programme is very important when optimisation of image quality and reduction of patient exposure is desired. QC surveys of diagnostics imaging equipment in Republic of Srpska (entity of Bosnia and Herzegovina) has been systematically performed since 2001. The presented results are mostly related to the QC test results of X-ray tubes and generators for diagnostic radiology units in 92 radiology departments. In addition, results include workplace monitoring and usage of personal protective devices for staff and patients. Presented results showed the improvements in the implementation of the QC programme within the period 2001--2005. Also, more attention is given to appropriate maintenance of imaging equipment, which was one of the main problems in the past. Implementation of a QC programme is a continuous and complex process. To achieve good performance of imaging equipment, additional tests are to be introduced, along with image quality assessment and patient dosimetry. Training is very important in order to achieve these goals.

  11. Data Validation & Laboratory Quality Assurance for Region 9

    EPA Pesticide Factsheets

    In all hazardous site investigations it is essential to know the quality of the data used for decision-making purposes. Validation of data requires that appropriate quality assurance and quality control (QA/QC) procedures be followed.

  12. SU-E-T-216: TPS QC Supporting Program by a Third-Party Evaluation Agency in Japan.

    PubMed

    Fukata, K; Minemura, T; Kurokawa, C; Miyagishi, T; Itami, J

    2012-06-01

    To equalize the quality of radiation therapy in Japan by supporting quality control of radiation treatment planning system. Center for Cancer Control and Information Service in National Cancer Center supports the QA-QC of the cancer core hospitals in Japan as a third-party evaluation agency. Recently, a program for assessing the quality of treatment planning system (TPS) began as a part of our QA-QC supporting activities. In this program, a questionnaire about TPS was sent to 45 prefectural cancer core hospitals in Japan. The object of this questionnaire is to assess the proper commissioning, implement and applications of TPSs. The contents of the questionnaire are as follows; 1) calculate MUs which deliver 1000 cGy to the point of SSD = 100 cm, 10 cm depth with field sizes ranging from 5×5 to 30 × 30 cm 2 , and obtain doses at several depths for the calculated MUs, 2) calculate MUs which deliver 1000 cGy to the point of SSD = 100 cm, 10 cm depth for wedge fields whose angles are from 15 to 60 degrees, and obtain doses at several depths with the MUs, 3) calculate MU which deliver 1000 cGy to the point of STD = 100 cm, 10 cm depth with 10×10 cm 2 field size and obtain doses at several depths with the MU. In this program, 179 beam data from 44 facilities were collected. Data were compared in terms of dose per MU, output factor, wedge factor and TMR. It was found that 90% of the data agreed within 2%. The quality of the treatment planning system was investigated through the questionnaire including the information of essential beam data. We compared 179 beam data in TPSs sent from 44 facilities and 90% of the data showed good agreement. © 2012 American Association of Physicists in Medicine.

  13. [Does implementation of benchmarking in quality circles improve the quality of care of patients with asthma and reduce drug interaction?].

    PubMed

    Kaufmann-Kolle, Petra; Szecsenyi, Joachim; Broge, Björn; Haefeli, Walter Emil; Schneider, Antonius

    2011-01-01

    The purpose of this cluster-randomised controlled trial was to evaluate the efficacy of quality circles (QCs) working either with general data-based feedback or with an open benchmark within the field of asthma care and drug-drug interactions. Twelve QCs, involving 96 general practitioners from 85 practices, were randomised. Six QCs worked with traditional anonymous feedback and six with an open benchmark. Two QC meetings supported with feedback reports were held covering the topics "drug-drug interactions" and "asthma"; in both cases discussions were guided by a trained moderator. Outcome measures included health-related quality of life and patient satisfaction with treatment, asthma severity and number of potentially inappropriate drug combinations as well as the general practitioners' satisfaction in relation to the performance of the QC. A significant improvement in the treatment of asthma was observed in both trial arms. However, there was only a slight improvement regarding inappropriate drug combinations. There were no relevant differences between the group with open benchmark (B-QC) and traditional quality circles (T-QC). The physicians' satisfaction with the QC performance was significantly higher in the T-QCs. General practitioners seem to take a critical perspective about open benchmarking in quality circles. Caution should be used when implementing benchmarking in a quality circle as it did not improve healthcare when compared to the traditional procedure with anonymised comparisons. Copyright © 2011. Published by Elsevier GmbH.

  14. A Clustered Randomized Controlled Trial of the Positive Prevention PLUS Adolescent Pregnancy Prevention Program.

    PubMed

    LaChausse, Robert G

    2016-09-01

    To determine the impact of Positive Prevention PLUS, a school-based adolescent pregnancy prevention program on delaying sexual intercourse, birth control use, and pregnancy. I randomly assigned a diverse sample of ninth grade students in 21 suburban public high schools in California into treatment (n = 2483) and control (n = 1784) groups that participated in a clustered randomized controlled trial. Between October 2013 and May 2014, participants completed baseline and 6-month follow-up surveys regarding sexual behavior and pregnancy. Participants in the treatment group were offered Positive Prevention PLUS, an 11-lesson adolescent pregnancy prevention program. The program had statistically significant impacts on delaying sexual intercourse and increasing the use of birth control. However, I detected no program effect on pregnancy rates at 6-month follow-up. The Positive Prevention PLUS program demonstrated positive impacts on adolescent sexual behavior. This suggests that programs that focus on having students practice risk reduction skills may delay sexual activity and increase birth control use.

  15. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    PubMed

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Testosterone Plus Low-Intensity Physical Training in Late Life Improves Functional Performance, Skeletal Muscle Mitochondrial Biogenesis, and Mitochondrial Quality Control in Male Mice

    PubMed Central

    Guo, Wen; Wong, Siu; Li, Michelle; Liang, Wentao; Liesa, Marc; Serra, Carlo; Jasuja, Ravi; Bartke, Andrzej; Kirkland, James L.; Shirihai, Orian; Bhasin, Shalender

    2012-01-01

    Testosterone supplementation increases muscle mass in older men but has not been shown to consistently improve physical function and activity. It has been hypothesized that physical exercise is required to induce the adaptations necessary for translation of testosterone-induced muscle mass gain into functional improvements. However, the effects of testosterone plus low intensity physical exercise training (T/PT) on functional performance and bioenergetics are unknown. In this pilot study, we tested the hypothesis that combined administration of T/PT would improve functional performance and bioenergetics in male mice late in life more than low-intensity physical training alone. 28-month old male mice were randomized to receive T/PT or vehicle plus physical training (V/PT) for 2 months. Compare to V/PT control, administration of T/PT was associated with improvements in muscle mass, grip strength, spontaneous physical movements, and respiratory activity. These changes were correlated with increased mitochondrial DNA copy number and expression of markers for mitochondrial biogenesis. Mice receiving T/PT also displayed increased expression of key elements for mitochondrial quality control, including markers for mitochondrial fission-and-fusion and mitophagy. Concurrently, mice receiving T/PT also displayed increased expression of markers for reduced tissue oxidative damage and improved muscle quality. Conclusion: Testosterone administered with low-intensity physical training improves grip strength, spontaneous movements, and respiratory activity. These functional improvements were associated with increased muscle mitochondrial biogenesis and improved mitochondrial quality control. PMID:23240002

  17. Analysis of negative historical control group data from the in vitro micronucleus assay using TK6 cells.

    PubMed

    Lovell, David P; Fellows, Mick; Marchetti, Francesco; Christiansen, Joan; Elhajouji, Azeddine; Hashimoto, Kiyohiro; Kasamoto, Sawako; Li, Yan; Masayasu, Ozaki; Moore, Martha M; Schuler, Maik; Smith, Robert; Stankowski, Leon F; Tanaka, Jin; Tanir, Jennifer Y; Thybaud, Veronique; Van Goethem, Freddy; Whitwell, James

    2018-01-01

    The recent revisions of the Organisation for Economic Co-operation and Development (OECD) genetic toxicology test guidelines emphasize the importance of historical negative controls both for data quality and interpretation. The goal of a HESI Genetic Toxicology Technical Committee (GTTC) workgroup was to collect data from participating laboratories and to conduct a statistical analysis to understand and publish the range of values that are normally seen in experienced laboratories using TK6 cells to conduct the in vitro micronucleus assay. Data from negative control samples from in vitro micronucleus assays using TK6 cells from 13 laboratories were collected using a standard collection form. Although in some cases statistically significant differences can be seen within laboratories for different test conditions, they were very small. The mean incidence of micronucleated cells/1000 cells ranged from 3.2/1000 to 13.8/1000. These almost four-fold differences in micronucleus levels cannot be explained by differences in scoring method, presence or absence of exogenous metabolic activation (S9), length of treatment, presence or absence of cytochalasin B or different solvents used as vehicles. The range of means from the four laboratories using flow cytometry methods (3.7-fold: 3.5-12.9 micronucleated cells/1000 cells) was similar to that from the nine laboratories using other scoring methods (4.3-fold: 3.2-13.8 micronucleated cells/1000 cells). No laboratory could be identified as an outlier or as showing unacceptably high variability. Quality Control (QC) methods applied to analyse the intra-laboratory variability showed that there was evidence of inter-experimental variability greater than would be expected by chance (i.e. over-dispersion). However, in general, this was low. This study demonstrates the value of QC methods in helping to analyse the reproducibility of results, building up a 'normal' range of values, and as an aid to identify variability within a

  18. Stability of Tetrahydrocannabinol and Cannabidiol in Prepared Quality Control Medible Brownies.

    PubMed

    Wolf, Carl E; Poklis, Justin L; Poklis, Alphonse

    2017-03-01

    The legalization of marijuana in the USA for both medicinal and recreational use has increased in the past few years. Currently, 24 states have legalized marijuana for medicinal use. The US Drug Enforcement Administration has classified marijuana as a Schedule I substance. The US Food and Drug Administration does not regulate formulations or packages of marijuana that are currently marketed in states that have legalized marijuana. Marijuana edibles or "medibles" are typically packages of candies and baked goods consumed for medicinal as well as recreational marijuana use. They contain major psychoactive drug in marijuana, delta-9-tetrahydrocannabinol (THC) and/or cannabidiol (CBD), which has reputed medical properties. Presented is a method for the preparation and application of THC and CBD containing brownies used as quality control (QC) material for the analysis of marijuana or cannabinoid baked medibles. The performance parameters of the assay including possible matrix effects and cannabinoid stability in the brownie QC over time are presented. It was determined that the process used to prepare and bake the brownie control material did not degrade the THC or CBD. The brownie matrix was found not to interfere with the analysis of a THC or a CBD. Ten commercially available brownie matrixes were evaluated for potential interferences; none of them were found to interfere with the analysis of THC or CBD. The laboratory baked medible QC material was found to be stable at room temperature for at least 3 months. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Managing the Quality of Environmental Data in EPA Region 9

    EPA Pesticide Factsheets

    EPA Pacific Southwest, Region 9's Quality Assurance (QA) section's primary mission is to effectively oversee and carry out the Quality System and Quality Management Plan, and project-level quality assurance and quality control (QA/QC) activities.

  20. Evaluation of capillary zone electrophoresis for the quality control of complex biologic samples: Application to snake venoms.

    PubMed

    Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine

    2017-08-01

    Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  2. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  3. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  4. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  5. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  6. US quality control in Italy: present and future

    NASA Astrophysics Data System (ADS)

    Balbis, S.; Musacchio, C.; Guiot, C.; Spagnolo, R.

    2011-02-01

    US diagnostic equipments are widely diffused in Italy but, in spite of recommendations (e.g. ISPESL-Ministry of Health (1999) and SIRM (Società Italiana di Radiologia Medica, 2004), US quality controls are restricted to only a few public sanitary structure and a national (or even regional) quality assurance program for testing the performances of the US equipments is still missing. A joint Research Centre among the three Piedmontese Universities and INRIM, partially funded by Regione Piemonte, has been established in 2009 as Reference Centre for Medical Ultrasounds (CRUM). In addition to research, development and training tasks, the Centre aims at the local diffusion of the quality assurance in clinical US equipments. According to data from the Ministry of Health (2006), around 7 % of the Italian US diagnostic equipments (946 over 13526) are located in Piedmont: mostly (75.6%) in public hospitals, 9.3 % in conventionated hospitals, 4.3% in public and 10.8% in private territorial structures. The goal is the provision of a regional database, which progressively includes data related to acceptance test, status and QC tests and maintenance, in order to drive equipment turnover and carefully monitoring the overall equipment efficiency. Moreover, facilities are available at CRUM for monitoring both beam geometry and acoustic power and performing quantitative assessment of the delivered energy intensity.

  7. Results from 15years of quality surveillance for a National Indigenous Point-of-Care Testing Program for diabetes.

    PubMed

    Shephard, Mark; Shephard, Anne; McAteer, Bridgit; Regnier, Tamika; Barancek, Kristina

    2017-12-01

    Diabetes is a major health problem for Australia's Aboriginal and Torres Strait Islander peoples. Point-of-care testing for haemoglobin A1c (HbA1c) has been the cornerstone of a long-standing program (QAAMS) to manage glycaemic control in Indigenous people with diabetes and recently, to diagnose diabetes. The QAAMS quality management framework includes monthly testing of quality control (QC) and external quality assurance (EQA) samples. Key performance indicators of quality include imprecision (coefficient of variation [CV%]) and percentage acceptable results. This paper reports on the past 15years of quality testing in QAAMS and examines the performance of HbA1c POC testing at the 6.5% cut-off recommended for diagnosis. The total number of HbA1c EQA results submitted from 2002 to 2016 was 29,093. The median imprecision for EQA testing by QAAMS device operators averaged 2.81% (SD 0.50; range 2.2 to 3.9%) from 2002 to 2016 and 2.44% (SD 0.22; range 2.2 to 2.9%) from 2009 to 2016. No significant difference was observed between the median imprecision achieved in QAAMS and by Australasian laboratories from 2002 to 2016 (p=0.05; two-tailed paired t-test) and from 2009 to 2016 (p=0.17; two-tailed paired t-test). For QC testing from 2009 to 2016, imprecision averaged 2.5% and 3.0% for the two levels of QC tested. Percentage acceptable results averaged 90% for QA testing from 2002 to 2016 and 96% for QC testing from 2009 to 2016. The DCA Vantage was able to measure a patient and an EQA sample with an HbA1c value close to 6.5% both accurately and precisely. HbA1c POC testing in QAAMS has remained analytically sound, matched the quality achieved by Australasian laboratories and met profession-derived analytical goals for 15years. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. A novel QC-LDPC code based on the finite field multiplicative group for optical communications

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Xu, Liang; Tong, Qing-zhen

    2013-09-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) code is proposed based on the finite field multiplicative group, which has easier construction, more flexible code-length code-rate adjustment and lower encoding/decoding complexity. Moreover, a regular QC-LDPC(5334,4962) code is constructed. The simulation results show that the constructed QC-LDPC(5334,4962) code can gain better error correction performance under the condition of the additive white Gaussian noise (AWGN) channel with iterative decoding sum-product algorithm (SPA). At the bit error rate (BER) of 10-6, the net coding gain (NCG) of the constructed QC-LDPC(5334,4962) code is 1.8 dB, 0.9 dB and 0.2 dB more than that of the classic RS(255,239) code in ITU-T G.975, the LDPC(32640,30592) code in ITU-T G.975.1 and the SCG-LDPC(3969,3720) code constructed by the random method, respectively. So it is more suitable for optical communication systems.

  9. 40 CFR 98.64 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...

  10. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  11. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  12. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  13. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  14. 40 CFR 98.64 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...

  15. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.84 Section 98.84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements...

  16. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.; Rutan, D. A.

    2016-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  17. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  18. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  19. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  20. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC...-specific heel factors for each container type for each gas used, according to the procedures in paragraphs...

  1. A Multilaboratory, Multicountry Study To Determine Bedaquiline MIC Quality Control Ranges for Phenotypic Drug Susceptibility Testing

    PubMed Central

    Cirillo, Daniela M.; Hoffner, Sven; Ismail, Nazir A.; Kaur, Devinder; Lounis, Nacer; Metchock, Beverly; Pfyffer, Gaby E.; Venter, Amour

    2016-01-01

    The aim of this study was to establish standardized drug susceptibility testing (DST) methodologies and reference MIC quality control (QC) ranges for bedaquiline, a diarylquinoline antimycobacterial, used in the treatment of adults with multidrug-resistant tuberculosis. Two tier-2 QC reproducibility studies of bedaquiline DST were conducted in eight laboratories using Clinical Laboratory and Standards Institute (CLSI) guidelines. Agar dilution and broth microdilution methods were evaluated. Mycobacterium tuberculosis H37Rv was used as the QC reference strain. Bedaquiline MIC frequency, mode, and geometric mean were calculated. When resulting data occurred outside predefined CLSI criteria, the entire laboratory data set was excluded. For the agar dilution MIC, a 4-dilution QC range (0.015 to 0.12 μg/ml) centered around the geometric mean included 95.8% (7H10 agar dilution; 204/213 observations with one data set excluded) or 95.9% (7H11 agar dilution; 232/242) of bedaquiline MICs. For the 7H9 broth microdilution MIC, a 3-dilution QC range (0.015 to 0.06 μg/ml) centered around the mode included 98.1% (207/211, with one data set excluded) of bedaquiline MICs. Microbiological equivalence was demonstrated for bedaquiline MICs determined using 7H10 agar and 7H11 agar but not for bedaquiline MICs determined using 7H9 broth and 7H10 agar or 7H9 broth and 7H11 agar. Bedaquiline DST methodologies and MIC QC ranges against the H37Rv M. tuberculosis reference strain have been established: 0.015 to 0.12 μg/ml for the 7H10 and 7H11 agar dilution MICs and 0.015 to 0.06 μg/ml for the 7H9 broth microdilution MIC. These methodologies and QC ranges will be submitted to CLSI and EUCAST to inform future research and provide guidance for routine clinical bedaquiline DST in laboratories worldwide. PMID:27654337

  2. 222-S Laboratory Quality Assurance Plan. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meznarich, H.K.

    1995-07-31

    This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A qualitymore » assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document.« less

  3. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23

  4. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  5. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  6. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  7. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  8. Quality control for quantitative multicenter whole-body PET/MR studies: A NEMA image quality phantom study with three current PET/MR systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boellaard, Ronald, E-mail: r.boellaard@vumc.nl; European Association of Nuclear Medicine Research Ltd., Vienna 1060; European Association of Nuclear Medicine Physics Committee, Vienna 1060

    2015-10-15

    Purpose: Integrated positron emission tomography/magnetic resonance (PET/MR) systems derive the PET attenuation correction (AC) from dedicated MR sequences. While MR-AC performs reasonably well in clinical patient imaging, it may fail for phantom-based quality control (QC). The authors assess the applicability of different protocols for PET QC in multicenter PET/MR imaging. Methods: The National Electrical Manufacturers Association NU 2 2007 image quality phantom was imaged on three combined PET/MR systems: a Philips Ingenuity TF PET/MR, a Siemens Biograph mMR, and a GE SIGNA PET/MR (prototype) system. The phantom was filled according to the EANM FDG-PET/CT guideline 1.0 and scanned for 5more » min over 1 bed. Two MR-AC imaging protocols were tested: standard clinical procedures and a dedicated protocol for phantom tests. Depending on the system, the dedicated phantom protocol employs a two-class (water and air) segmentation of the MR data or a CT-based template. Differences in attenuation- and SUV recovery coefficients (RC) are reported. PET/CT-based simulations were performed to simulate the various artifacts seen in the AC maps (μ-map) and their impact on the accuracy of phantom-based QC. Results: Clinical MR-AC protocols caused substantial errors and artifacts in the AC maps, resulting in underestimations of the reconstructed PET activity of up to 27%, depending on the PET/MR system. Using dedicated phantom MR-AC protocols, PET bias was reduced to −8%. Mean and max SUV RC met EARL multicenter PET performance specifications for most contrast objects, but only when using the dedicated phantom protocol. Simulations confirmed the bias in experimental data to be caused by incorrect AC maps resulting from the use of clinical MR-AC protocols. Conclusions: Phantom-based quality control of PET/MR systems in a multicenter, multivendor setting may be performed with sufficient accuracy, but only when dedicated phantom acquisition and processing protocols are used for

  9. Quality assurance and quality control in mammography: a review of available guidance worldwide.

    PubMed

    Reis, Cláudia; Pascoal, Ana; Sakellaris, Taxiarchis; Koutalonis, Manthos

    2013-10-01

    Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources. •An effective QA program should be practical to implement in a clinical setting. •QA should address the various stages of the imaging chain: acquisition, processing and display. •AEC system QC testing is simple to implement and provides information on equipment performance.

  10. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yung, J; Stefan, W; Reeve, D

    2015-06-15

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help preventmore » costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large

  11. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  12. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  13. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  14. Implementation of basic quality control tests for malaria medicines in Amazon Basin countries: results for the 2005-2010 period.

    PubMed

    Pribluda, Victor S; Barojas, Adrian; Añez, Arletta; López, Cecilia G; Figueroa, Ruth; Herrera, Roxana; Nakao, Gladys; Nogueira, Fernando Ha; Pianetti, Gerson A; Povoa, Marinete M; Viana, Giselle Mr; Gomes, Margarete S Mendonça; Escobar, Jose P; Sierra, Olga L Muñoz; Norena, Susana P Rendon; Veloz, Raúl; Bravo, Marcy Silva; Aldás, Martha R; Hindssemple, Alison; Collins, Marilyn; Ceron, Nicolas; Krishnalall, Karanchand; Adhin, Malti; Bretas, Gustavo; Hernandez, Nelly; Mendoza, Marjorie; Smine, Abdelkrim; Chibwe, Kennedy; Lukulay, Patrick; Evans, Lawrence

    2012-06-15

    collected from the public sector, 1,445/1,663 (86.9%). Results indicate that 193/1,663 (11.6%) were found not to meet quality specifications. Most failures were reported during visual and physical inspection, 142/1663 (8.5%), and most of these were due to expired medicines, 118/142 (83.1%). Samples failing TLC accounted for 27/1,663 (1.6%) and those failing disintegration accounted for 24/1,663 (1.4%). Medicines quality failures decreased significantly during the last two years. Basic tests revealed that the quality of medicines in the public sector improved over the years, since the implementation of this type of quality monitoring programme in 2005. However, the lack of consistent confirmatory tests in the quality control (QC) laboratory, utilizing methods that can also evaluate additional quality attributes, could still mask quality issues. In the future, AMI countries should improve coordination with their health authorities and their QC lab consistently, to provide a more complete picture of malaria medicines quality and support the implementation of corrective actions. Facilities in the private and informal sectors also should be included when these sectors constitute an important source of medicines used by malaria patients.

  15. WE-AB-206-01: Diagnostic Ultrasound Imaging Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zagzebski, J.

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less

  16. Establishing quality control ranges for antimicrobial susceptibility testing of Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus: a cornerstone to develop reference strains for Korean clinical microbiology laboratories.

    PubMed

    Hong, Sung Kuk; Choi, Seung Jun; Shin, Saeam; Lee, Wonmok; Pinto, Naina; Shin, Nari; Lee, Kwangjun; Hong, Seong Geun; Kim, Young Ah; Lee, Hyukmin; Kim, Heejung; Song, Wonkeun; Lee, Sun Hwa; Yong, Dongeun; Lee, Kyungwon; Chong, Yunsop

    2015-11-01

    Quality control (QC) processes are being performed in the majority of clinical microbiology laboratories to ensure the performance of microbial identification and antimicrobial susceptibility testing by using ATCC strains. To obtain these ATCC strains, some inconveniences are encountered concerning the purchase cost of the strains and the shipping time required. This study was focused on constructing a database of reference strains for QC processes using domestic bacterial strains, concentrating primarily on antimicrobial susceptibility testing. Three strains (Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus) that showed legible results in preliminary testing were selected. The minimal inhibitory concentrations (MICs) and zone diameters (ZDs) of eight antimicrobials for each strain were determined according to the CLSI M23. All resulting MIC and ZD ranges included at least 95% of the data. The ZD QC ranges obtained by using the CLSI method were less than 12 mm, and the MIC QC ranges extended no more than five dilutions. This study is a preliminary attempt to construct a bank of Korean QC strains. With further studies, a positive outcome toward cost and time reduction can be anticipated.

  17. Combination Therapy With Exenatide Plus Pioglitazone Versus Basal/Bolus Insulin in Patients With Poorly Controlled Type 2 Diabetes on Sulfonylurea Plus Metformin: The Qatar Study

    PubMed Central

    Abdul-Ghani, Muhammad; Migahid, Osama; Megahed, Ayman; Adams, John; Triplitt, Curtis; DeFronzo, Ralph A.; Zirie, Mahmoud; Jayyousi, Amin

    2017-01-01

    OBJECTIVE The Qatar Study was designed to examine the efficacy of combination therapy with exenatide plus pioglitazone versus basal/bolus insulin in patients with long-standing poorly controlled type 2 diabetes mellitus (T2DM) on metformin plus a sulfonylurea. RESEARCH DESIGN AND METHODS The study randomized 231 patients with poorly controlled (HbA1c >7.5%, 58 mmol/mol) T2DM on a sulfonylurea plus metformin to receive 1) pioglitazone plus weekly exenatide (combination therapy) or 2) basal plus prandial insulin (insulin therapy) to maintain HbA1c <7.0% (53 mmol/mol). RESULTS After a mean follow-up of 12 months, combination therapy caused a robust decrease in HbA1c from 10.0 ± 0.6% (86 ± 5.2 mmol/mol) at baseline to 6.1 ± 0.1% (43 ± 0.7 mmol/mol) compared with 7.1 ± 0.1% (54 ± 0.8 mmol/mol) in subjects receiving insulin therapy. Combination therapy was effective in lowering the HbA1c independent of sex, ethnicity, BMI, or baseline HbA1c. Subjects in the insulin therapy group experienced significantly greater weight gain and a threefold higher rate of hypoglycemia than patients in the combination therapy group. CONCLUSIONS Combination exenatide/pioglitazone therapy is a very effective and safe therapeutic option in patients with long-standing poorly controlled T2DM on metformin plus a sulfonylurea. PMID:28096223

  18. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  19. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... containers shall measure the mass in each CO2 container using weigh bills, scales, or load cells and sum the...

  20. E-Quality in the Workplace: Quality Circles or Quality of Working Life Programs in the US.

    ERIC Educational Resources Information Center

    Savage, Grant T.; Romano, Richard

    Quality Circle (QC) and Quality of Working Life (QWL) in the United States are similar in that both stress participative decision making, preserve management's prerogative to have the final say, and are voluntary. QC and QWL programs differ, however, in that labor unions are more involved in QWLs; QCs deal only with technical problems related to…

  1. Impact of Case Mix Severity on Quality Improvement in a Patient-centered Medical Home (PCMH) in the Maryland Multi-Payor Program.

    PubMed

    Khanna, Niharika; Shaya, Fadia T; Chirikov, Viktor V; Sharp, David; Steffen, Ben

    2016-01-01

    We present data on quality of care (QC) improvement in 35 of 45 National Quality Forum metrics reported annually by 52 primary care practices recognized as patient-centered medical homes (PCMHs) that participated in the Maryland Multi-Payor Program from 2011 to 2013. We assigned QC metrics to (1) chronic, (2) preventive, and (3) mental health care domains. The study used a panel data design with no control group. Using longitudinal fixed-effects regressions, we modeled QC and case mix severity in a PCMH. Overall, 35 of 45 quality metrics reported by 52 PCMHs demonstrated improvement over 3 years, and case mix severity did not affect the achievement of quality improvement. From 2011 to 2012, QC increased by 0.14 (P < .01) for chronic, 0.15 (P < .01) for preventive, and 0.34 (P < .01) for mental health care domains; from 2012 to 2013 these domains increased by 0.03 (P = .06), 0.04 (P = .05), and 0.07 (P = .12), respectively. In univariate analyses, lower National Commission on Quality Assurance PCMH level was associated with higher QC for the mental health care domain, whereas case mix severity did not correlate with QC. In multivariate analyses, higher QC correlated with larger practices, greater proportion of older patients, and readmission visits. Rural practices had higher proportions of Medicaid patients, lower QC, and higher QC improvement in interaction analyses with time. The gains in QC in the chronic disease domain, the preventive care domain, and, most significantly, the mental health care domain were observed over time regardless of patient case mix severity. QC improvement was generally not modified by practice characteristics, except for rurality. © Copyright 2016 by the American Board of Family Medicine.

  2. Autonomous exercise game use improves metabolic control and quality of life in type 2 diabetes patients - a randomized controlled trial.

    PubMed

    Kempf, Kerstin; Martin, Stephan

    2013-12-10

    Lifestyle intervention in type 2 diabetes mellitus (T2DM) is effective but needs a special local setting and is costly. Therefore, in a randomized-controlled trial we tested the hypothesis that the autonomous use of the interactive exercise game Wii Fit Plus over a period of 12 weeks improves metabolic control, with HbA1c reduction as the primary outcome, and weight loss, reduction of cardiometabolic risk factors, physical activity and quality of life (secondary outcomes) in T2DM patients. Participants (n = 220) were randomized into an intervention and a control group. The intervention group was provided with a Wii console, a balance board and the exercise game Wii Fit Plus for 12 weeks. The control group remained under routine care and received the items 12 weeks later. At baseline and after 12 weeks (and for the control group additionally after 12 weeks of intervention) the participants' health parameters, medication, physical activity and validated questionnaires for quality of life (PAID, SF12, WHO-5, CES-D) were requested and compared in a complete case analysis using the Mann-Whitney test and the Wilcoxon signed rank test. 80% of participants completed the 12-week study. Patients in the intervention group significantly improved HbA1c (from 7.1 ± 1.3% to 6.8 ± 0.9%; -0.3 ± 1.1%; p = 0.0002) in comparison to the control group (from 6.8 ± 0.9% to 6.7 ± 0.7%; -0.1 ± 0.5%) and also significantly reduced fasting blood glucose (from 135.8 ± 38.9 mg/dl to 126.6 ± 36.6 mg/dl; p = 0.04), weight (from 97.6 ± 19.2 kg to 96.3 ± 18.7 kg; p < 0.001) and body mass index (from 34.1 ± 6.5 kg/m2 to 33.5 ± 6.5 kg/m2; p < 0.001). Daily physical activity increased significantly (p < 0.001). Diabetes-dependent impairment, mental health, subjective wellbeing and quality of life also improved significantly, and the number of patients with depression decreased. Similar improvements were seen

  3. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  4. FASTQ quality control dashboard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-07-25

    FQCDB builds up existing open source software, FastQC, implementing a modern web interface for across parsed output of FastQC. In addition, FQCDB is extensible as a web service to include additional plots of type line, boxplot, or heatmap, across data formatted according to guidelines. The interface is also configurable via more readable JSON format, enabling customization by non-web programmers.

  5. Creating a comprehensive quality-controlled dataset of severe weather occurrence in Europe

    NASA Astrophysics Data System (ADS)

    Groenemeijer, P.; Kühne, T.; Liang, Z.; Holzer, A.; Feuerstein, B.; Dotzek, N.

    2010-09-01

    Ground-truth quality-controlled data on severe weather occurrence is required for meaningful research on severe weather hazards. Such data are collected by observation networks of several authorities in Europe, most prominently the National Hydrometeorological Institutes (NHMS). However, some events challenge the capabilities of such conventional networks by their isolated and short-lived nature. These rare and very localized but extreme events include thunderstorm wind gusts, large hail and tornadoes and are poorly resolved by synoptic observations. Moreover, their detection by remote-sensing techniques such as radar and satellites is in development and has proven to be difficult. Using the fact that all across across Europe there are many people with a special personal or professional interest in such events, who are typically organized in associations, allows pursuing a different strategy. Data delivered to the European Severe Weather Database is recorded and quality controlled by ESSL and a large number of partners including the Hydrometeorological Institutes of Germany, Finland, Austria, Italy and Bulgaria. Additionally, nine associations of storm spotters and centres of expertise in these and other countries are involved. The two categories of organizations (NHMSes/other) each have different privileges in the quality control procedure, which involves assigning a quality level of QC0+ (plausibility checked), QC1 (confirmed by reliable sources) or QC2 (verified) to each of the reports. Within the EWENT project funded by the EU 7th framework programme, the RegioExakt project funded by the German Ministry of Education and Research, and with support from the German Weather Service (DWD), several enhancements of the ESWD have been and will be carried out. Completed enhancements include the creation of an interface that allows partner organizations to upload data automatically, in the case of our German partner "Skywarn Germany" in near-real time. Moreover, the

  6. Quality Control Methodology Of A Surface Wind Observational Database In North Eastern North America

    NASA Astrophysics Data System (ADS)

    Lucio-Eceiza, Etor E.; Fidel González-Rouco, J.; Navarro, Jorge; Conte, Jorge; Beltrami, Hugo

    2016-04-01

    This work summarizes the design and application of a Quality Control (QC) procedure for an observational surface wind database located in North Eastern North America. The database consists of 526 sites (486 land stations and 40 buoys) with varying resolutions of hourly, 3 hourly and 6 hourly data, compiled from three different source institutions with uneven measurement units and changing measuring procedures, instrumentation and heights. The records span from 1953 to 2010. The QC process is composed of different phases focused either on problems related with the providing source institutions or measurement errors. The first phases deal with problems often related with data recording and management: (1) compilation stage dealing with the detection of typographical errors, decoding problems, site displacements and unification of institutional practices; (2) detection of erroneous data sequence duplications within a station or among different ones; (3) detection of errors related with physically unrealistic data measurements. The last phases are focused on instrumental errors: (4) problems related with low variability, placing particular emphasis on the detection of unrealistic low wind speed records with the help of regional references; (5) high variability related erroneous records; (6) standardization of wind speed record biases due to changing measurement heights, detection of wind speed biases on week to monthly timescales, and homogenization of wind direction records. As a result, around 1.7% of wind speed records and 0.4% of wind direction records have been deleted, making a combined total of 1.9% of removed records. Additionally, around 15.9% wind speed records and 2.4% of wind direction data have been also corrected.

  7. A Randomized Controlled Trial of Cognitive-Behavior Therapy Plus Bright Light Therapy for Adolescent Delayed Sleep Phase Disorder

    PubMed Central

    Gradisar, Michael; Dohnt, Hayley; Gardner, Greg; Paine, Sarah; Starkey, Karina; Menne, Annemarie; Slater, Amy; Wright, Helen; Hudson, Jennifer L.; Weaver, Edward; Trenowden, Sophie

    2011-01-01

    Objective: To evaluate cognitive-behavior therapy plus bright light therapy (CBT plus BLT) for adolescents diagnosed with delayed sleep phase disorder (DSPD). Design: Randomized controlled trial of CBT plus BLT vs. waitlist (WL) control with comparisons at pre- and post-treatment. There was 6-month follow-up for the CBT plus BLT group only. Setting: Flinders University Child & Adolescent Sleep Clinic, Adelaide, South Australia. Patients: 49 adolescents (mean age 14.6 ± 1.0 y, 53% males) diagnosed with DSPD; mean chronicity 4 y 8 months; 16% not attending school. Eighteen percent of adolescents dropped out of the study (CBT plus BLT: N = 23 vs WL: N = 17). Interventions: CBT plus BLT consisted of 6 individual sessions, including morning bright light therapy to advance adolescents' circadian rhythms, and cognitive restructuring and sleep education to target associated insomnia and sleep hygiene. Measurements and Results: DSPD diagnosis was performed via a clinical interview and 7-day sleep diary. Measurements at each time-point included online sleep diaries and scales measuring sleepiness, fatigue, and depression symptoms. Compared to WL, moderate-to-large improvements (d = 0.65-1.24) were found at post-treatment for CBT plus BLT adolescents, including reduced sleep latency, earlier sleep onset and rise times, total sleep time (school nights), wake after sleep onset, sleepiness, and fatigue. At 6-month follow-up (N = 15), small-to-large improvements (d = 0.24-1.53) continued for CBT plus BLT adolescents, with effects found for all measures. Significantly fewer adolescents receiving CBT plus BLT met DPSD criteria at post-treatment (WL = 82% vs. CBT plus BLT = 13%, P < 0.0001), yet 13% still met DSPD criteria at the 6-month follow-up. Conclusions: CBT plus BLT for adolescent DSPD is effective for improving multiple sleep and daytime impairments in the immediate and long-term. Studies evaluating the treatment effectiveness of each treatment component are needed

  8. Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea

    NASA Astrophysics Data System (ADS)

    Kim, S. D.; Park, H. M.

    2017-12-01

    To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.

  9. A Case-Control Study of Esomeprazole Plus Rebamipide vs. Omeprazole Plus Rebamipide on Post-ESD Gastric Ulcers.

    PubMed

    Bunno, Maki; Gouda, Kyosuke; Yamahara, Kunihiro; Kawaguchi, Masanori

    2013-01-01

    Endoscopic submucosal dissection (ESD) is useful for treating gastric tumors. Several trials have shown the efficacy of 4 or 8 weeks of proton pump inhibitor (PPI) administration for post-ESD ulcers. However, if the size of the post-ESD ulcer is larger than predicted, PPI administration alone might not be sufficient for the ulcer to heal within 4 weeks. There is no report about the efficacy of post-ESD gastric ulcers by esomeprazole. We examined retrospectively the efficacy of a combination therapy of esomeprazole plus rebamipide, a mucosal-protective antiulcer drug, on the acceleration of post-ESD ulcer healing comparing with omeprazole plus rebamipide. We reviewed the medical records of patients who underwent ESD for gastric neoplasia. We conducted a case-control study to compare the healing rates within 4 weeks effected by esomeprazole plus rebamipide (group E) and omeprazole plus rebamipide (group O). The sizes of the artificial ulcers were divided into normal-sized or large-sized. The baseline characteristics did not differ significantly between the two groups except age and sex. Stage S1 disease was observed in 27.6% and 38.7% of patients after 4 weeks of treatment in the group E and O, respectively. In large-sized artificial ulcers, the healing rate of stage S1 in group E is significantly higher than that in group O in 4 weeks.(25% VS 0%:P = 0.02). The safety and efficacy profiles of esomeprazole plus rebamipide and omeprazole and rebamipide are similar for the treatment of ESD-induced ulcers. In large-sized ulcers, esomeprazole plus rebamipide promotes ulcer healing.

  10. Analysis of CrIs/ATMS Using AIRS Version-7 Retrieval and QC Methodology

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Kouvaris, Louis; Blaisdell, John M.; Iredell, Lena

    2017-01-01

    The objective of this research is to develop and implement an algorithm to analyze a long term data record of CrIS/ATMS observations so as to produce monthly mean gridded Level-3 products which are consistent with, and will serve as a seamless follow on to, those of AIRS Version-7. We feel the best way to achieve this result is to analyze CrIS/ATMS data using retrieval and Quality Control (QC) methodologies which are scientifically equivalent to those used in AIRS Version-7. We developed and implemented a single retrieval program that uses as input either AIRS/AMSU or CrIS/ATMS radiance observations, and has appropriate switches that take into account the spectral and radiometric differences between CrIS and AIRS. Our methodology is call CHART (Climate Heritage AIRS Retrieval Technique).

  11. [Quality control of laser imagers].

    PubMed

    Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H

    1992-11-01

    Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.

  12. The Quality Control Algorithms Used in the Creation of NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological

  13. Moderate exercise plus sleep education improves self-reported sleep quality, daytime mood, and vitality in adults with chronic sleep complaints: a waiting list-controlled trial.

    PubMed

    Gebhart, Carmen; Erlacher, Daniel; Schredl, Michael

    2011-01-01

    Research indicates that physical exercise can contribute to better sleep quality. This study investigates the six-week influence of a combined intervention on self-rated sleep quality, daytime mood, and quality of life. A nonclinical sample of 114 adults with chronic initiating and the maintaining of sleep complaints participated in the study. The intervention group of 70 adults underwent moderate physical exercise, conducted weekly, plus sleep education sessions. Improvements among participants assigned to the intervention group relative to the waiting-list control group (n = 44) were noted for subjective sleep quality, daytime mood, depressive symptoms and vitality. Derived from PSQI subscores, the intervention group reported increased sleep duration, shortened sleep latency, fewer awakenings after sleep onset, and overall better sleep efficiency compared to controls. The attained scores were well sustained and enhanced over a time that lasted through to the follow-up 18 weeks later. These findings have implications in treatment programs concerning healthy lifestyle approaches for adults with chronic sleep complaints.

  14. Moderate Exercise Plus Sleep Education Improves Self-Reported Sleep Quality, Daytime Mood, and Vitality in Adults with Chronic Sleep Complaints: A Waiting List-Controlled Trial

    PubMed Central

    Gebhart, Carmen; Erlacher, Daniel; Schredl, Michael

    2011-01-01

    Research indicates that physical exercise can contribute to better sleep quality. This study investigates the six-week influence of a combined intervention on self-rated sleep quality, daytime mood, and quality of life. A nonclinical sample of 114 adults with chronic initiating and the maintaining of sleep complaints participated in the study. The intervention group of 70 adults underwent moderate physical exercise, conducted weekly, plus sleep education sessions. Improvements among participants assigned to the intervention group relative to the waiting-list control group (n = 44) were noted for subjective sleep quality, daytime mood, depressive symptoms and vitality. Derived from PSQI subscores, the intervention group reported increased sleep duration, shortened sleep latency, fewer awakenings after sleep onset, and overall better sleep efficiency compared to controls. The attained scores were well sustained and enhanced over a time that lasted through to the follow-up 18 weeks later. These findings have implications in treatment programs concerning healthy lifestyle approaches for adults with chronic sleep complaints. PMID:23471095

  15. Selecting Statistical Quality Control Procedures for Limiting the Impact of Increases in Analytical Random Error on Patient Safety.

    PubMed

    Yago, Martín

    2017-05-01

    QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.

  16. A randomized controlled trial of cognitive-behavior therapy plus bright light therapy for adolescent delayed sleep phase disorder.

    PubMed

    Gradisar, Michael; Dohnt, Hayley; Gardner, Greg; Paine, Sarah; Starkey, Karina; Menne, Annemarie; Slater, Amy; Wright, Helen; Hudson, Jennifer L; Weaver, Edward; Trenowden, Sophie

    2011-12-01

    To evaluate cognitive-behavior therapy plus bright light therapy (CBT plus BLT) for adolescents diagnosed with delayed sleep phase disorder (DSPD). Randomized controlled trial of CBT plus BLT vs. waitlist (WL) control with comparisons at pre- and post-treatment. There was 6-month follow-up for the CBT plus BLT group only. Flinders University Child & Adolescent Sleep Clinic, Adelaide, South Australia. 49 adolescents (mean age 14.6 ± 1.0 y, 53% males) diagnosed with DSPD; mean chronicity 4 y 8 months; 16% not attending school. Eighteen percent of adolescents dropped out of the study (CBT plus BLT: N = 23 vs. WL: N = 17). CBT plus BLT consisted of 6 individual sessions, including morning bright light therapy to advance adolescents' circadian rhythms, and cognitive restructuring and sleep education to target associated insomnia and sleep hygiene. DSPD diagnosis was performed via a clinical interview and 7-day sleep diary. Measurements at each time-point included online sleep diaries and scales measuring sleepiness, fatigue, and depression symptoms. Compared to WL, moderate-to-large improvements (d = 0.65-1.24) were found at post-treatment for CBT plus BLT adolescents, including reduced sleep latency, earlier sleep onset and rise times, total sleep time (school nights), wake after sleep onset, sleepiness, and fatigue. At 6-month follow-up (N = 15), small-to-large improvements (d = 0.24-1.53) continued for CBT plus BLT adolescents, with effects found for all measures. Significantly fewer adolescents receiving CBT plus BLT met DPSD criteria at post-treatment (WL = 82% vs. CBT plus BLT = 13%, P < 0.0001), yet 13% still met DSPD criteria at the 6-month follow-up. CBT plus BLT for adolescent DSPD is effective for improving multiple sleep and daytime impairments in the immediate and long-term. Studies evaluating the treatment effectiveness of each treatment component are needed. Australia-New Zealand Trials Registry Number: ACTRN12610001041044.

  17. MedlinePlus FAQ: What's the difference between MedlinePlus and MedlinePlus Connect?

    MedlinePlus

    ... MedlinePlus Connect is a free service that allows electronic health record (EHR) systems to easily link users to MedlinePlus, ... updates Subscribe to RSS Follow us Disclaimers Copyright Privacy Accessibility Quality Guidelines Viewers & Players MedlinePlus Connect for ...

  18. Implementation of basic quality control tests for malaria medicines in Amazon Basin countries: results for the 2005–2010 period

    PubMed Central

    2012-01-01

    was evaluated, mostly collected from the public sector, 1,445/1,663 (86.9%). Results indicate that 193/1,663 (11.6%) were found not to meet quality specifications. Most failures were reported during visual and physical inspection, 142/1663 (8.5%), and most of these were due to expired medicines, 118/142 (83.1%). Samples failing TLC accounted for 27/1,663 (1.6%) and those failing disintegration accounted for 24/1,663 (1.4%). Medicines quality failures decreased significantly during the last two years. Conclusions Basic tests revealed that the quality of medicines in the public sector improved over the years, since the implementation of this type of quality monitoring programme in 2005. However, the lack of consistent confirmatory tests in the quality control (QC) laboratory, utilizing methods that can also evaluate additional quality attributes, could still mask quality issues. In the future, AMI countries should improve coordination with their health authorities and their QC lab consistently, to provide a more complete picture of malaria medicines quality and support the implementation of corrective actions. Facilities in the private and informal sectors also should be included when these sectors constitute an important source of medicines used by malaria patients. PMID:22704680

  19. S-1 and irinotecan plus bevacizumab versus mFOLFOX6 or CapeOX plus bevacizumab as first-line treatment in patients with metastatic colorectal cancer (TRICOLORE): a randomized, open-label, phase III, noninferiority trial.

    PubMed

    Yamada, Y; Denda, T; Gamoh, M; Iwanaga, I; Yuki, S; Shimodaira, H; Nakamura, M; Yamaguchi, T; Ohori, H; Kobayashi, K; Tsuda, M; Kobayashi, Y; Miyamoto, Y; Kotake, M; Shimada, K; Sato, A; Morita, S; Takahashi, S; Komatsu, Y; Ishioka, C

    2018-03-01

    Combination therapy with oral fluoropyrimidine and irinotecan has not yet been established as first-line treatment of metastatic colorectal cancer (mCRC). We carried out a randomized, open-label, phase III trial to determine whether S-1 and irinotecan plus bevacizumab is noninferior to mFOLFOX6 or CapeOX plus bevacizumab in terms of progression-free survival (PFS). Patients from 53 institutions who had previously untreated mCRC were randomly assigned (1 : 1) to receive either mFOLFOX6 or CapeOX plus bevacizumab (control group) or S-1 and irinotecan plus bevacizumab (experimental group; a 3-week regimen: intravenous infusions of irinotecan 150 mg/m2 and bevacizumab 7.5 mg/kg on day 1, oral S-1 80 mg/m2 twice daily for 2 weeks, followed by a 1-week rest; or a 4-week regimen: irinotecan 100 mg/m2 and bevacizumab 5 mg/kg on days 1 and 15, S-1 80 mg/m2 twice daily for 2 weeks, followed by a 2-week rest). The primary end point was PFS. The noninferiority margin was 1.25; noninferiority would be established if the upper limit of the 95% confidence interval (CI) for the hazard ratio (HR) of the control group versus the experimental group was less than this margin. Between June 2012 and September 2014, 487 patients underwent randomization. Two hundred and forty-three patients assigned to the control group and 241 assigned to the experimental group were included in the primary analysis. Median PFS was 10.8 months (95% CI 9.6-11.6) in the control group and 14.0 months (95% CI 12.4-15.5) in the experimental group (HR 0.84, 95% CI 0.70-1.02; P < 0.0001 for noninferiority, P = 0.0815 for superiority). One hundred and fifty-seven patients (64.9%) in the control group and 140 (58.6%) in the experimental group had adverse events of grade 3 or higher. S-1 and irinotecan plus bevacizumab is noninferior to mFOLFOX6 or CapeOX plus bevacizumab with respect to PFS as first-line treatment of mCRC and could be a new standard treatment. UMIN000007834.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, MIRATECH CORPORATIONM GECO 3001 AIR/FUEL RATIO CONTROLLER

    EPA Science Inventory

    Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...

  1. In-Situ Atmospheric Sounding Data Lifecycle from Data Collection, Analysis and Quality Control to Documentation, Archival and Tracking

    NASA Astrophysics Data System (ADS)

    Young, K.; Voemel, H.; Morris, D.

    2015-12-01

    In-situ measurement systems are used to monitor the atmosphere whereby instruments are located in the area of interest and are in direct contact with what is being measured. Dropsondes and radiosondes are instruments used to collect high-vertical-resolution profiles of the atmosphere. The dropsondes are deployed from aircraft and, as they descend, they collect pressure, temperature and humidity data at a half-second rate, and GPS wind data at a quarter-second rate. Radiosondes are used to collect high-resolution measurements of the atmosphere, from the ground to approximately 30 kilometers. Carried by a large helium-filled balloon, they ascend upward through the atmosphere measuring pressure, temperature, relative humidity, and GPS winds at a one-second rate. Advancements in atmospheric research, technology and data assimilation techniques have contributed to driving the need for higher quality, higher resolution radiosonde and dropsonde data at an increasingly rapid rate. These data most notably represent a valuable resource for initializing numerical prediction models, calibrating and validating satellite retrieval techniques for atmospheric profiles, and for climatological research. The In-Situ Sensing Facility, at NCAR, has developed an extensive, multi-step process of quality control (QC). Traditionally, QC has been a time intensive process that involves evaluating data products using a variety of visualization tools and statistical methods. With a greater need for real-time data in the field and a reduced turn-around time for final quality controlled data, new and improved procedures for streamlining statistical analysis and QC are being implemented. Improvements have also been made on two fronts regarding implementation of a comprehensive data management plan. The first was ensuring ease of data accessibility through an intuitive centralized data archive system, that both keeps a record of data users and assigns digital object identifiers to each unique data

  2. 5-Fluorouracil, leucovorin, and oxaliplatin (mFOLFOX6) plus sunitinib or bevacizumab as first-line treatment for metastatic colorectal cancer: a randomized Phase IIb study

    PubMed Central

    Hecht, J Randolph; Mitchell, Edith P; Yoshino, Takayuki; Welslau, Manfred; Lin, Xun; Chow Maneval, Edna; Paolini, Jolanda; Lechuga, Maria Jose; Kretzschmar, Albrecht

    2015-01-01

    Background Sunitinib is an oral inhibitor of tyrosine kinase receptors implicated in tumor proliferation, angiogenesis, and metastasis. In this randomized, multicenter, open-label Phase IIb study, sunitinib plus mFOLFOX6 (oxaliplatin plus leucovorin plus 5-fluorouracil) was compared with bevacizumab plus mFOLFOX6 as first-line therapy in patients with metastatic colorectal cancer. Methods Patients were stratified by performance status, baseline lactate dehydrogenase level, and prior adjuvant treatment, and randomized 1:1 to receive sunitinib 37.5 mg/day for 4 weeks on and 2 weeks off plus mFOLFOX6 every 2 weeks or bevacizumab 5 mg/kg every 2 weeks plus mFOLFOX6 every 2 weeks. The primary endpoint was progression-free survival. Secondary endpoints included objective response rate, overall survival, safety, and quality of life. Results Enrollment was closed early following accrual of 191 patients, based on an interim analysis showing an inferior trend in the primary progression-free survival efficacy endpoint for sunitinib. Ninety-six patients were randomized to sunitinib plus mFOLFOX6 and 95 to bevacizumab plus mFOLFOX6. Median progression-free survival was 9.3 months and 15.4 months, respectively, but the objective response rate was similar between the study arms. Median overall survival was 23.7 months and 34.1 months, respectively. Dose reductions and interruptions were more common with sunitinib. Hematologic toxicity was more common in the sunitinib arm. Conclusion While the results of the sunitinib arm are comparable with those of previously reported FOLFOX combinations, the sunitinib-based combination was associated with more toxicity than that observed with bevacizumab and mFOLFOX6. The bevacizumab arm had an unexpectedly good outcome, and was much better than that seen in the Phase III trials. Combination therapy with sunitinib plus mFOLFOX6 is not recommended for patients with metastatic colorectal cancer. PMID:26109878

  3. Type testing the Model 6600 plus automatic TLD reader.

    PubMed

    Velbeck, K J; Luo, L Z; Streetz, K L

    2006-01-01

    The Harshaw Model 6600 Plus is a reader with a capacity for 200 TLD cards or 800 extremity cards. The new unit integrates more functionality, and significantly automates the QC and calibration process compared to the Model 6600. The Model 6600 Plus was tested against the IEC 61066 (1991-2012) procedures using Harshaw TLD-700H and TLD-600H, LiF:Mg,Cu,P based TLD Cards. An overview of the type testing procedures is presented. These include batch homogeneity, detection threshold, reproducibility, linearity, self-irradiation, residue, light effects on dosemeter, light leakage to reader, voltage and frequency, dropping and reader stability. The new TLD reader was found to meet all the IEC criteria by large margins and appears well suited for whole body, extremity and environmental dosimetry applications, with a high degree of dosimetric performance.

  4. Fast QC-LDPC code for free space optical communication

    NASA Astrophysics Data System (ADS)

    Wang, Jin; Zhang, Qi; Udeh, Chinonso Paschal; Wu, Rangzhong

    2017-02-01

    Free Space Optical (FSO) Communication systems use the atmosphere as a propagation medium. Hence the atmospheric turbulence effects lead to multiplicative noise related with signal intensity. In order to suppress the signal fading induced by multiplicative noise, we propose a fast Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) code for FSO Communication systems. As a linear block code based on sparse matrix, the performances of QC-LDPC is extremely near to the Shannon limit. Currently, the studies on LDPC code in FSO Communications is mainly focused on Gauss-channel and Rayleigh-channel, respectively. In this study, the LDPC code design over atmospheric turbulence channel which is nether Gauss-channel nor Rayleigh-channel is closer to the practical situation. Based on the characteristics of atmospheric channel, which is modeled as logarithmic-normal distribution and K-distribution, we designed a special QC-LDPC code, and deduced the log-likelihood ratio (LLR). An irregular QC-LDPC code for fast coding, of which the rates are variable, is proposed in this paper. The proposed code achieves excellent performance of LDPC codes and can present the characteristics of high efficiency in low rate, stable in high rate and less number of iteration. The result of belief propagation (BP) decoding shows that the bit error rate (BER) obviously reduced as the Signal-to-Noise Ratio (SNR) increased. Therefore, the LDPC channel coding technology can effectively improve the performance of FSO. At the same time, the BER, after decoding reduces with the increase of SNR arbitrarily, and not having error limitation platform phenomenon with error rate slowing down.

  5. Does quality control matter? Surface urban heat island intensity variations estimated by satellite-derived land surface temperature products

    NASA Astrophysics Data System (ADS)

    Lai, Jiameng; Zhan, Wenfeng; Huang, Fan; Quan, Jinling; Hu, Leiqiu; Gao, Lun; Ju, Weimin

    2018-05-01

    The temporally regular and spatially comprehensive monitoring of surface urban heat islands (SUHIs) have been extremely difficult, until the advent of satellite-based land surface temperature (LST) products. However, these LST products have relatively higher errors compared to in situ measurements. This has resulted in comparatively inaccurate estimations of SUHI indicators and, consequently, may have distorted interpretations of SUHIs. Although reports have shown that LST qualities are important for SUHI interpretations, systematic investigations of the response of SUHI indicators to LST qualities across cities with dissimilar bioclimates are rare. To address this issue, we chose eighty-six major cities across mainland China and analyzed SUHI intensity (SUHII) derived from Moderate Resolution Imaging Spectroradiometer (MODIS) LST data. The LST-based SUHII differences due to inclusion or exclusion of MODIS quality control (QC) flags (i.e., ΔSUHII) were evaluated. Our major findings included, but are not limited to, the following four aspects: (1) SUHIIs can be significantly impacted by MODIS QC flags, and the associated QC-induced ΔSUHIIs generally accounted for 24.3% (29.9%) of the total SUHII value during the day (night); (2) the ΔSUHIIs differed between seasons, with considerable differences between transitional (spring and autumn) and extreme (summer and winter) seasons; (3) significant discrepancies also appeared among cities located in northern and southern regions, with northern cities often possessing higher annual mean ΔSUHIIs. The internal variations of ΔSUHIIs within individual cities also showed high heterogeneity, with ΔSUHII variations that generally exceeded 5.0 K (3.0 K) in northern (southern) cities; (4) ΔSUHIIs were negatively related to SUHIIs and cloud cover percentages (mostly in transitional seasons). No significant relationship was found in the extreme seasons. Our findings highlight the need to be extremely cautious when using LST

  6. FPGA implementation of high-performance QC-LDPC decoder for optical communications

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2015-01-01

    Forward error correction is as one of the key technologies enabling the next-generation high-speed fiber optical communications. Quasi-cyclic (QC) low-density parity-check (LDPC) codes have been considered as one of the promising candidates due to their large coding gain performance and low implementation complexity. In this paper, we present our designed QC-LDPC code with girth 10 and 25% overhead based on pairwise balanced design. By FPGAbased emulation, we demonstrate that the 5-bit soft-decision LDPC decoder can achieve 11.8dB net coding gain with no error floor at BER of 10-15 avoiding using any outer code or post-processing method. We believe that the proposed single QC-LDPC code is a promising solution for 400Gb/s optical communication systems and beyond.

  7. Quality control of mRNP biogenesis: networking at the transcription site.

    PubMed

    Eberle, Andrea B; Visa, Neus

    2014-08-01

    Eukaryotic cells carry out quality control (QC) over the processes of RNA biogenesis to inactivate or eliminate defective transcripts, and to avoid their production. In the case of protein-coding transcripts, the quality controls can sense defects in the assembly of mRNA-protein complexes, in the processing of the precursor mRNAs, and in the sequence of open reading frames. Different types of defect are monitored by different specialized mechanisms. Some of them involve dedicated factors whose function is to identify faulty molecules and target them for degradation. Others are the result of a more subtle balance in the kinetics of opposing activities in the mRNA biogenesis pathway. One way or another, all such mechanisms hinder the expression of the defective mRNAs through processes as diverse as rapid degradation, nuclear retention and transcriptional silencing. Three major degradation systems are responsible for the destruction of the defective transcripts: the exosome, the 5'-3' exoribonucleases, and the nonsense-mediated mRNA decay (NMD) machinery. This review summarizes recent findings on the cotranscriptional quality control of mRNA biogenesis, and speculates that a protein-protein interaction network integrates multiple mRNA degradation systems with the transcription machinery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Raloxifene Plus Antipsychotics Versus Placebo Plus Antipsychotics in Severely Ill Decompensated Postmenopausal Women With Schizophrenia or Schizoaffective Disorder: A Randomized Controlled Trial.

    PubMed

    Weiser, Mark; Levi, Linda; Burshtein, Shimon; Hagin, Michal; Matei, Valentin P; Podea, Delia; Micluția, Ioana; Tiugan, Alexandru; Păcală, Bogdan; Grecu, Iosif Gabos; Noy, Adam; Zamora, Daisy; Davis, John M

    2017-07-01

    Several single-center studies have found raloxifene, an estrogen agonist, to be effective in ameliorating symptoms of schizophrenia in stable patients as augmentation of antipsychotics. This multicenter study assessed whether raloxifene plus antipsychotic treatment, in comparison to placebo plus antipsychotics, improves symptoms or cognition in severely ill decompensated schizophrenia patients. In this 16-week, double-blind, randomized, placebo-controlled study, 200 severely ill, decompensated postmenopausal women who met DSM-IV-TR criteria for schizophrenia or schizoaffective disorder were recruited from January 2011 to December 2012 and were randomized to receive either raloxifene 120 mg/d plus antipsychotics or placebo plus antipsychotics. The primary outcome measure was Positive and Negative Syndrome Scale (PANSS) total score at the end of the trial. The placebo plus antipsychotics group experienced statistically significant improvement in PANSS total score (P < .001) compared to the raloxifene plus antipsychotics group, using mixed models for repeated measures, with results favoring placebo by 4.5 points (95% CI, 2.3-6.7). These results were clearly outside the 95% confidence interval. This negative effect was more pronounced in patients who had more frequent relapses and in those with baseline PANSS scores of 100 or higher. There were no differences between groups in Clinical Global Impression Scale-Severity scores or Composite Brief Assessment of Cognition in Schizophrenia scores at 16 weeks (P > .3). Baseline follicle-stimulating hormone and estradiol levels did not alter the drug-placebo differences. Individuals in the active treatment arm showed worse outcome than those in the placebo arm, most likely as a result of chance variation, but the results unequivocally show no benefit of antipsychotics plus raloxifene versus antipsychotics plus placebo in this large randomized, double-blind, placebo-controlled trial in postmenopausal women. These data do not

  9. Aerobic treadmill plus Bobath walking training improves walking in subacute stroke: a randomized controlled trial.

    PubMed

    Eich, H-J; Mach, H; Werner, C; Hesse, S

    2004-09-01

    To evaluate the immediate and long-term effects of aerobic treadmill plus Bobath walking training in subacute stroke survivors compared with Bobath walking training alone. Randomized controlled trial. Rehabilitation unit. Fifty patients, first-time supratentorial stroke, stroke interval less than six weeks, Barthel Index (0-100) from 50 to 80, able to walk a minimum distance of 12 m with either intermittent help or stand-by while walking, cardiovascular stable, minimum 50 W in the bicycle ergometry, randomly allocated to two groups, A and B. Group A 30 min of treadmill training, harness secured and minimally supported according to patients' needs, and 30 min of physiotherapy, every workday for six weeks, speed and inclination of the treadmill were adjusted to achieve a heart rate of HR: (Hrmax-HRrest)*0.6+HRrest; in group B 60 min of daily physiotherapy for six weeks. Primary outcome variables were the absolute improvement of walking velocity (m/s) and capacity (m), secondary were gross motor function including walking ability (score out of 13) and walking quality (score out of 41), blindly assessed before and after the intervention, and at follow-up three months later. Patients tolerated the aerobic training well with no side-effects, significantly greater improvement of walking velocity and capacity both at study end (p =0.001 versus p =0.002) and at follow-up (p <0.001 versus p <0.001) in the experimental group. Between weeks 0 and 6, the experimental group improved walking speed and capacity by a mean of.31 m/s and 91 m, the control group by a mean of 0.16 m/s and 56 m. Between weeks 0 and 18, the experimental group improved walking speed and capacity by a mean of 0.36 m/s and 111 m, the control group by a mean of 0.15 m/s and 57 m. Gross motor function and walking quality did not differ at any time. Aerobic treadmill plus Bobath walking training in moderately affected stroke patients was better than Bobath walking training alone with respect to the improvement

  10. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    USGS Publications Warehouse

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The <2-mm fraction of each sample was analyzed for Al, Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset

  11. Adjuvant chemotherapy with fluorouracil plus folinic acid vs gemcitabine following pancreatic cancer resection: a randomized controlled trial.

    PubMed

    Neoptolemos, John P; Stocken, Deborah D; Bassi, Claudio; Ghaneh, Paula; Cunningham, David; Goldstein, David; Padbury, Robert; Moore, Malcolm J; Gallinger, Steven; Mariette, Christophe; Wente, Moritz N; Izbicki, Jakob R; Friess, Helmut; Lerch, Markus M; Dervenis, Christos; Oláh, Attila; Butturini, Giovanni; Doi, Ryuichiro; Lind, Pehr A; Smith, David; Valle, Juan W; Palmer, Daniel H; Buckels, John A; Thompson, Joyce; McKay, Colin J; Rawcliffe, Charlotte L; Büchler, Markus W

    2010-09-08

    Adjuvant fluorouracil has been shown to be of benefit for patients with resected pancreatic cancer. Gemcitabine is known to be the most effective agent in advanced disease as well as an effective agent in patients with resected pancreatic cancer. To determine whether fluorouracil or gemcitabine is superior in terms of overall survival as adjuvant treatment following resection of pancreatic cancer. The European Study Group for Pancreatic Cancer (ESPAC)-3 trial, an open-label, phase 3, randomized controlled trial conducted in 159 pancreatic cancer centers in Europe, Australasia, Japan, and Canada. Included in ESPAC-3 version 2 were 1088 patients with pancreatic ductal adenocarcinoma who had undergone cancer resection; patients were randomized between July 2000 and January 2007 and underwent at least 2 years of follow-up. Patients received either fluorouracil plus folinic acid (folinic acid, 20 mg/m(2), intravenous bolus injection, followed by fluorouracil, 425 mg/m(2) intravenous bolus injection given 1-5 days every 28 days) (n = 551) or gemcitabine (1000 mg/m(2) intravenous infusion once a week for 3 of every 4 weeks) (n = 537) for 6 months. Primary outcome measure was overall survival; secondary measures were toxicity, progression-free survival, and quality of life. Final analysis was carried out on an intention-to-treat basis after a median of 34.2 (interquartile range, 27.1-43.4) months' follow-up after 753 deaths (69%). Median survival was 23.0 (95% confidence interval [CI], 21.1-25.0) months for patients treated with fluorouracil plus folinic acid and 23.6 (95% CI, 21.4-26.4) months for those treated with gemcitabine (chi(1)(2) = 0.7; P = .39; hazard ratio, 0.94 [95% CI, 0.81-1.08]). Seventy-seven patients (14%) receiving fluorouracil plus folinic acid had 97 treatment-related serious adverse events, compared with 40 patients (7.5%) receiving gemcitabine, who had 52 events (P < .001). There were no significant differences in either progression-free survival or

  12. Haloperidol plus promethazine for psychosis-induced aggression.

    PubMed

    Huf, Gisele; Alexander, Jacob; Gandhi, Pinky; Allen, Michael H

    2016-11-25

    Health services often manage agitated or violent people, and such behaviour is particularly prevalent in emergency psychiatric services (10%). The drugs used in such situations should ensure that the person becomes calm swiftly and safely. To examine whether haloperidol plus promethazine is an effective treatment for psychosis-induced aggression. On 6 May 2015 we searched the Cochrane Schizophrenia Group's Register of Trials, which is compiled by systematic searches of major resources (including MEDLINE, EMBASE, AMED, BIOSIS, CINAHL, PsycINFO, PubMed, and registries of clinical trials) and their monthly updates, handsearches, grey literature, and conference proceedings. All randomised clinical trials with useable data focusing on haloperidol plus promethazine for psychosis-induced aggression. We independently extracted data. For binary outcomes, we calculated risk ratio (RR) and its 95% confidence interval (CI), on an intention-to-treat basis. For continuous data, we estimated the mean difference (MD) between groups and its 95% CI. We employed a fixed-effect model for analyses. We assessed risk of bias for included studies and created 'Summary of findings' tables using GRADE. We found two new randomised controlled trials (RCTs) from the 2015 update searching. The review now includes six studies, randomising 1367 participants and presenting data relevant to six comparisons.When haloperidol plus promethazine was compared with haloperidol alone for psychosis-induced aggression for the outcome not tranquil or asleep at 30 minutes, the combination treatment was clearly more effective (n=316, 1 RCT, RR 0.65, 95% CI 0.49 to 0.87, high-quality evidence). There were 10 occurrences of acute dystonia in the haloperidol alone arm and none in the combination group. The trial was stopped early as haloperidol alone was considered to be too toxic.When haloperidol plus promethazine was compared with olanzapine, high-quality data showed both approaches to be tranquillising. It was

  13. Efficacy and Safety of Tazarotene 0.1% Plus Clindamycin 1% Gel Versus Adapalene 0.1% Plus Clindamycin 1% Gel in Facial Acne Vulgaris: A Randomized, Controlled Clinical Trial.

    PubMed

    Maiti, Rituparna; Sirka, Chandra Sekhar; Ashique Rahman, M A; Srinivasan, Anand; Parida, Sansita; Hota, Debasish

    2017-11-01

    Acne vulgaris is a multifactorial disorder which is ideally treated with combination therapy with topical retinoids and antibiotics. The present study was conducted to compare the efficacy and safety of tazarotene plus clindamycin against adapalene plus clindamycin in facial acne vulgaris. This study is a randomized, open-label, parallel design clinical trial conducted on 60 patients with facial acne at the outpatient dermatology department in a tertiary healthcare center. The main outcome measures were change in the acne lesion count, Investigator's Static Global Assessment (ISGA) score, Global Acne Grading System (GAGS) score, and Acne-Specific Quality of Life Questionnaire (Acne-QoL) at the end of 4 weeks of therapy. After randomization one group (n = 30) received tazarotene 0.1% plus clindamycin 1% gel and another group (n = 30) received adapalene 0.1% plus clindamycin 1% gel for 1 month. At follow-up, all the parameter were reassessed. In both treatment regimens the total number of facial acne lesions decreased significantly. The difference in the change in the total count between the two combination regimens was also significant [6.51, 95% confidence interval (CI) 1.91-11.09, p = 0.007]. A ≥50% reduction in the total lesion count from the baseline levels was achieved by 71% of patients in the tazarotene plus clindamycin group and 22% of patients in the adapalene plus clindamycin group (p = 0.0012). The difference in the change of inflammatory (p = 0.017) and non-inflammatory (p = 0.039) lesion counts in the tazarotene plus clindamycin group were significantly higher than the adapalene plus clindamycin group. The difference in change of the GAGS score was also significantly higher in the tazarotene plus clindamycin group (p = 0.003). The ISGA score improved in 17 patients in the tazarotene plus clindamycin group versusnine patients in the adapalene plus clindamycin group (p = 0.04). The change of total quality-of-life score was found to be

  14. Evaluation of peak picking quality in LC-MS metabolomics data.

    PubMed

    Brodsky, Leonid; Moussaieff, Arieh; Shahaf, Nir; Aharoni, Asaph; Rogachev, Ilana

    2010-11-15

    The output of LC-MS metabolomics experiments consists of mass-peak intensities identified through a peak-picking/alignment procedure. Besides imperfections in biological samples and instrumentation, data accuracy is highly dependent on the applied algorithms and their parameters. Consequently, quality control (QC) is essential for further data analysis. Here, we present a QC approach that is based on discrepancies between replicate samples. First, the quantile normalization of per-sample log-signal distributions is applied to each group of biologically homogeneous samples. Next, the overall quality of each replicate group is characterized by the Z-transformed correlation coefficients between samples. This general QC allows a tuning of the procedure's parameters which minimizes the inter-replicate discrepancies in the generated output. Subsequently, an in-depth QC measure detects local neighborhoods on a template of aligned chromatograms that are enriched by divergences between intensity profiles of replicate samples. These neighborhoods are determined through a segmentation algorithm. The retention time (RT)-m/z positions of the neighborhoods with local divergences are indicative of either: incorrect alignment of chromatographic features, technical problems in the chromatograms, or to a true biological discrepancy between replicates for particular metabolites. We expect this method to aid in the accurate analysis of metabolomics data and in the development of new peak-picking/alignment procedures.

  15. Quality control for federal clean water act and safe drinking water act regulatory compliance.

    PubMed

    Askew, Ed

    2013-01-01

    QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.

  16. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics

    PubMed Central

    Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582

  17. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics.

    PubMed

    Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.

  18. Embankment quality and assessment of moisture control implementation : tech transfer summary.

    DOT National Transportation Integrated Search

    2016-02-01

    The motivation for this project was based on work by : Iowa State University (ISU) researchers at a few recent : grading projects that demonstrated embankments were : being constructed outside moisture control limits, even : though the contractor QC ...

  19. Multi-Site Quality Assurance Project Plan for Wisconsin Public Service Corporation, Peoples Gas Light and Coke Company, and North Shore Gas

    EPA Pesticide Factsheets

    This Multi-Site QAPP presents the organization, data quality objectives (DQOs), a set of anticipated activities, sample analysis, data handling and specific Quality Assurance/Quality Control (QA/QC) procedures associated with Studies done in EPA Region 5

  20. Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors

    NASA Technical Reports Server (NTRS)

    Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory

    2011-01-01

    Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.

  1. Quality control methods for linear accelerator radiation and mechanical axes alignment.

    PubMed

    Létourneau, Daniel; Keller, Harald; Becker, Nathan; Amin, Md Nurul; Norrlinger, Bernhard; Jaffray, David A

    2018-06-01

    The delivery accuracy of highly conformal dose distributions generated using intensity modulation and collimator, gantry, and couch degrees of freedom is directly affected by the quality of the alignment between the radiation beam and the mechanical axes of a linear accelerator. For this purpose, quality control (QC) guidelines recommend a tolerance of ±1 mm for the coincidence of the radiation and mechanical isocenters. Traditional QC methods for assessment of radiation and mechanical axes alignment (based on pointer alignment) are time consuming and complex tasks that provide limited accuracy. In this work, an automated test suite based on an analytical model of the linear accelerator motions was developed to streamline the QC of radiation and mechanical axes alignment. The proposed method used the automated analysis of megavoltage images of two simple task-specific phantoms acquired at different linear accelerator settings to determine the coincidence of the radiation and mechanical isocenters. The sensitivity and accuracy of the test suite were validated by introducing actual misalignments on a linear accelerator between the radiation axis and the mechanical axes using both beam steering and mechanical adjustments of the gantry and couch. The validation demonstrated that the new QC method can detect sub-millimeter misalignment between the radiation axis and the three mechanical axes of rotation. A displacement of the radiation source of 0.2 mm using beam steering parameters was easily detectable with the proposed collimator rotation axis test. Mechanical misalignments of the gantry and couch rotation axes of the same magnitude (0.2 mm) were also detectable using the new gantry and couch rotation axis tests. For the couch rotation axis, the phantom and test design allow detection of both translational and tilt misalignments with the radiation beam axis. For the collimator rotation axis, the test can isolate the misalignment between the beam radiation axis

  2. Potential of dynamically harmonized Fourier transform ion cyclotron resonance cell for high-throughput metabolomics fingerprinting: control of data quality.

    PubMed

    Habchi, Baninia; Alves, Sandra; Jouan-Rimbaud Bouveresse, Delphine; Appenzeller, Brice; Paris, Alain; Rutledge, Douglas N; Rathahao-Paris, Estelle

    2018-01-01

    Due to the presence of pollutants in the environment and food, the assessment of human exposure is required. This necessitates high-throughput approaches enabling large-scale analysis and, as a consequence, the use of high-performance analytical instruments to obtain highly informative metabolomic profiles. In this study, direct introduction mass spectrometry (DIMS) was performed using a Fourier transform ion cyclotron resonance (FT-ICR) instrument equipped with a dynamically harmonized cell. Data quality was evaluated based on mass resolving power (RP), mass measurement accuracy, and ion intensity drifts from the repeated injections of quality control sample (QC) along the analytical process. The large DIMS data size entails the use of bioinformatic tools for the automatic selection of common ions found in all QC injections and for robustness assessment and correction of eventual technical drifts. RP values greater than 10 6 and mass measurement accuracy of lower than 1 ppm were obtained using broadband mode resulting in the detection of isotopic fine structure. Hence, a very accurate relative isotopic mass defect (RΔm) value was calculated. This reduces significantly the number of elemental composition (EC) candidates and greatly improves compound annotation. A very satisfactory estimate of repeatability of both peak intensity and mass measurement was demonstrated. Although, a non negligible ion intensity drift was observed for negative ion mode data, a normalization procedure was easily applied to correct this phenomenon. This study illustrates the performance and robustness of the dynamically harmonized FT-ICR cell to perform large-scale high-throughput metabolomic analyses in routine conditions. Graphical abstract Analytical performance of FT-ICR instrument equipped with a dynamically harmonized cell.

  3. A novel construction method of QC-LDPC codes based on CRT for optical communications

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  4. [Perineal urethrostomy plus secondary urethroplasty for ultralong urethral stricture: clinical outcomes and influence on the patient's quality of life].

    PubMed

    Wang, Yong-Quan; Zhang, Heng; Shen, Wen-Hao; Li, Long-Kun; Li, Wei-Bing; Xiong, En-Qing

    2012-04-01

    To investigate the outcomes of perineal urethrostomy plus secondary urethroplasty for ultralong urethral stricture and assess its influence on the patient's quality of life. We retrospectively analyzed 54 cases of ultralong urethral stricture treated by perineal urethrostomy from 2000 to 2010. The mean age of the patients was 40 years, and the average length of stricture was 6.5 cm. We evaluated the patients'quality of life by questionnaire investigation and the clinical outcomes based on IPSS, Qmax, the necessity of urethral dilation and satisfaction of the patients. The mean Qmax of the 54 patients was (14.0 +/- 4.7) ml/min. Of the 34 cases that underwent secondary urethroplasty, 22 (64.7%) achieved a mean Qmax of (12.0 +/- 3.5) ml/min, 8 (23.5%) needed regular urethral dilatation and 4 (11.8%) received internal urethrotomy because of restenosis. IPSS scores were 5.4 +/- 2.1 and 8.5 +/- 5.8 after perineal urethrostomy and secondary urethroplasty, respectively. Fifty of the total number of patients (92.6%) were satisfied with the results of perineal urethrostomy, and 22 of the 34 (64.7%) with the results of secondary urethroplasty. Perineal urethrostomy plus secondary urethroplasty is safe and effective for ultralong urethral stricture, and affects very little the patient's quality of life.

  5. Improving the quality of randomized controlled trials in Chinese herbal medicine, part II: control group design.

    PubMed

    Bian, Zhao-Xiang; Moher, David; Dagenais, Simon; Li, You-Ping; Liu, Liang; Wu, Tai-Xiang; Miao, Jiang-Xia

    2006-03-01

    To discuss the types of control groups in randomized controlled trials (RCTs) of Chinese herbal medicine (CHM), and to provide suggestions for improving the design of control group in future clinical studies in this therapeutic area. A search of the Cochrane Library was conducted in July 2005 to identify RCTs of CHM, and 66 RCTs with CHM for type 2 diabetes mellitus were obtained as the basis for further analysis. Of 66 RCTs with CHM for type 2 diabetes mellitus, 61 (92.4%) trials had both a treatment group and a control group. Twenty-seven (40.9%) RCTs compared CHM plus conventional drug vs conventional drug, 24 (36.4%) compared CHM vs conventional drug, 5 (7.6%) compared CHM vs placebo, 3 (4.5%) compared CHM plus conventional drug vs conventional drug plus placebo, 3 (4.5%) compared CHM plus conventional drug vs other CHM, 1 (1.5%) compared CHM vs no treatment, 1 (1.5%) compared CHM plus placebo vs conventional drug plus placebo, 1 (1.5%) compared CHM vs CHM plus conventional drug vs conventional drug vs placebo, and 1 (1.5%) compared CHM vs conventional drug vs CHM plus conventional drug. A variety of control groups were used in RCTs of CHM for type 2 diabetes mellitus, including placebo, active, and no treatment control groups. Justification for selecting particular types of control groups were not provided in the trials reviewed in this study. Different control groups may be appropriate according to the study objectives, and several factors should be considered prior to selecting control groups in future RCTs of CHM. (1) Investigators of CHM who design clinical trials should understand the rationale for selecting different types of control groups; (2) Control groups for RCTs should be selected according to study objectives; (3) Active control groups should select interventions for comparisons that have the strongest evidence of efficacy and prescribe them as recommended; (4) Placebo control groups should select a placebo that mimics the physical

  6. Consumption of whole-grain cereals during weight loss: effects on dietary quality, dietary fiber, magnesium, vitamin B-6, and obesity.

    PubMed

    Melanson, Kathleen J; Angelopoulos, Theodore J; Nguyen, Von T; Martini, Margaret; Zukley, Linda; Lowndes, Joshua; Dube, Thomas J; Fiutem, Justin J; Yount, Byron W; Rippe, James M

    2006-09-01

    While various weight-management approaches produce weight loss, they may differ in dietary quality. We monitored changes in nutrient intakes in overweight and obese subjects on three different weight-management programs. Randomized clinical trial (pilot study) with two 12-week phases: phase 1, weekly counseling; phase 2, monitoring only. One hundred eighty nonsmoking, sedentary overweight and obese adults began this outpatient study; 134 (body mass index [calculated as kg/m(2)]=30.9+/-2.4; age=42.3+/-1.2 years) were used in analyses. Twenty-four weeks of exercise only (control group), hypocaloric diet plus exercise, or hypocaloric diet with fiber-rich whole-grain cereals plus exercise. At weeks 0, 12, and 24, diet quality was assessed by 3-day food records and body weight was measured. Three-way analysis of variance with repeated measures. The hypocaloric diet with fiber-rich whole-grain cereals plus exercise decreased energy intake more than exercise only (P=0.032). By week 12, the hypocaloric diet with fiber-rich whole-grain cereals plus exercise and the hypocaloric diet plus exercise decreased total fat more than exercise only, which was sustained in the hypocaloric diet with fiber-rich whole-grain cereals plus exercise at 24 weeks (P<0.001). At weeks 12 and 24, the hypocaloric diet with fiber-rich whole-grain cereals plus exercise reduced saturated fat intake more than exercise only. The hypocaloric diet with fiber-rich whole-grain cereals plus exercise increased total fiber, insoluble fiber (both P<0.001), magnesium (P=0.004), and vitamin B-6 (P=0.002) intakes more than the hypocaloric diet plus exercise and exercise only. Calcium and vitamin E intakes were inadequate in all groups. Weight loss was similar in the hypocaloric diet with fiber-rich whole-grain cereals plus exercise and the hypocaloric diet plus exercise. Weight-reduction strategies may be associated with reduced intake of micronutrients, such as calcium and vitamin E. However, a hypocaloric diet

  7. Quality control of next-generation sequencing library through an integrative digital microfluidic platform.

    PubMed

    Thaitrong, Numrin; Kim, Hanyoup; Renzi, Ronald F; Bartsch, Michael S; Meagher, Robert J; Patel, Kamlesh D

    2012-12-01

    We have developed an automated quality control (QC) platform for next-generation sequencing (NGS) library characterization by integrating a droplet-based digital microfluidic (DMF) system with a capillary-based reagent delivery unit and a quantitative CE module. Using an in-plane capillary-DMF interface, a prepared sample droplet was actuated into position between the ground electrode and the inlet of the separation capillary to complete the circuit for an electrokinetic injection. Using a DNA ladder as an internal standard, the CE module with a compact LIF detector was capable of detecting dsDNA in the range of 5-100 pg/μL, suitable for the amount of DNA required by the Illumina Genome Analyzer sequencing platform. This DMF-CE platform consumes tenfold less sample volume than the current Agilent BioAnalyzer QC technique, preserving precious sample while providing necessary sensitivity and accuracy for optimal sequencing performance. The ability of this microfluidic system to validate NGS library preparation was demonstrated by examining the effects of limited-cycle PCR amplification on the size distribution and the yield of Illumina-compatible libraries, demonstrating that as few as ten cycles of PCR bias the size distribution of the library toward undesirable larger fragments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A short term quality control tool for biodegradable microspheres.

    PubMed

    D'Souza, Susan; Faraj, Jabar A; Dorati, Rossella; DeLuca, Patrick P

    2014-06-01

    Accelerated in vitro release testing methodology has been developed as an indicator of product performance to be used as a discriminatory quality control (QC) technique for the release of clinical and commercial batches of biodegradable microspheres. While product performance of biodegradable microspheres can be verified by in vivo and/or in vitro experiments, such evaluation can be particularly challenging because of slow polymer degradation, resulting in extended study times, labor, and expense. Three batches of Leuprolide poly(lactic-co-glycolic acid) (PLGA) microspheres having varying morphology (process variants having different particle size and specific surface area) were manufactured by the solvent extraction/evaporation technique. Tests involving in vitro release, polymer degradation and hydration of the microspheres were performed on the three batches at 55°C. In vitro peptide release at 55°C was analyzed using a previously derived modification of the Weibull function termed the modified Weibull equation (MWE). Experimental observations and data analysis confirm excellent reproducibility studies within and between batches of the microsphere formulations demonstrating the predictability of the accelerated experiments at 55°C. The accelerated test method was also successfully able to distinguish the in vitro product performance between the three batches having varying morphology (process variants), indicating that it is a suitable QC tool to discriminate product or process variants in clinical or commercial batches of microspheres. Additionally, data analysis utilized the MWE to further quantify the differences obtained from the accelerated in vitro product performance test between process variants, thereby enhancing the discriminatory power of the accelerated methodology at 55°C.

  9. A quality control system for digital elevation data

    NASA Astrophysics Data System (ADS)

    Knudsen, Thomas; Kokkendorf, Simon; Flatman, Andrew; Nielsen, Thorbjørn; Rosenkranz, Brigitte; Keller, Kristian

    2015-04-01

    In connection with the introduction of a new version of the Danish national coverage Digital Elevation Model (DK-DEM), the Danish Geodata Agency has developed a comprehensive quality control (QC) and metadata production (MP) system for LiDAR point cloud data. The architecture of the system reflects its origin in a national mapping organization where raw data deliveries are typically outsourced to external suppliers. It also reflects a design decision of aiming at, whenever conceivable, doing full spatial coverage tests, rather than scattered sample checks. Hence, the QC procedure is split in two phases: A reception phase and an acceptance phase. The primary aim of the reception phase is to do a quick assessment of things that can typically go wrong, and which are relatively simple to check: Data coverage, data density, strip adjustment. If a data delivery passes the reception phase, the QC continues with the acceptance phase, which checks five different aspects of the point cloud data: Vertical accuracy Vertical precision Horizontal accuracy Horizontal precision Point classification correctness The vertical descriptors are comparatively simple to measure: The vertical accuracy is checked by direct comparison with previously surveyed patches. The vertical precision is derived from the observed variance on well defined flat surface patches. These patches are automatically derived from the road centerlines registered in FOT, the official Danish map data base. The horizontal descriptors are less straightforward to measure, since potential reference material for direct comparison is typically expected to be less accurate than the LiDAR data. The solution selected is to compare photogrammetrically derived roof centerlines from FOT with LiDAR derived roof centerlines. These are constructed by taking the 3D Hough transform of a point cloud patch defined by the photogrammetrical roof polygon. The LiDAR derived roof centerline is then the intersection line of the two primary

  10. GTKDynamo: a PyMOL plug-in for QC/MM hybrid potential simulations

    PubMed Central

    Bachega, José Fernando R.; Timmers, Luís Fernando S.M.; Assirati, Lucas; Bachega, Leonardo R.; Field, Martin J.; Wymore, Troy

    2014-01-01

    Hybrid quantum chemical (QC)/molecular mechanical (MM) potentials are very powerful tools for molecular simulation. They are especially useful for studying processes in condensed phase systems, such as chemical reactions, that involve a relatively localized change in electronic structure and where the surrounding environment contributes to these changes but can be represented with more computationally efficient functional forms. Despite their utility, however, these potentials are not always straightforward to apply since the extent of significant electronic structure changes occurring in the condensed phase process may not be intuitively obvious. To facilitate their use we have developed an open-source graphical plug-in, GTKDynamo, that links the PyMOL visualization program and the pDynamo QC/MM simulation library. This article describes the implementation of GTKDynamo and its capabilities and illustrates its application to QC/MM simulations. PMID:24137667

  11. Development of the Quality Assurance/Quality Control Procedures for a Neutron Interrogation System

    NASA Astrophysics Data System (ADS)

    Obhođaš, Jasmina; Sudac, Davorin; Valković, Vladivoj

    2016-06-01

    In order to perform Quality Assurance/Quality Control (QA/QC) procedures for a system dedicated to the neutron interrogation of objects for the presence of threat materials one needs to perform measurements of reference materials (RM) i.e. simulants having the same (or similar) atomic ratios as real materials. It is well known that explosives, drugs, and various other benign materials, contain chemical elements such as hydrogen, oxygen, carbon and nitrogen in distinctly different quantities. For example, a high carbon-to-oxygen ratio (C/O) is characteristic of drugs. Explosives can be differentiated by measurement of both (C/O) and nitrogen-to-oxygen (N/O) ratios. The C/N ratio of the chemical warfare agents, coupled with the measurement of elements such as fluorine and phosphorus, clearly differentiate them from the conventional explosives. Here we present the RM preparation, calibration procedure and correlations attained between theoretical values and experimentally obtained results in laboratory conditions for C/O and N/C ratios of prepared hexogen (RDX), TNT, DLM2, TATP, cocaine, heroin, yperite, tetranitromethane, peroxide methylethylketone, nitromethane and ethyleneglycol dinitrate simulants. We have shown that analyses of the gamma ray spectra by using simple unfolding model developed for this purpose gave a nice agreement with the chemical formula of created simulants, thus the calibration quality was successfully tested.

  12. QUALITY SYSTEMS AND IMPLEMENTATION PLAN FOR A PILOT STUDY OF CHILDREN'S TOTAL EXPOSURE TO PERSISTENT PESTICIDES AND OTHER PERSISTENT ORGANIC PESTICIDES (CTEPP)

    EPA Science Inventory

    The Quality System Implementation Plan (QSIP) describes the quality assurance and quality control procedures developed for the CTEPP study. It provides the QA/QC procedures used in recruitment of subjects, sample field collection, sample extraction and analysis, data storage, and...

  13. A randomized, controlled trial of oral sulfate solution plus polyethylene glycol as a bowel preparation for colonoscopy.

    PubMed

    Rex, Douglas K; McGowan, John; Cleveland, Mark vB; Di Palma, Jack A

    2014-09-01

    No bowel preparation for colonoscopy is optimal with regard to efficacy, safety, and tolerability. New options for bowel preparation are needed. To compare a new hybrid preparation consisting of a reduced dose of oral sulfate solution (OSS) plus 2 L of sulfate-free electrolyte lavage solution (SF-ELS) with 2 low-volume preparations based on polyethylene glycol electrolyte lavage solution (PEG-ELS). Two randomized, controlled trials. Twenty-four U.S. centers. A total of 737 outpatients undergoing colonoscopy. In study 1, OSS plus SF-ELS was given as a split dose, and in study 2, OSS plus SF-ELS was given in its entirety the evening before colonoscopy. In study 1, the active control was 2 L of PEG-ELS plus ascorbic acid (PEG-EA) given as a split dose. In study 2, the control was 10 mg of bisacodyl plus 2 L of SF-ELS taken the evening before colonoscopy. Rates of successful (good or excellent) bowel preparation. In study 1, the rates of successful (excellent or good) preparation with OSS plus SF-ELS and PEG-EA were identical at 93.5% for split-dose preparation. OSS plus SF-ELS was noninferior to PEG-EA (P < .001). In study 2, OSS plus SF-ELS resulted in successful preparation in 89.8% of patients compared with 83.5% with bisacodyl plus SF-ELS in a same-day preparation regimen. OSS plus SF-ELS was noninferior to bisacodyl plus SF-ELS (P <.001). In study 1, vomiting was more frequent with OSS plus SF-ELS (13.5% vs 6.7%; P = .042), and bloating was rated worse with PEG-EA (P = .025). In study 2, overall discomfort was rated worse with OSS plus SF-ELS (mean score 2.1 vs 1.8; P = .032). There were no deaths in either study and no serious adverse events considered related to the preparation. Bowel cleansing was not scored by colon segment. Adenoma detection was not compared between the regimens. OSS plus SF-ELS is a new, safe, and effective bowel preparation for colonoscopy. Copyright © 2014 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All

  14. THE MAQC PROJECT: ESTABLISHING QC METRICS AND THRESHOLDS FOR MICROARRAY QUALITY CONTROL

    EPA Science Inventory

    Microarrays represent a core technology in pharmacogenomics and toxicogenomics; however, before this technology can successfully and reliably be applied in clinical practice and regulatory decision-making, standards and quality measures need to be developed. The Microarray Qualit...

  15. Decision theory applied to image quality control in radiology.

    PubMed

    Lessa, Patrícia S; Caous, Cristofer A; Arantes, Paula R; Amaro, Edson; de Souza, Fernando M Campello

    2008-11-13

    The present work aims at the application of the decision theory to radiological image quality control (QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.

  16. Effects of switching from prandial premixed insulin therapy to basal plus two times bolus insulin therapy on glycemic control and quality of life in patients with type 2 diabetes mellitus

    PubMed Central

    Ito, Hiroyuki; Abe, Mariko; Antoku, Shinichi; Omoto, Takashi; Shinozaki, Masahiro; Nishio, Shinya; Mifune, Mizuo; Togane, Michiko

    2014-01-01

    Background The effects of switching from prandial premixed insulin therapy (PPT) injected three times a day to basal plus two times bolus insulin therapy (B2B) on glycemic control and quality of life were investigated in patients with type 2 diabetes mellitus. Methods The clinical course was prospectively observed during the first 16 weeks after switching to B2B (insulin glargine plus insulin glulisine before breakfast and dinner) in 27 subjects previously treated with PPT using 50/50 premixed insulin. The Diabetes Treatment Satisfaction Questionnaire (DTSQ) was administered at the start and end of the study. Results The glycated hemoglobin (HbA1c) level (8.3%±1.8% to 8.2%±1.1%) and the DTSQ score did not change between the start and end of the study. An improvement in HbA1c level was found in nine (33%) subjects. The change in HbA1c showed a significant negative correlation with baseline HbA1c, and was significantly better in patients with a baseline HbA1c >8.0% than in those with an HbA1c ≤8.0% (−0.9±2.0 versus 0.3±0.6, respectively, P=0.02). The change in DTSQ score representing treatment satisfaction was significantly greater in patients whose HbA1c level was improved than in those in whom it was not (2.7±3.6 versus −0.8±3.5, P=0.04). Conclusion B2B was noninferior to PPT with regard to HbA1c levels in patients with type 2 diabetes mellitus. B2B should be considered particularly for subjects whose glycemic control is poor despite PPT. PMID:24790413

  17. Effects of switching from prandial premixed insulin therapy to basal plus two times bolus insulin therapy on glycemic control and quality of life in patients with type 2 diabetes mellitus.

    PubMed

    Ito, Hiroyuki; Abe, Mariko; Antoku, Shinichi; Omoto, Takashi; Shinozaki, Masahiro; Nishio, Shinya; Mifune, Mizuo; Togane, Michiko

    2014-01-01

    The effects of switching from prandial premixed insulin therapy (PPT) injected three times a day to basal plus two times bolus insulin therapy (B2B) on glycemic control and quality of life were investigated in patients with type 2 diabetes mellitus. The clinical course was prospectively observed during the first 16 weeks after switching to B2B (insulin glargine plus insulin glulisine before breakfast and dinner) in 27 subjects previously treated with PPT using 50/50 premixed insulin. The Diabetes Treatment Satisfaction Questionnaire (DTSQ) was administered at the start and end of the study. The glycated hemoglobin (HbA1c) level (8.3% ± 1.8% to 8.2% ± 1.1%) and the DTSQ score did not change between the start and end of the study. An improvement in HbA1c level was found in nine (33%) subjects. The change in HbA1c showed a significant negative correlation with baseline HbA1c, and was significantly better in patients with a baseline HbA1c >8.0% than in those with an HbA1c ≤ 8.0% (-0.9 ± 2.0 versus 0.3 ± 0.6, respectively, P = 0.02). The change in DTSQ score representing treatment satisfaction was significantly greater in patients whose HbA1c level was improved than in those in whom it was not (2.7 ± 3.6 versus -0.8 ± 3.5, P = 0.04). B2B was noninferior to PPT with regard to HbA1c levels in patients with type 2 diabetes mellitus. B2B should be considered particularly for subjects whose glycemic control is poor despite PPT.

  18. STAT1 is Constitutively Activated in the T/C28a2 Immortalized Juvenile Human Chondrocyte Line and Stimulated by IL-6 Plus Soluble IL-6R.

    PubMed

    Meszaros, Evan C; Malemud, Charles J

    2015-04-01

    T/C28a2 immortalized juvenile human chondrocytes were employed to determine the extent to which activation of Signal Transducers and Activators of Transcription-1 (STAT1) occurred in response to recombinant human interleukin-6 (rhIL-6) or rhIL-6 in combination with the soluble IL-6 receptor (sIL-6R). Two forms of STAT1, STAT1A and STAT1B, were identified on SDS-PAGE and western blotting with anti-STAT1 antibody. Western blotting revealed that STAT1 was constitutively phosphorylated (p-STAT1). Although incubation of T/C28a2 chondrocytes with rhIL-6 (50 ng/ml) increased p-STAT1A by Δ=22.3% after 30 min, this percent difference failed to reach significance by Chi-square analysis. Similarly, no effect of rhIL-6 (Δ=+10.7%) on p-STAT1B was seen at 30 min. In contrast, although the combination of rhIL-6 plus sIL-6R had no effect on p-STAT1A, rhIL-6 plus sIL-6R increased p-STAT1B by Δ=73.3% (p<0.0001) after 30 min compared to the control group and by Δ=56.7% (p<0.0001) compared to rhIL-6 alone. Janex-1, a Janus kinase-3-specific inhibitor (100 μM) partially reduced the effect of rhIL-6 on p-STAT1B by Δ=27.7% (p<0.05). The results of this study showed that STAT1A/STAT1B was constitutively activated in T/C28a2 chondrocytes. Although rhIL-6 increased p-STAT1B to a small extent, the combination of rhIL-6 plus sIL-6R was far more effective in stimulating STAT1B phosphorylation compared to controls or rhIL-6 alone. These data support the likelihood that although JAK3-mediated activation of STAT1 in T/C28a2 chondrocytes may involve the IL-6/IL-6R/gp130 pathway, these results indicated that STAT1 activation in response to IL-6 preferentially involved IL-6 trans -signaling via sIL-6R.

  19. Assessment of in-situ test technology for construction control of base courses and embankments.

    DOT National Transportation Integrated Search

    2004-05-01

    With the coming move from an empirical to mechanistic-empirical pavement design, it is essential to improve the quality control/quality assurance (QC/QA) procedures of compacted materials from a density-based criterion to a stiffness/strength-based c...

  20. Quality control of human tissues--experience from the Indiana University Cancer Center-Lilly Research Labs human tissue bank.

    PubMed

    Sandusky, George E; Teheny, Katie Heinz; Esterman, Mike; Hanson, Jeff; Williams, Stephen D

    2007-01-01

    The success of molecular research and its applications in both the clinical and basic research arenas is strongly dependent on the collection, handling, storage, and quality control of fresh human tissue samples. This tissue bank was set up to bank fresh surgically obtained human tissue using a Clinical Annotated Tissue Database (CATD) in order to capture the associated patient clinical data and demographics using a one way patient encryption scheme to protect patient identification. In this study, we determined that high quality of tissue samples is imperative for both genomic and proteomic molecular research. This paper also contains a brief compilation of the literature involved in the patient ethics, patient informed consent, patient de-identification, tissue collection, processing, and storage as well as basic molecular research generated from the tissue bank using good clinical practices. The current applicable rules, regulations, and guidelines for handling human tissues are briefly discussed. More than 6,610 cancer patients have been consented (97% of those that were contacted by the consenter) and 16,800 tissue specimens have been banked from these patients in 9 years. All samples collected in the bank were QC'd by a pathologist. Approximately 1,550 tissue samples have been requested for use in basic, clinical, and/or biomarker cancer research studies. Each tissue aliquot removed from the bank for a research study were evaluated by a second H&E, if the samples passed the QC, they were submitted for genomic and proteomic molecular analysis/study. Approximately 75% of samples evaluated were of high histologic quality and used for research studies. Since 2003, we changed the patient informed consent to allow the tissue bank to gather more patient clinical follow-up information. Ninety two percent of the patients (1,865 patients) signed the new informed consent form and agreed to be re-contacted for follow-up information on their disease state. In addition

  1. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f) [Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  2. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f)[Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  3. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f)[Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  4. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f) [Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  5. DETERMINATION OF NATIONAL DIAGNOSTIC REFERENCE LEVELS IN COMPUTED TOMOGRAPHY EXAMINATIONS OF IRAN BY A NEW QUALITY CONTROL-BASED DOSE SURVEY METHOD.

    PubMed

    Sohrabi, Mehdi; Parsi, Masoumeh; Mianji, Fereidoun

    2018-05-01

    National diagnostic reference levels (NDRLs) of Iran were determined for the four most common CT examinations including head, sinus, chest and abdomen/pelvis. A new 'quality control (QC)-based dose survey method', as developed by us, was applied to 157 CT scanners in Iran (2014-15) with different slice classes, models and geographic spread across the country. The NDRLs for head, sinus, chest and abdomen/pelvis examinations are 58, 29, 12 and 14 mGy for CTDIVol and 750, 300, 300 and 650 mGy.cm for DLP, respectively. The 'QC-based dose survey method' was further proven that it is a simple, accurate and practical method for a time and cost-effective NDRLs determination. One effective approach for optimization of the CT examination protocols at the national level is the provision of an adequate standardized training of the radiologists, technicians and medical physicists on the patient radiation protection principles and implementation of the DRL concept in clinical practices.

  6. Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.

    PubMed

    Westgard, James O; Bayat, Hassan; Westgard, Sten A

    2018-02-01

    To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.

  7. Results of the Excreta Bioassay Quality Control Program for April 1, 2009 through March 31, 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antonio, Cheryl L.

    2012-07-19

    A total of 58 urine samples and 10 fecal samples were submitted during the report period (April 1, 2009 through March 31, 2010) to General Engineering Laboratories, South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for Sr, 238Pu, 239Pu, 241Am, 243Am 235U, 238U, elemental uranium and fecal analyses for 241Am, 238Pu and 239Pu were tested this year as well as four tissue samples for 238Pu, 239Pu, 241Am and 241Pu. The number of QC urine samples submitted during the report period represented 1.3% of the total samplesmore » submitted. In addition to the samples provided by IDP, GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 33% of the analyses processed by GEL during the third year of this contract were quality control samples. GEL tested the performance of 21 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty (Table 4).« less

  8. Results of The Excreta Bioassay Quality Control Program For April 1, 2010 Through March 31, 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antonio, Cheryl L.

    2012-07-19

    A total of 76 urine samples and 10 spiked fecal samples were submitted during the report period (April 1, 2010 through March 31, 2011) to GEL Laboratories, LLC in South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for 14C, Sr, for 238Pu, 239Pu, 241Am, 243Am, 235U, 238U, 238U-mass and fecal analyses for 241Am, 238Pu and 239Pu were tested this year. The number of QC urine samples submitted during the report period represented 1.1% of the total samples submitted. In addition to the samples provided by IDP,more » GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 31% of the analyses processed by GEL during the first year of contract 112512 were quality control samples. GEL tested the performance of 23 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty except the slightly elevated relative bias for 243,244Cm (Table 4).« less

  9. Effects of motivational enhancement therapy plus cognitive behaviour therapy on depressive symptoms and health-related quality of life in adults with type II diabetes mellitus: a randomised controlled trial.

    PubMed

    Huang, Chiung-Yu; Lai, Hui-Ling; Chen, Chun-I; Lu, Yung-Chuan; Li, Su-Chen; Wang, Long-Whou; Su, Yi

    2016-05-01

    This paper evaluates the effectiveness of motivational enhancement therapy plus cognitive behavioural therapy on depressive symptoms, glycosylated haemoglobin, fasting glucose, body mass index (BMI), and health-related quality of life in type II diabetes patients. A controlled trial was conducted to compare patients who received the behavioural intervention with untreated controls on measures of health outcomes. A total of 31 intervention group participants and 30 controls were selected from patients that met the inclusion criteria from a hospital-based endocrinology outpatient department. The outcome measures including depressive symptoms, glycosylated haemoglobin, fasting glucose, BMI, and both physical and mental quality of life were collected before (T1), after (T2), and after 90 days (T3) following the intervention. The experimental group showed a significant reduction in glycosylated haemoglobin, fasting glucose, and depressive symptoms and a significant increase in physical quality of life and mental quality of life at T2 and T3, while patients in the control group with usual care showed no changes over time. The behavioural intervention facilitated a significant improvement in psychological adjustment and glycemic control, thus strengthening diabetes control skills and leading to healthy outcomes. It is feasible that nurses and psychiatrists can deliver the behavioural intervention for diabetes patients to decrease their depressive symptoms. Sharing discussion and problem-solving experiences is particularly helpful method for self-control, and these will be beneficially influential on further research.

  10. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Contained in Pre-Charged Equipment or Closed-Cell Foams § 98.434 Monitoring and QA/QC requirements. (a) For... equipment or closed-cell foam in the correct quantities (metric tons) and units (kg per piece of equipment...

  11. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Contained in Pre-Charged Equipment or Closed-Cell Foams § 98.434 Monitoring and QA/QC requirements. (a) For... equipment or closed-cell foam in the correct quantities (metric tons) and units (kg per piece of equipment...

  12. Scater: pre-processing, quality control, normalization and visualization of single-cell RNA-seq data in R.

    PubMed

    McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F

    2017-04-15

    Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  13. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Contained in Pre-Charged Equipment or Closed-Cell Foams § 98.434 Monitoring and QA/QC requirements. (a) For... equipment or closed-cell foam in the correct quantities and units. [74 FR 56374, Oct. 30, 2009, as amended...

  14. Quality of life with palbociclib plus fulvestrant in previously treated hormone receptor-positive, HER2-negative metastatic breast cancer: patient-reported outcomes from the PALOMA-3 trial

    PubMed Central

    Harbeck, N.; Iyer, S.; Turner, N.; Cristofanilli, M.; Ro, J.; André, F.; Loi, S.; Verma, S.; Iwata, H.; Bhattacharyya, H.; Puyana Theall, K.; Bartlett, C. H.; Loibl, S.

    2016-01-01

    Background In the PALOMA-3 study, palbociclib plus fulvestrant demonstrated improved progression-free survival compared with fulvestrant plus placebo in hormone receptor-positive, HER2− endocrine-resistant metastatic breast cancer (MBC). This analysis compared patient-reported outcomes (PROs) between the two treatment groups. Patients and methods Patients were randomized 2 : 1 to receive palbociclib 125 mg/day orally for 3 weeks followed by 1 week off (n = 347) plus fulvestrant (500 mg i.m. per standard of care) or placebo plus fulvestrant (n = 174). PROs were assessed on day 1 of cycles 1–4 and of every other subsequent cycle starting with cycle 6 using the EORTC QLQ-C30 and its breast cancer module, QLQ-BR23. High scores (range 0–100) could indicate better functioning/quality of life (QoL) or worse symptom severity. Repeated-measures mixed-effect analyses were carried out to compare on-treatment overall scores and changes from baseline between treatment groups while controlling for baseline. Between-group comparisons of time to deterioration in global QoL and pain were made using an unstratified log-rank test and Cox proportional hazards model. Results Questionnaire completion rates were high at baseline and during treatment (from baseline to cycle 14, ≥95.8% in each group completed ≥1 question on the EORTC QLQ-C30). On treatment, estimated overall global QoL scores significantly favored the palbociclib plus fulvestrant group [66.1, 95% confidence interval (CI) 64.5–67.7 versus 63.0, 95% CI 60.6–65.3; P = 0.0313]. Significantly greater improvement from baseline in pain was also observed in this group (−3.3, 95% CI −5.1 to −1.5 versus 2.0, 95% CI −0.6 to 4.6; P = 0.0011). No significant differences were observed for other QLQ-BR23 functioning domains, breast or arm symptoms. Treatment with palbociclib plus fulvestrant significantly delayed deterioration in global QoL (P < 0.025) and pain (P < 0.001) compared with fulvestrant alone. Conclusion

  15. Quality of life with palbociclib plus fulvestrant in previously treated hormone receptor-positive, HER2-negative metastatic breast cancer: patient-reported outcomes from the PALOMA-3 trial.

    PubMed

    Harbeck, N; Iyer, S; Turner, N; Cristofanilli, M; Ro, J; André, F; Loi, S; Verma, S; Iwata, H; Bhattacharyya, H; Puyana Theall, K; Bartlett, C H; Loibl, S

    2016-06-01

    In the PALOMA-3 study, palbociclib plus fulvestrant demonstrated improved progression-free survival compared with fulvestrant plus placebo in hormone receptor-positive, HER2- endocrine-resistant metastatic breast cancer (MBC). This analysis compared patient-reported outcomes (PROs) between the two treatment groups. Patients were randomized 2 : 1 to receive palbociclib 125 mg/day orally for 3 weeks followed by 1 week off (n = 347) plus fulvestrant (500 mg i.m. per standard of care) or placebo plus fulvestrant (n = 174). PROs were assessed on day 1 of cycles 1-4 and of every other subsequent cycle starting with cycle 6 using the EORTC QLQ-C30 and its breast cancer module, QLQ-BR23. High scores (range 0-100) could indicate better functioning/quality of life (QoL) or worse symptom severity. Repeated-measures mixed-effect analyses were carried out to compare on-treatment overall scores and changes from baseline between treatment groups while controlling for baseline. Between-group comparisons of time to deterioration in global QoL and pain were made using an unstratified log-rank test and Cox proportional hazards model. Questionnaire completion rates were high at baseline and during treatment (from baseline to cycle 14, ≥95.8% in each group completed ≥1 question on the EORTC QLQ-C30). On treatment, estimated overall global QoL scores significantly favored the palbociclib plus fulvestrant group [66.1, 95% confidence interval (CI) 64.5-67.7 versus 63.0, 95% CI 60.6-65.3; P = 0.0313]. Significantly greater improvement from baseline in pain was also observed in this group (-3.3, 95% CI -5.1 to -1.5 versus 2.0, 95% CI -0.6 to 4.6; P = 0.0011). No significant differences were observed for other QLQ-BR23 functioning domains, breast or arm symptoms. Treatment with palbociclib plus fulvestrant significantly delayed deterioration in global QoL (P < 0.025) and pain (P < 0.001) compared with fulvestrant alone. Palbociclib plus fulvestrant allowed patients to maintain good Qo

  16. Linking to MedlinePlus

    MedlinePlus

    ... want to link patients or healthcare providers from electronic health record (EHR) systems to relevant MedlinePlus information, use MedlinePlus ... updates Subscribe to RSS Follow us Disclaimers Copyright Privacy Accessibility Quality Guidelines Viewers & Players MedlinePlus Connect for ...

  17. Improved GMP-compliant multi-dose production and quality control of 6-[18F]fluoro-L-DOPA.

    PubMed

    Luurtsema, G; Boersma, H H; Schepers, M; de Vries, A M T; Maas, B; Zijlma, R; de Vries, E F J; Elsinga, P H

    2017-01-01

    6-[ 18 F]Fluoro-L-3,4-dihydroxyphenylalanine (FDOPA) is a frequently used radiopharmaceutical for detecting neuroendocrine and brain tumors and for the differential diagnosis of Parkinson's disease. To meet the demand for FDOPA, a high-yield GMP-compliant production method is required. Therefore, this study aimed to improve the FDOPA production and quality control procedures to enable distribution of the radiopharmaceutical over distances.FDOPA was prepared by electrophilic fluorination of the trimethylstannyl precursor with [ 18 F]F 2 , produced from [ 18 O] 2 via the double-shoot approach, leading to FDOPA with higher specific activity as compared to FDOPA which was synthesized, using [ 18 F]F 2 produced from 20 Ne, leading to FDOPA with a lower specific activity. The quality control of the product was performed using a validated UPLC system and compared with quality control with a conventional HPLC system. Impurities were identified using UPLC-MS. The [ 18 O] 2 double-shoot radionuclide production method yielded significantly more [ 18 F]F 2 with less carrier F 2 than the conventional method starting from 20 Ne. After adjustment of radiolabeling parameters substantially higher amounts of FDOPA with higher specific activity could be obtained. Quality control by UPLC was much faster and detected more side-products than HPLC. UPLC-MS showed that the most important side-product was FDOPA-quinone, rather than 6-hydroxydopa as suggested by the European Pharmacopoeia. The production and quality control of FDOPA were significantly improved by introducing the [ 18 O] 2 double-shoot radionuclide production method, and product analysis by UPLC, respectively. As a result, FDOPA is now routinely available for clinical practice and for distribution over distances.

  18. Vosaroxin plus cytarabine versus placebo plus cytarabine in patients with first relapsed or refractory acute myeloid leukaemia (VALOR): a randomised, controlled, double-blind, multinational, phase 3 study.

    PubMed

    Ravandi, Farhad; Ritchie, Ellen K; Sayar, Hamid; Lancet, Jeffrey E; Craig, Michael D; Vey, Norbert; Strickland, Stephen A; Schiller, Gary J; Jabbour, Elias; Erba, Harry P; Pigneux, Arnaud; Horst, Heinz-August; Recher, Christian; Klimek, Virginia M; Cortes, Jorge; Roboz, Gail J; Odenike, Olatoyosi; Thomas, Xavier; Havelange, Violaine; Maertens, Johan; Derigs, Hans-Günter; Heuser, Michael; Damon, Lloyd; Powell, Bayard L; Gaidano, Gianluca; Carella, Angelo-Michele; Wei, Andrew; Hogge, Donna; Craig, Adam R; Fox, Judith A; Ward, Renee; Smith, Jennifer A; Acton, Gary; Mehta, Cyrus; Stuart, Robert K; Kantarjian, Hagop M

    2015-09-01

    Safe and effective treatments are urgently needed for patients with relapsed or refractory acute myeloid leukaemia. We investigated the efficacy and safety of vosaroxin, a first-in-class anticancer quinolone derivative, plus cytarabine in patients with relapsed or refractory acute myeloid leukaemia. This phase 3, double-blind, placebo-controlled trial was undertaken at 101 international sites. Eligible patients with acute myeloid leukaemia were aged 18 years of age or older and had refractory disease or were in first relapse after one or two cycles of previous induction chemotherapy, including at least one cycle of anthracycline (or anthracenedione) plus cytarabine. Patients were randomly assigned 1:1 to vosaroxin (90 mg/m(2) intravenously on days 1 and 4 in a first cycle; 70 mg/m(2) in subsequent cycles) plus cytarabine (1 g/m(2) intravenously on days 1-5) or placebo plus cytarabine through a central interactive voice system with a permuted block procedure stratified by disease status, age, and geographical location. All participants were masked to treatment assignment. The primary efficacy endpoint was overall survival and the primary safety endpoint was 30-day and 60-day all-cause mortality. Efficacy analyses were done by intention to treat; safety analyses included all treated patients. This study is registered with ClinicalTrials.gov, number NCT01191801. Between Dec 17, 2010, and Sept 25, 2013, 711 patients were randomly assigned to vosaroxin plus cytarabine (n=356) or placebo plus cytarabine (n=355). At the final analysis, median overall survival was 7·5 months (95% CI 6·4-8·5) in the vosaroxin plus cytarabine group and 6·1 months (5·2-7·1) in the placebo plus cytarabine group (hazard ratio 0·87, 95% CI 0·73-1·02; unstratified log-rank p=0·061; stratified p=0·024). A higher proportion of patients achieved complete remission in the vosaroxin plus cytarabine group than in the placebo plus cytarabine group (107 [30%] of 356 patients vs 58 [16%] of 355

  19. Quality Assurance of Real-Time Oceanographic Data from the Cabled Array of the Ocean Observatories Initiative

    NASA Astrophysics Data System (ADS)

    Kawka, O. E.; Nelson, J. S.; Manalang, D.; Kelley, D. S.

    2016-02-01

    The Cabled Array component of the NSF-funded Ocean Observatories Initiative (OOI) provides access to real-time physical, chemical, geological, and biological data from water column and seafloor platforms/instruments at sites spanning the southern half of the Juan de Fuca Plate. The Quality Assurance (QA) program for OOI data is designed to ensure that data products meet OOI science requirements. This overall data QA plan establishes the guidelines for assuring OOI data quality and summarizes Quality Control (QC) protocols and procedures, based on best practices, which can be utilized to ensure the highest quality data across the OOI program. This presentation will highlight, specifically, the QA/QC approach being utilized for the OOI Cabled Array infrastructure and data and will include a summary of both shipboard and shore-based protocols currently in use. Aspects addressed will be pre-deployment instrument testing and calibration checks, post-deployment and pre-recovery field verification of data, and post-recovery "as-found" testing of instruments. Examples of QA/QC data will be presented and specific cases of cabled data will be discussed in the context of quality assessments and adjustment/correction of OOI datasets overall for inherent sensor drift and/or instrument fouling.

  20. Laboratory quality management system: road to accreditation and beyond.

    PubMed

    Wadhwa, V; Rai, S; Thukral, T; Chopra, M

    2012-01-01

    This review attempts to clarify the concepts of Laboratory Quality Management System (Lab QMS) for a medical testing and diagnostic laboratory in a holistic way and hopes to expand the horizon beyond quality control (QC) and quality assurance. It provides an insight on accreditation bodies and highlights a glimpse of existing laboratory practices but essentially it takes the reader through the journey of accreditation and during the course of reading and understanding this document, prepares the laboratory for the same. Some of the areas which have not been highlighted previously include: requirement for accreditation consultants, laboratory infrastructure and scope, applying for accreditation, document preparation. This section is well supported with practical illustrations and necessary tables and exhaustive details like preparation of a standard operating procedure and a quality manual. Concept of training and privileging of staff has been clarified and a few of the QC exercises have been dealt with in a novel way. Finally, a practical advice for facing an actual third party assessment and caution needed to prevent post-assessment pitfalls has been dealt with.

  1. The need for a formalised system of Quality Control for environmental policy-science.

    PubMed

    Larcombe, Piers; Ridd, Peter

    2018-01-01

    Research science used to inform public policy decisions, herein defined as "Policy-Science", is rarely subjected to rigorous checking, testing and replication. Studies of biomedical and other sciences indicate that a considerable fraction of published peer-reviewed scientific literature, perhaps half, has significant flaws. To demonstrate the potential failings of the present approaches to scientific Quality Control (QC), we describe examples of science associated with perceived threats to the Great Barrier Reef (GBR), Australia. There appears a serious risk of efforts to improve the health of the GBR being directed inefficiently and/or away from the more serious threats. We suggest the need for a new organisation to undertake quality reviews and audits of important scientific results that underpin government spending decisions on the environment. Logically, such a body could also examine policy science in other key areas where governments rely heavily upon scientific results, such as education, health and criminology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Effect of gefitinib plus Chinese herbal medicine (CHM) in patients with advanced non-small-cell lung cancer: a retrospective case-control study.

    PubMed

    Yang, Xiao-Bing; Wu, Wan-Yin; Long, Shun-Qin; Deng, Hong; Pan, Zong-Qi

    2014-12-01

    Some patients with non-small-cell lung cancer (NSCLC) respond well to the EGFR tyrosine kinase inhibitor gefitinib. Chinese herbal medicine (CHM) was effective in improving the quality of life and prolonging overall survival in patient with NSCLC. We aim to determine whether gefitinib plus CHM could prolong the progression-free survival (PFS) or median survival time (MST) in patients with NSCLC than gefitinib alone. We retrospectively analyzed 159 non-small-cell lung cancer patients with the method of retrospective case-control study, matching factors included gender, age categories (30-39,40-49,50-59,60-69,70-79), pathological stage (IIIB or IV), smoking status (never: <100 lifetime cigarettes, or ever: ≥100 lifetime cigarettes), pathology, and performance status. Among the 159 patients, 100 patients treated with gefitinib (250mg/day orally) plus CHM ("Fuzheng Kang'ai" decoction, a Chinese herbal medicine, 250ml/bid/day orally), 59 patients treated with gefitinib (250mg/day orally) only. PFS and MST were analyzed for the whole population. 58 pairs were matched successfully. 1 patient (treated with gefitinib) with the age of 27 years failed to be matched. Progression-free survival was significantly longer in patients treated with gefitinib plus CHM than with gefitinib: median PFS was 13.1 months (95% CI 6.50-19.70) with gefitinib plus CHM versus 11.43 months (95% CI 7.95-14.91) with gefitinib (log-rank P=0.013). Median overall survival was longer with gefitinib plus CHM than with gefitinib: median MST was 22.83 months (95% CI 17.51-28.16) with gefitinib plus CHM versus 18.7 months (95% CI 16.83-20.57) with gefitinib (log-rank P=0.049). The most common adverse event was rash, the incidence in the gefitinib plus CHM group was 41.38% while in the gefitinib group was 24.14% (P=0.048). This case-control analysis suggested that treatment with gefitinib plus CHM prolonged PFS and MST compared with gefitinib in patients with NSCLC, and it is worthy of further study

  3. A Comparison of the Performance of Efficient Data Analysis Versus Fine Particle Dose as Metrics for the Quality Control of Aerodynamic Particle Size Distributions of Orally Inhaled Pharmaceuticals.

    PubMed

    Tougas, Terrence P; Goodey, Adrian P; Hardwell, Gareth; Mitchell, Jolyon; Lyapustina, Svetlana

    2017-02-01

    The performance of two quality control (QC) tests for aerodynamic particle size distributions (APSD) of orally inhaled drug products (OIPs) is compared. One of the tests is based on the fine particle dose (FPD) metric currently expected by the European regulators. The other test, called efficient data analysis (EDA), uses the ratio of large particle mass to small particle mass (LPM/SPM), along with impactor sized mass (ISM), to detect changes in APSD for QC purposes. The comparison is based on analysis of APSD data from four products (two different pressurized metered dose inhalers (MDIs) and two dry powder inhalers (DPIs)). It is demonstrated that in each case, EDA is able to detect shifts and abnormalities that FPD misses. The lack of sensitivity on the part of FPD is due to its "aggregate" nature, since FPD is a univariate measure of all particles less than about 5 μm aerodynamic diameter, and shifts or changes within the range encompassed by this metric may go undetected. EDA is thus shown to be superior to FPD for routine control of OIP quality. This finding augments previously reported superiority of EDA compared with impactor stage groupings (favored by US regulators) for incorrect rejections (type I errors) when incorrect acceptances (type II errors) were adjusted to the same probability for both approaches. EDA is therefore proposed as a method of choice for routine quality control of OIPs in both European and US regulatory environments.

  4. Image processing and Quality Control for the first 10,000 brain imaging datasets from UK Biobank.

    PubMed

    Alfaro-Almagro, Fidel; Jenkinson, Mark; Bangerter, Neal K; Andersson, Jesper L R; Griffanti, Ludovica; Douaud, Gwenaëlle; Sotiropoulos, Stamatios N; Jbabdi, Saad; Hernandez-Fernandez, Moises; Vallee, Emmanuel; Vidaurre, Diego; Webster, Matthew; McCarthy, Paul; Rorden, Christopher; Daducci, Alessandro; Alexander, Daniel C; Zhang, Hui; Dragonu, Iulius; Matthews, Paul M; Miller, Karla L; Smith, Stephen M

    2018-02-01

    UK Biobank is a large-scale prospective epidemiological study with all data accessible to researchers worldwide. It is currently in the process of bringing back 100,000 of the original participants for brain, heart and body MRI, carotid ultrasound and low-dose bone/fat x-ray. The brain imaging component covers 6 modalities (T1, T2 FLAIR, susceptibility weighted MRI, Resting fMRI, Task fMRI and Diffusion MRI). Raw and processed data from the first 10,000 imaged subjects has recently been released for general research access. To help convert this data into useful summary information we have developed an automated processing and QC (Quality Control) pipeline that is available for use by other researchers. In this paper we describe the pipeline in detail, following a brief overview of UK Biobank brain imaging and the acquisition protocol. We also describe several quantitative investigations carried out as part of the development of both the imaging protocol and the processing pipeline. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.434 Section 98.434 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Importers and Exporters of Fluorinated Greenhouse Gases...

  6. Easy parallel screening of reagent stability, quality control, and metrology in solid phase peptide synthesis (SPPS) and peptide couplings for microarrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achyuthan, Komandoor E.; Wheeler, David R.

    Evaluating the stability of coupling reagents, quality control (QC), and surface functionalization metrology are all critical to the production of high quality peptide microarrays. We describe a broadly applicable screening technique for evaluating the fidelity of solid phase peptide synthesis (SPPS), the stability of activation/coupling reagents, and a microarray surface metrology tool. This technique was used to assess the stability of the activation reagent 1-{[1-(Cyano-2-ethoxy-2-oxo-ethylidenaminooxy)dimethylamino-morpholinomethylene]}methaneaminiumHexafluorophosphate (COMU) (Sigma-Aldrich, St. Louis, MO, USA) by SPPS of Leu-Enkephalin (YGGFL) or the coupling of commercially synthesized YGGFL peptides to (3-aminopropyl)triethyoxysilane-modified glass surfaces. Coupling efficiency was quantitated by fluorescence signaling based on immunoreactivity of themore » YGGFL motif. It was concluded that COMU solutions should be prepared fresh and used within 5 h when stored at ~23 °C and not beyond 24 h if stored refrigerated, both in closed containers. Caveats to gauging COMU stability by absorption spectroscopy are discussed. Commercial YGGFL peptides needed independent QC, due to immunoreactivity variations for the same sequence synthesized by different vendors. This technique is useful in evaluating the stability of other activation/coupling reagents besides COMU and as a metrology tool for SPPS and peptide microarrays.« less

  7. Easy parallel screening of reagent stability, quality control, and metrology in solid phase peptide synthesis (SPPS) and peptide couplings for microarrays

    DOE PAGES

    Achyuthan, Komandoor E.; Wheeler, David R.

    2015-08-27

    Evaluating the stability of coupling reagents, quality control (QC), and surface functionalization metrology are all critical to the production of high quality peptide microarrays. We describe a broadly applicable screening technique for evaluating the fidelity of solid phase peptide synthesis (SPPS), the stability of activation/coupling reagents, and a microarray surface metrology tool. This technique was used to assess the stability of the activation reagent 1-{[1-(Cyano-2-ethoxy-2-oxo-ethylidenaminooxy)dimethylamino-morpholinomethylene]}methaneaminiumHexafluorophosphate (COMU) (Sigma-Aldrich, St. Louis, MO, USA) by SPPS of Leu-Enkephalin (YGGFL) or the coupling of commercially synthesized YGGFL peptides to (3-aminopropyl)triethyoxysilane-modified glass surfaces. Coupling efficiency was quantitated by fluorescence signaling based on immunoreactivity of themore » YGGFL motif. It was concluded that COMU solutions should be prepared fresh and used within 5 h when stored at ~23 °C and not beyond 24 h if stored refrigerated, both in closed containers. Caveats to gauging COMU stability by absorption spectroscopy are discussed. Commercial YGGFL peptides needed independent QC, due to immunoreactivity variations for the same sequence synthesized by different vendors. This technique is useful in evaluating the stability of other activation/coupling reagents besides COMU and as a metrology tool for SPPS and peptide microarrays.« less

  8. Quality control for normal liquid-based cytology: Rescreening, high-risk HPV targeted reviewing and/or high-risk HPV detection?

    PubMed Central

    Depuydt, Christophe E; Arbyn, Marc; Benoy, Ina H; Vandepitte, Johan; Vereecken, Annie J; Bogers, Johannes J

    2009-01-01

    The objective of this prospective study was to compare the number of CIN2+cases detected in negative cytology by different quality control (QC) methods. Full rescreening, high-risk (HR) human papillomavirus (HPV)-targeted reviewing and HR HPV detection were compared. Randomly selected negative cytology detected by BD FocalPoint™ (NFR), by guided screening of the prescreened which needed further review (GS) and by manual screening (MS) was used. A 3-year follow-up period was available. Full rescreening of cytology only detected 23.5% of CIN2+ cases, whereas the cytological rescreening of oncogenic positive slides (high-risk HPV-targeted reviewing) detected 7 of 17 CIN2+ cases (41.2%). Quantitative real-time PCR for 15 oncogenic HPV types detected all CIN2+ cases. Relative sensitivity to detect histological CIN2+ was 0.24 for full rescreening, 0.41 for HR-targeted reviewing and 1.00 for HR HPV detection. In more than half of the reviewed negative cytological preparations associated with histological CIN2+cases no morphologically abnormal cells were detected despite a positive HPV test. The visual cut-off for the detection of abnormal cytology was established at 6.5 HR HPV copies/cell. High-risk HPV detection has a higher yield for detection of CIN2+ cases as compared to manual screening followed by 5% full review, or compared to targeted reviewing of smears positive for oncogenic HPV types, and show diagnostic properties that support its use as a QC procedure in cytologic laboratories. PMID:18544049

  9. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  10. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... by Gas Chromatography (incorporated by reference see § 98.7). All gas composition monitors shall be...-90 (Reapproved 2006) Standard Practice for Analysis of Reformed Gas by Gas Chromatography... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements...

  11. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... by Gas Chromatography (incorporated by reference see § 98.7). All gas composition monitors shall be...-90 (Reapproved 2006) Standard Practice for Analysis of Reformed Gas by Gas Chromatography... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements...

  12. Quantity is nothing without quality: automated QA/QC for streaming sensor networks

    Treesearch

    John L. Campbell; Lindsey E. Rustad; John H. Porter; Jeffrey R. Taylor; Ethan W. Dereszynski; James B. Shanley; Corinna Gries; Donald L. Henshaw; Mary E. Martin; Wade. M. Sheldon; Emery R. Boose

    2013-01-01

    Sensor networks are revolutionizing environmental monitoring by producing massive quantities of data that are being made publically available in near real time. These data streams pose a challenge for ecologists because traditional approaches to quality assurance and quality control are no longer practical when confronted with the size of these data sets and the...

  13. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia.

    PubMed

    Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael

    2015-01-21

    Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24

  14. Clopidogrel plus aspirin versus aspirin alone for preventing cardiovascular events.

    PubMed

    Squizzato, Alessandro; Bellesini, Marta; Takeda, Andrea; Middeldorp, Saskia; Donadini, Marco Paolo

    2017-12-14

    Aspirin is the prophylactic antiplatelet drug of choice for people with cardiovascular disease. Adding a second antiplatelet drug to aspirin may produce additional benefit for people at high risk and people with established cardiovascular disease. This is an update to a previously published review from 2011. To review the benefit and harm of adding clopidogrel to aspirin therapy for preventing cardiovascular events in people who have coronary disease, ischaemic cerebrovascular disease, peripheral arterial disease, or were at high risk of atherothrombotic disease, but did not have a coronary stent. We updated the searches of CENTRAL (2017, Issue 6), MEDLINE (Ovid, 1946 to 4 July 2017) and Embase (Ovid, 1947 to 3 July 2017) on 4 July 2017. We also searched ClinicalTrials.gov and the WHO ICTRP portal, and handsearched reference lists. We applied no language restrictions. We included all randomised controlled trials comparing over 30 days use of aspirin plus clopidogrel with aspirin plus placebo or aspirin alone in people with coronary disease, ischaemic cerebrovascular disease, peripheral arterial disease, or at high risk of atherothrombotic disease. We excluded studies including only people with coronary drug-eluting stent (DES) or non-DES, or both. We collected data on mortality from cardiovascular causes, all-cause mortality, fatal and non-fatal myocardial infarction, fatal and non-fatal ischaemic stroke, major and minor bleeding. The overall treatment effect was estimated by the pooled risk ratio (RR) with 95% confidence interval (CI), using a fixed-effect model (Mantel-Haenszel); we used a random-effects model in cases of moderate or severe heterogeneity (I 2 ≥ 30%). We assessed the quality of the evidence using the GRADE approach. We used GRADE profiler (GRADE Pro) to import data from Review Manager to create a 'Summary of findings' table. The search identified 13 studies in addition to the two studies in the previous version of our systematic review. Overall

  15. QC operator’s nonneutral posture against musculoskeletal disorder’s (MSDs) risks

    NASA Astrophysics Data System (ADS)

    Kautsar, F.; Gustopo, D.; Achmadi, F.

    2018-04-01

    Musculoskeletal disorders refer to a gamut of inflammatory and degenerative disorders aggravated largely by the performance of work. It is the major cause of pain, disability, absenteeism and reduced productivity among workers worldwide. Although it is not fatal, MSDs have the potential to develop into serious injuries in the musculoskeletal system if ignored. QC operators work in nonneutral body posture. This cross-sectional study was condusted in order to investigate correlation between risk assessment results of QEC and body posture calculation of mannequin pro. Statistical analysis was condusted using SPSS version 16.0. Validity test, Reliability test and Regression analysis were conducted to compare the risk assessment output of applied method and nonneutral body posture simulation. All of QEC’s indicator classified as valid and reliable. The result of simple regression anlysis are back (0.326<4.32), shoulder/arm (8.489>4.32), wrist/hand (4.86 >4.32) and neck (1.298 <4.32). Result of this study shows that there is an influence between nonneutral body posture of the QC operator during work with risk of musculoskeletal disorders. The potential risk of musculoskeletal disorders is in the shoulder/arm and wrist/hand of the QC operator, whereas the back and neck are not affected.

  16. Casscf/ci Calculations for First Row Transition Metal Hydrides - the TIH(4-PHI), VH(5-DELTA), CRH(6-SIGMA-PLUS), MNH(7-SIGMA-PLUS), FEH(4,6-DELTA) and NIH(2-DELTA) States

    NASA Astrophysics Data System (ADS)

    Walch, S. P.; Bauschlicher, C. W., Jr.

    1983-04-01

    Calculations are performed for the predicted ground states of TiH(4-phi), VH(5-delta), CrH(6-sigma-plus), MnH(7-sigma-plus), Fett(4,6-delta) and NiH(2-delta). For FeH both the 6-delta and 4-delta states are studied, since both are likely candidates for the ground state. The ground state symmetries are predicted based on a combination of atomic coupling arguments and coupling of 4s(2)3d(n) and 4s(1)3d(n+1) terms in the molecular system. Electron correlation is included by a CASSCF/CI (SD) treatment. The CASSCF includes near-degeneracy effects, while correlation of the 3d electrons in included at the CI level.

  17. Development of models to estimate the subgrade and subbase layers' resilient modulus from in situ devices test results for construction control.

    DOT National Transportation Integrated Search

    2008-04-01

    The objective of this study was to develop resilient modulus prediction models for possible application in the quality control/quality assurance (QC/QA) procedures during and after the construction of pavement layers. Field and laboratory testing pro...

  18. Evaluation of the potential use of hybrid LC-MS/MS for active drug quantification applying the 'free analyte QC concept'.

    PubMed

    Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F

    2017-11-01

    Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.

  19. Improvement of the quality of work in a biochemistry laboratory via measurement system analysis.

    PubMed

    Chen, Ming-Shu; Liao, Chen-Mao; Wu, Ming-Hsun; Lin, Chih-Ming

    2016-10-31

    An adequate and continuous monitoring of operational variations can effectively reduce the uncertainty and enhance the quality of laboratory reports. This study applied the evaluation rule of the measurement system analysis (MSA) method to estimate the quality of work conducted in a biochemistry laboratory. Using the gauge repeatability & reproducibility (GR&R) approach, variations in quality control (QC) data among medical technicians in conducting measurements of five biochemical items, namely, serum glucose (GLU), aspartate aminotransferase (AST), uric acid (UA), sodium (Na) and chloride (Cl), were evaluated. The measurements of the five biochemical items showed different levels of variance among the different technicians, with the variances in GLU measurements being higher than those for the other four items. The ratios of precision-to-tolerance (P/T) for Na, Cl and GLU were all above 0.5, implying inadequate gauge capability. The product variation contribution of Na was large (75.45% and 31.24% in normal and abnormal QC levels, respectively), which showed that the impact of insufficient usage of reagents could not be excluded. With regard to reproducibility, high contributions (of more than 30%) of variation for the selected items were found. These high operator variation levels implied that the possibility of inadequate gauge capacity could not be excluded. The analysis of variance (ANOVA) of GR&R showed that the operator variations in GLU measurements were significant (F=5.296, P=0.001 in the normal level and F=3.399, P=0.015 in the abnormal level, respectively). In addition to operator variations, product variations of Na were also significant for both QC levels. The heterogeneity of variance for the five technicians showed significant differences for the Na and Cl measurements in the normal QC level. The accuracy of QC for five technicians was identified for further operational improvement. This study revealed that MSA can be used to evaluate product and

  20. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements..., shale, iron oxide, and alumina). Facilities that opt to use the default total organic carbon factor... quantity of each category of raw materials consumed by the facility (e.g., limestone, sand, shale, iron...

  1. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements..., shale, iron oxide, and alumina). Facilities that opt to use the default total organic carbon factor... quantity of each category of raw materials consumed by the facility (e.g., limestone, sand, shale, iron...

  2. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements..., shale, iron oxide, and alumina). Facilities that opt to use the default total organic carbon factor... quantity of each category of raw materials consumed by the facility (e.g., limestone, sand, shale, iron...

  3. 40 CFR 98.144 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Glass Production § 98.144 Monitoring and QA/QC requirements. (a) You must measure annual amounts of carbonate-based raw materials charged to each continuous glass... calibrated scales or weigh hoppers. Total annual mass charged to glass melting furnaces at the facility shall...

  4. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Relative Molecular Mass of Petroleum Oils from Viscosity Measurements (incorporated by reference, see § 98... Weight) of Hydrocarbons by Thermoelectric Measurement of Vapor Pressure (incorporated by reference, see... measurements according to the monitoring and QA/QC requirements for the Tier 3 methodology in § 98.34(b). (e...

  5. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...

  6. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...

  7. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...

  8. 40 CFR 98.414 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Industrial Greenhouse Gases § 98.414 Monitoring... or better. If the mass in paragraph (a) of this section is measured by weighing containers that...

  9. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids... product or natural gas liquid on any day of each calendar month of the reporting year in which the...

  10. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning

    PubMed Central

    Kim, James; Li, Li; Liu, Hui

    2018-01-01

    Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796

  11. KAT: a K-mer analysis toolkit to quality control NGS datasets and genome assemblies.

    PubMed

    Mapleson, Daniel; Garcia Accinelli, Gonzalo; Kettleborough, George; Wright, Jonathan; Clavijo, Bernardo J

    2017-02-15

    De novo assembly of whole genome shotgun (WGS) next-generation sequencing (NGS) data benefits from high-quality input with high coverage. However, in practice, determining the quality and quantity of useful reads quickly and in a reference-free manner is not trivial. Gaining a better understanding of the WGS data, and how that data is utilized by assemblers, provides useful insights that can inform the assembly process and result in better assemblies. We present the K-mer Analysis Toolkit (KAT): a multi-purpose software toolkit for reference-free quality control (QC) of WGS reads and de novo genome assemblies, primarily via their k-mer frequencies and GC composition. KAT enables users to assess levels of errors, bias and contamination at various stages of the assembly process. In this paper we highlight KAT's ability to provide valuable insights into assembly composition and quality of genome assemblies through pairwise comparison of k-mers present in both input reads and the assemblies. KAT is available under the GPLv3 license at: https://github.com/TGAC/KAT . bernardo.clavijo@earlham.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  12. The effect of FT500 Plus(®) on ovarian stimulation in PCOS women.

    PubMed

    Alviggi, Carlo; Cariati, Federica; Conforti, Alessandro; De Rosa, Pasquale; Vallone, Roberta; Strina, Ida; Pivonello, Rosario; De Placido, Giuseppe

    2016-01-01

    Both oxidative stress and polycystic ovary syndrome have been involved in several aspects of female reproduction. In this retrospective observational study, the outcome of controlled ovarian stimulation and follicular microenvironment of twenty-five women affected by PCOS (Group A) have been explored, evaluating the effects of myo-inositol in association with antioxidant activities (FT500 Plus(®)). Twenty-five untreated-PCOS women (Group B) with similar characteristics served as control group. Although there was no difference in ovarian volume at time zero, this parameter was significantly smaller at the 5-month follow-up in the Group A (11.1±0.9 versus 13.5±1; P=0.0001). Group A showed a significant increase in the number of MII oocytes (6.3±2.5 versus 4.5±2; P=0.03) and glutathione peroxidase activity in follicular fluid (15.4±6.2 versus 11±2.2; P=0.04). FT500 Plus(®) may be considered in PCOS patient for improving oocyte quality. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. 77 FR 75968 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-26

    ... information unless it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality... required to perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380-1, Quality Control Review Schedule is for State use to collect both QC data and case...

  14. 40 CFR 98.444 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Geologic Sequestration of Carbon Dioxide § 98.444 Monitoring... volume of contents in all containers if you receive CO2 in containers by following the most appropriate...

  15. 40 CFR 98.444 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Geologic Sequestration of Carbon Dioxide § 98.444 Monitoring... volume of contents in all containers if you receive CO2 in containers by following the most appropriate...

  16. 40 CFR 98.444 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Geologic Sequestration of Carbon Dioxide § 98.444 Monitoring... volume of contents in all containers if you receive CO2 in containers by following the most appropriate...

  17. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... each petroleum product or natural gas liquid on any day of each calendar month of the reporting year in...

  18. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... each petroleum product or natural gas liquid on any day of each calendar month of the reporting year in...

  19. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... or natural gas liquid on any day of each calendar month of the reporting year in which the quantity...

  20. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... each petroleum product or natural gas liquid on any day of each calendar month of the reporting year in...

  1. Method for Examination and Documentation of Basic Information and Metadata from Published Reports Relevant to the Study of Stormwater Runoff Quality

    USGS Publications Warehouse

    Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.

    1999-01-01

    A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.

  2. The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.

    2014-12-01

    A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).

  3. Uniform Federal Policy for Quality Assurance Project Plans: Evaluating, Assessing, and Documenting Environmental Data Collection and Use Programs. Part 1. UFP-QAPP Manual

    DTIC Science & Technology

    2004-07-01

    sampler, project manager, data reviewer, statistician , risk assessor, assessment personnel, and laboratory QC manager. In addition, a complete copy of...sample • Corrective actions to be taken if the QC sample fails these criteria • A description of how the QC data and results are to be documented and...Intergovernmental Data Quality Task Force Uniform Federal Policy for Quality Assurance Project Plans Evaluating, Assessing, and Documenting

  4. Illumina GA IIx& HiSeq 2000 Production Sequenccing and QC Analysis Pipelines at the DOE Joint Genome Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daum, Christopher; Zane, Matthew; Han, James

    2011-01-31

    The U.S. Department of Energy (DOE) Joint Genome Institute's (JGI) Production Sequencing group is committed to the generation of high-quality genomic DNA sequence to support the mission areas of renewable energy generation, global carbon management, and environmental characterization and clean-up. Within the JGI's Production Sequencing group, a robust Illumina Genome Analyzer and HiSeq pipeline has been established. Optimization of the sesequencer pipelines has been ongoing with the aim of continual process improvement of the laboratory workflow, reducing operational costs and project cycle times to increases ample throughput, and improving the overall quality of the sequence generated. A sequence QC analysismore » pipeline has been implemented to automatically generate read and assembly level quality metrics. The foremost of these optimization projects, along with sequencing and operational strategies, throughput numbers, and sequencing quality results will be presented.« less

  5. Revisiting Cyberbullying in Schools Using the Quality Circle Approach

    ERIC Educational Resources Information Center

    Paul, Simone; Smith, Peter K.; Blumberg, Herbert H.

    2012-01-01

    An earlier study reported the use of Quality Circles (QC) in a UK school in the context of understanding and reducing bullying and cyberbullying. Here, we report further work in the same school setting. The QC approach allows explorative analysis of problems in school settings, whereby students embark on a problem-solving exercise over a period of…

  6. [Quality assurance in diagnostic radiology using an automated system. Experience and results].

    PubMed

    Princivalli, M; Stea, L; Ordóñez, P L; Bussoli, L; Marchetti, C

    1995-05-01

    The authors report their personal experience with the use of an integrated quality control system in the radiology department. The system we used was the RTI DIGI-X Plus, a Swedish-made product, allowing a wide range of parameters to be measured on diagnostic X-ray units for general radiography, mammography and fluoroscopy. Data can be retrieved with a minimum number of measurements. The "oRTIgo" software improves the quality assurance system and ensures document compliance with international recommendations. The equipment consists of a detector unit, a processor and a display unit. The detector consists of a rotating holder with 12 combinations of metal filters of various thickness mounted in front of two photo-diodes covered with identical X-ray intensifying gadoliniumoxysulfide screens. This unit is connected to a data acquisition system controlled by a microcomputer. Peak tube voltage and total tube filtration are derived from the ratio of detector signals. The relationship between this ratio and the measured quantity is determined by a calibration procedure. Furthermore, exposure time "mAs" value, "mAs" linearity and exposure (or kerma in air) can be measured. Digital storage can be performed and input signals displayed. A serial interface is used to communicate with a PC for QC management purposes. An error propagation model is used to determine the inaccuracy of peak tube voltage measurements. With the DIGI-X Plus system, measurements can be carried out in a shorter time and the stored data reprocessed later on. After QA testing on 20 X-ray units in the radiology department, in vivo doses were measured using a TLD Harshaw 100 on 46 randomly selected patients undergoing chest examinations. The results are reported and analyzed following the NRPB protocol and show high agreement with the recommended values.

  7. Health-related quality of life in women with newly diagnosed polycystic ovary syndrome randomized between clomifene citrate plus metformin or clomifene citrate plus placebo.

    PubMed

    Moll, E; van Wely, M; Lambalk, C B; Bossuyt, P M M; van der Veen, F

    2012-11-01

    What is the health-related quality of life (HRQoL) in women with polycystic ovary syndrome (PCOS) undergoing ovulation induction with clomifene citrate (CC) combined with metformin compared with those using CC combined with placebo? Overall quality of life in women with PCOS treated with CC plus metformin was significantly lower than in women treated with CC plus placebo. There are no data on HRQoL in adult women who receive ovulation induction with the purpose of conceiving. Women with PCOS have higher scores on depression and anxiety scales and lower QoL scores than women without PCOS. This study was a secondary analysis of a multi-centre RCT completed between June 2001 and May 2004. The randomization was stratified per centre, and the centres received blinded, numbered containers with medication. There were172 women available for the HRQoL assessment: 85 were allocated to metformin and 87 were allocated to placebo. The Rotterdam Symptom Checklist (RSCL), a standard self-administered questionnaire, was used to assess physical symptoms, psychological distress, activity levels and overall HRQoL. In the intention to treat analysis, we found differences between the treatment groups with respect to physical symptoms and overall HRQoL. Physical well-being was significantly impaired in women allocated to metformin but not in women allocated to placebo. The increase in physical symptoms in the metformin group was caused by side-effects typical of metformin, and was most pronounced at Week 1 (mean difference 12 [95% confidence interval (CI): 8-16] and still apparent at Week 16 [mean difference 7 (95% CI 2-12]. Overall well-being was significantly impaired in the metformin group compared with the placebo group [mean difference 13 (95% CI 6-20)]. RSCL measurements were available only for three quarters of the participants. Although the number of missing questionnaires and the baseline measurements, were comparable between the treatment groups, some form of selection bias

  8. Absence of growth hormone (GH) secretion after the administration of either GH-releasing hormone (GHRH), GH-releasing peptide (GHRP-6), or GHRH plus GHRP-6 in children with neonatal pituitary stalk transection.

    PubMed

    Pombo, M; Barreiro, J; Peñalva, A; Peino, R; Dieguez, C; Casanueva, F F

    1995-11-01

    GH-releasing peptide (GHRP-6; His-D-Trp-Ala-Trp-D-Phe-Lys-NH2) is a synthetic compound that releases GH in a specific and dose-related manner through mechanisms and a point of action that are mostly unknown, but different from those of GHRH. In man, GHRP-6 is more efficacious than GHRH, and a striking synergistic action occurs when both compounds are administered together. To explain such a synergistic effect, it has been postulated, but not proven, that GHRP-6 acts through a double mechanism, with actions exerted at the pituitary and the hypothalamic level. On the other hand, patients with the syndrome of GH deficiency due to perinatal pituitary stalk transection have any hypothalamic factor nonoperandi. The aim of the present study was 3-fold: 1) to further understand how relevant, if at all, the hypothalamic action of GHRP-6 is for GH regulation; 2) to evaluate whether GHRP-6 plus GHRH could be a suitable diagnostic tool in children with pituitary stalk transection; and 3) to compare these results with similar published studies performed in patients with hypothalamo-pituitary disconnection, who developed the disease as adults. Seven patients with GH deficiency and different degrees of panhypopituitarism due to perinatal pituitary stalk transection and 7 age- and sex-matched normal controls were studied. The subjects underwent 3 different tests on separate occasions, being challenged with GHRH (1 microgram/kg, iv), GHRP-6 (1 microgram/kg, iv), or GHRH plus GHRP-6. GH was analyzed as the area under the curve (mean +/- SE; micrograms per L/90 min). In normal subjects, GH secretion was 1029 +/- 202 after GHRH treatment, 1221 +/- 345 after GHRP-6, and 3542 +/- 650 after GHRH plus GHRP-6; the latter value was significantly (P < 0.05) higher than the secretion elicited by GHRH or GHRP-6 alone. In the group of patients with perinatal pituitary stalk transection, the level of GH after GHRH treatment was 116 +/- 22 and was even more reduced (P < 0.05) after GHRP-6

  9. Randomized controlled trial of laparoscopic Heller myotomy plus Dor fundoplication versus Nissen fundoplication for achalasia: long-term results.

    PubMed

    Rebecchi, Fabrizio; Giaccone, Claudio; Farinella, Eleonora; Campaci, Roberto; Morino, Mario

    2008-12-01

    To compare in a prospective, randomized trial the long-term results of laparoscopic Heller myotomy plus Dor fundoplication versus laparoscopic Heller myotomy plus floppy-Nissen for achalasia. Anterior fundoplication is usually performed after Heller myotomy to control GER; however, the incidence of postoperative GER ranges between 10% and 30%. Total fundoplication may aid in reducing GER rates. From December 1993 to September 2002, 153 patients with achalasia underwent Heller laparoscopic myotomy plus antireflux fundoplication. Of these, 9 were excluded from the study. The remaining 144 patients were randomly assigned to 2 treatment groups: Heller laparoscopic myotomy plus anterior fundoplication (Dor procedure) or Heller laparoscopic myotomy plus total fundoplication (floppy-Nissen procedure). The primary end point was incidence of clinical and instrumental GER after a minimum of 60 months follow-up. The secondary end point was recurrence of dysphagia. Follow-up clinical assessments were performed at 1, 3, 12, and 60 months using a modified DeMeester Symptom Scoring System (MDSS). Esophageal manometry and 24-hour pH monitoring were performed at 3, 12, and 60 months postoperative. Of the 144 patients originally included in the study, 138 were available for long-term analysis: 71 (51%) underwent antireflux fundoplication plus a Dor procedure (H + D group) and 67 (49%) antireflux fundoplication plus a Nissen procedure (H + N group). No mortality was observed. The mean follow-up period was 125 months. No statistically significant differences in clinical (5.6% vs. 0%) or instrumental GER (2.8% vs. 0%) were found between the 2 groups; however, a statistically significant difference in dysphagia rates was noted (2.8% vs. 15%; P < 0.001). Although both techniques achieved long-term GER control, the recurrence rate of dysphagia was significantly higher among the patients who underwent Nissen fundoplication. This evidence supports the use of Dor fundoplication as the

  10. Quality Assurance in Stem Cell Banking: Emphasis on Embryonic and Induced Pluripotent Stem Cell Banking.

    PubMed

    Kallur, Therése; Blomberg, Pontus; Stenfelt, Sonya; Tryggvason, Kristian; Hovatta, Outi

    2017-01-01

    For quality assurance (QA) in stem cell banking, a planned system is needed to ensure that the banked products, stem cells, meet the standards required for research, clinical use, and commercial biotechnological applications. QA is process oriented, avoids, or minimizes unacceptable product defects, and particularly encompasses the management and operational systems of the bank, as well as the ethical and legal frameworks. Quality control (QC ) is product oriented and therefore ensures the stem cells of a bank are what they are expected to be. Testing is for controlling, not assuring, product quality, and is therefore a part of QC , not QA. Like QA, QC is essential for banking cells for quality research and translational application (Schwartz et al., Lancet 379:713-720, 2012). Human embryonic stem cells (hESCs), as cells derived from donated supernumerary embryos from in vitro fertilization (IVF) therapy, are different from other stem cell types in resulting from an embryo that has had two donors . This imposes important ethical and legal constraints on the utility of the cells, which, together with quite specific culture conditions, require special attention in the QA system. Importantly, although the origin and derivation of induced pluripotent stem cells (iPSCs ) differ from that of hESCs, many of the principles of QA for hESC banking are applicable to iPSC banking (Stacey et al., Cell Stem Cell 13:385-388, 2013). Furthermore, despite differences between the legal and regulatory frameworks for hESC and iPSC banking between different countries, the requirements for QA are being harmonized (Stacey et al., Cell Stem Cell 13:385-388, 2013; International Stem Cell Banking Initiative, Stem Cell Rev 5:301-314, 2009).

  11. Increasing the quantity and quality of searching for current best evidence to answer clinical questions: protocol and intervention design of the MacPLUS FS Factorial Randomized Controlled Trials.

    PubMed

    Agoritsas, Thomas; Iserman, Emma; Hobson, Nicholas; Cohen, Natasha; Cohen, Adam; Roshanov, Pavel S; Perez, Miguel; Cotoi, Chris; Parrish, Rick; Pullenayegum, Eleanor; Wilczynski, Nancy L; Iorio, Alfonso; Haynes, R Brian

    2014-09-20

    Finding current best evidence for clinical decisions remains challenging. With 3,000 new studies published every day, no single evidence-based resource provides all answers or is sufficiently updated. McMaster Premium LiteratUre Service--Federated Search (MacPLUS FS) addresses this issue by looking in multiple high quality resources simultaneously and displaying results in a one-page pyramid with the most clinically useful at the top. Yet, additional logistical and educational barriers need to be addressed to enhance point-of-care evidence retrieval. This trial seeks to test three innovative interventions, among clinicians registered to MacPLUS FS, to increase the quantity and quality of searching for current best evidence to answer clinical questions. In a user-centered approach, we designed three interventions embedded in MacPLUS FS: (A) a web-based Clinical Question Recorder; (B) an Evidence Retrieval Coach composed of eight short educational videos; (C) an Audit, Feedback and Gamification approach to evidence retrieval, based on the allocation of 'badges' and 'reputation scores.' We will conduct a randomized factorial controlled trial among all the 904 eligible medical doctors currently registered to MacPLUS FS at the hospitals affiliated with McMaster University, Canada. Postgraduate trainees (n=429) and clinical faculty/staff (n=475) will be randomized to each of the three following interventions in a factorial design (AxBxC). Utilization will be continuously recorded through clinicians’ accounts that track logins and usage, down to the level of individual keystrokes. The primary outcome is the rate of searches per month per user during the six months of follow-up. Secondary outcomes, measured through the validated Impact Assessment Method questionnaire, include: utility of answers found (meeting clinicians’ information needs), use (application in practice), and perceived usefulness on patient outcomes. Built on effective models for the point

  12. Towards machine learned quality control: A benchmark for sharpness quantification in digital pathology.

    PubMed

    Campanella, Gabriele; Rajanna, Arjun R; Corsale, Lorraine; Schüffler, Peter J; Yagi, Yukako; Fuchs, Thomas J

    2018-04-01

    Pathology is on the verge of a profound change from an analog and qualitative to a digital and quantitative discipline. This change is mostly driven by the high-throughput scanning of microscope slides in modern pathology departments, reaching tens of thousands of digital slides per month. The resulting vast digital archives form the basis of clinical use in digital pathology and allow large scale machine learning in computational pathology. One of the most crucial bottlenecks of high-throughput scanning is quality control (QC). Currently, digital slides are screened manually to detected out-of-focus regions, to compensate for the limitations of scanner software. We present a solution to this problem by introducing a benchmark dataset for blur detection, an in-depth comparison of state-of-the art sharpness descriptors and their prediction performance within a random forest framework. Furthermore, we show that convolution neural networks, like residual networks, can be used to train blur detectors from scratch. We thoroughly evaluate the accuracy of feature based and deep learning based approaches for sharpness classification (99.74% accuracy) and regression (MSE 0.004) and additionally compare them to domain experts in a comprehensive human perception study. Our pipeline outputs spacial heatmaps enabling to quantify and localize blurred areas on a slide. Finally, we tested the proposed framework in the clinical setting and demonstrate superior performance over the state-of-the-art QC pipeline comprising commercial software and human expert inspection by reducing the error rate from 17% to 4.7%. Copyright © 2017. Published by Elsevier Ltd.

  13. High-resolution audiometry: an automated method for hearing threshold acquisition with quality control.

    PubMed

    Bian, Lin

    2012-01-01

    In clinical practice, hearing thresholds are measured at only five to six frequencies at octave intervals. Thus, the audiometric configuration cannot closely reflect the actual status of the auditory structures. In addition, differential diagnosis requires quantitative comparison of behavioral thresholds with physiological measures, such as otoacoustic emissions (OAEs) that are usually measured in higher resolution. The purpose of this research was to develop a method to improve the frequency resolution of the audiogram. A repeated-measure design was used in the study to evaluate the reliability of the threshold measurements. A total of 16 participants with clinically normal hearing and mild hearing loss were recruited from a population of university students. No intervention was involved in the study. Custom developed system and software were used for threshold acquisition with quality control (QC). With real-ear calibration and monitoring of test signals, the system provided accurate and individualized measure of hearing thresholds that were determined by an analysis based on signal detection theory (SDT). The reliability of the threshold measure was assessed by correlation and differences between the repeated measures. The audiometric configurations were diverse and unique to each individual ear. The accuracy, within-subject reliability, and between-test repeatability are relatively high. With QC, the high-resolution audiograms can be reliably and accurately measured. Hearing thresholds measured as ear canal sound pressures with higher frequency resolution can provide more customized hearing-aid fitting. The test system may be integrated with other physiological measures, such as OAEs, into a comprehensive evaluative tool. American Academy of Audiology.

  14. Assessment of fludarabine plus cyclophosphamide for patients with chronic lymphocytic leukaemia (the LRF CLL4 Trial): a randomised controlled trial.

    PubMed

    Catovsky, D; Richards, S; Matutes, E; Oscier, D; Dyer, M J S; Bezares, R F; Pettitt, A R; Hamblin, T; Milligan, D W; Child, J A; Hamilton, M S; Dearden, C E; Smith, A G; Bosanquet, A G; Davis, Z; Brito-Babapulle, V; Else, M; Wade, R; Hillmen, P

    2007-07-21

    Previous studies of patients with chronic lymphocytic leukaemia reported high response rates to fludarabine combined with cyclophosphamide. We aimed to establish whether this treatment combination provided greater survival benefit than did chlorambucil or fludarabine. 777 patients with chronic lymphocytic leukaemia requiring treatment were randomly assigned to fludarabine (n=194) or fludarabine plus cyclophosphamide (196) for six courses, or chlorambucil (387) for 12 courses. The primary endpoint was overall survival, with secondary endpoints of response rates, progression-free survival, toxic effects, and quality of life. Analysis was by intention to treat. This study is registered as an International Standard Randomised Controlled Trial, number NCT 58585610. There was no significant difference in overall survival between patients given fludarabine plus cyclophosphamide, fludarabine, or chlorambucil. Complete and overall response rates were better with fludarabine plus cyclophosphamide than with fludarabine (complete response rate 38%vs 15%, respectively; overall response rate 94%vs 80%, respectively; p<0.0001 for both comparisons), which were in turn better than with chlorambucil (complete response rate 7%, overall response rate 72%; p=0.006 and 0.04, respectively). Progression-free survival at 5 years was significantly better with fludarabine plus cyclophosphamide (36%) than with fludarabine (10%) or chlorambucil (10%; p<0.00005). Fludarabine plus cyclophosphamide was the best combination for all ages, including patients older than 70 years, and in prognostic groups defined by immunoglobulin heavy chain gene (V(H)) mutation status and cytogenetics, which were tested in 533 and 579 cases, respectively. Patients had more neutropenia and days in hospital with fludarabine plus cyclophosphamide, or fludarabine, than with chlorambucil. There was less haemolytic anaemia with fludarabine plus cyclophosphamide (5%) than with fludarabine (11%) or chlorambucil (12

  15. Construction of type-II QC-LDPC codes with fast encoding based on perfect cyclic difference sets

    NASA Astrophysics Data System (ADS)

    Li, Ling-xiang; Li, Hai-bing; Li, Ji-bi; Jiang, Hua

    2017-09-01

    In view of the problems that the encoding complexity of quasi-cyclic low-density parity-check (QC-LDPC) codes is high and the minimum distance is not large enough which leads to the degradation of the error-correction performance, the new irregular type-II QC-LDPC codes based on perfect cyclic difference sets (CDSs) are constructed. The parity check matrices of these type-II QC-LDPC codes consist of the zero matrices with weight of 0, the circulant permutation matrices (CPMs) with weight of 1 and the circulant matrices with weight of 2 (W2CMs). The introduction of W2CMs in parity check matrices makes it possible to achieve the larger minimum distance which can improve the error- correction performance of the codes. The Tanner graphs of these codes have no girth-4, thus they have the excellent decoding convergence characteristics. In addition, because the parity check matrices have the quasi-dual diagonal structure, the fast encoding algorithm can reduce the encoding complexity effectively. Simulation results show that the new type-II QC-LDPC codes can achieve a more excellent error-correction performance and have no error floor phenomenon over the additive white Gaussian noise (AWGN) channel with sum-product algorithm (SPA) iterative decoding.

  16. The Quality Control Algorithms Used in the Process of Creating the NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.

  17. Croup: MedlinePlus Health Topic

    MedlinePlus

    ... federal government agencies. MedlinePlus also links to health information from non-government Web sites. See our disclaimer about external links and our quality guidelines . About MedlinePlus Site Map FAQs Customer Support ...

  18. Identifying and attributing common data quality problems: temperature and precipitation observations in Bolivia and Peru

    NASA Astrophysics Data System (ADS)

    Hunziker, Stefan; Gubler, Stefanie; Calle, Juan; Moreno, Isabel; Andrade, Marcos; Velarde, Fernando; Ticona, Laura; Carrasco, Gualberto; Castellón, Yaruska; Oria Rojas, Clara; Brönnimann, Stefan; Croci-Maspoli, Mischa; Konzelmann, Thomas; Rohrer, Mario

    2016-04-01

    Assessing climatological trends and extreme events requires high-quality data. However, for many regions of the world, observational data of the desired quality is not available. In order to eliminate errors in the data, quality control (QC) should be applied before data analysis. If the data still contains undetected errors and quality problems after QC, a consequence may be misleading and erroneous results. A region which is seriously affected by observational data quality problems is the Central Andes. At the same time, climatological information on ongoing climate change and climate risks are of utmost importance in this area due to its vulnerability to meteorological extreme events and climatic changes. Beside data quality issues, the lack of metadata and the low station network density complicate quality control and assessment, and hence, appropriate application of the data. Errors and data problems may occur at any point of the data generation chain, e.g. due to unsuitable station configuration or siting, poor station maintenance, erroneous instrument reading, or inaccurate data digitalization and post processing. Different measurement conditions in the predominantly conventional station networks in Bolivia and Peru compared to the mostly automated networks e.g. in Europe or Northern America may cause different types of errors. Hence, applying QC methods used on state of the art networks to Bolivian and Peruvian climate observations may not be suitable or sufficient. A comprehensive amount of Bolivian and Peruvian maximum and minimum temperature and precipitation in-situ measurements were analyzed to detect and describe common data quality problems. Furthermore, station visits and reviews of the original documents were done. Some of the errors could be attributed to a specific source. Such information is of great importance for data users, since it allows them to decide for what applications the data still can be used. In ideal cases, it may even allow to

  19. Insight into the Development of Dissolution Media for BCS Class II Drugs: A Review from Quality Control and Prediction of In Vivo Performance Perspectives.

    PubMed

    Wu, Chunnuan; Liu, Yan; He, Zhonggui; Sun, Jin

    2016-01-01

    To assess in vivo behavior through in vitro method, the dissolution test is mostly used, both for quality control (QC) and for development purpose. In view of the fact that a dissolution test can hardly achieve two goals at the same time, the design of dissolution testing generally varies along with the development stage of drug products and therefore the selection of dissolution media may change with the goals of the dissolution test. To serve the QC purpose, a dissolution medium is designed to provide a sink condition; for development purpose, the dissolution medium is required to simulate the physiological conditions in the gastrointestinal tract as far as possible. In this review, we intended to provide an initial introduction to the various dissolution media applied for QC and formulation development purposes for poorly water soluble drugs. We focused on these methods like addition of cosolvents, surfactants and utilization of biphasic media, applied to provide sink conditions which are difficult to be achieved by simple aqueous buffers for lipophilic drugs, and introduced the development of physiologically relevant media for human and animals like dog and rat with respect to the choice of buffers, bile salts, lipids and so on. In addition, we further discussed the influence of biorelevant dissolution media on the modification of drug Biopharmaceutical Classification System (BCS) classification, especially for BCS class II drugs with low solubility and high permeability, the solubility of which is relatively sensitive to the presence of bile salts and lipids.

  20. Aircraft Measurements for Understanding Air-Sea Coupling and Improving Coupled Model Predictions Over the Indian Ocean

    DTIC Science & Technology

    2012-09-30

    briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http://catalog.eol.ucar.edu/cgi- bin/dynamo/report...index); • Dropsonde data processing on all P3 flights and realtime QC/reporting to GTS; and • Science summary of aircraft missions posted on EOL ...data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop at EOL /NCAR from 6-7 February 2012

  1. An evaluation of the Parents Plus-Parenting When Separated programme.

    PubMed

    Keating, Adele; Sharry, John; Murphy, Michelle; Rooney, Brendan; Carr, Alan

    2016-04-01

    This study evaluated the Parents Plus-Parenting when Separated Programme, an intervention specifically designed to address the needs of separated parents in an Irish context. In a randomized control trial, 82 separated parents with young children were assigned to the Parents Plus-Parenting when Separated Programme treatment group and 79 to a waiting-list control group. They were assessed on measures of client goals, parenting satisfaction, child and parental adjustment and interparental conflict at baseline (Time 1) and 6 weeks later (Time 2), after the treatment group completed the Parents Plus-Parenting when Separated Programme. From Time 1 to 2, significant goal attainment, increases in parenting satisfaction and decreases in child behaviour problems, parental adjustment problems and interparental conflict occurred in the Parents Plus-Parenting when Separated Programme group, but not in the control group. These results supported the effectiveness of Parents Plus-Parenting when Separated Programme, which should be made more widely available to separated parents. © The Author(s) 2015.

  2. Development of a portable quality control application using a tablet-type electronic device.

    PubMed

    Ono, Tomohiro; Miyabe, Yuki; Akimoto, Mami; Mukumoto, Nobutaka; Ishihara, Yoshitomo; Nakamura, Mitsuhiro; Mizowaki, Takashi

    2018-03-01

    Our aim was to develop a portable quality control (QC) application using a thermometer, a barometer, an angle gauge, and a range finder implemented in a tablet-type consumer electronic device (CED) and to assess the accuracies of the measurements made. The QC application was programmed using Java and OpenCV libraries. First, temperature and atmospheric pressure were measured over 30 days using the temperature and pressure sensors of the CED and compared with those measured by a double-tube thermometer and a digital barometer. Second, the angle gauge was developed using the accelerometer of the CED. The roll and pitch angles of the CED were measured from 0 to 90° at intervals of 10° in the clockwise (CW) and counterclockwise (CCW) directions. The values were compared with those measured by a digital angle gauge. Third, a range finder was developed using the tablet's built-in camera and image-processing capacities. Surrogate markers were detected by the camera and their positions converted to actual positions using a homographic transformation method. Fiducial markers were placed on a treatment couch and moved 100 mm in 10-mm steps in both the lateral and longitudinal directions. The values were compared with those measured by the digital output of the treatment couch. The differences between CED values and those of other devices were compared by calculating means ± standard deviations (SDs). The means ± SDs of differences in temperature and atmospheric pressure were -0.07 ± 0.25°C and 0.05 ± 0.10 hPa, respectively. The means ± SDs of the difference in angle was -0.17 ± 0.87° (0.15 ± 0.23° degrees excluding the 90° angle). The means ± SDs of distances were 0.01 ± 0.07 mm in both the lateral and longitudinal directions. Our portable QC application was accurate and may be used instead of standard measuring devices. Our portable CED is efficient and simple when used in the field of medical physics. © 2018 American Association of

  3. Technical Note: Display window setting: An important factor for detecting subtle but clinically relevant artifacts in daily CT quality control.

    PubMed

    Long, Zaiyang; Bruesewitz, Michael R; Sheedy, Emily N; Powell, Michele A; Kramer, Jacqualynn C; Supalla, Randall R; Colvin, Chance M; Bechel, Jessica R; Favazza, Christopher P; Kofler, James M; Leng, Shuai; McCollough, Cynthia H; Yu, Lifeng

    2016-12-01

    This study aimed to investigate the influence of display window setting on technologist performance detecting subtle but clinically relevant artifacts in daily computed tomography (CT) quality control (dQC) images. Fifty three sets of dQC images were retrospectively selected, including 30 sets without artifacts, and 23 with subtle but clinically relevant artifacts. They were randomized and shown to six CT technologists (two new and four experienced). Each technologist reviewed all images in each of two sessions, one with a display window width (WW) of 100 HU, which is currently recommended by the American College of Radiology, and the other with a narrow WW of 40 HU, both at a window level of 0 HU. For each case, technologists rated the presence of image artifacts based on a five point scale. The area under the receiver operating characteristic curve (AUC) was used to evaluate the artifact detection performance. At a WW of 100 HU, the AUC (95% confidence interval) was 0.658 (0.576, 0.740), 0.532 (0.429, 0.635), and 0.616 (0.543, 0.619) for the experienced, new, and all technologists, respectively. At a WW of 40 HU, the AUC was 0.768 (0.687, 0.850), 0.546 (0.433, 0.658), and 0.694 (0.619, 0.769), respectively. The performance significantly improved at WW of 40 HU for experienced technologists (p = 0.009) and for all technologists (p = 0.040). Use of a narrow display WW significantly improved technologists' performance in dQC for detecting subtle but clinically relevant artifacts as compared to that using a 100 HU display WW.

  4. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning.

    PubMed

    Aris-Brosou, Stephane; Kim, James; Li, Li; Liu, Hui

    2018-05-15

    Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. ©Stephane Aris-Brosou, James Kim, Li Li, Hui Liu. Originally published in JMIR Medical Informatics (http

  5. Optimization of cw-QC lasers for Doppler and sub-Doppler molecular spectroscopy

    NASA Astrophysics Data System (ADS)

    Kelly, James F.; Disselkamp, Robert S.; Sams, Robert L.; Blake, Thomas A.; Sharpe, Steven W.; Richter, Dirk A.; Fried, Alan

    2002-09-01

    Inter-subband (Type I) quantum-cascade (QC) lasers have shown the potential to generate tunable mid-IR radiation with narrow intrinsic linewidths (< 160 KHz in 15 mSec sweeps) and excellent amplitude stability (< 3 ppm averaged over minutes). Our bench-scale efforts to develop the Type I distributed feedback (DFB)-QC lasers for fieldable atmospheric chemistry campaigns, where multipass (Herriot or White) cells are used to enhance path-length, have not yet realized performance to the low intrinsic noise levels seen in these devices. By comparison, many operational systems' levels of noise-equivalent-absorbance (NEA) using Pb-salt lasers can routinely achieve at least one-order of magnitude better cw-performance, and with much lower powers. We have found that instability effets from weak back-scattered laser light -primarily from the Herriot cell- results in feedback-implicated technical noise well above the thermal and shot-noise of standard IR detectors. Of more fundamental concern is the fact that planar-stripe DFB-QC lasers undergo beam steering and transverse spatial-mode competitions during current tuning. It is the development of fully automated sub-ppbV sensitive IR chem-sensors. It is possible to reach low-ppm levels of absorptance change-detection (ΔI/I0) over small wavelength regions with careful alignment to 100 M Herriott cells, but extreme care in spatial filtering is critical. However in the case of optical configurations which preclude significant optical feedback and need for stringent mode coupling alignments, the cw-DFB-QC lasers show great promise to do high resolution sub-Doppler spectroscopy. By serendipitous events, a varient of 'mode- or level-crossing' spectroscopy was probably rediscovered, which may allow very high resolution, sub-Doppler features and/or hyperfine alignments to be probed with 'uni-directional' topologies. We will primarily discuss the basic features of the 'uni-directional' sub-Doppler spectroscopy concept in this report

  6. Applying Sigma Metrics to Reduce Outliers.

    PubMed

    Litten, Joseph

    2017-03-01

    Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. AmeriFlux CA-Qc2 Quebec - 1975 Harvested Black Spruce (HBS75)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margolis, Hank

    This is the AmeriFlux version of the carbon flux data for the site CA-Qc2 Quebec - 1975 Harvested Black Spruce (HBS75). Site Description - Quebec - Eastern Boreal; Black Spruce forest harvested in 1975.

  8. [Treatment of transfusion-dependent nonsevere aplastic anemia with cyclosporine A plus ATG/ALG versus cyclosporine A plus androgens: a retrospective single center study].

    PubMed

    Song, L; Peng, G X; Wu, Z J; Zhang, L; Jing, L P; Zhou, K; Li, Y; Li, Y; Ye, L; Li, J P; Fan, H H; Zhao, X; Yang, W R; Yang, Y; Zhang, F K

    2016-11-14

    Objective: To determine whether cyclosporine A (CsA) plus androgens was as effective as the current standard immunosuppressive therapy (IST) for transfusion-dependent nonsevere aplastic anemia (TD-NSAA). Methods: The records of 125 consecutive TD-NSAA patients who were treated between Aug. 2007 and Sept. 2014, with either CsA plus androgen or ALG/ATG plus CsA regimen were reviewed. The 3-month and 6-month hematologic responses and survival were evaluated. Results: There were 125 TD-NSAA patients (70 were male and 55 female, 1.25∶1). Median age was 27 (6-66) years. There was no significant difference in early mortality between 48 treated by ATG/ALG plus CsA and 77 by CsA plus androgen patients (1/48 vs 0/77, P =0.384). Both the total hematologic response and the better hematological response rates at 3-month (70.8% vs 45.5%, P =0.006 and 27.1% vs 10.4%, P =0.015, respectively) and 6-month (75.0% vs 55.8%, P =0.031 and 41.7% vs 22.1% P =0.020, respectively) after treatment were much higher in the standard IST group than that in CsA plus androgen group. The median time to transfusion independent of 36.5 (0-149) days in the standard IST group was significantly shorter than 98 (14-180) days in CsA plus androgen group ( P <0.001). Survival was comparable between the two groups (97.9% vs 100.0%, P =0.227). It was superior (71.2% vs 59.5%) but not significantly ( P =0.227) in event-free survival in standard IST group. Conclusions: CsA plus androgen was inferior to the standard IST of ATG/ALG and CsA regimen in treating TD-NSAA in terms of the hematologic response and the quality of response, despite of comparable short-term survival.

  9. Effects of developer depletion on image quality of Kodak Insight and Ektaspeed Plus films.

    PubMed

    Casanova, M S; Casanova, M L S; Haiter-Neto, F

    2004-03-01

    To evaluate the effect of processing solution depletion on the image quality of F-speed dental X-ray film (Insight), compared with Ektaspeed Plus. The films were exposed with a phantom and developed in manual and automatic conditions, in fresh and progressively depleted solutions. The comparison was based on densitometric analysis and subjective appraisal. The processing solution depletion presented a different behaviour depending on whether manual or automatic technique was used. The films were distinctly affected by depleted processing solutions. The developer depletion was faster in automatic than manual conditions. Insight film was more resistant than Ektaspeed Plus to the effects of processing solution depletion. In the present study there was agreement between the objective and subjective appraisals.

  10. On Quality Control Procedures Being Adopted for TRMM LBA and KWAJEX Soundings Data Sets

    NASA Technical Reports Server (NTRS)

    Roy, B.; Halverson, Jeffrey B.; Starr, David OC. (Technical Monitor)

    2001-01-01

    During NASA's Tropical Rainfall Measuring Mission (TRMM) field campaigns Large Scale Biosphere Atmosphere (LBA) held in Amazonia (Brazil) in the period January- February, 1999, and the Kwajalein Experiment (KWAJEX) held in the Republic of Marshall Islands in the period between August-September, 1999, extensive radiosonde observations (raob) were collected using VIZ and Vaisala sondes which have different response characteristics. In all, 320 raob for LBA and 972 fixed raob for KWAJEX have been obtained and are being processed. Most atmospheric sensible heat source (Q1) and apparent moisture sink (Q2) budget studies are based on sounding data, and the accuracy of the raob is important especially in regions of deep moist convection. A data quality control (QC) project has been initiated at GSFC by the principal investigator (JBH), and this paper addresses some of the quantitative findings for the level I and II QC procedures. Based on these quantitative assessment of sensor (or system) biases associated with each type of sonde, the initial data repair work will be started. Evidence of moisture biases between the two different sondes (VIZ and Vaisala) has been shown earlier by Halverson et al. (2000). Vaisala humidity sensors are found to have a low-level dry bias in the boundary layer, whereas above 600 mb the VIZ sensor tends to register a dryer atmosphere. All raob data were subjected to a limit check based on an algorithm already well tested for the raob data obtained during the Tropical Ocean Global Atmosphere (TOGA-COARE).

  11. A laboratory-based evaluation of exercise plus contingency management for reducing cigarette smoking.

    PubMed

    Kurti, Allison N; Dallery, Jesse

    2014-11-01

    Both contingency management (CM) and exercise have shown promise as smoking cessation treatments, but their combined effects have not been evaluated. The present study evaluated whether CM (in which motivational incentives are provided for abstinence) plus exercise reduced smoking more than either component alone. In a within-subjects design, 20 smokers were exposed to exercise plus CM, exercise plus CM-control (non-contingent incentives), inactivity plus CM, and inactivity plus CM-control. CM increased latencies to smoke and decreased total puffs (Mdns = 39.6 min and .8 puffs, respectively) relative to CM-control (Mdns = 2.5 min and 12.8 puffs). Exercise decreased craving relative to baseline for craving based on both the pleasurable consequences of smoking (D=-10.7 on a 100-point visual analog scale) and anticipated relief from withdrawal (D=-5.9), whereas inactivity increased both components of craving (Ds=7.6 and 3.5). Exercise had no effect on smoking or a measure of temporal discounting. Although exercise decreased craving, it did not affect smoking behavior. Exercise plus CM was not more effective than CM alone. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Submaximal exercise VO2 and Qc during 30-day 6 degrees head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Ertl, A. C.; Bernauer, E. M.

    1996-01-01

    BACKGROUND: Maintaining intermediary metabolism is necessary for the health and well-being of astronauts on long-duration spaceflights. While peak oxygen uptake (VO2) is consistently decreased during prolonged bed rest, submaximal VO2 is either unchanged or decreased. METHODS: Submaximal exercise metabolism (61 +/- 3% peak VO2) was measured during ambulation (AMB day-2) and on bed rest days 4, 11, and 25 in 19 healthy men (32-42 yr) allocated into no exercise (NOE, N = 5) control, and isotonic exercise (ITE, N = 7) and isokinetic exercise (IKE, N = 7) training groups. Exercise training was conducted supine for two 30-min periods per day for 6 d per week: ITE training was intermittent at 60-90% peak VO2; IKE training was 10 sets of 5 repetitions of peak knee flexion-extension force at a velocity of 100 degrees s-1. Cardiac output was measured with the indirect Fick CO2 method, and plasma volume with Evans blue dye dilution. RESULTS: Supine submaximal exercise VO2 decreased significantly (*p < 0.05) by 10.3%* with ITE and by 7.3%* with IKE; similar to the submaximal cardiac output decrease of 14.5%* (ITE) and 20.3%* (IKE), but different from change in peak VO2 (+1.4% with ITE and -10.2%* with IKE) and decrease in plasma volume of -3.7% (ITE) and -18.0%* (IKE). Reduction of submaximal VO2 during bed rest correlated 0.79 (p < 0.01) with submaximal Qc, but was not related to change in peak VO2 or plasma volume. CONCLUSION: Reduction in submaximal oxygen uptake during prolonged bed rest is related to decrease in exercise but not resting cardiac output; perturbations in active skeletal muscle metabolism may be involved.

  13. Educational intervention together with an on-line quality control program achieve recommended analytical goals for bedside blood glucose monitoring in a 1200-bed university hospital.

    PubMed

    Sánchez-Margalet, Víctor; Rodriguez-Oliva, Manuel; Sánchez-Pozo, Cristina; Fernández-Gallardo, María Francisca; Goberna, Raimundo

    2005-01-01

    Portable meters for blood glucose concentrations are used at the patients bedside, as well as by patients for self-monitoring of blood glucose. Even though most devices have important technological advances that decrease operator error, the analytical goals proposed for the performance of glucose meters have been recently changed by the American Diabetes Association (ADA) to reach <5% analytical error and <7.9% total error. We studied 80 meters throughout the Virgen Macarena Hospital and we found most devices with performance error higher than 10%. The aim of the present study was to establish a new system to control portable glucose meters together with an educational program for nurses in a 1200-bed University Hospital to achieve recommended analytical goals, so that we could improve the quality of diabetes care. We used portable glucose meters connected on-line to the laboratory after an educational program for nurses with responsibilities in point-of-care testing. We evaluated the system by assessing total error of the glucometers using high- and low-level glucose control solutions. In a period of 6 months, we collected data from 5642 control samples obtained by 14 devices (Precision PCx) directly from the control program (QC manager). The average total error for the low-level glucose control (2.77 mmol/l) was 6.3% (range 5.5-7.6%), and even lower for the high-level glucose control (16.66 mmol/l), at 4.8% (range 4.1-6.5%). In conclusion, the performance of glucose meters used in our University Hospital with more than 1000 beds not only improved after the intervention, but the meters achieved the analytical goals of the suggested ADA/National Academy of Clinical Biochemistry criteria for total error (<7.9% in the range 2.77-16.66 mmol/l glucose) and optimal total error for high glucose concentrations of <5%, which will improve the quality of care of our patients.

  14. CASSCF/CI calculations for first row transition metal hydrides - The TiH(4-phi), VH(5-delta), CrH(6-sigma-plus), MnH(7-sigma-plus), FeH(4,6-delta) and NiH(2-delta) states

    NASA Technical Reports Server (NTRS)

    Walch, S. P.; Bauschlicher, C. W., Jr.

    1983-01-01

    Calculations are performed for the predicted ground states of TiH(4-phi), VH(5-delta), CrH(6-sigma-plus), MnH(7-sigma-plus), Fett(4,6-delta) and NiH(2-delta). For FeH both the 6-delta and 4-delta states are studied, since both are likely candidates for the ground state. The ground state symmetries are predicted based on a combination of atomic coupling arguments and coupling of 4s(2)3d(n) and 4s(1)3d(n+1) terms in the molecular system. Electron correlation is included by a CASSCF/CI (SD) treatment. The CASSCF includes near-degeneracy effects, while correlation of the 3d electrons in included at the CI level.

  15. Ambient quality of ground water in the vicinity of Naval Submarine Base Bangor, Kitsap County, Washington, 1995

    USGS Publications Warehouse

    Greene, Karen E.

    1997-01-01

    A study of the ambient ground-water quality in the vicinity of Naval Submarine Base (SUBASE) Bangor was conducted to provide the U.S. Navywith background levels of selected constituents.The Navy needs this information to plan and manage cleanup activities on the base. DuringMarch and April 1995, 136 water-supply wells were sampled for common ions, trace elements, and organic compounds; not all wells were sampled for all constituents. Man-made organic compounds were detected in only two of fifty wells, and the sources of these organic compounds were attributed to activities in the immediate vicinities of these off- base wells. Drinking water standards for trichloroethylene, iron, and manganese were exceeded in one of these wells, which was probablycontaminated by an old local (off-base) dump. Ground water from wells open to the following hydrogeologic units (in order from shallow to deep) was investigated: the Vashon till confining unit (Qvt, three wells); the Vashon aquifer (Qva, 54 wells); the Upper confining unit (QC1, 16 wells); the Permeable interbeds within QC1 (QC1pi, 34 wells); and the Sea-level aquifer (QA1, 29 wells).The 50th and 90th percentile ambient background levels of 35 inorganic constituents were determined for each hydrogeologic unit. At least tenmeasurements were required for a constituent in each hydro- geologic unit for determination of ambient background levels, and data for three wellsdetermined to be affected by localized activities were excluded from these analyses. The only drinking water standards exceeded by ambient background levels were secondary maximum contaminant levels for iron (300 micrograms per liter), in QC1 and QC1pi, and manganese (50 micrograms per liter), in all of the units. The 90th percentile values for arsenic in QC1pi, QA1, and for the entire study area are above 5 micrograms per liter, the Model Toxics Control Act Method A value for protecting drinking water, but well below the maximum contaminant level of 50

  16. Effect of controlling herbaceous and woody competing vegetation on wood quality of planted loblolly pine

    Treesearch

    Alexander Clark; Richard F. Daniels; James H. Miller

    2006-01-01

    Southern pine plantations are increasingly established using herbicides to control herbaceous and/or woody competing vegetation to enhance growth, but little is known about the effect on wood quality. A study was established at 13 southern locations in 1984 to examine the effects of complete control of woody, herbaceous, and woody plus herbaceous competition for the...

  17. Evaluation of the Abbott RealTime HCV genotype II plus RUO (PLUS) assay with reference to core and NS5B sequencing.

    PubMed

    Mallory, Melanie A; Lucic, Danijela; Ebbert, Mark T W; Cloherty, Gavin A; Toolsie, Dan; Hillyard, David R

    2017-05-01

    HCV genotyping remains a critical tool for guiding initiation of therapy and selecting the most appropriate treatment regimen. Current commercial genotyping assays may have difficulty identifying 1a, 1b and genotype 6. To evaluate the concordance for identifying 1a, 1b, and genotype 6 between two methods: the PLUS assay and core/NS5B sequencing. This study included 236 plasma and serum samples previously genotyped by core/NS5B sequencing. Of these, 25 samples were also previously tested by the Abbott RealTime HCV GT II Research Use Only (RUO) assay and yielded ambiguous results. The remaining 211 samples were routine genotype 1 (n=169) and genotype 6 (n=42). Genotypes obtained from sequence data were determined using a laboratory-developed HCV sequence analysis tool and the NCBI non-redundant database. Agreement between the PLUS assay and core/NS5B sequencing for genotype 1 samples was 95.8% (162/169), with 96% (127/132) and 95% (35/37) agreement for 1a and 1b samples respectively. PLUS results agreed with core/NS5B sequencing for 83% (35/42) of unselected genotype 6 samples, with the remaining seven "not detected" by the PLUS assay. Among the 25 samples with ambiguous GT II results, 15 were concordant by PLUS and core/NS5B sequencing, nine were not detected by PLUS, and one sample had an internal control failure. The PLUS assay is an automated method that identifies 1a, 1b and genotype 6 with good agreement with gold-standard core/NS5B sequencing and can aid in the resolution of certain genotype samples with ambiguous GT II results. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. New simple and low-cost methods for periodic checks of Cyclone® Plus Storage Phosphor System.

    PubMed

    Edalucci, Elisabetta; Maffione, Anna Margherita; Fornasier, Maria Rosa; De Denaro, Mario; Scian, Giovanni; Dore, Franca; Rubello, Domenico

    2017-01-01

    The recent large use of the Cyclone® Plus Storage Phosphor System, especially in European countries, as imaging system for quantification of radiochemical purity of radiopharmaceuticals raised the problem of setting the periodic controls as required by European Legislation. We described simple, low-cost methods for Cyclone® Plus quality controls, which can be useful to evaluate the performance measurement of this imaging system.

  19. Proteomics as a Quality Control Tool of Pharmaceutical Probiotic Bacterial Lysate Products

    PubMed Central

    Klein, Günter; Schanstra, Joost P.; Hoffmann, Janosch; Mischak, Harald; Siwy, Justyna; Zimmermann, Kurt

    2013-01-01

    Probiotic bacteria have a wide range of applications in veterinary and human therapeutics. Inactivated probiotics are complex samples and quality control (QC) should measure as many molecular features as possible. Capillary electrophoresis coupled to mass spectrometry (CE/MS) has been used as a multidimensional and high throughput method for the identification and validation of biomarkers of disease in complex biological samples such as biofluids. In this study we evaluate the suitability of CE/MS to measure the consistency of different lots of the probiotic formulation Pro-Symbioflor which is a bacterial lysate of heat-inactivated Escherichia coli and Enterococcus faecalis. Over 5000 peptides were detected by CE/MS in 5 different lots of the bacterial lysate and in a sample of culture medium. 71 to 75% of the total peptide content was identical in all lots. This percentage increased to 87–89% when allowing the absence of a peptide in one of the 5 samples. These results, based on over 2000 peptides, suggest high similarity of the 5 different lots. Sequence analysis identified peptides of both E. coli and E. faecalis and peptides originating from the culture medium, thus confirming the presence of the strains in the formulation. Ontology analysis suggested that the majority of the peptides identified for E. coli originated from the cell membrane or the fimbrium, while peptides identified for E. faecalis were enriched for peptides originating from the cytoplasm. The bacterial lysate peptides as a whole are recognised as highly conserved molecular patterns by the innate immune system as microbe associated molecular pattern (MAMP). Sequence analysis also identified the presence of soybean, yeast and casein protein fragments that are part of the formulation of the culture medium. In conclusion CE/MS seems an appropriate QC tool to analyze complex biological products such as inactivated probiotic formulations and allows determining the similarity between lots. PMID

  20. SU-C-9A-02: Structured Noise Index as An Automated Quality Control for Nuclear Medicine: A Two Year Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, J; Christianson, O; Samei, E

    Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issuesmore » in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the

  1. [A Case of Advanced Rectal Cancer Resected Successfully after Induction Chemotherapy with Modified FOLFOX6 plus Panitumumab].

    PubMed

    Yukawa, Yoshimi; Uchima, Yasutake; Kawamura, Minori; Takeda, Osami; Hanno, Hajime; Takayanagi, Shigenori; Hirooka, Tomoomi; Dozaiku, Toshio; Hirooka, Takashi; Aomatsu, Naoki; Hirakawa, Toshiki; Iwauchi, Takehiko; Nishii, Takafumi; Morimoto, Junya; Nakazawa, Kazunori; Takeuchi, Kazuhiro

    2016-05-01

    We report a case of advanced colon cancer that was effectively treated with mFOLFOX6 plus panitumumab combination chemotherapy. The patient was a 54-year-old man who had type 2 colon cancer of the rectum. An abdominal CT scan demonstrated rectal cancer with bulky lymph node metastasis and 1 hepatic node (rectal cancer SI [bladder retroperitoneum], N2M0H1P0, cStage IV). He was treated with mFOLFOX6 plus panitumumab as neoadjuvant chemotherapy. After 4 courses of chemotherapy, CT revealed that the primary lesion and regional metastatic lymph nodes had reduced in size (rectal cancer A, N1H1P0M0, cStage IV). Anterior rectal resection with D3 nodal dissection and left lateral segmentectomy of the liver was performed. The histological diagnosis was tubular adenocarcinoma (tub2-1), int, INF a, pMP, ly0, v0, pDM0, pPM0, R0. He was treated with 4 courses of mFOLFOX6 after surgery. The patient has been in good health without a recurrence for 2 years and 5 months after surgery. This case suggests that induction chemotherapy with mFOLFOX6 plus panitumumab is a potentially effective regimen for advanced colon cancer.

  2. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 6 2011-10-01 2011-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  3. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 6 2012-10-01 2012-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  4. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  5. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 6 2014-10-01 2014-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  6. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 6 2013-10-01 2013-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  7. A randomized controlled trial comparing acetaminophen plus ibuprofen versus acetaminophen plus codeine plus caffeine after outpatient general surgery.

    PubMed

    Mitchell, Alex; van Zanten, Sander Veldhuyzen; Inglis, Karen; Porter, Geoffrey

    2008-03-01

    Narcotics are used extensively in outpatient general surgery but are often poorly tolerated with variable efficacy. Acetaminophen combined with NSAIDs is a possible alternative. The objective of this study was to compare the efficacy of acetaminophen, codeine, and caffeine (Tylenol No. 3) with acetaminophen and ibuprofen for management of pain after outpatient general surgery procedures. A double-blind randomized controlled trial was performed in patients undergoing outpatient inguinal/umbilical/ventral hernia repair or laparoscopic cholecystectomy. Patients were randomized to receive acetaminophen plus codeine plus caffeine (Tylenol No. 3) or acetaminophen plus ibuprofen (AcIBU) 4 times daily for 7 days or until pain-free. Pain intensity, measured four times daily by visual analogue scale, was the primary outcome. Secondary end points included incidence of side effects, patient satisfaction, number of days until patient was pain-free, and use of alternative analgesia. One hundred forty-six patients were randomized (74 Tylenol No. 3 and 72 AcIBU), and 139 (95%) patients completed the study. No significant differences in mean or maximum daily visual analogue scale scores were identified between the 2 groups, except on postoperative day 2, when pain was improved in AcIBU patients (p = 0.025). During the entire week, mean visual analogue scale score was modestly lower in AcIBU patients (p = 0.018). More patients in the AcIBU group, compared with Tylenol No. 3, were satisfied with their analgesia (83% versus 64%, respectively; p = 0.02). There were more side effects with Tylenol No. 3 (57% versus 41%, p = 0.045), and the discontinuation rate was also higher in Tylenol No. 3-treated patients (11% versus 3%, p = 0.044). When compared with Tylenol No. 3, AcIBU was not an inferior analgesic and was associated with fewer side effects and higher patient satisfaction. AcIBU is an effective, low-cost, and safe alternative to codeine-based narcotic analgesia for outpatient

  8. Data Quality Control: Challenges, Methods, and Solutions from an Eco-Hydrologic Instrumentation Network

    NASA Astrophysics Data System (ADS)

    Eiriksson, D.; Jones, A. S.; Horsburgh, J. S.; Cox, C.; Dastrup, D.

    2017-12-01

    Over the past few decades, advances in electronic dataloggers and in situ sensor technology have revolutionized our ability to monitor air, soil, and water to address questions in the environmental sciences. The increased spatial and temporal resolution of in situ data is alluring. However, an often overlooked aspect of these advances are the challenges data managers and technicians face in performing quality control on millions of data points collected every year. While there is general agreement that high quantities of data offer little value unless the data are of high quality, it is commonly understood that despite efforts toward quality assurance, environmental data collection occasionally goes wrong. After identifying erroneous data, data managers and technicians must determine whether to flag, delete, leave unaltered, or retroactively correct suspect data. While individual instrumentation networks often develop their own QA/QC procedures, there is a scarcity of consensus and literature regarding specific solutions and methods for correcting data. This may be because back correction efforts are time consuming, so suspect data are often simply abandoned. Correction techniques are also rarely reported in the literature, likely because corrections are often performed by technicians rather than the researchers who write the scientific papers. Details of correction procedures are often glossed over as a minor component of data collection and processing. To help address this disconnect, we present case studies of quality control challenges, solutions, and lessons learned from a large scale, multi-watershed environmental observatory in Northern Utah that monitors Gradients Along Mountain to Urban Transitions (GAMUT). The GAMUT network consists of over 40 individual climate, water quality, and storm drain monitoring stations that have collected more than 200 million unique data points in four years of operation. In all of our examples, we emphasize that scientists

  9. Evaluation of effective energy for QA and QC: measurement of half-value layer using radiochromic film density.

    PubMed

    Gotanda, T; Katsuda, T; Gotanda, R; Tabuchi, A; Yamamoto, K; Kuwano, T; Yatake, H; Takeda, Y

    2009-03-01

    The effective energy of diagnostic X-rays is important for quality assurance (QA) and quality control (QC). However, the half-value layer (HVL), which is necessary to evaluate the effective energy, is not ubiquitously monitored because ionization-chamber dosimetry is time-consuming and complicated. To verify the applicability of GAFCHROMIC XR type R (GAF-R) film for HVL measurement as an alternative to monitoring with an ionization chamber, a single-strip method for measuring the HVL has been evaluated. Calibration curves of absorbed dose versus film density were generated using this single-strip method with GAF-R film, and the coefficient of determination (r2) of the straight-line approximation was evaluated. The HVLs (effective energies) estimated using the GAF-R film and an ionization chamber were compared. The coefficient of determination (r2) of the straight-line approximation obtained with the GAF-R film was more than 0.99. The effective energies (HVLs) evaluated using the GAF-R film and the ionization chamber were 43.25 keV (5.10 mm) and 39.86 keV (4.45 mm), respectively. The difference in the effective energies determined by the two methods was thus 8.5%. These results suggest that GAF-R might be used to evaluate the effective energy from the film-density growth without the need for ionization-chamber measurements.

  10. Making the Business Case for Energy Savings Plus Health: Indoor Air Quality Guidelines for School Building Upgrades

    EPA Pesticide Factsheets

    The Energy Savings Plus Health Guide equips school districts to integrate indoor air quality protections into school energy efficiency retrofits and other building upgrade projects. This page describes the business case for energy savings in schools.

  11. SU-E-J-14: A Comparison of a 2.5MV Imaging Beam to KV and 6MV Imaging Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nitsch, P; Robertson, D; Balter, P

    Purpose: To compare image quality metrics and dose of TrueBeam V2.0’s 2.5MV imaging beam and kV and 6MV images. Methods: To evaluate the MV image quality, the Standard Imaging QC-3 and Varian Las Vegas (LV) phantoms were imaged using the ‘quality’ and ‘low dose’ modes and then processed using RIT113 V6.3. The LEEDS phantom was used to evaluate the kV image quality. The signal to noise ratio (SNR) was also evaluated in patient images using Matlab. In addition, dose per image was evaluated at a depth of 5cm using solid water for a 28.6 cm × 28.6 cm field size,more » which is representative of the largest jaw settings at an SID of 150cm. Results: The 2.5MV images had lower dose than the 6 MV images and a contrast to noise ratio (CNR) about 1.4 times higher, when evaluated using the QC-3. When energy was held constant but dose varied, the different modes, ‘low dose’ and ‘quality’, showed less than an 8% difference in CNR. The ‘quality’ modes demonstrated better spatial resolution than the ‘low dose’; however, even with the ‘low dose’ all line pairs were distinct except for the 0.75lp/mm on the 2.5MV. The LV phantom was used to measure low contrast detectability and showed similar results to the QC-3. Several patient images all confirmed that SNR were highest in kV images followed by 2.5MV and then 6MV. Qualitatively, for anatomical areas with large variability in thickness, like lateral head and necks, 2.5MV images show more anatomy, such as shoulder position, than kV images. Conclusions: The kV images clearly provide the best image metrics per unit dose. The 2.5MV beam showed excellent contrast at a lower dose than 6MV and may be superior to kV for difficult to image areas that include large changes in anatomical thickness. P Balter: Varian, Sun Nuclear, Philips, CPRIT.« less

  12. Dynamic 99mTc-MAG3 renography: images for quality control obtained by combining pharmacokinetic modelling, an anthropomorphic computer phantom and Monte Carlo simulated scintillation camera imaging

    NASA Astrophysics Data System (ADS)

    Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael

    2013-05-01

    In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.

  13. Comparison of sequential intravenous/oral ciprofloxacin plus metronidazole with intravenous ceftriaxone plus metronidazole for treatment of complicated intra-abdominal infections.

    PubMed

    Wacha, Hannes; Warren, Brian; Bassaris, Harry; Nikolaidis, Paul

    2006-08-01

    Intra-abdominal infections are a substantial clinical problem and an important cause of morbidity and death in the hospital. Optimal treatment requires both source control and antibiotic therapy. Sequential intravenous (IV) to oral therapy may improve patient convenience and reduce total health care costs. In this randomized, double-blind trial, the efficacy of sequential IV-to-oral ciprofloxacin plus metronidazole was compared with ceftriaxone plus metronidazole in adult patients with complicated intra-abdominal infections. The trial enrolled 531 patients, who began with IV therapy. Patients who improved clinically were switched to oral therapy on day three or later. The clinical and bacteriological responses four to six weeks after the end of therapy and the safety of the two regimens were assessed. To maintain blinding, the patients received placebo IV in the ciprofloxacin group or placebo orally in the ceftriaxone group. A total of 475 patients (235 ciprofloxacin plus metronidazole, 240 ceftriaxone plus metronidazole) were valid for evaluation of efficacy. All patients were included in the safety analysis. Of the patients valid for efficacy, 78% of the ciprofloxacin plus metronidazole group and 81% of the ceftriaxone plus metronidazole group were eligible for a switch to oral therapy. The clinical success rates were 98.9% and 96.9%, respectively, which were statistically equivalent. The clinical success rates for all patients, including those on continuous IV therapy, were 90.6% and 87.9%. Source control was achieved in more than 90% of the patients. The bacteriological eradication rates were similar in the two groups. Bacterial complications (e.g., surgical site infections, abscesses) were encountered more often in the ceftriaxone plus metronidazole group. Sequential ciprofloxacin plus metronidazole IV-to-oral therapy was statistically equivalent to ceftriaxone plus metronidazole. The switch to oral therapy with ciprofloxacin plus metronidazole was as

  14. Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary

    DOT National Transportation Integrated Search

    2012-01-01

    When the Indiana Department of Transportation designs : a pavement project, a decision for QC/QA (Quality Control/ : Quality Assurance) or nonQC/QA is made solely : based on the quantity of pavement materials to be used : in the project. Once the ...

  15. Development and implementation of an automated quantitative film digitizer quality control program

    NASA Astrophysics Data System (ADS)

    Fetterly, Kenneth A.; Avula, Ramesh T. V.; Hangiandreou, Nicholas J.

    1999-05-01

    A semi-automated, quantitative film digitizer quality control program that is based on the computer analysis of the image data from a single digitized test film was developed. This program includes measurements of the geometric accuracy, optical density performance, signal to noise ratio, and presampled modulation transfer function. The variability of the measurements was less than plus or minus 5%. Measurements were made on a group of two clinical and two laboratory laser film digitizers during a trial period of approximately four months. Quality control limits were established based on clinical necessity, vendor specifications and digitizer performance. During the trial period, one of the digitizers failed the performance requirements and was corrected by calibration.

  16. Non-invasive quality evaluation of confluent cells by image-based orientation heterogeneity analysis.

    PubMed

    Sasaki, Kei; Sasaki, Hiroto; Takahashi, Atsuki; Kang, Siu; Yuasa, Tetsuya; Kato, Ryuji

    2016-02-01

    In recent years, cell and tissue therapy in regenerative medicine have advanced rapidly towards commercialization. However, conventional invasive cell quality assessment is incompatible with direct evaluation of the cells produced for such therapies, especially in the case of regenerative medicine products. Our group has demonstrated the potential of quantitative assessment of cell quality, using information obtained from cell images, for non-invasive real-time evaluation of regenerative medicine products. However, image of cells in the confluent state are often difficult to evaluate, because accurate recognition of cells is technically difficult and the morphological features of confluent cells are non-characteristic. To overcome these challenges, we developed a new image-processing algorithm, heterogeneity of orientation (H-Orient) processing, to describe the heterogeneous density of cells in the confluent state. In this algorithm, we introduced a Hessian calculation that converts pixel intensity data to orientation data and a statistical profiling calculation that evaluates the heterogeneity of orientations within an image, generating novel parameters that yield a quantitative profile of an image. Using such parameters, we tested the algorithm's performance in discriminating different qualities of cellular images with three types of clinically important cell quality check (QC) models: remaining lifespan check (QC1), manipulation error check (QC2), and differentiation potential check (QC3). Our results show that our orientation analysis algorithm could predict with high accuracy the outcomes of all types of cellular quality checks (>84% average accuracy with cross-validation). Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  17. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 6 2014-10-01 2014-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  18. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 6 2012-10-01 2012-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  19. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 6 2013-10-01 2013-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  20. One fish, two fish, we QC fish: controlling data quality among more than 50 organizations over a four-year period.

    PubMed

    Riddick, L; Simbanin, C

    2001-01-01

    EPA is conducting a National Study of Chemical Residues in Lake Fish Tissue. The study involves five analytical laboratories, multiple sampling teams from each of the 47 participating states, several tribes, all 10 EPA Regions and several EPA program offices, with input from other federal agencies. To fulfill study objectives, state and tribal sampling teams are voluntarily collecting predator and bottom-dwelling fish from approximately 500 randomly selected lakes over a 4-year period. The fish will be analyzed for more than 300 pollutants. The long-term nature of the study, combined with the large number of participants, created several QA challenges: (1) controlling variability among sampling activities performed by different sampling teams from more than 50 organizations over a 4-year period; (2) controlling variability in lab processes over a 4-year period; (3) generating results that will meet the primary study objectives for use by OW statisticians; (4) generating results that will meet the undefined needs of more than 50 participating organizations; and (5) devising a system for evaluating and defining data quality and for reporting data quality assessments concurrently with the data to ensure that assessment efforts are streamlined and that assessments are consistent among organizations. This paper describes the QA program employed for the study and presents an interim assessment of the program's effectiveness.

  1. Cancer--Living with Cancer: MedlinePlus Health Topic

    MedlinePlus

    ... MedlinePlus GO GO About MedlinePlus Site Map FAQs Customer Support Health Topics Drugs & Supplements Videos & Tools Español ... our quality guidelines . About MedlinePlus Site Map FAQs Customer Support Get email updates Subscribe to RSS Follow ...

  2. The Ocean Observatories Initiative Data Management and QA/QC: Lessons Learned and the Path Ahead

    NASA Astrophysics Data System (ADS)

    Vardaro, M.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Smith, M. J.; Kerfoot, J.; Crowley, M. F.

    2016-02-01

    The Ocean Observatories Initiative (OOI) is a multi-decadal, NSF-funded program that will provide long-term, near real-time cabled and telemetered measurements of climate variability, ocean circulation, ecosystem dynamics, air-sea exchange, seafloor processes, and plate-scale geodynamics. The OOI platforms consist of seafloor sensors, fixed moorings, and mobile assets containing over 700 operational instruments in the Atlantic and Pacific oceans. Rutgers University operates the Cyberinfrastructure (CI) component of the OOI, which acquires, processes and distributes data to scientists, researchers, educators and the public. It will also provide observatory mission command and control, data assessment and distribution, and long-term data management. The Rutgers Data Management Team consists of a data manager and four data evaluators, who are tasked with ensuring data completeness and quality, as well as interaction with OOI users to facilitate data delivery and utility. Here we will discuss the procedures developed to guide the data team workflow, the automated QC algorithms and human-in-the-loop (HITL) annotations that are used to flag suspect data (whether due to instrument failures, biofouling, or unanticipated events), system alerts and alarms, long-term data storage and CF (Climate and Forecast) standard compliance, and the lessons learned during construction and the first several months of OOI operations.

  3. QC/QA differences between hot mix asphalt (HMA) and warm mix asphalt (WMA).

    DOT National Transportation Integrated Search

    2013-01-01

    WMA represents a group of technologies which allow a reduction in temperatures at which asphalt mixtures are produced and placed on the road. ODOT Materials Division has conducted preliminary inquiries into QC/QA testing for WMA. Some respondents ind...

  4. Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary

    DOT National Transportation Integrated Search

    2012-01-01

    When the Indiana Department of Transportation designs a pavement project, a decision for QC/QA (Quality Control/ Quality Assurance) or nonQC/QA is made solely based on the quantity of pavement materials to be used in the project. Once the pavement...

  5. A fully-automated one-pot synthesis of [18F]fluoromethylcholine with reduced dimethylaminoethanol contamination via [18F]fluoromethyl tosylate.

    PubMed

    Rodnick, Melissa E; Brooks, Allen F; Hockley, Brian G; Henderson, Bradford D; Scott, Peter J H

    2013-08-01

    A novel one-pot method for preparing [(18)F]fluoromethylcholine ([(18)F]FCH) via in situ generation of [(18)F]fluoromethyl tosylate ([(18)F]FCH2OTs), and subsequent [(18)F]fluoromethylation of dimethylaminoethanol (DMAE), has been developed. [(18)F]FCH was prepared using a GE TRACERlab FXFN, although the method should be readily adaptable to any other fluorine-(18) synthesis module. Initially ditosylmethane was fluorinated to generate [(18)F]FCH2OTs. DMAE was then added and the reaction was heated at 120 °C for 10 min to generate [(18)F]FCH. After this time, reaction solvent was evaporated, and the crude reaction mixture was purified by solid-phase extraction using C(18)-Plus and CM-Light Sep-Pak cartridges to provide [(18)F]FCH formulated in USP saline. The formulated product was passed through a 0.22 µm filter into a sterile dose vial, and submitted for quality control testing. Total synthesis time was 1.25 h from end-of-bombardment. Typical non-decay-corrected yields of [(18)F]FCH prepared using this method were 91 mCi (7% non-decay corrected based upon ~1.3 Ci [(18)F]fluoride), and doses passed all other quality control (QC) tests. A one-pot liquid-phase synthesis of [(18)F]FCH has been developed. Doses contain extremely low levels of residual DMAE (31.6 µg/10 mL dose or ~3 ppm) and passed all other requisite QC testing, confirming their suitability for use in clinical imaging studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. A Fully-automated One-pot Synthesis of [18F]Fluoromethylcholine with Reduced Dimethylaminoethanol Contamination via [18F]Fluoromethyl Tosylate

    PubMed Central

    Rodnick, Melissa E.; Brooks, Allen F.; Hockley, Brian G.; Henderson, Bradford D.; Scott, Peter J. H.

    2013-01-01

    Introduction A novel one-pot method for preparing [18F]fluoromethylcholine ([18F]FCH) via in situ generation of [18F]fluoromethyl tosylate ([18F]FCH2OTs), and subsequent [18F]fluoromethylation of dimethylaminoethanol (DMAE), has been developed. Methods [18F]FCH was prepared using a GE TRACERlab FXFN, although the method should be readily adaptable to any other fluorine-18 synthesis module. Initially ditosylmethane was fluorinated to generate [18F]FCH2OTs. DMAE was then added and the reaction was heated at 120°C for 10 min to generate [18F]FCH. After this time, reaction solvent was evaporated, and the crude reaction mixture was purified by solid-phase extraction using C18-Plus and CM-Light Sep-Pak cartridges to provide [18F]FCH formulated in USP saline. The formulated product was passed through a 0.22 μm filter into a sterile dose vial, and submitted for quality control testing. Total synthesis time was 1.25 hours from end-of-bombardment. Results Typical non-decay-corrected yields of [18F]FCH prepared using this method were 91 mCi (7% non-decay corrected based upon ~1.3 Ci [18F]fluoride), and doses passed all other quality control (QC) tests. Conclusion A one-pot liquid-phase synthesis of [18F]FCH has been developed. Doses contain extremely low levels of residual DMAE (31.6 μg / 10 mL dose or ~3 ppm) and passed all other requisite QC testing, confirming their suitability for use in clinical imaging studies. PMID:23665261

  7. Validation of a two-dimensional liquid chromatography method for quality control testing of pharmaceutical materials.

    PubMed

    Yang, Samuel H; Wang, Jenny; Zhang, Kelly

    2017-04-07

    Despite the advantages of 2D-LC, there is currently little to no work in demonstrating the suitability of these 2D-LC methods for use in a quality control (QC) environment for good manufacturing practice (GMP) tests. This lack of information becomes more critical as the availability of commercial 2D-LC instrumentation has significantly increased, and more testing facilities begin to acquire these 2D-LC capabilities. It is increasingly important that the transferability of developed 2D-LC methods be assessed in terms of reproducibility, robustness and performance across different laboratories worldwide. The work presented here focuses on the evaluation of a heart-cutting 2D-LC method used for the analysis of a pharmaceutical material, where a key, co-eluting impurity in the first dimension ( 1 D) is resolved from the main peak and analyzed in the second dimension ( 2 D). A design-of-experiments (DOE) approach was taken in the collection of the data, and the results were then modeled in order to evaluate method robustness using statistical modeling software. This quality by design (QBD) approach gives a deeper understanding of the impact of these 2D-LC critical method attributes (CMAs) and how they affect overall method performance. Although there are multiple parameters that may be critical from method development point of view, a special focus of this work is devoted towards evaluation of unique 2D-LC critical method attributes from method validation perspective that transcend conventional method development and validation. The 2D-LC method attributes are evaluated for their recovery, peak shape, and resolution of the two co-eluting compounds in question on the 2 D. In the method, linearity, accuracy, precision, repeatability, and sensitivity are assessed along with day-to-day, analyst-to-analyst, and lab-to-lab (instrument-to-instrument) assessments. The results of this validation study demonstrate that the 2D-LC method is accurate, sensitive, and robust and is

  8. SSTL UK-DMC SLIM-6 data quality assessment

    USGS Publications Warehouse

    Chander, G.; Saunier, S.; Choate, M.J.; Scaramuzza, P.L.

    2009-01-01

    Satellite data from the Surrey Satellite Technology Limited (SSTL) United Kingdom (UK) Disaster Monitoring Constellation (DMC) were assessed for geometric and radiometric quality. The UK-DMC Surrey Linear Imager 6 (SLIM-6) sensor has a 32-m spatial resolution and a ground swath width of 640 km. The UK-DMC SLIM-6 design consists of a three-band imager with green, red, and near-infrared bands that are set to similar bandpass as Landsat bands 2, 3, and 4. The UK-DMC data consisted of imagery registered to Landsat orthorectified imagery produced from the GeoCover program. Relief displacements within the UK-DMC SLIM-6 imagery were accounted for by using global 1-km digital elevation models available through the Global Land One-km Base Elevation (GLOBE) Project. Positional accuracy and relative band-to-band accuracy were measured. Positional accuracy of the UK-DMC SLIM-6 imagery was assessed by measuring the imagery against digital orthophoto quadrangles (DOQs), which are designed to meet national map accuracy standards at 1 : 24 000 scales; this corresponds to a horizontal root-mean-square accuracy of about 6 m. The UK-DMC SLIM-6 images were typically registered to within 1.0-1.5 pixels to the DOQ mosaic images. Several radiometric artifacts like striping, coherent noise, and flat detector were discovered and studied. Indications are that the SSTL UK-DMC SLIM-6 data have few artifacts and calibration challenges, and these can be adjusted or corrected via calibration and processing algorithms. The cross-calibration of the UK-DMC SLIM-6 and Landsat 7 Enhanced Thematic Mapper Plus was performed using image statistics derived from large common areas observed by the two sensors.

  9. Internal Quality Control Practices in Coagulation Laboratories: recommendations based on a patterns-of-practice survey.

    PubMed

    McFarlane, A; Aslan, B; Raby, A; Moffat, K A; Selby, R; Padmore, R

    2015-12-01

    Internal quality control (IQC) procedures are crucial for ensuring accurate patient test results. The IQMH Centre for Proficiency Testing conducted a web-based survey to gather information on the current IQC practices in coagulation testing. A questionnaire was distributed to 174 Ontario laboratories licensed to perform prothrombin time (PT) and activated partial thromboplastin time (APTT). All laboratories reported using two levels of commercial QC (CQC); 12% incorporate pooled patient plasma into their IQC program; >68% run CQC at the beginning of each shift; 56% following maintenance, with reagent changes, during a shift, or with every repeat sample; 6% only run CQC at the beginning of the day and 25% when the instruments have been idle for a defined period of time. IQC run frequency was determined by manufacturer recommendations (71%) but also influenced by the stability of test (27%), clinical impact of an incorrect test result (25%), and sample's batch number (10%). IQC was monitored using preset limits based on standard deviation (66%), precision goals (46%), or allowable performance limits (36%). 95% use multirules. Failure actions include repeating the IQC (90%) and reporting patient results; if repeat passes, 42% perform repeat analysis of all patient samples from last acceptable IQC. Variability exists in coagulation IQC practices among Ontario clinical laboratories. The recommendations presented here would be useful in encouraging standardized IQC practices. © 2015 John Wiley & Sons Ltd.

  10. Effects of an Oral Nutritional Supplementation Plus Physical Exercise Intervention on the Physical Function, Nutritional Status, and Quality of Life in Frail Institutionalized Older Adults: The ACTIVNES Study.

    PubMed

    Abizanda, Pedro; López, Mateo Díez; García, Victoria Pérez; Estrella, Juan de Dios; da Silva González, Álvaro; Vilardell, Núria Barcons; Torres, Krysmarú Araujo

    2015-05-01

    The objective of this study was to assess the effects of a hyperproteic, hypercaloric oral nutritional supplement with prebiotic fiber, vitamin D, and calcium, plus a standardized physical intervention, in the functional status, strength, nutritional status, and quality of life of frail institutionalized older adults. Multicentric prospective observational study under usual clinical practice conditions. Four nursing homes from Burgos (2), Albacete, and Madrid, Spain. Participants included 91 institutionalized older adults (age ≥70), able to walk 50 m, and meeting at least 3 of the Fried frailty phenotype criteria. Daily intake of two 200-mL bottles of an oral nutritional supplement, each bottle containing 300 kcal, 20 g protein, 3 g fiber, 500 IU vitamin D, and 480 mg calcium, plus a standardized physical exercise training consisting of flexibility, balance, and strengthening exercises for arms and legs, 5 days per week. Short Physical Performance Battery (SPPB), Short-Form-Late-Life Function and Disability Instrument (SF-LLFDI) function subscale, handgrip strength, EuroQoL-5 Dimensions visual analogic scale (EQ5DVAS), weight, body mass index (BMI), and Short-Form Mini Nutritional Assessment (MNA-SF) at baseline and 6 and 12 weeks. Forty-eight participants (52.7%) improved at least 1 point in the SPPB at week 6, and 44 (48.4%) did so at week 12; 39 participants (42.9%) improved at least 2 points in the SF-LLFDI at week 6, and 46 (50.5%) at week 12. Participants improved their quality of life measured with the EQ5DVAS by 6% (95% confidence interval [CI] 3%-10%) at week 6, and by 5% (95% CI 0%-10%) at week 12. They also improved their nutritional status (weight gain, BMI increase, and higher MNA-SF scores at 6- and 12-week follow-up). This improvement was higher in participants with more frailty criteria, lower functional level, lower vitamin D levels, and poorer nutritional status. A 12-week intervention with oral nutritional supplementation plus physical

  11. Randomized, Controlled Trial of Dexamethasone Versus Dexamethasone Plus Hydrocortisone as Prophylaxis for Hypersensitivity Reactions Due to Paclitaxel Treatment for Gynecologic Cancer.

    PubMed

    Jeerakornpassawat, Dhammapoj; Suprasert, Prapaporn

    2017-10-01

    The aim of this study was to assess intravenous hydrocortisone (HCT) added to standard dexamethasone (DXM) prophylaxis for paclitaxel-associated hypersensitivity reactions (HSRs). Paclitaxel naives scheduled for 6 cycles of paclitaxel (plus platinum) were randomized to DXM alone (20 mg intravenously [IV]) versus DXM plus HCT (100 mg IV) as premedication including chlorpheniramine (10 mg IV), diphenhydramine (25 mg orally), and ranitidine (50 mg IV) 30 minutes before infusion. Clinic nurses observed for HSRs. Groups were well balanced for cancer type, stage, drug allergy, chemotherapy naivete, mean age, body mass index, and paclitaxel dose. The 44 DXM controls underwent 213 cycles and the 42 investigational DXM plus HCT group 192 per protocol cycles. Hypersensitivity reactions were observed among 9 (4.2%) DXM only cycles compared with 1 (0.5%) among DXM plus HCT cycles (P = 0.022). Hypersensitivity reactions occurred in 8 (18%) DXM only patients and in 1 (2.4%) among those correctly receiving DXM plus HCT (P = 0.030). All HSRs occurred in cycles 1 to 3, within 10 to 40 minutes after infusion initiation, and peaked in cycle 2 (5/39) for DXM recipients and in cycle 3 (1/30) for DXM plus HCT. Hypersensitivity reaction severity was grade 1 in 3 DXM only recipients and grade 2 in 6 DXM and 1 DXM plus HCT. A sole grade 3 HSR was in an intention-to-treat DXM-HCT patient, who erroneously received no HCT. Hypersensitivity reaction symptoms were facial flushing (8 episodes), dyspnea (7), palmar rash (1), and transient hypotension (1). Paclitaxel infusion was suspended for treatment of HSRs; in all cases, symptoms mitigated and infusion successfully restarted for the remaining dose. Adding HCT to routine DXM prophylaxis significantly decreased paclitaxel HSR frequency.

  12. Quality assurance and quality control for autonomously collected geoscience data

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Richardson, A.; Labrecque, D.

    2006-12-01

    The growing interest in processes, coupled with the reduction in cost and complexity of sensors which allow for continuous data collection and transmission is giving rise to vast amounts of semi autonomously collected data. Such data is typically collected from a range of physical and chemical sensors and transmitted - either at the time of collection, or periodically as a collection of measurements - to a central server. Such setups can collect vast amounts of data. In cases where power is not an issue one datapoint can be collected every minute, resulting in tens of thousands of data points per month per sensor. Especially in cases in which multiple sensors are deployed it is infeasible to examine each individual datapoint for each individual sensor, and users typically will look at aggregates of such data on a periodic (once a week to once every few months) basis. Such aggregates (and the timelag between data collection and data evaluation) will impact the ability to rapidly identify and resolve data issues. Thus, there is a need to integrate data qa/qc rules and procedures in the data collection process. These should be implemented such that data is analyzed for compliance the moment it arrives at the server, and that any issues with this data result in notification of cognizant personnel. Typical issues (encountered in the field) include complete system failure (resulting in no data arriving at all), to complete sensor failure (data is collected, but is meaningless), to partial sensor failure (sensor gives erratic readings, or starts to exhibit a bias) to partial powerloss (system collects and transmits data only intermittently). We have implemented a suite of such rules and tests as part of the INL developed performance monitoring system. These rules are invoked as part of a data qa/qc workflow, and result in quality indicators for each datapoint as well as user alerts in case of issues. Tests which are applied to the data include tests on individual

  13. Automated, Miniaturized and Integrated Quality Control-on-Chip (QC-on-a-Chip) for Advanced Cell Therapy Applications

    NASA Astrophysics Data System (ADS)

    Wartmann, David; Rothbauer, Mario; Kuten, Olga; Barresi, Caterina; Visus, Carmen; Felzmann, Thomas; Ertl, Peter

    2015-09-01

    The combination of microfabrication-based technologies with cell biology has laid the foundation for the development of advanced in vitro diagnostic systems capable of evaluating cell cultures under defined, reproducible and standardizable measurement conditions. In the present review we describe recent lab-on-a-chip developments for cell analysis and how these methodologies could improve standard quality control in the field of manufacturing cell-based vaccines for clinical purposes. We highlight in particular the regulatory requirements for advanced cell therapy applications using as an example dendritic cell-based cancer vaccines to describe the tangible advantages of microfluidic devices that overcome most of the challenges associated with automation, miniaturization and integration of cell-based assays. As its main advantage lab-on-a-chip technology allows for precise regulation of culturing conditions, while simultaneously monitoring cell relevant parameters using embedded sensory systems. State-of-the-art lab-on-a-chip platforms for in vitro assessment of cell cultures and their potential future applications for cell therapies and cancer immunotherapy are discussed in the present review.

  14. Revisiting the Procedures for the Vector Data Quality Assurance in Practice

    NASA Astrophysics Data System (ADS)

    Erdoğan, M.; Torun, A.; Boyacı, D.

    2012-07-01

    of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.

  15. Pilot Quality Control Program for Audit RT External Beams at Mexican Hospitals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alvarez R, J T; Tovar M, V M

    2008-08-11

    A pilot quality control program for audit 18 radiotherapy RT external beams at 13 Mexican hospitals is described--for eleven {sup 60}Co beams and seven photon beams of 6, 10 and 15 MV from accelerators. This program contains five parts: a) Preparation of the TLD-100 powder: washing, drying and annealing (one hour 400 deg. C plus 24 hrs 80 deg. C). b) Sending two IAEA type capsules to the hospitals for irradiation at the hospital to a nominal D{sub W} = 2 Gy{center_dot}c) Preparation at the SSDL of ten calibration curves CC in the range of 0.5 Gy to 6 Gymore » in terms of absorbed dose to water D{sub W} for {sup 60}Co with traceability to primary laboratory NRC (Canada), according to a window irradiation: 26/10/2007-7/12/2007. d) Reading all capsules that match their hospital time irradiation and the SSDL window irradiation. f) Evaluation of the Dw imparted by the hospitals.« less

  16. Effect of Seed Quality and Combination Fungicide-Trichoderma spp. Seed Treatments on Pre- and Postemergence Damping-Off in Cotton.

    PubMed

    Howell, Charles R

    2007-01-01

    ABSTRACT Good quality seeds of cotton cultivars often escaped pre-emergence damping-off incited by Pythium spp. and Rhizopus oryzae, and they were resistant to postemergence damping-off incited by Rhizoctonia solani. Poor quality seeds, however, were highly susceptible to both phases of seedling disease and required seed treatment in order to survive. Pre-emergence damping-off incited by Pythium spp. and Rhizopus oryzae could be controlled by seed treatment with biocontrol preparations of a number of Trichoderma spp., but these treatments were much less effective in controlling postemergence disease incited by Rhizoctonia solani. Postemergence seedling disease can be controlled by fungicides, but they were much less effective in controlling the pre-emergence phase of the disease. Combination seed treatments of poor quality cotton seeds with fungicides and Trichoderma spp. preparations, followed by planting in pathogen-infested soil, indicated that this technique will control both phases of seedling disease. Seed treatment with either the fungicides or the biocontrol agents alone did not achieve this goal. The optimum combination treatment for disease control was that of chloroneb plus Trichoderma spp., followed by chloroneb plus metalaxyl (Deltacoat AD) plus T. virens strain G-6.

  17. Efficacy of tranexamic acid plus drain-clamping to reduce blood loss in total knee arthroplasty

    PubMed Central

    Zhang, Yan; Zhang, Jun-Wei; Wang, Bao-Hua

    2017-01-01

    Abstract Background: Perioperative blood loss is still an unsolved problem in total knee arthroplasty (TKA). The efficacy of the preoperative use of tranexamic acid (TXA) plus drain-clamping to reduce blood loss in TKA has been debated. This meta-analysis aimed to illustrate the efficacy of TXA plus drain-clamping to reduce blood loss in patients who underwent a TKA. Methods: In February 2017, a systematic computer-based search was conducted in PubMed, EMBASE, Web of Science, the Cochrane Database of Systematic Reviews, and Google Scholar. Data from patients prepared for TKA in studies that compared TXA plus drain-clamping versus TXA alone, drain-clamping alone, or controls were retrieved. The primary endpoint was the need for transfusion. The secondary outcomes were total blood loss, blood loss in drainage, the decrease in hemoglobin, and the occurrence of deep venous thrombosis. After testing for publication bias and heterogeneity between studies, data were aggregated for random-effects models when necessary. Results: Ultimately, 5 clinical studies with 618 patients (TXA plus drain-clamping group = 249, control group = 130, TXA-alone group = 60, and drain-clamping group = 179) were included. TXA plus drain-clamping could decrease the need for transfusion, total blood loss, blood loss in drainage, and the decrease in hemoglobin than could the control group, the TXA-alone group, and the drain-clamping group (P < .05). There was no significant difference between the occurrence of deep venous thrombosis between the included groups (P > .05). Conclusions: TXA plus drain-clamping can achieve the maximum effects of hemostasis in patients prepared for primary TKA. Because the number and the quality of the included studies were limited, more high-quality randomized controlled trials are needed to identify the optimal dose of TXA and the clamping hours in patients prepared for TKA. PMID:28658157

  18. Technical Resources for Energy Savings Plus Health

    EPA Pesticide Factsheets

    The Energy Savings Plus Health Guide equips school districts to integrate indoor air quality protections into school energy efficiency retrofits and other building upgrade projects. This page lists additional resources related to Energy Savings Plus Health

  19. MedlinePlus FAQ: Listing Your Web Site

    MedlinePlus

    ... medlineplus.gov/faq/criteria.html Question: How do Web sites get listed in MedlinePlus? To use the ... authoritative resources. MedlinePlus uses quality guidelines to evaluate Web sites. We try to ensure that the information ...

  20. Relations between open-field, elevated plus-maze, and emergence tests as displayed by C57/BL6J and BALB/c mice.

    PubMed

    Lalonde, R; Strazielle, C

    2008-06-15

    The relations between open-field, elevated plus-maze, and emergence tests were examined in two strains of mice. In the open-field, C57BL/6J mice had more ambulatory movements and rears but not stereotyped movements relative to BALB/c. In addition, C57BL/6J mice entered more often than BALB/c into enclosed and open arms of the elevated plus-maze. When placed inside a large enclosure, C57BL/6J mice emerged more quickly than BALB/c from a small toy object. In the entire series of mice, ambulation and rears in the open-field were linearly correlated with open and enclosed arm visits in the elevated plus-maze. Ambulatory movements and rears were also correlated with emergence latencies. In contrast, stereotyped movements were correlated with emergence latencies, but not with any elevated plus-maze value. These results specify the extent and limits of association between the three tests.

  1. Comparison of Single Intra-Articular Injection of Novel Hyaluronan (HYA-JOINT Plus) with Synvisc-One for Knee Osteoarthritis: A Randomized, Controlled, Double-Blind Trial of Efficacy and Safety.

    PubMed

    Sun, Shu-Fen; Hsu, Chien-Wei; Lin, Huey-Shyan; Liou, I-Hsiu; Chen, Yin-Han; Hung, Chia-Ling

    2017-03-15

    Viscosupplementation has been widely used for the treatment of knee osteoarthritis. Because we found no well-controlled trial comparing single-injection regimens of hyaluronan for knee osteoarthritis, we compared the efficacy and safety of a single intra-articular injection of a novel cross-linked hyaluronan (HYA-JOINT Plus) with a single injection of Synvisc-One in patients with knee osteoarthritis. In a prospective, randomized, controlled, double-blind trial with a 6-month follow-up, 132 patients with knee osteoarthritis (Kellgren-Lawrence grade 2 or 3) were randomized to receive 1 intra-articular injection of 3 mL of HYA-JOINT Plus (20 mg/mL) (n = 66) or 6 mL of Synvisc-One (8 mg/mL) (n = 66). The primary outcome was the change from baseline in the visual analog scale (VAS) (0 to 100 mm) pain score at 6 months. Secondary outcome measures included the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC, Likert scale), Lequesne index, timed "Up & Go" (TUG) test, single-limb stance (SLS) test, use of rescue analgesics, and patient satisfaction. A total of 121 patients were available for the intention-to-treat analysis at 6 months. Both groups had a significant improvement in the VAS, WOMAC, and Lequesne index scores at each follow-up visit (p < 0.001). Patients who received HYA-JOINT Plus experienced a significantly greater improvement in the VAS pain score at 1, 3, and 6 months compared with those treated with Synvisc-One (adjusted mean difference: -12.0, -8.5, and -6.6; p = 0.001, 0.033, and 0.045, respectively). There were no significant between-group differences in any of the secondary outcomes except the WOMAC stiffness scores at 6 months, which favored HYA-JOINT Plus treatment (p = 0.043). The TUG time did not change significantly in either group during the study (p > 0.05), but the SLS time improved significantly in both the HYA-JOINT Plus and the Synvisc-One group (p = 0.004 and p = 0.022, respectively). No significant between

  2. A combined QC methodology in Ebro Delta HF radar system: real time web monitoring of diagnostic parameters and offline validation of current data

    NASA Astrophysics Data System (ADS)

    Lorente, Pablo; Piedracoba, Silvia; Soto-Navarro, Javier; Ruiz, Maria Isabel; Alvarez Fanjul, Enrique

    2015-04-01

    Over recent years, special attention has been focused on the development of protocols for near real-time quality control (QC) of HF radar derived current measurements. However, no agreement has been worldwide achieved to date to establish a standardized QC methodology, although a number of valuable international initiatives have been launched. In this context, Puertos del Estado (PdE) aims to implement a fully operational HF radar network with four different Codar SeaSonde HF radar systems by means of: - The development of a best-practices robust protocol for data processing and QC procedures to routinely monitor sites performance under a wide variety of ocean conditions. - The execution of validation works with in-situ observations to assess the accuracy of HF radar-derived current measurements. The main goal of the present work is to show this combined methodology for the specific case of Ebro HF radar (although easily expandable to the rest of PdE radar systems), deployed to manage Ebro River deltaic area and promote the conservation of an important aquatic ecosystem exposed to a severe erosion and reshape. To this aim, a web interface has been developed to efficiently monitor in real time the evolution of several diagnostic parameters provided by the manufacturer (CODAR) and used as indicators of HF radar system health. This web, updated automatically every hour, examines sites performance on different time basis in terms of: - Hardware parameters: power and temperature. - Radial parameters, among others: Signal-to-Noise Ratio (SNR), number of radial vectors provided by time step, maximum radial range and bearing. - Total uncertainty metrics provided by CODAR: zonal and meridional standard deviations and covariance between both components. - Additionally, a widget embedded in the web interface executes queries against PdE database, providing the chance to compare current time series observed by Tarragona buoy (located within Ebro HF radar spatial domain) and

  3. Aerobic exercise improves self-reported sleep and quality of life in older adults with insomnia.

    PubMed

    Reid, Kathryn J; Baron, Kelly Glazer; Lu, Brandon; Naylor, Erik; Wolfe, Lisa; Zee, Phyllis C

    2010-10-01

    To assess the efficacy of moderate aerobic physical activity with sleep hygiene education to improve sleep, mood and quality of life in older adults with chronic insomnia. Seventeen sedentary adults aged >or=55 years with insomnia (mean age 61.6 [SD±4.3] years; 16 female) participated in a randomized controlled trial comparing 16 weeks of aerobic physical activity plus sleep hygiene to non-physical activity plus sleep hygiene. Eligibility included primary insomnia for at least 3 months, habitual sleep duration <6.5h and a Pittsburgh Sleep Quality Index (PSQI) score >5. Outcomes included sleep quality, mood and quality of life questionnaires (PSQI, Epworth Sleepiness Scale [ESS], Short-form 36 [SF-36], Center for Epidemiological Studies Depression Scale [CES-D]). The physical activity group improved in sleep quality on the global PSQI (p<.0001), sleep latency (p=.049), sleep duration (p=.04), daytime dysfunction (p=.027), and sleep efficiency (p=.036) PSQI sub-scores compared to the control group. The physical activity group also had reductions in depressive symptoms (p=.044), daytime sleepiness (p=.02) and improvements in vitality (p=.017) compared to baseline scores. Aerobic physical activity with sleep hygiene education is an effective treatment approach to improve sleep quality, mood and quality of life in older adults with chronic insomnia.

  4. Cost effectiveness of peginterferon alpha-2b plus ribavirin versus interferon alpha-2b plus ribavirin for initial treatment of chronic hepatitis C.

    PubMed

    Siebert, U; Sroczynski, G; Rossol, S; Wasem, J; Ravens-Sieberer, U; Kurth, B M; Manns, M P; McHutchison, J G; Wong, J B

    2003-03-01

    Peginterferon alpha-2b plus ribavirin therapy in previously untreated patients with chronic hepatitis C yields the highest sustained virological response rates of any treatment strategy but is expensive. To estimate the cost effectiveness of treatment with peginterferon alpha-2b plus ribavirin compared with interferon alpha-2b plus ribavirin for initial treatment of patients with chronic hepatitis C. Individual patient level data from a randomised clinical trial with peginterferon plus ribavirin were applied to a previously published and validated Markov model to project lifelong clinical outcomes. Quality of life and economic estimates were based on German patient data. We used a societal perspective and applied a 3% annual discount rate. Compared with no antiviral therapy, peginterferon plus fixed or weight based dosing of ribavirin increased life expectancy by 4.2 and 4.7 years, respectively. Compared with standard interferon alpha-2b plus ribavirin, peginterferon plus fixed or weight based dosing of ribavirin increased life expectancy by 0.5 and by 1.0 years with incremental cost effectiveness ratios of 11,800 euros and 6600 euros per quality adjusted life year (QALY), respectively. Subgroup analyses by genotype, viral load, sex, and histology showed that peginterferon plus weight based ribavirin remained cost effective compared with other well accepted medical treatments. Peginterferon alpha-2b plus ribavirin should reduce the incidence of liver complications, prolong life, improve quality of life, and be cost effective for the initial treatment of chronic hepatitis C.

  5. Application of Sigma Metrics Analysis for the Assessment and Modification of Quality Control Program in the Clinical Chemistry Laboratory of a Tertiary Care Hospital.

    PubMed

    Iqbal, Sahar; Mustansar, Tazeen

    2017-03-01

    Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.

  6. Codeine Plus Acetaminophen for Pain After Photorefractive Keratectomy: A Randomized, Double-Blind, Placebo-Controlled Add-On Trial.

    PubMed

    Pereira, Vinicius B P; Garcia, Renato; Torricelli, Andre A M; Mukai, Adriana; Bechara, Samir J

    2017-10-01

    Pain after photorefractive keratectomy (PRK) is significant, and the analgesic efficacy and safety of oral opioids in combination with acetaminophen has not been fully investigated in PRK trials. To assess the efficacy and safety of the combination of codeine plus acetaminophen (paracetamol) versus placebo as an add-on therapy for pain control after PRK. Randomized, double-blind, placebo-controlled trial. Single tertiary center. One eye was randomly allocated to the intervention, whereas the fellow eye was treated with a placebo. Eyes were operated 2 weeks apart. The participants were adults older than 20 years with refractive stability for ≥1 year, who underwent PRK for correction of myopia or myopic astigmatism. Codeine (30 mg) plus acetaminophen (500 mg) was given orally 4 times per day for 4 days after PRK. The follow-up duration was 4 months. The study outcomes included pain scores at 1 to 72 hours, as measured by the visual analog scale, McGill Pain Questionnaire, and Brief Pain Inventory, as well as adverse events and corneal wound healing. Of the initial 82 eyes, 80 completed the trial (40 intervention, 40 placebo). Median (interquartile range) pain scores as measured by the visual analog scale were statistically and clinically lower during treatment with codeine/acetaminophen compared with the placebo: 1 hour: 4 (2-4) versus 6 (3-6), P < 0.001; 24 hours: 4 (3-6) versus 7 (6-9), P < 0.001; 48 hours: 1 (0-2) versus 3 (2-5), P < 0.001; and 72 hours: 0 (0-0) versus 0 (0-2), P = 0.001. Virtually identical results were obtained by the McGill Pain Questionnaire and Brief Pain Inventory scales. The most common adverse events with codeine/acetaminophen were drowsiness (42%), nausea (18%), and constipation (5%). No case of delayed epithelial healing was observed in both treatment arms. When added to the usual care therapy, the oral combination of codeine/acetaminophen was safe and significantly superior to the placebo for pain control after PRK. URL: http

  7. Artesunate plus pyronaridine for treating uncomplicated Plasmodium falciparum malaria.

    PubMed

    Bukirwa, Hasifa; Unnikrishnan, B; Kramer, Christine V; Sinclair, David; Nair, Suma; Tharyan, Prathap

    2014-03-04

    artemether-lumefantrine had fewer than 5% PCR adjusted treatment failures during 42 days of follow-up, with no differences between groups (two trials, 1472 participants, low quality evidence). There were fewer new infections during the first 28 days in those given artesunate-pyronaridine (PCR-unadjusted treatment failure: RR 0.60, 95% CI 0.40 to 0.90, two trials, 1720 participants, moderate quality evidence), but no difference was detected over the whole 42 day follow-up (two trials, 1691 participants, moderate quality evidence). Artesunate-pyronaridine versus artesunate plus mefloquineIn one multicentre trial, enrolling mainly older children and adults from South East Asia, both artesunate-pyronaridine and artesunate plus mefloquine had fewer than 5% PCR adjusted treatment failures during 28 days follow-up (one trial, 1187 participants, moderate quality evidence). PCR-adjusted treatment failures were 6% by day 42 for these treated with artesunate-pyronaridine, and 4% for those with artesunate-mefloquine (RR 1.64, 95% CI 0.89 to 3.00, one trial, 1116 participants, low quality evidence). Again, there were fewer new infections during the first 28 days in those given artesunate-pyronaridine (PCR-unadjusted treatment failure: RR 0.35, 95% CI 0.17 to 0.73, one trial, 1720 participants, moderate quality evidence), but no differences were detected over the whole 42 days (one trial, 1146 participants, low quality evidence). Adverse effectsSerious adverse events were uncommon in these trials, with no difference detected between artesunate-pyronaridine and comparator ACTs. The analysis of liver function tests showed biochemical elevation were four times more frequent with artesunate-pyronaridine than with the other antimalarials (RR 4.17, 95% CI 1.38 to 12.62, four trials, 3523 participants, moderate quality evidence). Artesunate-pyronaridine performed well in these trials compared to artemether-lumefantrine and artesunate plus mefloquine, with PCR-adjusted treatment failure at day 28

  8. The Photovoltaic Array Space Power plus Diagnostics (PASP Plus) Flight Experiment

    NASA Technical Reports Server (NTRS)

    Piszczor, Michael F.; Curtis, Henry B.; Guidice, Donald A.; Severance, Paul S.

    1992-01-01

    An overview of the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) flight experiment is presented in outline and graphic form. The goal of the experiment is to test a variety of photovoltaic cell and array technologies under various space environmental conditions. Experiment objectives, flight hardware, experiment control and diagnostic instrumentation, and illuminated thermal vacuum testing are addressed.

  9. High-power terahertz lasers with excellent beam quality for local oscillator sources

    NASA Astrophysics Data System (ADS)

    Williams, Benjamin

    Many molecular species that compose the interstellar medium have strong spectral features in the 2-5 THz range, and heterodyne spectroscopy is required to obtain ~km/s velocity resolution to resolve their complicated lineshapes and disentangle them from the background. Understanding the kinetics and energetics within the gas clouds of the interstellar medium is critical to understanding star formation processes and validating theories of galactic evolution. Herschel Observatory's heterodyne HIFI instrument provided several years of high-spectral-resolution measurements of the interstellar medium, although only up to 1.9 THz. The next frontier for heterodyne spectroscopy is the 2-6 THz region. However, development of heterodyne receivers above 2 THz has been severely hindered by a lack of convenient coherent sources of sufficient power to serve as local oscillators (LOs). The recently developed quantum-cascade (QC) lasers are emerging as candidates for LOs in the 1.5-5 THz range. The current generation of single-mode THz QC-lasers can provide a few milliwatts of power in a directive beam, and will be sufficient to pump single pixels and small-format heterodyne arrays (~10 elements). This proposal looks beyond the state-of-the-art, to the development of large format heterodyne arrays which contain on the order of 100-1000 elements. LO powers on the order of 10-100 mW delivered in a high-quality Gaussian beam will be needed to pump the mixer array - not only because of the microwatt mixer power requirement, but to account for large anticipated losses in LO coupling and distribution. Large format heterodyne array instruments are attractive for a dramatic speedup of mapping of the interstellar medium, particularly on airborne platforms such as the Stratospheric Observatory for Infrared Astronomy (SOFIA), and on long duration balloon platforms such as the Stratospheric Terahertz Observatory (STO), where observation time is limited. The research goal of this proposal is

  10. Overexpression of the Qc-SNARE gene OsSYP71 enhances tolerance to oxidative stress and resistance to rice blast in rice (Oryza sativa L.).

    PubMed

    Bao, Yong-Mei; Sun, Shu-Jing; Li, Meng; Li, Li; Cao, Wen-Lei; Luo, Jia; Tang, Hai-Juan; Huang, Ji; Wang, Zhou-Fei; Wang, Jian-Fei; Zhang, Hong-Sheng

    2012-08-10

    OsSYP71 is an oxidative stress and rice blast response gene that encodes a Qc-SNARE protein in rice. Qc-SNARE proteins belong to the superfamily of SNAREs (soluble N-ethylmaleimide-sensitive factor attachment protein receptors), which function as important components of the vesicle trafficking machinery in eukaryotic cells. In this paper, 12 Qc-SNARE genes were isolated from rice, and expression patterns of 9 genes were detected in various tissues and in seedlings challenged with oxidative stresses and inoculated with rice blast. The expression of OsSYP71 was clearly up-regulated under these stresses. Overexpression of OsSYP71 in rice showed more tolerance to oxidative stress and resistance to rice blast than wild-type plants. These results indicate that Qc-SNAREs play an important role in rice response to environmental stresses, and OsSYP71 is useful in engineering crop plants with enhanced tolerance to oxidative stress and resistance to rice blast. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Regulation of gene expression in mammalian nervous system through alternative pre-mRNA splicing coupled with RNA quality control mechanisms.

    PubMed

    Yap, Karen; Makeyev, Eugene V

    2013-09-01

    Eukaryotic gene expression is orchestrated on a genome-wide scale through several post-transcriptional mechanisms. Of these, alternative pre-mRNA splicing expands the proteome diversity and modulates mRNA stability through downstream RNA quality control (QC) pathways including nonsense-mediated decay (NMD) of mRNAs containing premature termination codons and nuclear retention and elimination (NRE) of intron-containing transcripts. Although originally identified as mechanisms for eliminating aberrant transcripts, a growing body of evidence suggests that NMD and NRE coupled with deliberate changes in pre-mRNA splicing patterns are also used in a number of biological contexts for deterministic control of gene expression. Here we review recent studies elucidating molecular mechanisms and biological significance of these gene regulation strategies with a specific focus on their roles in nervous system development and physiology. This article is part of a Special Issue entitled 'RNA and splicing regulation in neurodegeneration'. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. To QC or not to QC: the key to a consistent laboratory?

    PubMed

    Lane, Michelle; Mitchell, Megan; Cashman, Kara S; Feil, Deanne; Wakefield, Sarah; Zander-Fox, Deirdre L

    2008-01-01

    A limiting factor in every embryology laboratory is its capacity to grow 'normal' embryos. In human in vitro fertilisation (IVF), there is considerable awareness that the environment of the laboratory itself can alter the quality of the embryos produced and the industry as a whole has moved towards the implementation of auditable quality management systems. Furthermore, in some countries, such as Australia, an established quality management system is mandatory for clinical IVF practice, but such systems are less frequently found in other embryology laboratories. Although the same challenges of supporting consistent and repeatable embryo development are paramount to success in all embryology laboratories, it could be argued that they are more important in a research setting where often the measured outcomes are at an intracellular or molecular level. In the present review, we have outlined the role and importance of quality control and quality assurance systems in any embryo laboratory and have highlighted examples of how simple monitoring can provide consistency and avoid the induction of artefacts, irrespective of the laboratory's purpose, function or species involved.

  13. Long term high resolution rainfall runoff observations for improved water balance uncertainty and database QA-QC in the Walnut Gulch Experimental Watershed.

    NASA Astrophysics Data System (ADS)

    Bitew, M. M.; Goodrich, D. C.; Demaria, E.; Heilman, P.; Kautz, M. A.

    2017-12-01

    Walnut Gulch is a semi-arid environment experimental watershed and Long Term Agro-ecosystem Research (LTAR) site managed by USDA-ARS Southwest Watershed Research Center for which high-resolution long-term hydro-climatic data are available across its 150 km2 drainage area. In this study, we present the analysis of 50 years of continuous hourly rainfall data to evaluate runoff control and generation processes for improving the QA-QC plans of Walnut Gulch to create high-quality data set that is critical for reducing water balance uncertainties. Multiple linear regression models were developed to relate rainfall properties, runoff characteristics and watershed properties. The rainfall properties were summarized to event based total depth, maximum intensity, duration, the location of the storm center with respect to the outlet, and storm size normalized to watershed area. We evaluated the interaction between the runoff and rainfall and runoff as antecedent moisture condition (AMC), antecedent runoff condition (ARC) and, runoff depth and duration for each rainfall events. We summarized each of the watershed properties such as contributing area, slope, shape, channel length, stream density, channel flow area, and percent of the area of retention stock ponds for each of the nested catchments in Walnut Gulch. The evaluation of the model using basic and categorical statistics showed good predictive skill throughout the watersheds. The model produced correlation coefficients ranging from 0.4-0.94, Nash efficiency coefficients up to 0.77, and Kling-Gupta coefficients ranging from 0.4 to 0.98. The model predicted 92% of all runoff generations and 98% of no-runoff across all sub-watersheds in Walnut Gulch. The regression model also indicated good potential to complement the QA-QC procedures in place for Walnut Gulch dataset publications developed over the years since the 1960s through identification of inconsistencies in rainfall and runoff relations.

  14. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less

  15. Quality Controlling CMIP datasets at GFDL

    NASA Astrophysics Data System (ADS)

    Horowitz, L. W.; Radhakrishnan, A.; Balaji, V.; Adcroft, A.; Krasting, J. P.; Nikonov, S.; Mason, E. E.; Schweitzer, R.; Nadeau, D.

    2017-12-01

    As GFDL makes the switch from model development to production in light of the Climate Model Intercomparison Project (CMIP), GFDL's efforts are shifted to testing and more importantly establishing guidelines and protocols for Quality Controlling and semi-automated data publishing. Every CMIP cycle introduces key challenges and the upcoming CMIP6 is no exception. The new CMIP experimental design comprises of multiple MIPs facilitating research in different focus areas. This paradigm has implications not only for the groups that develop the models and conduct the runs, but also for the groups that monitor, analyze and quality control the datasets before data publishing, before their knowledge makes its way into reports like the IPCC (Intergovernmental Panel on Climate Change) Assessment Reports. In this talk, we discuss some of the paths taken at GFDL to quality control the CMIP-ready datasets including: Jupyter notebooks, PrePARE, LAMP (Linux, Apache, MySQL, PHP/Python/Perl): technology-driven tracker system to monitor the status of experiments qualitatively and quantitatively, provide additional metadata and analysis services along with some in-built controlled-vocabulary validations in the workflow. In addition to this, we also discuss the integration of community-based model evaluation software (ESMValTool, PCMDI Metrics Package, and ILAMB) as part of our CMIP6 workflow.

  16. Oral contraceptive plus antiandrogen therapy and cardiometabolic risk in polycystic ovary syndrome.

    PubMed

    Harmanci, Ayla; Cinar, Nese; Bayraktar, Miyase; Yildiz, Bulent Okan

    2013-01-01

    Oral contraceptives alone or in combination with antiandrogens are commonly used in the treatment for polycystic ovary syndrome (PCOS). We aimed to determine the effects of ethinyl estradiol/drospirenone (EE-DRSP) plus spironolactone therapy on inflammation and cardiometabolic risk in PCOS. Prospective cohort study. Twenty-three lean, normal glucose-tolerant patients with PCOS and 23 age- and body mass index (BMI)-matched healthy control women. Androgens, high-sensitivity C-reactive protein (hsCRP), homocysteine, lipids, fasting insulin, and glucose levels during a standard 75-g, 2-h oral glucose tolerance test were measured. Patients with PCOS were evaluated before and after receiving EE-DRSP (3 mg/30 μg) plus spironolactone (100 mg/day) for 6 months. Healthy controls were evaluated at baseline only. hsCRP, homocysteine, lipids, insulin and glucose levels were similar between patient and control groups at baseline. EE-DRSP plus spironolactone increased hsCRP and homocysteine levels in patients with PCOS (0.50 ± 0.28 vs 1.5 ± 1.3 mg/l, P < 0.05 and 13.1 ± 5.2 vs 17.6 ± 5.3 μm, P < 0.05, respectively). BMI, waist-to-hip ratio, LDL, HDL cholesterol and triglycerides, and glucose tolerance did not change. Modified Ferriman-Gallwey hirsutism scores, testosterone levels and free androgen index improved (9.1 ± 4.2 vs 6.2 ± 3.4, P = 0.001; 80.6 ± 31.1 47.8 ± 20.3 ng/dl, P < 0.05; and 10.5 ± 7.4 vs 1.1 ± 0.8, P < 0.001, respectively). EE-DRSP plus spironolactone therapy in 6 months improves androgen excess in lean PCOS women without any adverse effects on adiposity, glucose tolerance status or lipid profile. However, this combination increases hsCRP and homocysteine levels. © 2012 Blackwell Publishing Ltd.

  17. CryoSat Ice Processor: High-Level Overview of Baseline-C Data and Quality-Control

    NASA Astrophysics Data System (ADS)

    Mannan, R.; Webb, E.; Hall, A.; Bouffard, J.; Femenias, P.; Parrinello, T.; Bouffard, J.; Brockley, D.; Baker, S.; Scagliola, M.; Urien, S.

    2016-08-01

    Since April 2015, the CryoSat ice products have been generated with the new Baseline-C Instrument Processing Facilities (IPFs). This represents a major upgrade to the CryoSat ice IPFs and is the baseline for the second CryoSat Reprocessing Campaign. Baseline- C introduces major evolutions with respect to Baseline- B, most notably the release of freeboard data within the L2 SAR products, following optimisation of the SAR retracker. Additional L2 improvements include a new Arctic Mean Sea Surface (MSS) in SAR; a new tuneable land ice retracker in LRM; and a new Digital Elevation Model (DEM) in SARIn. At L1B new attitude fields have been introduced and existing datation and range biases reduced. This paper provides a high level overview of the changes and evolutions implemented at Baseline-C in order to improve CryoSat L1B and L2 data characteristics and exploitation over polar regions. An overview of the main Quality Control (QC) activities performed on operational Baseline-C products is also presented.

  18. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    PubMed

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  19. SIMPATIQCO: A Server-Based Software Suite Which Facilitates Monitoring the Time Course of LC–MS Performance Metrics on Orbitrap Instruments

    PubMed Central

    2012-01-01

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386

  20. Effect of clomifene citrate plus metformin and clomifene citrate plus placebo on induction of ovulation in women with newly diagnosed polycystic ovary syndrome: randomised double blind clinical trial.

    PubMed

    Moll, Etelka; Bossuyt, Patrick M M; Korevaar, Johanna C; Lambalk, Cornelis B; van der Veen, Fulco

    2006-06-24

    To compare the effectiveness of clomifene citrate plus metformin and clomifene citrate plus placebo in women with newly diagnosed polycystic ovary syndrome. Randomised clinical trial. Multicentre trial in 20 Dutch hospitals. 228 women with polycystic ovary syndrome. Clomifene citrate plus metformin or clomifene citrate plus placebo. The primary outcome measure was ovulation. Secondary outcome measures were ongoing pregnancy, spontaneous abortion, and clomifene resistance. 111 women were allocated to clomifene citrate plus metformin (metformin group) and 114 women were allocated to clomifene citrate plus placebo (placebo group). The ovulation rate in the metformin group was 64% compared with 72% in the placebo group, a non-significant difference (risk difference - 8%, 95% confidence interval - 20% to 4%). There were no significant differences in either rate of ongoing pregnancy (40% v 46%; - 6%, - 20% to 7%) or rate of spontaneous abortion (12% v 11%; 1%, - 7% to 10%). A significantly larger proportion of women in the metformin group discontinued treatment because of side effects (16% v 5%; 11%, 5% to 16%). Metformin is not an effective addition to clomifene citrate as the primary method of inducing ovulation in women with polycystic ovary syndrome. Current Controlled Trials ISRCTN55906981 [controlled-trials.com].

  1. Vosaroxin plus cytarabine versus placebo plus cytarabine in patients with first relapsed or refractory acute myeloid leukaemia (VALOR): a randomised, controlled, double-blind, multinational, phase 3 study

    PubMed Central

    Ravandi, Farhad; Ritchie, Ellen K.; Sayar, Hamid; Lancet, Jeffrey E.; Craig, Michael D.; Vey, Norbert; Strickland, Stephen A.; Schiller, Gary J.; Jabbour, Elias; Erba, Harry P.; Pigneux, Arnaud; Horst, Heinz-August; Recher, Christian; Klimek, Virginia M.; Cortes, Jorge; Roboz, Gail J.; Odenike, Olatoyosi; Thomas, Xavier; Havelange, Violaine; Maertens, Johan; Derigs, Hans-Günter; Heuser, Michael; Damon, Lloyd; Powell, Bayard L.; Gaidano, Gianluca; Carella, Angelo-Michele; Wei, Andrew; Hogge, Donna; Craig, Adam R.; Fox, Judith A.; Ward, Renee; Smith, Jennifer A.; Acton, Gary; Mehta, Cyrus; Stuart, Robert K.; Kantarjian, Hagop M.

    2016-01-01

    Summary Background Safe and effective treatments are urgently needed for patients with relapsed/refractory acute myeloid leukaemia (AML). We investigated the efficacy and safety of vosaroxin, a first-in-class anticancer quinolone derivative, plus cytarabine in patients with relapsed/refractory AML. Methods VALOR was a phase 3, double-blind, placebo-controlled trial conducted at 101 international sites. Patients were randomised 1:1 to vosaroxin (90 mg/m2 IV days 1,4) plus cytarabine (1 g/m2 IV days 1–5) (vos/cyt) or placebo plus cytarabine (pla/cyt) using a permuted block procedure stratified by disease status, age, and geographic location. All participants were blind to treatment assignment. Primary endpoints were overall survival (OS) and 30- and 60-day mortality. Efficacy analyses were by intention-to-treat; safety analyses included all treated patients. This study is registered at clinicaltrials.gov (NCT01191801). Findings Between December 2010 and September 2013, 711 patients were randomised to vos/cyt (n=356) or pla/cyt (n=355). Median OS was 7·5 months with vos/cyt and 6·1 months with pla/cyt (hazard ratio 0·87; unstratified log-rank p=0·061; stratified p=0·0241) and was supported by a sensitivity analysis censoring for subsequent transplant (6·7 and 5·3 months; p=0·0243). Complete remission (CR) rate was higher with vos/cyt vs pla/cyt (30·1% vs 16·3%, p<0·0001). Early mortality rates were equivalent (vos/cyt vs pla/cyt: 30-day, 7·9% vs 6·6%; 60-day, 19·7% vs 19·4%). Treatment-related deaths occurred at any time in 18 patients (5·1%) with vos/cyt and 8 (2·3%) with pla/cyt. Grade ≥3 adverse events more frequent with vos/cyt included febrile neutropenia (167/355 [47%] vs 117/350 [33%]), stomatitis (54 [15%] vs 10 [3%]), hypokalaemia (52 [15%] vs 21 [6%]), sepsis (42 [12%] vs 18 [5%]), and pneumonia (39 [11%] vs 26 [7%]). Interpretation Addition of vosaroxin to cytarabine prolonged survival in patients with relapsed/refractory AML

  2. Automated evaluation of electronic discharge notes to assess quality of care for cardiovascular diseases using Medical Language Extraction and Encoding System (MedLEE)

    PubMed Central

    Lin, Jou-Wei; Yang, Chen-Wei

    2010-01-01

    The objective of this study was to develop and validate an automated acquisition system to assess quality of care (QC) measures for cardiovascular diseases. This system combining searching and retrieval algorithms was designed to extract QC measures from electronic discharge notes and to estimate the attainment rates to the current standards of care. It was developed on the patients with ST-segment elevation myocardial infarction and tested on the patients with unstable angina/non-ST-segment elevation myocardial infarction, both diseases sharing almost the same QC measures. The system was able to reach a reasonable agreement (κ value) with medical experts from 0.65 (early reperfusion rate) to 0.97 (β-blockers and lipid-lowering agents before discharge) for different QC measures in the test set, and then applied to evaluate QC in the patients who underwent coronary artery bypass grafting surgery. The result has validated a new tool to reliably extract QC measures for cardiovascular diseases. PMID:20442141

  3. Indoor airPLUS Videos, Podcasts, Webinars and Interviews

    EPA Pesticide Factsheets

    The Webinar presentations will help you discover how Indoor airPLUS homes are designed to improve indoor air quality and increase energy efficiency and learn about the key design and construction features included in Indoor airPLUS homes.

  4. A randomized controlled trial of calcium plus vitamin D supplementation and risk of benign proliferative breast disease

    PubMed Central

    Rohan, Thomas E.; Negassa, Abdissa; Chlebowski, Rowan T.; Ceria-Ulep, Clementina D.; Cochrane, Barbara B.; Lane, Dorothy S.; Ginsberg, Mindy; Wassertheil-Smoller, Sylvia; Page, David L.

    2014-01-01

    Experimental evidence provides strong support for anti-carcinogenic effects of calcium and vitamin D with respect to breast cancer. Observational epidemiologic data also provide some support for inverse associations with risk. We tested the effect of calcium plus vitamin D supplementation on risk of benign proliferative breast disease, a condition which is associated with increased risk of breast cancer. We used the Women’s Health Initiative randomized controlled trial. The 36,282 participants were randomized either to 500 mg of elemental calcium as calcium carbonate plus 200 IU of vitamin D3 (GlaxoSmithKline) twice daily (n = 18,176) or to placebo (n = 18,106). Regular mammograms and clinical breast exams were performed. We identified women who had had a biopsy for benign breast disease and subjected histologic sections from the biopsies to standardized review. After an average follow-up period of 6.8 years, 915 incident cases of benign proliferative breast disease had been ascertained, with 450 in the intervention group and 465 in the placebo group. Calcium plus vitamin D supplementation was not associated with altered risk of benign proliferative breast disease overall (hazard ratio = 0.99, 95% confidence interval = 0.86–1.13), or by histologic subtype. Risk varied significantly by levels of age at baseline, but not by levels of other variables. Daily use of 1,000 mg of elemental calcium as calcium carbonate plus 400 IU of vitamin D3 for almost 7 years by postmenopausal women did not alter the overall risk of benign proliferative breast disease. PMID:18853250

  5. Randomized controlled trial of a cognitive-behavioral therapy plus hypnosis intervention to control fatigue in patients undergoing radiotherapy for breast cancer.

    PubMed

    Montgomery, Guy H; David, Daniel; Kangas, Maria; Green, Sheryl; Sucala, Madalina; Bovbjerg, Dana H; Hallquist, Michael N; Schnur, Julie B

    2014-02-20

    The objective of this study was to test the efficacy of cognitive-behavioral therapy plus hypnosis (CBTH) to control fatigue in patients with breast cancer undergoing radiotherapy. We hypothesized that patients in the CBTH group receiving radiotherapy would have lower levels of fatigue than patients in an attention control group. Patients (n = 200) were randomly assigned to either the CBTH (n = 100; mean age, 55.59 years) or attention control (n = 100; mean age, 55.97 years) group. Fatigue was measured at four time points (baseline, end of radiotherapy, 4 weeks, and 6 months after radiotherapy). Fatigue was measured using the Functional Assessment of Chronic Illness Therapy (FACIT) -Fatigue subscale and Visual Analog Scales (VASs; Fatigue and Muscle Weakness). The CBTH group had significantly lower levels of fatigue (FACIT) at the end of radiotherapy (z, 6.73; P < .001), 4-week follow-up (z, 6.98; P < .001), and 6-month follow-up (z, 7.99; P < .001) assessments. Fatigue VAS scores were significantly lower in the CBTH group at the end of treatment (z, 5.81; P < .001) and at the 6-month follow-up (z, 4.56; P < .001), but not at the 4-week follow-up (P < .07). Muscle Weakness VAS scores were significantly lower in the CBTH group at the end of treatment (z, 9.30; P < .001) and at the 6-month follow-up (z, 3.10; P < .02), but not at the 4-week follow-up (P < .13). The results support CBTH as an evidence-based intervention to control fatigue in patients undergoing radiotherapy for breast cancer. CBTH is noninvasive, has no adverse effects, and its beneficial effects persist long after the last intervention session. CBTH seems to be a candidate for future dissemination and implementation.

  6. Randomized Controlled Trial of a Cognitive-Behavioral Therapy Plus Hypnosis Intervention to Control Fatigue in Patients Undergoing Radiotherapy for Breast Cancer

    PubMed Central

    Montgomery, Guy H.; David, Daniel; Kangas, Maria; Green, Sheryl; Sucala, Madalina; Bovbjerg, Dana H.; Hallquist, Michael N.; Schnur, Julie B.

    2014-01-01

    Purpose The objective of this study was to test the efficacy of cognitive-behavioral therapy plus hypnosis (CBTH) to control fatigue in patients with breast cancer undergoing radiotherapy. We hypothesized that patients in the CBTH group receiving radiotherapy would have lower levels of fatigue than patients in an attention control group. Patients and Methods Patients (n = 200) were randomly assigned to either the CBTH (n = 100; mean age, 55.59 years) or attention control (n = 100; mean age, 55.97 years) group. Fatigue was measured at four time points (baseline, end of radiotherapy, 4 weeks, and 6 months after radiotherapy). Fatigue was measured using the Functional Assessment of Chronic Illness Therapy (FACIT) –Fatigue subscale and Visual Analog Scales (VASs; Fatigue and Muscle Weakness). Results The CBTH group had significantly lower levels of fatigue (FACIT) at the end of radiotherapy (z, 6.73; P < .001), 4-week follow-up (z, 6.98; P < .001), and 6-month follow-up (z, 7.99; P < .001) assessments. Fatigue VAS scores were significantly lower in the CBTH group at the end of treatment (z, 5.81; P < .001) and at the 6-month follow-up (z, 4.56; P < .001), but not at the 4-week follow-up (P < .07). Muscle Weakness VAS scores were significantly lower in the CBTH group at the end of treatment (z, 9.30; P < .001) and at the 6-month follow-up (z, 3.10; P < .02), but not at the 4-week follow-up (P < .13). Conclusion The results support CBTH as an evidence-based intervention to control fatigue in patients undergoing radiotherapy for breast cancer. CBTH is noninvasive, has no adverse effects, and its beneficial effects persist long after the last intervention session. CBTH seems to be a candidate for future dissemination and implementation. PMID:24419112

  7. Metformin plus sibutramine for olanzapine-associated weight gain and metabolic dysfunction in schizophrenia: a 12-week double-blind, placebo-controlled pilot study.

    PubMed

    Baptista, Trino; Uzcátegui, Euderruh; Rangel, Nairy; El Fakih, Yamily; Galeazzi, Tatiana; Beaulieu, Serge; de Baptista, Enma Araujo

    2008-05-30

    Metformin (850-1700 mg) plus sibutramine (10-20 mg, n=13) or placebo (n=15) was administered for 12 weeks in olanzapine-treated chronic schizophrenia patients. Weight loss was similar in both groups: -2.8+/-3.2 kg vs. -1.4+/-2.6 kg. Except for preventing a triglyceride increase, the drug combination lacked efficacy for metabolic control in this clinical population.

  8. SQC: secure quality control for meta-analysis of genome-wide association studies.

    PubMed

    Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre

    2017-08-01

    Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  9. Economic evaluation of pegylated interferon plus ribavirin for treatment of chronic hepatitis C in Thailand: genotype 1 and 6.

    PubMed

    Kapol, Nattiya; Lochid-Amnuay, Surasit; Teerawattananon, Yot

    2016-08-05

    Pegylated interferon alpha 2a, alpha 2b and ribavirin have been included to the National List of Essential Medicines (NLEM) for treatment of only chronic hepatitis C genotypes 2 and 3 in Thailand. This reimbursement policy has not covered for other genotypes of hepatitis C virus infection (HCV) especially for genotypes 1 and 6 that account for 30-50 % of all HCV infection in Thailand. Therefore, this research determined whether pegylated interferon alpha 2a or alpha 2b plus ribavirin is more cost-effective than a palliative care for treatment of HCV genotype 1 and 6 in Thailand. A cost-utility analysis using a model-based economic evaluation was conducted based on a societal perspective. A Markov model was developed to estimate costs and quality-adjusted life years (QALYs) comparing between the combination of pegylated interferon alpha 2a or alpha 2b and ribavirin with a usual palliative care for genotype 1 and 6 HCV patients. Health-state transition probabilities, virological responses, and utility values were obtained from published literatures. Direct medical and direct non-medical costs were included and retrieved from published articles and Thai Standard Cost List for Health Technology Assessment. The incremental cost-effectiveness ratio (ICER) was presented as costs in Thai baht per QALY gained. HCV treatment with pegylated interferon alpha 2a or alpha 2b plus ribavirin was dominant or cost-saving in Thailand compared to a palliative care. The ICER value was negative with lower in total costs (peg 2a- 747,718vs. peg 2b- 819,921 vs. palliative care- 1,169,121 Thai baht) and more in QALYs (peg 2a- 13.44 vs. peg 2b- 13.14 vs. palliative care- 11.63 years) both in HCV genotypes 1 and 6. As cost-saving results, the Subcommittee for Development of the NLEM decided to include both pegylated interferon alpha 2a and alpha 2b into the NLEM for treatment of HCV genotype 1 and 6 recently. Economic evaluation for these current drugs can be further applied to other novel

  10. Design of a randomized controlled trial on the effect on return to work with coaching plus light therapy and pulsed electromagnetic field therapy for workers with work-related chronic stress.

    PubMed

    Schoutens, Antonius M C; Frings-Dresen, Monique H W; Sluiter, Judith K

    2016-07-19

    Work-related chronic stress is a common problem among workers. The core complaint is that the employee feels exhausted, which has an effect on the well-being and functioning of the employee, and an impact on the employer and society. The employee's absence is costly due to lost productivity and medical expenses. The usual form of care for work-related chronic stress is coaching, using a cognitive-behavioural approach whose primary aim is to reduce symptoms and improve functioning. Light therapy and pulsed electromagnetic field therapy are used for the treatment of several mental and physical disorders. The objective of this study is to determine whether coaching combined with light therapy plus pulsed electromagnetic field therapy is an effective treatment for reducing absenteeism, fatigue and stress, and improving quality of life compared to coaching alone. The randomized placebo-controlled trial consists of three arms. The population consists of 90 participants with work-related chronic stress complaints. The research groups are: (i) intervention group; (ii) placebo group; and (iii) control group. Participants in the intervention group will be treated with light therapy/pulsed electromagnetic field therapy for 12 weeks, twice a week for 40 min, and coaching (once a fortnight for 50 min). The placebo group receives the same treatment but with the light and pulsed electromagnetic field switched to placebo settings. The control group receives only coaching for 12 weeks, a course of six sessions, once a fortnight for 50 min. The primary outcome is the level of return to work. Secondary outcomes are fatigue, stress and quality of life. Outcomes will be measured at baseline, 6 weeks, 12 and 24 weeks after start of treatment. This study will provide information about the effectiveness of coaching and light therapy plus pulsed electromagnetic field therapy on return to work, and secondly on fatigue, stress and quality of life in people with work-related chronic

  11. The sensitivity of patient specific IMRT QC to systematic MLC leaf bank offset errors.

    PubMed

    Rangel, Alejandra; Palte, Gesa; Dunscombe, Peter

    2010-07-01

    Patient specific IMRT QC is performed routinely in many clinics as a safeguard against errors and inaccuracies which may be introduced during the complex planning, data transfer, and delivery phases of this type of treatment. The purpose of this work is to evaluate the feasibility of detecting systematic errors in MLC leaf bank position with patient specific checks. 9 head and neck (H&N) and 14 prostate IMRT beams were delivered using MLC files containing systematic offsets (+/- 1 mm in two banks, +/- 0.5 mm in two banks, and 1 mm in one bank of leaves). The beams were measured using both MAPCHECK (Sun Nuclear Corp., Melbourne, FL) and the aS1000 electronic portal imaging device (Varian Medical Systems, Palo Alto, CA). Comparisons with calculated fields, without offsets, were made using commonly adopted criteria including absolute dose (AD) difference, relative dose difference, distance to agreement (DTA), and the gamma index. The criteria most sensitive to systematic leaf bank offsets were the 3% AD, 3 mm DTA for MAPCHECK and the gamma index with 2% AD and 2 mm DTA for the EPID. The criterion based on the relative dose measurements was the least sensitive to MLC offsets. More highly modulated fields, i.e., H&N, showed greater changes in the percentage of passing points due to systematic MLC inaccuracy than prostate fields. None of the techniques or criteria tested is sufficiently sensitive, with the population of IMRT fields, to detect a systematic MLC offset at a clinically significant level on an individual field. Patient specific QC cannot, therefore, substitute for routine QC of the MLC itself.

  12. CPR: MedlinePlus Health Topic

    MedlinePlus

    ... to health information from non-government Web sites. See our disclaimer about external links and our quality guidelines . About MedlinePlus Site Map FAQs Customer Support Get email updates Subscribe to ...

  13. Trichomoniasis: MedlinePlus Health Topic

    MedlinePlus

    ... to health information from non-government Web sites. See our disclaimer about external links and our quality guidelines . About MedlinePlus Site Map FAQs Customer Support Get email updates Subscribe to ...

  14. Colonoscopy: MedlinePlus Health Topic

    MedlinePlus

    ... to health information from non-government Web sites. See our disclaimer about external links and our quality guidelines . About MedlinePlus Site Map FAQs Customer Support Get email updates Subscribe to ...

  15. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  16. Comparison of Microlife BP A200 Plus and Omron M6 blood pressure monitors to detect atrial fibrillation in hypertensive patients.

    PubMed

    Marazzi, Giuseppe; Iellamo, Ferdinando; Volterrani, Maurizio; Lombardo, Mauro; Pelliccia, Francesco; Righi, Daniela; Grieco, Fabrizia; Cacciotti, Luca; Iaia, Luigi; Caminiti, Giuseppe; Rosano, Giuseppe

    2012-01-01

    Self-monitoring home blood pressure (BP) devices are currently recommended for long-term follow-up of hypertension and its management. Some of these devices are integrated with algorithms aimed at detecting atrial fibrillation (AF), which is common essential hypertension. This study was designed to compare the diagnostic accuracy of two widely diffused home BP monitoring devices in detecting AF in an unselected population of outpatients referred to a hypertension clinic because of high BP. In 503 consecutive patients the authors simultaneously compared the accuracy of the Microlife(®) BP A200 Plus (Microlife) and the OMRON(®) M6 (OMRON) home BP devices, in detecting AF. Systolic and diastolic BP as well as heart rate (HR) values detected by the two devices were not significantly different. Pulse irregularity was detected in 124 and 112 patients with the OMRON M6 and Microlife BP A200 Plus devices, respectively. Simultaneous electrocardiogram (ECG) recording revealed that pulse irregularity was due to AF in 101 patients. Pulse irregularity detected by the OMRON M6 device corresponded to AF in 101, to supraventricular premature beats in 18, and to frequent premature ventricular beat in five patients, respectively. Pulse irregularity detected by the Microlife BP A200 Plus device corresponded to AF in 93, to supraventricular premature beats in 14, and to ventricular premature beats in five patients. The sensitivity for detecting AF was 100%, the specificity was 92%, and diagnostic accuracy 95% for the OMRON M6 and 100%, 92%, and 95 for the Microlife BP A200 Plus, respectively. AF was newly diagnosed by ECG recordings in 47 patients, and was detected in all patients by the OMRON device, and in 42 patients by the Microlife device. These results indicate that OMRON M6 is more accurate than Microlife BP A200 Plus in detecting AF in patients with essential hypertension. Widespread use of these devices in hypertensive patients could be of clinical benefit for the early

  17. Evaluation of quality-control data collected by the U.S. Geological Survey for routine water-quality activities at the Idaho National Laboratory and vicinity, southeastern Idaho, 2002-08

    USGS Publications Warehouse

    Rattray, Gordon W.

    2014-01-01

    Quality-control (QC) samples were collected from 2002 through 2008 by the U.S. Geological Survey, in cooperation with the U.S. Department of Energy, to ensure data robustness by documenting the variability and bias of water-quality data collected at surface-water and groundwater sites at and near the Idaho National Laboratory. QC samples consisted of 139 replicates and 22 blanks (approximately 11 percent of the number of environmental samples collected). Measurements from replicates were used to estimate variability (from field and laboratory procedures and sample heterogeneity), as reproducibility and reliability, of water-quality measurements of radiochemical, inorganic, and organic constituents. Measurements from blanks were used to estimate the potential contamination bias of selected radiochemical and inorganic constituents in water-quality samples, with an emphasis on identifying any cross contamination of samples collected with portable sampling equipment. The reproducibility of water-quality measurements was estimated with calculations of normalized absolute difference for radiochemical constituents and relative standard deviation (RSD) for inorganic and organic constituents. The reliability of water-quality measurements was estimated with pooled RSDs for all constituents. Reproducibility was acceptable for all constituents except dissolved aluminum and total organic carbon. Pooled RSDs were equal to or less than 14 percent for all constituents except for total organic carbon, which had pooled RSDs of 70 percent for the low concentration range and 4.4 percent for the high concentration range. Source-solution and equipment blanks were measured for concentrations of tritium, strontium-90, cesium-137, sodium, chloride, sulfate, and dissolved chromium. Field blanks were measured for the concentration of iodide. No detectable concentrations were measured from the blanks except for strontium-90 in one source solution and one equipment blank collected in September

  18. Social problem-solving plus psychoeducation for adults with personality disorder: pragmatic randomised controlled trial.

    PubMed

    Huband, Nick; McMurran, Mary; Evans, Chris; Duggan, Conor

    2007-04-01

    Social problem-solving therapy may be relevant in the treatment of personality disorder, although assessments of its effectiveness are uncommon. To determine the effectiveness of a problem-solving intervention for adults with personality disorder in the community under conditions resembling routine clinical practice. Participants were randomly allocated to brief psychoeducation plus 16 problem-solving group sessions (n=87) or to waiting-list control (n=89). Primary outcome was comparison of scores on the Social Problem Solving Inventory and the Social Functioning Questionnaire between intervention and control arms at the conclusion of treatment, on average at 24 weeks after randomisation. In intention-to-treat analysis, those allocated to intervention showed significantly better problem-solving skills (P<0.001), higher overall social functioning (P=0.031) and lower anger expression (P=0.039) compared with controls. No significant differences were found on use of services during the intervention period. Problem-solving plus psychoeducation has potential as a preliminary intervention for adults with personality disorder.

  19. Efficacy of tranexamic acid plus drain-clamping to reduce blood loss in total knee arthroplasty: A meta-analysis.

    PubMed

    Zhang, Yan; Zhang, Jun-Wei; Wang, Bao-Hua

    2017-06-01

    Perioperative blood loss is still an unsolved problem in total knee arthroplasty (TKA). The efficacy of the preoperative use of tranexamic acid (TXA) plus drain-clamping to reduce blood loss in TKA has been debated. This meta-analysis aimed to illustrate the efficacy of TXA plus drain-clamping to reduce blood loss in patients who underwent a TKA. In February 2017, a systematic computer-based search was conducted in PubMed, EMBASE, Web of Science, the Cochrane Database of Systematic Reviews, and Google Scholar. Data from patients prepared for TKA in studies that compared TXA plus drain-clamping versus TXA alone, drain-clamping alone, or controls were retrieved. The primary endpoint was the need for transfusion. The secondary outcomes were total blood loss, blood loss in drainage, the decrease in hemoglobin, and the occurrence of deep venous thrombosis. After testing for publication bias and heterogeneity between studies, data were aggregated for random-effects models when necessary. Ultimately, 5 clinical studies with 618 patients (TXA plus drain-clamping group = 249, control group = 130, TXA-alone group = 60, and drain-clamping group = 179) were included. TXA plus drain-clamping could decrease the need for transfusion, total blood loss, blood loss in drainage, and the decrease in hemoglobin than could the control group, the TXA-alone group, and the drain-clamping group (P < .05). There was no significant difference between the occurrence of deep venous thrombosis between the included groups (P > .05). TXA plus drain-clamping can achieve the maximum effects of hemostasis in patients prepared for primary TKA. Because the number and the quality of the included studies were limited, more high-quality randomized controlled trials are needed to identify the optimal dose of TXA and the clamping hours in patients prepared for TKA.

  20. Training a Chest Compression of 6-7 cm Depth for High Quality Cardiopulmonary Resuscitation in Hospital Setting: A Randomised Controlled Trial.

    PubMed

    Oh, Jaehoon; Lim, Tae Ho; Cho, Youngsuk; Kang, Hyunggoo; Kim, Wonhee; Chee, Youngjoon; Song, Yeongtak; Kim, In Young; Lee, Juncheol

    2016-03-01

    During cardiopulmonary resuscitation (CPR), chest compression (CC) depth is influenced by the surface on which the patient is placed. We hypothesized that training healthcare providers to perform a CC depth of 6-7 cm (instead of 5-6 cm) on a manikin placed on a mattress during CPR in the hospital might improve their proper CC depth. This prospective randomised controlled study involved 66 premedical students without CPR training. The control group was trained to use a CC depth of 5-6 cm (G 5-6), while the experimental group was taught to use a CC depth of 6-7 cm (G 6-7) with a manikin on the floor. All participants performed CCs for 2 min on a manikin that was placed on a bed 1 hour and then again 4 weeks after the training without a feedback. The parameters of CC quality (depth, rate, % of accurate depth) were assessed and compared between the 2 groups. Four students were excluded due to loss to follow-up and recording errors, and data of 62 were analysed. CC depth and % of accurate depth were significantly higher among students in the G 6-7 than G 5-6 both 1 hour and 4 weeks after the training (p<0.001), whereas CC rate was not different between two groups (p>0.05). Training healthcare providers to perform a CC depth of 6-7 cm could improve quality CC depth when performing CCs on patients who are placed on a mattress during CPR in a hospital setting.