A comprehensive quality control workflow for paired tumor-normal NGS experiments.
Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc
2017-06-01
Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M C; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Frölich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr
2016-03-01
Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples. To prepare and test alternative matrices for QC samples that could facilitate intra- and inter-laboratory QC of the NDD biomarkers. Three matrices were validated in this study: (A) human pooled CSF, (B) Aβ peptides spiked into human prediluted plasma, and (C) Aβ peptides spiked into solution of bovine serum albumin in phosphate-buffered saline. All matrices were tested also after supplementation with an antibacterial agent (sodium azide). We analyzed short- and long-term stability of the biomarkers with ELISA and chemiluminescence (Fujirebio Europe, MSD, IBL International), and performed an inter-laboratory variability study. NDD biomarkers turned out to be stable in almost all samples stored at the tested conditions for up to 14 days as well as in samples stored deep-frozen (at - 80°C) for up to one year. Sodium azide did not influence biomarker stability. Inter-center variability of the samples sent at room temperature (pooled CSF, freeze-dried CSF, and four artificial matrices) was comparable to the results obtained on deep-frozen samples in other large-scale projects. Our results suggest that it is possible to replace self-made, CSF-based QC samples with large-scale volumes of QC materials prepared with artificial peptides and matrices. This would greatly facilitate intra- and inter-laboratory QC schedules for NDD measurements.
A Framework for a Quality Control System for Vendor/Processor Contracts.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
A framework for monitoring quality control (QC) of processor contracts administered by the Department of Education's Office of Student Financial Assistance (OSFA) is presented and applied to the Pell Grant program. Guidelines for establishing QC measures and standards are included, and the uses of a sampling procedure in the QC system are…
Cian, Francesco; Villiers, Elisabeth; Archer, Joy; Pitorri, Francesca; Freeman, Kathleen
2014-06-01
Quality control (QC) validation is an essential tool in total quality management of a veterinary clinical pathology laboratory. Cost-analysis can be a valuable technique to help identify an appropriate QC procedure for the laboratory, although this has never been reported in veterinary medicine. The aim of this study was to determine the applicability of the Six Sigma Quality Cost Worksheets in the evaluation of possible candidate QC rules identified by QC validation. Three months of internal QC records were analyzed. EZ Rules 3 software was used to evaluate candidate QC procedures, and the costs associated with the application of different QC rules were calculated using the Six Sigma Quality Cost Worksheets. The costs associated with the current and the candidate QC rules were compared, and the amount of cost savings was calculated. There was a significant saving when the candidate 1-2.5s, n = 3 rule was applied instead of the currently utilized 1-2s, n = 3 rule. The savings were 75% per year (£ 8232.5) based on re-evaluating all of the patient samples in addition to the controls, and 72% per year (£ 822.4) based on re-analyzing only the control materials. The savings were also shown to change accordingly with the number of samples analyzed and with the number of daily QC procedures performed. These calculations demonstrated the importance of the selection of an appropriate QC procedure, and the usefulness of the Six Sigma Costs Worksheet in determining the most cost-effective rule(s) when several candidate rules are identified by QC validation. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.
Data-quality measures for stakeholder-implemented watershed-monitoring programs
Greve, Adrienne I.
2002-01-01
Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.
Cho, Min-Chul; Kim, So Young; Jeong, Tae-Dong; Lee, Woochang; Chun, Sail; Min, Won-Ki
2014-11-01
Verification of new lot reagent's suitability is necessary to ensure that results for patients' samples are consistent before and after reagent lot changes. A typical procedure is to measure results of some patients' samples along with quality control (QC) materials. In this study, the results of patients' samples and QC materials in reagent lot changes were analysed. In addition, the opinion regarding QC target range adjustment along with reagent lot changes was proposed. Patients' sample and QC material results of 360 reagent lot change events involving 61 analytes and eight instrument platforms were analysed. The between-lot differences for the patients' samples (ΔP) and the QC materials (ΔQC) were tested by Mann-Whitney U tests. The size of the between-lot differences in the QC data was calculated as multiples of standard deviation (SD). The ΔP and ΔQC values only differed significantly in 7.8% of the reagent lot change events. This frequency was not affected by the assay principle or the QC material source. One SD was proposed for the cutoff for maintaining pre-existing target range after reagent lot change. While non-commutable QC material results were infrequent in the present study, our data confirmed that QC materials have limited usefulness when assessing new reagent lots. Also a 1 SD standard for establishing a new QC target range after reagent lot change event was proposed. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
The Development of Quality Control Genotyping Approaches: A Case Study Using Elite Maize Lines.
Chen, Jiafa; Zavala, Cristian; Ortega, Noemi; Petroli, Cesar; Franco, Jorge; Burgueño, Juan; Costich, Denise E; Hearne, Sarah J
2016-01-01
Quality control (QC) of germplasm identity and purity is a critical component of breeding and conservation activities. SNP genotyping technologies and increased availability of markers provide the opportunity to employ genotyping as a low-cost and robust component of this QC. In the public sector available low-cost SNP QC genotyping methods have been developed from a very limited panel of markers of 1,000 to 1,500 markers without broad selection of the most informative SNPs. Selection of optimal SNPs and definition of appropriate germplasm sampling in addition to platform section impact on logistical and resource-use considerations for breeding and conservation applications when mainstreaming QC. In order to address these issues, we evaluated the selection and use of SNPs for QC applications from large DArTSeq data sets generated from CIMMYT maize inbred lines (CMLs). Two QC genotyping strategies were developed, the first is a "rapid QC", employing a small number of SNPs to identify potential mislabeling of seed packages or plots, the second is a "broad QC", employing a larger number of SNP, used to identify each germplasm entry and to measure heterogeneity. The optimal marker selection strategies combined the selection of markers with high minor allele frequency, sampling of clustered SNP in proportion to marker cluster distance and selecting markers that maintain a uniform genomic distribution. The rapid and broad QC SNP panels selected using this approach were further validated using blind test assessments of related re-generation samples. The influence of sampling within each line was evaluated. Sampling 192 individuals would result in close to 100% possibility of detecting a 5% contamination in the entry, and approximately a 98% probability to detect a 2% contamination of the line. These results provide a framework for the establishment of QC genotyping. A comparison of financial and time costs for use of these approaches across different platforms is discussed providing a framework for institutions involved in maize conservation and breeding to assess the resource use effectiveness of QC genotyping. Application of these research findings, in combination with existing QC approaches, will ensure the regeneration, distribution and use in breeding of true to type inbred germplasm. These findings also provide an effective approach to optimize SNP selection for QC genotyping in other species.
Gantner, Pierre; Mélard, Adeline; Damond, Florence; Delaugerre, Constance; Dina, Julia; Gueudin, Marie; Maillard, Anne; Sauné, Karine; Rodallec, Audrey; Tuaillon, Edouard; Plantier, Jean-Christophe; Rouzioux, Christine; Avettand-Fenoel, Véronique
2017-11-01
Viral reservoirs represent an important barrier to HIV cure. Accurate markers of HIV reservoirs are needed to develop multicenter studies. The aim of this multicenter quality control (QC) was to evaluate the inter-laboratory reproducibility of total HIV-1-DNA quantification. Ten laboratories of the ANRS-AC11 working group participated by quantifying HIV-DNA with a real-time qPCR assay (Biocentric) in four samples (QCMD). Good reproducibility was found between laboratories (standard deviation ≤ 0.2 log 10 copies/10 6 PBMC) for the three positive QC that were correctly classified by each laboratory (QC1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
2017-06-09
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza
2017-01-03
Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities.
Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza
2017-01-01
Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities. PMID:28054956
Analytical approaches to quality assurance and quality control in rangeland monitoring data
USDA-ARS?s Scientific Manuscript database
Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...
McClure, Matthew C.; McCarthy, John; Flynn, Paul; McClure, Jennifer C.; Dair, Emma; O'Connell, D. K.; Kearney, John F.
2018-01-01
A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP) verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS), they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF) analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800) selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR), and minor allele frequency (MAF) in the Irish cattle population. Large datasets require sample and SNP quality control (QC). Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present), and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non-matching genotypes per animal, SNP duplicates, sex and breed prediction mismatches, parentage and progeny validation results, and other situations. The Animal QC pipeline make use of ICBF800 SNP set where appropriate to identify errors in a computationally efficient yet still highly accurate method. PMID:29599798
Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.
Westgard, James O; Bayat, Hassan; Westgard, Sten A
2018-02-01
To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.
Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K
2011-12-01
Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.
Southam, Lorraine; Panoutsopoulou, Kalliope; Rayner, N William; Chapman, Kay; Durrant, Caroline; Ferreira, Teresa; Arden, Nigel; Carr, Andrew; Deloukas, Panos; Doherty, Michael; Loughlin, John; McCaskie, Andrew; Ollier, William E R; Ralston, Stuart; Spector, Timothy D; Valdes, Ana M; Wallis, Gillian A; Wilkinson, J Mark; Marchini, Jonathan; Zeggini, Eleftheria
2011-05-01
Imputation is an extremely valuable tool in conducting and synthesising genome-wide association studies (GWASs). Directly typed SNP quality control (QC) is thought to affect imputation quality. It is, therefore, common practise to use quality-controlled (QCed) data as an input for imputing genotypes. This study aims to determine the effect of commonly applied QC steps on imputation outcomes. We performed several iterations of imputing SNPs across chromosome 22 in a dataset consisting of 3177 samples with Illumina 610 k (Illumina, San Diego, CA, USA) GWAS data, applying different QC steps each time. The imputed genotypes were compared with the directly typed genotypes. In addition, we investigated the correlation between alternatively QCed data. We also applied a series of post-imputation QC steps balancing elimination of poorly imputed SNPs and information loss. We found that the difference between the unQCed data and the fully QCed data on imputation outcome was minimal. Our study shows that imputation of common variants is generally very accurate and robust to GWAS QC, which is not a major factor affecting imputation outcome. A minority of common-frequency SNPs with particular properties cannot be accurately imputed regardless of QC stringency. These findings may not generalise to the imputation of low frequency and rare variants.
RNA-SeQC: RNA-seq metrics for quality control and process optimization.
DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad
2012-06-01
RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.
Quality control for federal clean water act and safe drinking water act regulatory compliance.
Askew, Ed
2013-01-01
QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.
QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.
Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O
2018-04-17
Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected. In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration. To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality. QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis. We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
Jiang, Jian; James, Christopher A; Wong, Philip
2016-09-05
A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.
A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.
Westgard, James O
2017-03-01
A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
2017-06-09
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.
Mapp, Latisha; Klonicki, Patricia; Takundwa, Prisca; Hill, Vincent R; Schneeberger, Chandra; Knee, Jackie; Raynor, Malik; Hwang, Nina; Chambers, Yildiz; Miller, Kenneth; Pope, Misty
2015-11-01
The U.S. Environmental Protection Agency's (EPA) Water Laboratory Alliance (WLA) currently uses ultrafiltration (UF) for concentration of biosafety level 3 (BSL-3) agents from large volumes (up to 100-L) of drinking water prior to analysis. Most UF procedures require comprehensive training and practice to achieve and maintain proficiency. As a result, there was a critical need to develop quality control (QC) criteria. Because select agents are difficult to work with and pose a significant safety hazard, QC criteria were developed using surrogates, including Enterococcus faecalis and Bacillus atrophaeus. This article presents the results from the QC criteria development study and results from a subsequent demonstration exercise in which E. faecalis was used to evaluate proficiency using UF to concentrate large volume drinking water samples. Based on preliminary testing EPA Method 1600 and Standard Methods 9218, for E. faecalis and B. atrophaeus respectively, were selected for use during the QC criteria development study. The QC criteria established for Method 1600 were used to assess laboratory performance during the demonstration exercise. Based on the results of the QC criteria study E. faecalis and B. atrophaeus can be used effectively to demonstrate and maintain proficiency using ultrafiltration. Published by Elsevier B.V.
Evaluation of peak picking quality in LC-MS metabolomics data.
Brodsky, Leonid; Moussaieff, Arieh; Shahaf, Nir; Aharoni, Asaph; Rogachev, Ilana
2010-11-15
The output of LC-MS metabolomics experiments consists of mass-peak intensities identified through a peak-picking/alignment procedure. Besides imperfections in biological samples and instrumentation, data accuracy is highly dependent on the applied algorithms and their parameters. Consequently, quality control (QC) is essential for further data analysis. Here, we present a QC approach that is based on discrepancies between replicate samples. First, the quantile normalization of per-sample log-signal distributions is applied to each group of biologically homogeneous samples. Next, the overall quality of each replicate group is characterized by the Z-transformed correlation coefficients between samples. This general QC allows a tuning of the procedure's parameters which minimizes the inter-replicate discrepancies in the generated output. Subsequently, an in-depth QC measure detects local neighborhoods on a template of aligned chromatograms that are enriched by divergences between intensity profiles of replicate samples. These neighborhoods are determined through a segmentation algorithm. The retention time (RT)-m/z positions of the neighborhoods with local divergences are indicative of either: incorrect alignment of chromatographic features, technical problems in the chromatograms, or to a true biological discrepancy between replicates for particular metabolites. We expect this method to aid in the accurate analysis of metabolomics data and in the development of new peak-picking/alignment procedures.
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.
Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao
2016-09-01
The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.
Quality Control (QC) System Development for the Pell Grant Program: A Conceptual Framework.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The objectives of the Pell Grant quality control (QC) system and the general definition of QC are considered. Attention is also directed to: the objectives of the Stage II Pell Grant QC system design and testing project, the approach used to develop the QC system, and the interface of the QC system and the Pell Grant delivery system. The…
General Quality Control (QC) Guidelines for SAM Methods
Learn more about quality control guidelines and recommendations for the analysis of samples using the methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).
Proximate Composition Analysis.
2016-01-01
The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.
CHALLENGES IN SETTING UP QUALITY CONTROL IN DIAGNOSTIC RADIOLOGY FACILITIES IN NIGERIA.
Inyang, S O; Egbe, N O; Ekpo, E
2015-01-01
The Nigerian Nuclear Regulatory Authority (NNRA) was established to regulate and control the use of radioactive and radiation emitting sources in Nigeria. Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the authorization of diagnostic radiology facilities in the Country. Some quality control tests (output, exposure linearity and reproducibility) were measured on the x-ray machines in the facilities that took part in the study. Questionnaire was developed to evaluate the frequencies at which QC tests were conducted in the facilities and the challenges in setting up QC. Results show great variation in the values of the QC parameters measured. Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study. The responses on the frequencies at which QC tests should be conducted did not correspond to the recommended standards; indicating that personnel were not familiar with QC implementation and may require further training on QC.
Quality assurance and quality control of geochemical data—A primer for the research scientist
Geboy, Nicholas J.; Engle, Mark A.
2011-01-01
Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and interpreting results. This primer has been developed to provide basic information and guidance about developing QA/QC protocols for geochemical studies. It is not intended to be a comprehensive guide but rather an introduction to key concepts tied to a list of relevant references for further reading. The guidelines are presented in stepwise order beginning with presampling considerations and continuing through final data interpretation. The goal of this primer is to outline basic QA/QC practices that scientists can use before, during, and after chemical analysis to ensure the validity of the data they collect with the goal of providing defendable results and conclusions.
Phase 2 Site Investigations Report. Volume 3 of 3: Appendices
1994-09-01
Phase II Site Investigations Ee Report Cn Volume III of III Appendices Fort Devens Sudbury Training Annex, Massachusetts September 1994 Contract No...laboratory quality control (QC) samples collected during field investigations at the Sudbury Training Annex of Fort Devens , Massachusetts. The QC...returned to its original condition. E & E performed this procedure for each monitoring well tested during the 1993 slug testing activities at Fort Devens
Torres, Leticia; Liu, Yue; Guitreau, Amy; Yang, Huiping; Tiersch, Terrence R
2017-12-01
Quality control (QC) is essential for reproducible and efficient functioning of germplasm repositories. However, many biomedical fish models present significant QC challenges due to small body sizes (<5 cm) and miniscule sperm volumes (<5 μL). Using minimal volumes of sperm, we used Zebrafish to evaluate common QC endpoints as surrogates for fertilization success along sequential steps of cryopreservation. First, concentrations of calibration bead suspensions were evaluated with a Makler ® counting chamber by using different sample volumes and mixing methods. For sperm analysis, samples were initially diluted at a 1:30 ratio with Hanks' balanced salt solution (HBSS). Motility was evaluated by using different ratios of sperm and activation medium, and membrane integrity was analyzed with flow cytometry at different concentrations. Concentration and sperm motility could be confidently estimated by using volumes as small as 1 μL, whereas membrane integrity required a minimum of 2 μL (at 1 × 10 6 cells/mL). Thus, <5 μL of sperm suspension (after dilution to 30-150 μL with HBSS) was required to evaluate sperm quality by using three endpoints. Sperm quality assessment using a combination of complementary endpoints enhances QC efforts during cryopreservation, increasing reliability and reproducibility, and reducing waste of time and resources.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee
2018-01-11
In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.
Impact of dose calibrators quality control programme in Argentina
NASA Astrophysics Data System (ADS)
Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.
1992-02-01
The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.
Adjustment of pesticide concentrations for temporal changes in analytical recovery, 1992–2010
Martin, Jeffrey D.; Eberle, Michael
2011-01-01
Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ("spiked" QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as a percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in apparent environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report presents data and models related to the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as "pesticides") that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 through 2010 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Models of recovery, based on robust, locally weighted scatterplot smooths (lowess smooths) of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.
Park, Sang Hyuk; Park, Chan-Jeoung; Kim, Mi-Jeong; Choi, Mi-Ok; Han, Min-Young; Cho, Young-Uk; Jang, Seongsoo
2014-12-01
We developed and validated an interinstrument comparison method for automatic hematology analyzers based on the 99th percentile coefficient of variation (CV) cutoff of daily means and validated in both patient samples and quality control (QC) materials. A total of 120 patient samples were obtained over 6 months. Data from the first 3 months were used to determine 99th percentile CV cutoff values, and data obtained in the last 3 months were used to calculate acceptable ranges and rejection rates. Identical analyses were also performed using QC materials. Two instrument comparisons were also performed, and the most appropriate allowable total error (ATE) values were determined. The rejection rates based on the 99th percentile cutoff values were within 10.00% and 9.30% for the patient samples and QC materials, respectively. The acceptable ranges of QC materials based on the currently used method were wider than those calculated from the 99th percentile CV cutoff values in most items. In two-instrument comparisons, 34.8% of all comparisons failed, and 87.0% of failed comparisons were successful when 4 SD was applied as an ATE value instead of 3 SD. The 99th percentile CV cutoff value-derived daily acceptable ranges can be used as a real-time interinstrument comparison method in both patient samples and QC materials. Applying 4 SD as an ATE value can significantly reduce unnecessarily followed recalibration in the leukocyte differential counts, reticulocytes, and mean corpuscular volume. Copyright© by the American Society for Clinical Pathology.
QA/QC requirements for physical properties sampling and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Innis, B.E.
1993-07-21
This report presents results of an assessment of the available information concerning US Environmental Protection Agency (EPA) quality assurance/quality control (QA/QC) requirements and guidance applicable to sampling, handling, and analyzing physical parameter samples at Comprehensive Environmental Restoration, Compensation, and Liability Act (CERCLA) investigation sites. Geotechnical testing laboratories measure the following physical properties of soil and sediment samples collected during CERCLA remedial investigations (RI) at the Hanford Site: moisture content, grain size by sieve, grain size by hydrometer, specific gravity, bulk density/porosity, saturated hydraulic conductivity, moisture retention, unsaturated hydraulic conductivity, and permeability of rocks by flowing air. Geotechnical testing laboratories alsomore » measure the following chemical parameters of soil and sediment samples collected during Hanford Site CERCLA RI: calcium carbonate and saturated column leach testing. Physical parameter data are used for (1) characterization of vadose and saturated zone geology and hydrogeology, (2) selection of monitoring well screen sizes, (3) to support modeling and analysis of the vadose and saturated zones, and (4) for engineering design. The objectives of this report are to determine the QA/QC levels accepted in the EPA Region 10 for the sampling, handling, and analysis of soil samples for physical parameters during CERCLA RI.« less
The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...
7 CFR 275.10 - Scope and purpose.
Code of Federal Regulations, 2010 CFR
2010-01-01
... to enhanced funding. (b) The objectives of quality control reviews are to provide: (1) A systematic... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of...
ChronQC: a quality control monitoring system for clinical next generation sequencing.
Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C
2018-05-15
ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.
Francy, D.S.; Jones, A.L.; Myers, Donna N.; Rowe, G.L.; Eberle, Michael; Sarver, K.M.
1998-01-01
The U.S. Geological Survey (USGS), Water Resources Division (WRD), requires that quality-assurance/quality-control (QA/QC) activities be included in any sampling and analysis program. Operational QA/QC procedures address local needs while incorporating national policies. Therefore, specific technical policies were established for all activities associated with water-quality project being done by the Ohio District. The policies described in this report provide Ohio District personnel, cooperating agencies, and others with a reference manual on QA/QC procedures that are followed in collecitng and analyzing water-quality samples and reporting water-quality information in the Ohio District. The project chief, project support staff, District Water-Quality Specialist, and District Laboratory Coordinator are all involved in planning and implementing QA/QC activities at the district level. The District Chief and other district-level managers provide oversight, and the Regional Water-Quality Specialist, Office of Water Quality (USGS headquarters), and the Branch of Quality Systems within the Office of Water Quality create national QA/QC polices and provide assistance to District personnel. In the literature, the quality of all measurement data is expressed in terms of precision, variability, bias, accuracy, completeness, representativeness, and comparability. In the Ohio District, bias and variability will be used to describe quality-control data generated from samples in the field and laboratory. Each project chief must plan for implementation and financing of QA/QC activities necessary to achieve data-quality objectives. At least 15 percent of the total project effort must be directed toward QA/QC activities. Of this total, 5-10 percent will be used for collection and analysis of quality-control samples. This is an absolute minimum, and more may be required based on project objectives. Proper techniques must be followed in the collection and processing of surface-water, ground-water, biological, precipitation, bed-sediment, bedload, suspended-sediment, and solid-phase samples. These techniques are briefly described in this report and are extensively documented. The reference documents listed in this report will be kept by the District librarian and District Water-Quality Specialist and updated regularly so that they are available to all District staff. Proper handling and documentation before, during, and after field activities are essential to ensure the integrity of the sample and to correct erroneous reporting of data results. Field sites are to be properly identified and entered into the data base before field data-collection activities begin. During field activities, field notes are to be completed and sample bottles appropriately labeled a nd stored. After field activities, all paperwork is to be completed promptly and samples transferred to the laboratory within allowable holding times. All equipment used by District personnel for the collection and processing of water-quality samples is to be properly operated, maintained, and calibrated by project personnel. This includes equipment for onsite measurement of water-quality characteristics (temperature, specific conductance, pH, dissolved oxygen, alkalinity, acidity, and turbidity) and equipment and instruments used for biological sampling. The District Water-Quality Specialist and District Laboratory Coordinator are responsible for preventive maintenance and calibration of equipment in the Ohio District laboratory. The USGS National Water Quality Laboratory in Arvada, Colo., is the primary source of analytical services for most project work done by the Ohio District. Analyses done at the Ohio District laboratory are usually those that must be completed within a few hours of sample collection. Contract laboratories or other USGS laboratories are sometimes used instead of the NWQL or the Ohio District laboratory. When a contract laboratory is used, the projec
Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.
2009-01-01
In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The <2-mm fraction of each sample was analyzed for Al, Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of 19 organochlorine pesticides by gas chromatography. Only three of these samples had detectable pesticide concentrations. A separate sample of A-horizon soil was collected for microbial characterization by phospholipid fatty acid analysis (PLFA), soil enzyme assays, and determination of selected human and agricultural pathogens. Collection, preservation and analysis of samples for both organic compounds and microbial characterization add a great degree of complication to the sampling and preservation protocols and a significant increase to the cost for a continental-scale survey. Both these issues must be considered carefully prior to adopting these parameters as part of the soil geochemical survey of North America.
7 CFR 275.10 - Scope and purpose.
Code of Federal Regulations, 2011 CFR
2011-01-01
... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of... terminated (called negative cases). Reviews shall be conducted on active cases to determine if households are...
Levey-Jennings Analysis Uncovers Unsuspected Causes of Immunohistochemistry Stain Variability.
Vani, Kodela; Sompuram, Seshi R; Naber, Stephen P; Goldsmith, Jeffrey D; Fulton, Regan; Bogen, Steven A
Almost all clinical laboratory tests use objective, quantitative measures of quality control (QC), incorporating Levey-Jennings analysis and Westgard rules. Clinical immunohistochemistry (IHC) testing, in contrast, relies on subjective, qualitative QC review. The consequences of using Levey-Jennings analysis for QC assessment in clinical IHC testing are not known. To investigate this question, we conducted a 1- to 2-month pilot test wherein the QC for either human epidermal growth factor receptor 2 (HER-2) or progesterone receptor (PR) in 3 clinical IHC laboratories was quantified and analyzed with Levey-Jennings graphs. Moreover, conventional tissue controls were supplemented with a new QC comprised of HER-2 or PR peptide antigens coupled onto 8 μm glass beads. At institution 1, this more stringent analysis identified a decrease in the HER-2 tissue control that had escaped notice by subjective evaluation. The decrement was due to heterogeneity in the tissue control itself. At institution 2, we identified a 1-day sudden drop in the PR tissue control, also undetected by subjective evaluation, due to counterstain variability. At institution 3, a QC shift was identified, but only with 1 of 2 controls mounted on each slide. The QC shift was due to use of the instrument's selective reagent drop zones dispense feature. None of these events affected patient diagnoses. These case examples illustrate that subjective QC evaluation of tissue controls can detect gross assay failure but not subtle changes. The fact that QC issues arose from each site, and in only a pilot study, suggests that immunohistochemical stain variability may be an underappreciated problem.
Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie
2014-01-01
Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.
Valid internal standard technique for arson detection based on gas chromatography-mass spectrometry.
Salgueiro, Pedro A S; Borges, Carlos M F; Bettencourt da Silva, Ricardo J N
2012-09-28
The most popular procedures for the detection of residues of accelerants in fire debris are the ones published by the American Society for Testing and Materials (ASTM E1412-07 and E1618-10). The most critical stages of these tests are the conservation of fire debris from the sampling to the laboratory, the extraction of residues of accelerants from the debris to the activated charcoal strips (ACS) and from those to the final solvent, as well as the analysis of sample extract by gas chromatography-mass spectrometry (GC-MS) and the interpretation of the instrumental signal. This work proposes a strategy for checking the quality of the sample conservation, the accelerant residues transference to final solvent and GC-MS analysis, using internal standard additions. It is used internal standards ranging from a highly volatile compound for checking debris conservation to low volatile compound for checking GC-MS repeatability. The developed quality control (QC) parameters are not affected by GC-MS sensitivity variation and, specifically, the GC-MS performance control is not affected by ACS adsorption saturation that may mask test performance deviations. The proposed QC procedure proved to be adequate to check GC-MS repeatability, ACS extraction and sample conservation since: (1) standard additions are affected by negligible uncertainty and (2) observed dispersion of QC parameters are fit for its intended use. Copyright © 2012 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
Quality Control for Scoring Tests Administered in Continuous Mode: An NCME Instructional Module
ERIC Educational Resources Information Center
Allalouf, Avi; Gutentag, Tony; Baumer, Michal
2017-01-01
Quality control (QC) in testing is paramount. QC procedures for tests can be divided into two types. The first type, one that has been well researched, is QC for tests administered to large population groups on few administration dates using a small set of test forms (e.g., large-scale assessment). The second type is QC for tests, usually…
jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.
Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris
2014-07-03
The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollister, R
QC sample results (daily background check drum and 100-gram SGS check drum) were within acceptance criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on drum LL85501243TRU. Replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. HWM NCAR No. 02-1000168 issued on 17-Oct-2002 regarding a partially dislodged Cd sheet filter on the HPGe coaxial detector. This physical geometry occurred on 01-Oct-2002 and was not corrected until 10-Oct-2002, during which period is inclusive of the present batch run of drums. Per discussions among the Independent Technical Reviewer, Expert Reviewermore » and the Technical QA Supervisor, as well as in consultation with John Fleissner, Technical Point of Contact from Canberra, the analytical results are technically reliable. All QC standard runs during this period were in control. Data packet for SGS Batch 2002-13 generated using passive gamma-ray spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with establiShed control limits. The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable.« less
NASA Technical Reports Server (NTRS)
Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)
2000-01-01
The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.
Rosenbaum, Matthew W; Flood, James G; Melanson, Stacy E F; Baumann, Nikola A; Marzinke, Mark A; Rai, Alex J; Hayden, Joshua; Wu, Alan H B; Ladror, Megan; Lifshitz, Mark S; Scott, Mitchell G; Peck-Palmer, Octavia M; Bowen, Raffick; Babic, Nikolina; Sobhani, Kimia; Giacherio, Donald; Bocsi, Gregary T; Herman, Daniel S; Wang, Ping; Toffaletti, John; Handel, Elizabeth; Kelly, Kathleen A; Albeiroti, Sami; Wang, Sihe; Zimmer, Melissa; Driver, Brandon; Yi, Xin; Wilburn, Clayton; Lewandrowski, Kent B
2018-05-29
In the United States, minimum standards for quality control (QC) are specified in federal law under the Clinical Laboratory Improvement Amendment and its revisions. Beyond meeting this required standard, laboratories have flexibility to determine their overall QC program. We surveyed chemistry and immunochemistry QC procedures at 21 clinical laboratories within leading academic medical centers to assess if standardized QC practices exist for chemistry and immunochemistry testing. We observed significant variation and unexpected similarities in practice across laboratories, including QC frequency, cutoffs, number of levels analyzed, and other features. This variation in practice indicates an opportunity exists to establish an evidence-based approach to QC that can be generalized across institutions.
Parvin, C A
1993-03-01
The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.
An accelerated solvent extraction (ASE) device was evaluated as a semi-automated means for extracting arsenicals from quality control (QC) samples and DORM-2 [standard reference material (SRM)]. Unlike conventional extraction procedures, the ASE requires that the sample be dispe...
Sho, Shonan; Court, Colin M; Winograd, Paul; Lee, Sangjun; Hou, Shuang; Graeber, Thomas G; Tseng, Hsian-Rong; Tomlinson, James S
2017-07-01
Sequencing analysis of circulating tumor cells (CTCs) enables "liquid biopsy" to guide precision oncology strategies. However, this requires low-template whole genome amplification (WGA) that is prone to errors and biases from uneven amplifications. Currently, quality control (QC) methods for WGA products, as well as the number of CTCs needed for reliable downstream sequencing, remain poorly defined. We sought to define strategies for selecting and generating optimal WGA products from low-template input as it relates to their potential applications in precision oncology strategies. Single pancreatic cancer cells (HPAF-II) were isolated using laser microdissection. WGA was performed using multiple displacement amplification (MDA), multiple annealing and looping based amplification (MALBAC) and PicoPLEX. Quality of amplified DNA products were assessed using a multiplex/RT-qPCR based method that evaluates for 8-cancer related genes and QC-scores were assigned. We utilized this scoring system to assess the impact of de novo modifications to the WGA protocol. WGA products were subjected to Sanger sequencing, array comparative genomic hybridization (aCGH) and next generation sequencing (NGS) to evaluate their performances in respective downstream analyses providing validation of the QC-score. Single-cell WGA products exhibited a significant sample-to-sample variability in amplified DNA quality as assessed by our 8-gene QC assay. Single-cell WGA products that passed the pre-analysis QC had lower amplification bias and improved aCGH/NGS performance metrics when compared to single-cell WGA products that failed the QC. Increasing the number of cellular input resulted in improved QC-scores overall, but a resultant WGA product that consistently passed the QC step required a starting cellular input of at least 20-cells. Our modified-WGA protocol effectively reduced this number, achieving reproducible high-quality WGA products from ≥5-cells as a starting template. A starting cellular input of 5 to 10-cells amplified using the modified-WGA achieved aCGH and NGS results that closely matched that of unamplified, batch genomic DNA. The modified-WGA protocol coupled with the 8-gene QC serve as an effective strategy to enhance the quality of low-template WGA reactions. Furthermore, a threshold number of 5-10 cells are likely needed for a reliable WGA reaction and product with high fidelity to the original starting template.
Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.
Yago, Martín; Alcover, Silvia
2016-07-01
According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.
Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H
1999-03-01
The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2012-01-01
This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
QA/QC in the laboratory. Session F
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
Li, Junming; He, Zhiyao; Yu, Shui; Li, Shuangzhi; Ma, Qing; Yu, Yiyi; Zhang, Jialin; Li, Rui; Zheng, Yu; He, Gu; Song, Xiangrong
2012-10-01
In this study, quercetin (QC) with cancer chemoprevention effect and anticancer potential was loaded into polymeric micelles of methoxy poly(ethylene glycol)-cholesterol conjugate (mPEG-Chol) in order to increase its water solubility. MPEG-Chol with lower critical micelle concentration (CMC) value (4.0 x 10(-7) M - 13 x 10(-7) M) was firstly synthesized involving two steps of chemical modification on cholesterol by esterification, and then QC was incorporated into mPEG-Chol micelles by self-assembly method. After the process parameters were optimized, QC-loaded micelles had higher drug loading (3.66%) and entrapment efficiency (93.51%) and nano-sized diameter (116 nm). DSC analysis demonstrated that QC had been incorporated non-covalently into the micelles and existed as an amorphous state or a solid solution in the polymeric matrix. The freeze-dried formulation with addition of 1% (w/v) mannitol as cryoprotectant was successfully developed for the long-term storage of QC-loaded micelles. Compared to free QC, QC-loaded micelles could release QC more slowly. Moreover, the release of QC from micelles was slightly faster in PBS at pH 5 than that in PBS at pH 7.4, which implied that QC-loaded micelles might be pH-sensitive and thereby selectively deliver QC to tumor tissue with unwanted side effects. Therefore, mPEG-Chol was a promising micellar vector for the controlled and targeted drug delivery of QC to tumor and QC-loaded micelles were also worth being further investigated as a potential formulation for cancer chemoprevention and treatment.
Quality Control in Clinical Laboratory Samples
2015-01-01
is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient
Automated quality control in a file-based broadcasting workflow
NASA Astrophysics Data System (ADS)
Zhang, Lina
2014-04-01
Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.
2004-07-01
sampler, project manager, data reviewer, statistician , risk assessor, assessment personnel, and laboratory QC manager. In addition, a complete copy of...sample • Corrective actions to be taken if the QC sample fails these criteria • A description of how the QC data and results are to be documented and...Intergovernmental Data Quality Task Force Uniform Federal Policy for Quality Assurance Project Plans Evaluating, Assessing, and Documenting
Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T
2007-03-01
Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.
The Quality System Implementation Plan (QSIP) describes the quality assurance and quality control procedures developed for the CTEPP study. It provides the QA/QC procedures used in recruitment of subjects, sample field collection, sample extraction and analysis, data storage, and...
Eight years of quality control in Bulgaria: impact on mammography practice.
Avramova-Cholakova, S; Lilkov, G; Kaneva, M; Terziev, K; Nakov, I; Mutkurov, N; Kovacheva, D; Ivanova, M; Vasilev, D
2015-07-01
The requirements for quality control (QC) in diagnostic radiology were introduced in Bulgarian legislation in 2005. Hospital medical physicists and several private medical physics groups provide QC services to radiology departments. The aim of this study was to analyse data from QC tests in mammography and to investigate the impact of QC introduction on mammography practice in the country. The study was coordinated by the National Centre of Radiobiology and Radiation Protection. All medical physics services were requested to fill in standardised forms with information about most important parameters routinely measured during QC. All QC service providers responded. Results demonstrated significant improvement of practice since the introduction of QC, with reduction of established deviations from 65 % during the first year to 7 % in the last year. The systems that do not meet the acceptability criteria were suspended from use. Performance of automatic exposure control and digital detectors are not regularly tested because of the absence of requirements in the legislation. The need of updated guidance and training of medical physicists to reflect the change in technology was demonstrated. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
PCB Analysis Plan for Tank Archive Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
NGUYEN, D.M.
2001-03-22
This analysis plan specifies laboratory analysis, quality assurance/quality control (QA/QC), and data reporting requirements for analyzing polychlorinated biphenyls (PCB) concentrations in archive samples. Tank waste archive samples that are planned for PCB analysis are identified in Nguyen 2001. The tanks and samples are summarized in Table 1-1. The analytical data will be used to establish a PCB baseline inventory in Hanford tanks.
Quality Control in Primary Schools: Progress from 2001-2006
ERIC Educational Resources Information Center
Hofman, Roelande H.; de Boom, Jan; Hofman, W. H. Adriaan
2010-01-01
This article presents findings of research into the quality control (QC) of schools from 2001-2006. In 2001 several targets for QC were set and the progress of 939 primary schools is presented. Furthermore, using cluster analysis, schools are classified into four QC-types that differ in their focus on school (self) evaluation and school…
Jones, A Kyle; Heintz, Philip; Geiser, William; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John
2015-11-01
Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Geiser, William; Heintz, Philip
Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist ismore » responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.« less
Results of the Excreta Bioassay Quality Control Program for April 1, 2009 through March 31, 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antonio, Cheryl L.
2012-07-19
A total of 58 urine samples and 10 fecal samples were submitted during the report period (April 1, 2009 through March 31, 2010) to General Engineering Laboratories, South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for Sr, 238Pu, 239Pu, 241Am, 243Am 235U, 238U, elemental uranium and fecal analyses for 241Am, 238Pu and 239Pu were tested this year as well as four tissue samples for 238Pu, 239Pu, 241Am and 241Pu. The number of QC urine samples submitted during the report period represented 1.3% of the total samplesmore » submitted. In addition to the samples provided by IDP, GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 33% of the analyses processed by GEL during the third year of this contract were quality control samples. GEL tested the performance of 21 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty (Table 4).« less
Results of The Excreta Bioassay Quality Control Program For April 1, 2010 Through March 31, 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antonio, Cheryl L.
2012-07-19
A total of 76 urine samples and 10 spiked fecal samples were submitted during the report period (April 1, 2010 through March 31, 2011) to GEL Laboratories, LLC in South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for 14C, Sr, for 238Pu, 239Pu, 241Am, 243Am, 235U, 238U, 238U-mass and fecal analyses for 241Am, 238Pu and 239Pu were tested this year. The number of QC urine samples submitted during the report period represented 1.1% of the total samples submitted. In addition to the samples provided by IDP,more » GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 31% of the analyses processed by GEL during the first year of contract 112512 were quality control samples. GEL tested the performance of 23 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty except the slightly elevated relative bias for 243,244Cm (Table 4).« less
Effect of different solutions on color stability of acrylic resin-based dentures.
Goiato, Marcelo Coelho; Nóbrega, Adhara Smith; dos Santos, Daniela Micheline; Andreotti, Agda Marobo; Moreno, Amália
2014-01-01
The aim of this study was to evaluate the effect of thermocycling and immersion in mouthwash or beverage solutions on the color stability of four different acrylic resin-based dentures (Onda Cryl, OC; QC20, QC; Classico, CL; and Lucitone, LU). The factors evaluated were type of acrylic resin, immersion time, and solution (mouthwash or beverage). A total of 224 denture samples were fabricated. For each type of resin, eight samples were immersed in mouthwashes (Plax-Colgate, PC; Listerine, LI; and Oral-B, OB), beverages (coffee, CP; cola, C; and wine, W), and artificial saliva (AS; control). The color change (DE) was evaluated before (baseline) and after thermocycling (T1), and after immersion in solution for 1 h (T2), 3 h (T3), 24 h (T4), 48 h (T5), and 96 h (T6). The CIE Lab system was used to determine the color changes. The thermocycling test was performed for 5000 cycles. Data were submitted to three-way repeated-measures analysis of variance and Tukey's test (p<0.05). When the samples were immersed in each mouthwash, all assessed factors, associated or not, significantly influenced the color change values, except there was no association between the mouthwash and acrylic resin. Similarly, when the samples were immersed in each beverage, all studied factors influenced the color change values. In general, regardless of the solution, LU exhibited the greatest DE values in the period from T1 to T5; and QC presented the greatest DE values at T6. Thus, thermocycling and immersion in the various solutions influenced the color stability of acrylic resins and QC showed the greatest color alteration.
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.
Sekhavati, Mohammad H; Mesgaran, Mohsen Danesh; Nassiri, Mohammad R; Mohammadabadi, Tahereh; Rezaii, Farkhondeh; Fani Maleki, Adham
2009-10-01
This paper describes the use of a quantitative competitive polymerase chain reaction (QC-PCR) assay; using PCR primers to the rRNA locus of rumen fungi and a standard-control DNA including design and validation. In order to test the efficiency of this method for quantifying anaerobic rumen fungi, it has been attempted to evaluate this method in in vitro conditions by comparing with an assay based on measuring cell wall chitin. The changes in fungal growth have been studied when they are grown in in vitro on either untreated (US) or sodium hydroxide treated wheat straw (TS). Results showed that rumen fungi growth was significantly higher in treated samples compared with untreated during the 12d incubation (P<0.05) and plotting the chitin assay's results against the competitive PCR's showed high positive correlation (R(2)> or =0.87). The low mean values of the coefficients of variance in repeatability in the QC-PCR method against the chitin assay demonstrated more reliability of this new approach. And finally, the efficiency of this method was investigated in in vivo conditions. Samples of rumen fluid were collected from four fistulated Holstein steers which were fed four different diets (basal diet, high starch, high sucrose and starch plus sucrose) in rotation. The results of QC-PCR showed that addition of these non-structural carbohydrates to the basal diets caused a significant decrease in rumen anaerobic fungi biomass. The QC-PCR method appears to be a reliable and can be used for rumen samples.
AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.
Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia
2017-03-14
Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.
A real-time automated quality control of rain gauge data based on multiple sensors
NASA Astrophysics Data System (ADS)
qi, Y.; Zhang, J.
2013-12-01
Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.
[Development of quality assurance/quality control web system in radiotherapy].
Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun
2013-12-01
Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.
Jayakody, Chatura; Hull-Ryde, Emily A
2016-01-01
Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.
FDAs Critical Path Initiative identifies pharmacogenomics and toxicogenomics as key opportunities in advancing medical product development and personalized medicine, and the Guidance for Industry: Pharmacogenomic Data Submissions has been released. Microarrays represent a co...
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204
NASA Astrophysics Data System (ADS)
Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu
2006-07-01
Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.
Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R
2014-01-01
Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770
You, Jun; Zhou, Jinping; Li, Qian; Zhang, Lina
2012-03-20
As a weak base, β-glycerophosphate (β-GP) was used to spontaneously initiate gelation of quaternized cellulose (QC) solutions at body temperature. The QC/β-GP solutions are flowable below or at room temperature but gel rapidly under physiological conditions. In order to clarify the sol-gel transition process of the QC/β-GP systems, the complex was investigated by dynamic viscoelastic measurements. The shear storage modulus (G') and loss modulus (G″) as a function of (1) concentration of β-GP (c(β-GP)), (2) concentration of QC (c(QC)), (3) degree of substitution (DS; i.e., the average number of substituted hydroxyl groups in the anhydroglucose unit) of QC, (4) viscosity-average molecular weight (M(η)) of QC, and (5) solvent medium were studied by the oscillatory rheology. The sol-gel transition temperature of QC/β-GP solutions decreased with an increase of c(QC) and c(β-GP), the M(η) of QC, and a decrease of the DS of QC and pH of the solvent. The sol-gel transition temperature and time could be easily controlled by adjusting the concentrations of QC and β-GP, M(η) and DS of QC, and the solvent medium. Gels formed after heating were irreversible; i.e., after cooling to lower temperature they could not be dissolved to become liquid again. The aggregation and entanglement of QC chains, electrostatic interaction, and hydrogen bonding between QC and β-GP were the main factors responsible for the irreversible sol-gel transition behavior of QC/β-GP systems.
Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael
2015-01-21
Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24 due to errors in interpreting faint test lines. The DTS method can be used under field conditions to supplement other RDT QC methods and health worker proficiency in Ethiopia and possibly other malaria-endemic countries.
Vassileva, J; Dimov, A; Slavchev, A; Karadjov, A
2005-01-01
Results from a Bulgarian patient dose survey in diagnostic radiology are presented. Reference levels for entrance surface dose (ESD) were 0.9 mGy for chest radiography (PA), 30 mGy for lumbar spine (Lat), 10 mGy for pelvis, 5 mGy for skull (AP), 3 mGy for skull (Lat) and 13 mGy for mammography. Quality control (QC) programmes were proposed for various areas of diagnostic radiology. Film processing QC warranted special attention. Proposed QC programmes included parameters to be tested, level of expertise needed and two action levels: remedial and suspension. Programmes were tested under clinical conditions to assess initial results and draw conclusions for further QC system development. On the basis of international experience, measurement protocols were developed for all parameters tested. QC equipment was provided as part of the PHARE project. A future problem for QC programme implementation may be the small number of medical physics experts in diagnostic radiology.
Quality control in urinalysis.
Takubo, T; Tatsumi, N
1999-01-01
Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.
Ensuring the reliability of stable isotope ratio data--beyond the principle of identical treatment.
Carter, J F; Fry, B
2013-03-01
The need for inter-laboratory comparability is crucial to facilitate the globalisation of scientific networks and the development of international databases to support scientific and criminal investigations. This article considers what lessons can be learned from a series of inter-laboratory comparison exercises organised by the Forensic Isotope Ratio Mass Spectrometry (FIRMS) network in terms of reference materials (RMs), the management of data quality, and technical limitations. The results showed that within-laboratory precision (repeatability) was generally good but between-laboratory accuracy (reproducibility) called for improvements. This review considers how stable isotope laboratories can establish a system of quality control (QC) and quality assurance (QA), emphasising issues of repeatability and reproducibility. For results to be comparable between laboratories, measurements must be traceable to the international δ-scales and, because isotope ratio measurements are reported relative to standards, a key aspect is the correct selection, calibration, and use of international and in-house RMs. The authors identify four principles which promote good laboratory practice. The principle of identical treatment by which samples and RMs are processed in an identical manner and which incorporates three further principles; the principle of identical correction (by which necessary corrections are identified and evenly applied), the principle of identical scaling (by which data are shifted and stretched to the international δ-scales), and the principle of error detection by which QC and QA results are monitored and acted upon. To achieve both good repeatability and good reproducibility it is essential to obtain RMs with internationally agreed δ-values. These RMs will act as the basis for QC and can be used to calibrate further in-house QC RMs tailored to the activities of specific laboratories. In-house QA standards must also be developed to ensure that QC-based calibrations and corrections lead to accurate results for samples. The δ-values assigned to RMs must be recorded and reported with all data. Reference materials must be used to determine what corrections are necessary for measured data. Each analytical sequence of samples must include both QC and QA materials which are subject to identical treatment during measurement and data processing. Results for these materials must be plotted, monitored, and acted upon. Periodically international RMs should be analysed as an in-house proficiency test to demonstrate results are accurate.
Novelo-Casanova, D. A.; Lee, W.H.K.
1991-01-01
Using simulated coda waves, the resolution of the single-scattering model to extract coda Q (Qc) and its power law frequency dependence was tested. The back-scattering model of Aki and Chouet (1975) and the single isotropic-scattering model of Sato (1977) were examined. The results indicate that: (1) The input Qc models are reasonably well approximated by the two methods; (2) almost equal Qc values are recovered when the techniques sample the same coda windows; (3) low Qc models are well estimated in the frequency domain from the early and late part of the coda; and (4) models with high Qc values are more accurately extracted from late code measurements. ?? 1991 Birkha??user Verlag.
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane E.; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trust authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane Elizabeth; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trusted authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
Applying Sigma Metrics to Reduce Outliers.
Litten, Joseph
2017-03-01
Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Yago, Martín
2017-05-01
QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.
Srivastava, Praveen; Moorthy, Ganesh S; Gross, Robert; Barrett, Jeffrey S
2013-01-01
A selective and a highly sensitive method for the determination of the non-nucleoside reverse transcriptase inhibitor (NNRTI), efavirenz, in human plasma has been developed and fully validated based on high performance liquid chromatography tandem mass spectrometry (LC-MS/MS). Sample preparation involved protein precipitation followed by one to one dilution with water. The analyte, efavirenz was separated by high performance liquid chromatography and detected with tandem mass spectrometry in negative ionization mode with multiple reaction monitoring. Efavirenz and ¹³C₆-efavirenz (Internal Standard), respectively, were detected via the following MRM transitions: m/z 314.20243.90 and m/z 320.20249.90. A gradient program was used to elute the analytes using 0.1% formic acid in water and 0.1% formic acid in acetonitrile as mobile phase solvents, at a flow-rate of 0.3 mL/min. The total run time was 5 min and the retention times for the internal standard (¹³C₆-efavirenz) and efavirenz was approximately 2.6 min. The calibration curves showed linearity (coefficient of regression, r>0.99) over the concentration range of 1.0-2,500 ng/mL. The intraday precision based on the standard deviation of replicates of lower limit of quantification (LLOQ) was 9.24% and for quality control (QC) samples ranged from 2.41% to 6.42% and with accuracy from 112% and 100-111% for LLOQ and QC samples. The inter day precision was 12.3% and 3.03-9.18% for LLOQ and quality controls samples, and the accuracy was 108% and 95.2-108% for LLOQ and QC samples. Stability studies showed that efavirenz was stable during the expected conditions for sample preparation and storage. The lower limit of quantification for efavirenz was 1 ng/mL. The analytical method showed excellent sensitivity, precision, and accuracy. This method is robust and is being successfully applied for therapeutic drug monitoring and pharmacokinetic studies in HIV-infected patients.
UK audit of glomerular filtration rate measurement from plasma sampling in 2013.
Murray, Anthony W; Lawson, Richard S; Cade, Sarah C; Hall, David O; Kenny, Bob; O'Shaughnessy, Emma; Taylor, Jon; Towey, David; White, Duncan; Carson, Kathryn
2014-11-01
An audit was carried out into UK glomerular filtration rate (GFR) calculation. The results were compared with an identical 2001 audit. Participants used their routine method to calculate GFR for 20 data sets (four plasma samples) in millilitres per minute and also the GFR normalized for body surface area. Some unsound data sets were included to analyse the applied quality control (QC) methods. Variability between centres was assessed for each data set, compared with the national median and a reference value calculated using the method recommended in the British Nuclear Medicine Society guidelines. The influence of the number of samples on variability was studied. Supplementary data were requested on workload and methodology. The 59 returns showed widespread standardization. The applied early exponential clearance correction was the main contributor to the observed variability. These corrections were applied by 97% of centres (50% - 2001) with 80% using the recommended averaged Brochner-Mortenson correction. Approximately 75% applied the recommended Haycock body surface area formula for adults (78% for children). The effect of the number of samples used was not significant. There was wide variability in the applied QC techniques, especially in terms of the use of the volume of distribution. The widespread adoption of the guidelines has harmonized national GFR calculation compared with the previous audit. Further standardization could further reduce variability. This audit has highlighted the need to address the national standardization of QC methods. Radionuclide techniques are confirmed as the preferred method for GFR measurement when an unequivocal result is required.
This Multi-Site QAPP presents the organization, data quality objectives (DQOs), a set of anticipated activities, sample analysis, data handling and specific Quality Assurance/Quality Control (QA/QC) procedures associated with Studies done in EPA Region 5
Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R
2017-11-10
In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Embankment quality and assessment of moisture control implementation.
DOT National Transportation Integrated Search
2016-02-01
A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 : years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certificatio...
Betsou, Fay; Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita
2016-10-01
This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality.
Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita
2016-01-01
This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality. PMID:27046294
Jackson, David; Bramwell, David
2013-12-16
Proteomics technologies can be effective for the discovery and assay of protein forms altered with disease. However, few examples of successful biomarker discovery yet exist. Critical to addressing this is the widespread implementation of appropriate QC (quality control) methodology. Such QC should combine the rigour of clinical laboratory assays with a suitable treatment of the complexity of the proteome by targeting separate assignable causes of variation. We demonstrate an approach, metric and example workflow for users to develop such targeted QC rules systematically and objectively, using a publicly available plasma DIGE data set. Hierarchical clustering analysis of standard channels is first used to discover correlated groups of features corresponding to specific assignable sources of technical variation. These effects are then quantified using a statistical distance metric, and followed on control charts. This allows measurement of process drift and the detection of runs that outlie for any given effect. A known technical issue on originally rejected gels was detected validating this approach, and relevant novel effects were also detected and classified effectively. Our approach was effective for 2-DE QC. Whilst we demonstrated this in a retrospective DIGE experiment, the principles would apply to ongoing QC and other proteomic technologies. This work asserts that properly carried out QC is essential to proteomics discovery experiments. Its significance is that it provides one possible novel framework for applying such methods, with a particular consideration of how to handle the complexity of the proteome. It not only focusses on 2DE-based methodology but also demonstrates general principles. A combination of results and discussion based upon a publicly available data set is used to illustrate the approach and allows a structured discussion of factors that experimenters may wish to bear in mind in other situations. The demonstration is on retrospective data only for reasons of scope, but the principles applied are also important for ongoing QC, and this work serves as a step towards a later demonstration of that application. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
76 FR 67315 - Supplemental Nutrition Assistance Program: Quality Control Error Tolerance Threshold
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...This direct final rule is amending the Quality Control (QC) review error threshold in our regulations from $25.00 to $50.00. The purpose for raising the QC error threshold is to make permanent the temporary threshold change that was required by the American Recovery and Reinvestment Act of 2008. This change does not have an impact on the public. The QC system measures the accuracy of the eligibility system for the Supplemental Nutrition Assistance Program (SNAP).
Quality control and quality assurance in genotypic data for genome-wide association studies
Laurie, Cathy C.; Doheny, Kimberly F.; Mirel, Daniel B.; Pugh, Elizabeth W.; Bierut, Laura J.; Bhangale, Tushar; Boehm, Frederick; Caporaso, Neil E.; Cornelis, Marilyn C.; Edenberg, Howard J.; Gabriel, Stacy B.; Harris, Emily L.; Hu, Frank B.; Jacobs, Kevin; Kraft, Peter; Landi, Maria Teresa; Lumley, Thomas; Manolio, Teri A.; McHugh, Caitlin; Painter, Ian; Paschall, Justin; Rice, John P.; Rice, Kenneth M.; Zheng, Xiuwen; Weir, Bruce S.
2011-01-01
Genome-wide scans of nucleotide variation in human subjects are providing an increasing number of replicated associations with complex disease traits. Most of the variants detected have small effects and, collectively, they account for a small fraction of the total genetic variance. Very large sample sizes are required to identify and validate findings. In this situation, even small sources of systematic or random error can cause spurious results or obscure real effects. The need for careful attention to data quality has been appreciated for some time in this field, and a number of strategies for quality control and quality assurance (QC/QA) have been developed. Here we extend these methods and describe a system of QC/QA for genotypic data in genome-wide association studies. This system includes some new approaches that (1) combine analysis of allelic probe intensities and called genotypes to distinguish gender misidentification from sex chromosome aberrations, (2) detect autosomal chromosome aberrations that may affect genotype calling accuracy, (3) infer DNA sample quality from relatedness and allelic intensities, (4) use duplicate concordance to infer SNP quality, (5) detect genotyping artifacts from dependence of Hardy-Weinberg equilibrium (HWE) test p-values on allelic frequency, and (6) demonstrate sensitivity of principal components analysis (PCA) to SNP selection. The methods are illustrated with examples from the ‘Gene Environment Association Studies’ (GENEVA) program. The results suggest several recommendations for QC/QA in the design and execution of genome-wide association studies. PMID:20718045
77 FR 75968 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... information unless it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality... required to perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380-1, Quality Control Review Schedule is for State use to collect both QC data and case...
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
Stapanian, Martin A.; Lewis, Timothy E; Palmer, Craig J.; Middlebrook Amos, Molly
2016-01-01
Unlike most laboratory studies, rigorous quality assurance/quality control (QA/QC) procedures may be lacking in ecosystem restoration (“ecorestoration”) projects, despite legislative mandates in the United States. This is due, in part, to ecorestoration specialists making the false assumption that some types of data (e.g. discrete variables such as species identification and abundance classes) are not subject to evaluations of data quality. Moreover, emergent behavior manifested by complex, adapting, and nonlinear organizations responsible for monitoring the success of ecorestoration projects tend to unconsciously minimize disorder, QA/QC being an activity perceived as creating disorder. We discuss similarities and differences in assessing precision and accuracy for field and laboratory data. Although the concepts for assessing precision and accuracy of ecorestoration field data are conceptually the same as laboratory data, the manner in which these data quality attributes are assessed is different. From a sample analysis perspective, a field crew is comparable to a laboratory instrument that requires regular “recalibration,” with results obtained by experts at the same plot treated as laboratory calibration standards. Unlike laboratory standards and reference materials, the “true” value for many field variables is commonly unknown. In the laboratory, specific QA/QC samples assess error for each aspect of the measurement process, whereas field revisits assess precision and accuracy of the entire data collection process following initial calibration. Rigorous QA/QC data in an ecorestoration project are essential for evaluating the success of a project, and they provide the only objective “legacy” of the dataset for potential legal challenges and future uses.
Chaturvedi, Arvind K; Craft, Kristi J; Cardona, Patrick S; Rogers, Paul B; Canfield, Dennis V
2009-05-01
During toxicological evaluations of samples from fatally injured pilots involved in civil aviation accidents, a high degree of quality control/quality assurance (QC/QA) is maintained. Under this philosophy, the Federal Aviation Administration (FAA) started a forensic toxicology proficiency-testing (PT) program in July 1991. In continuation of the first seven years of the PT findings reported earlier, PT findings of the next seven years are summarized herein. Twenty-eight survey samples (12 urine, 9 blood, and 7 tissue homogenate) with/without alcohols/volatiles, drugs, and/or putrefactive amine(s) were submitted to an average of 31 laboratories, of which an average of 25 participants returned their results. Analytes in survey samples were correctly identified and quantitated by a large number of participants, but some false positives of concern were reported. It is anticipated that the FAA's PT program will continue to serve the forensic toxicology community through this important part of the QC/QA for laboratory accreditations.
INAA Application for Trace Element Determination in Biological Reference Material
NASA Astrophysics Data System (ADS)
Atmodjo, D. P. D.; Kurniawati, S.; Lestiani, D. D.; Adventini, N.
2017-06-01
Trace element determination in biological samples is often used in the study of health and toxicology. Determination change to its essentiality and toxicity of trace element require an accurate determination method, which implies that a good Quality Control (QC) procedure should be performed. In this study, QC for trace element determination in biological samples was applied by analyzing the Standard Reference Material (SRM) Bovine muscle 8414 NIST using Instrumental Neutron Activation Analysis (INAA). Three selected trace element such as Fe, Zn, and Se were determined. Accuracy of the elements showed as %recovery and precision as %coefficient of variance (%CV). The result showed that %recovery of Fe, Zn, and Se were in the range between 99.4-107%, 92.7-103%, and 91.9-112%, respectively, whereas %CV were 2.92, 3.70, and 5.37%, respectively. These results showed that INAA method is precise and accurate for trace element determination in biological matrices.
Gray, James L.; Kanagy, Leslie K.; Furlong, Edward T.; McCoy, Jeff W.; Kanagy, Chris J.
2011-01-01
On April 22, 2010, the explosion on and subsequent sinking of the Deepwater Horizon oil drilling platform resulted in the release of crude oil into the Gulf of Mexico. At least 4.4 million barrels had been released into the Gulf of Mexico through July 15, 2010, 10 to 29 percent of which was chemically dispersed, primarily using two dispersant formulations. Initially, the dispersant Corexit 9527 was used, and when existing stocks of that formulation were exhausted, Corexit 9500 was used. Over 1.8 million gallons of the two dispersants were applied in the first 3 months after the spill. This report presents the development of an analytical method to analyze one of the primary surfactant components of both Corexit formulations, di(ethylhexyl) sodium sulfosuccinate (DOSS), the preliminary results, and the associated quality assurance/quality control (QA/QC) from samples collected from various points on the Gulf Coast between Texas and Florida. Seventy water samples and 8 field QC samples were collected before the predicted landfall of oil (pre-landfall) on the Gulf Coast, and 51 water samples and 10 field QC samples after the oil made landfall (post-landfall). Samples were collected in Teflon(Registered) bottles and stored at -20(degrees)C until analysis. Extraction of whole-water samples used sorption onto a polytetrafluoroethylene (PTFE) filter to isolate DOSS, with subsequent 50 percent methanol/water elution of the combined dissolved and particulate DOSS fractions. High-performance liquid chromatography/tandem mass spectrometry (LC/MS/MS) was used to identify and quantify DOSS by the isotope dilution method, using a custom-synthesized 13C4-DOSS labeled standard. Because of the ubiquitous presence of DOSS in laboratory reagent water, a chromatographic column was installed in the LC/MS/MS between the system pumps and the sample injector that separated this ambient background DOSS contamination from the sample DOSS, minimizing one source of blank contamination. Laboratory and field QA/QC for pre-landfall samples included laboratory reagent spike and blank samples, a total of 34 replicate analyses for the 78 environmental and field blank samples, and 11 randomly chosen laboratory matrix spike samples. Laboratory and field QA/QC for post-landfall samples included laboratory reagent spike and blank samples, a laboratory 'in-bottle' duplicate for each sample, and analysis of 24 randomly chosen laboratory matrix spike samples. Average DOSS recovery of 89(+/-)9.5 percent in all native (non-13C4-DOSS ) spikes was observed, with a mean relative percent difference between sample duplicates of 36 percent. The reporting limit for this analysis was 0.25 micrograms per liter due to blank limitations; DOSS was not detected in any samples collected in October (after oil landfall at certain study sites) above that concentration. It was detected prior to oil landfall above 0.25 micrograms per liter in 3 samples, but none exceeded the Environmental Protection Agency aquatic life criteria of 40 micrograms per liter.
Unanticipated error in HbA(1c) measurement on the HLC-723 G7 analyzer.
van den Ouweland, Johannes M W; de Keijzer, Marinus H; van Daal, Henny
2010-04-01
Investigation of falsely elevated HbA(1c) measurements on the HLC-723 G7 analyser. Comparison of HbA(1c) in blood samples that were diluted either in hemolysis reagent or water. HbA(1c) results became falsely elevated when samples were diluted in hemolysis reagent, but not in water. QC-procedures failed to detect this error as calibrator and QC samples were manually diluted in water, according to manufacturer's instructions, whereas patient samples were automatically diluted using hemolysing reagent. After replacement of the instruments' sample-loop and rotor seal comparable HbA(1c) results were obtained, irrespective of dilution with hemolysing reagent or water. This case illustrates the importance of treating calibrator and QC materials similar to routine patient samples in order to prevent unnoticed drift in patient HbA(1c) results. Copyright 2010 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
WE-AB-206-00: Diagnostic QA/QC Hands-On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
Robust modular product family design
NASA Astrophysics Data System (ADS)
Jiang, Lan; Allada, Venkat
2001-10-01
This paper presents a modified Taguchi methodology to improve the robustness of modular product families against changes in customer requirements. The general research questions posed in this paper are: (1) How to effectively design a product family (PF) that is robust enough to accommodate future customer requirements. (2) How far into the future should designers look to design a robust product family? An example of a simplified vacuum product family is used to illustrate our methodology. In the example, customer requirements are selected as signal factors; future changes of customer requirements are selected as noise factors; an index called quality characteristic (QC) is set to evaluate the product vacuum family; and the module instance matrix (M) is selected as control factor. Initially a relation between the objective function (QC) and the control factor (M) is established, and then the feasible M space is systemically explored using a simplex method to determine the optimum M and the corresponding QC values. Next, various noise levels at different time points are introduced into the system. For each noise level, the optimal values of M and QC are computed and plotted on a QC-chart. The tunable time period of the control factor (the module matrix, M) is computed using the QC-chart. The tunable time period represents the maximum time for which a given control factor can be used to satisfy current and future customer needs. Finally, a robustness index is used to break up the tunable time period into suitable time periods that designers should consider while designing product families.
Introducing Quality Control in the Chemistry Teaching Laboratory Using Control Charts
ERIC Educational Resources Information Center
Schazmann, Benjamin; Regan, Fiona; Ross, Mary; Diamond, Dermot; Paull, Brett
2009-01-01
Quality control (QC) measures are less prevalent in teaching laboratories than commercial settings possibly owing to a lack of commercial incentives or teaching resources. This article focuses on the use of QC assessment in the analytical techniques of high performance liquid chromatography (HPLC) and ultraviolet-visible spectroscopy (UV-vis) at…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
WE-AB-206-01: Diagnostic Ultrasound Imaging Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zagzebski, J.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
Greene, Karen E.
1997-01-01
A study of the ambient ground-water quality in the vicinity of Naval Submarine Base (SUBASE) Bangor was conducted to provide the U.S. Navywith background levels of selected constituents.The Navy needs this information to plan and manage cleanup activities on the base. DuringMarch and April 1995, 136 water-supply wells were sampled for common ions, trace elements, and organic compounds; not all wells were sampled for all constituents. Man-made organic compounds were detected in only two of fifty wells, and the sources of these organic compounds were attributed to activities in the immediate vicinities of these off- base wells. Drinking water standards for trichloroethylene, iron, and manganese were exceeded in one of these wells, which was probablycontaminated by an old local (off-base) dump. Ground water from wells open to the following hydrogeologic units (in order from shallow to deep) was investigated: the Vashon till confining unit (Qvt, three wells); the Vashon aquifer (Qva, 54 wells); the Upper confining unit (QC1, 16 wells); the Permeable interbeds within QC1 (QC1pi, 34 wells); and the Sea-level aquifer (QA1, 29 wells).The 50th and 90th percentile ambient background levels of 35 inorganic constituents were determined for each hydrogeologic unit. At least tenmeasurements were required for a constituent in each hydro- geologic unit for determination of ambient background levels, and data for three wellsdetermined to be affected by localized activities were excluded from these analyses. The only drinking water standards exceeded by ambient background levels were secondary maximum contaminant levels for iron (300 micrograms per liter), in QC1 and QC1pi, and manganese (50 micrograms per liter), in all of the units. The 90th percentile values for arsenic in QC1pi, QA1, and for the entire study area are above 5 micrograms per liter, the Model Toxics Control Act Method A value for protecting drinking water, but well below the maximum contaminant level of 50 micrograms per liter for arsenic. The manganese standard was exceeded in 38 wells and the standard for iron was exceeded in 12 wells.Most of these wells were in QC1 or QC1pi and had dissolved oxygen concentrations of less than 1 milligram per liter and dissolved organic carbon concentrations greater than 1\\x11milligram per liter.The dissolved oxygen concentration is generally lower in the deeper units, while pH increases; the recommended pH range of 6.5-8.5 standard units was exceeded in 9 wells. The common-ion chemistry was similar for all of the units.
7 CFR 275.21 - Quality control review reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... terminals, the State agency shall submit the results of each QC review in a format specified by FNS. Upon... in the individual case records, or legible copies of that material, as well as legible hard copies of... selection and completion on the Form FNS-248, Status of Sample Selection and Completion or other format...
7 CFR 275.13 - Review of negative cases.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 4 2011-01-01 2011-01-01 false Review of negative cases. 275.13 Section 275.13... AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews § 275.13 Review of negative cases. (a) General. A sample of actions to deny applications, or suspend or...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angers, Crystal Plume; Bottema, Ryan; Buckley, Les
Purpose: Treatment unit uptime statistics are typically used to monitor radiation equipment performance. The Ottawa Hospital Cancer Centre has introduced the use of Quality Control (QC) test success as a quality indicator for equipment performance and overall health of the equipment QC program. Methods: Implemented in 2012, QATrack+ is used to record and monitor over 1100 routine machine QC tests each month for 20 treatment and imaging units ( http://qatrackplus.com/ ). Using an SQL (structured query language) script, automated queries of the QATrack+ database are used to generate program metrics such as the number of QC tests executed and themore » percentage of tests passing, at tolerance or at action. These metrics are compared against machine uptime statistics already reported within the program. Results: Program metrics for 2015 show good correlation between pass rate of QC tests and uptime for a given machine. For the nine conventional linacs, the QC test success rate was consistently greater than 97%. The corresponding uptimes for these units are better than 98%. Machines that consistently show higher failure or tolerance rates in the QC tests have lower uptimes. This points to either poor machine performance requiring corrective action or to problems with the QC program. Conclusions: QATrack+ significantly improves the organization of QC data but can also aid in overall equipment management. Complimenting machine uptime statistics with QC test metrics provides a more complete picture of overall machine performance and can be used to identify areas of improvement in the machine service and QC programs.« less
Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors
NASA Technical Reports Server (NTRS)
Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory
2011-01-01
Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.
Quality control in urodynamics and the role of software support in the QC procedure.
Hogan, S; Jarvis, P; Gammie, A; Abrams, P
2011-11-01
This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.
Quality control management and communication between radiologists and technologists.
Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M
2008-06-01
The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.
Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine
2017-08-01
Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Results-driven approach to improving quality and productivity
John Dramm
2000-01-01
Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of âSomeday, this will all pay off.â Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...
Coda Q and its Frequency Dependence in the Eastern Himalayan and Indo-Burman Plate Boundary Systems
NASA Astrophysics Data System (ADS)
Mitra, S.; Kumar, A.
2015-12-01
We use broadband waveform data for 305 local earthquakes from the Eastern Himalayan and Indo-Burman plate boundary systems, to model the seismic attenuation in NE India. We measure the decay in amplitude of coda waves at discreet frequencies (between 1 and 12Hz) to evaluate the quality factor (Qc) as a function of frequency. We combine these measurements to evaluate the frequency dependence of Qc of the form Qc(f)=Qof η, where Qo is the quality factor at 1Hz and η is the frequency dependence. Computed Qo values range from 80-360 and η ranges from 0.85-1.45. To study the lateral variation in Qo and η, we regionalise the Qc by combining all source-receiver measurements using a back-projection algorithm. For a single back scatter model, the coda waves sample an elliptical area with the epicenter and receiver at the two foci. We parameterize the region using square grids. The algorithm calculates the overlap in area and distributes Qc in the sampled grids using the average Qc as the boundary value. This is done in an iterative manner, by minimising the misfit between the observed and computed Qc within each grid. This process is repeated for all frequencies and η is computed for each grid by combining Qc for all frequencies. Our results reveal strong variation in Qo and η across NE India. The highest Qo are in the Bengal Basin (210-280) and the Indo-Burman subduction zone (300-360). The Shillong Plateau and Mikir Hills have intermediate Qo (~160) and the lowest Qo (~80) is observed in the Naga fold thrust belt. This variation in Qo demarcates the boundary between the continental crust beneath the Shillong Plateau and Mikir Hills and the transitional to oceanic crust beneath the Bengal Basin and Indo-Burman subduction zone. Thick pile of sedimentary strata in the Naga fold thrust belt results in the low Qo. Frequency dependence (η) of Qc across NE India is observed to be very high, with regions of high Qo being associated with relatively higher η.
Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl
2012-11-02
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.
2012-01-01
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386
Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?
Sharp, Susan E; Miller, Melissa B; Hindler, Janet
2015-12-01
The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use "equivalent QC" (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Quality control and assurance for validation of DOS/I measurements
NASA Astrophysics Data System (ADS)
Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.
2010-02-01
Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.
WE-AB-206-02: ACR Ultrasound Accreditation: Requirements and Pitfalls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, J.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
Operational quality control of daily precipitation using spatio-climatological consistency testing
NASA Astrophysics Data System (ADS)
Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.
2010-09-01
Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.
Quantum cascade transmitters for ultrasensitive chemical agent and explosives detection
NASA Astrophysics Data System (ADS)
Schultz, John F.; Taubman, Matthew S.; Harper, Warren W.; Williams, Richard M.; Myers, Tanya L.; Cannon, Bret D.; Sheen, David M.; Anheier, Norman C., Jr.; Allen, Paul J.; Sundaram, S. K.; Johnson, Bradley R.; Aker, Pamela M.; Wu, Ming C.; Lau, Erwin K.
2003-07-01
The small size, high power, promise of access to any wavelength between 3.5 and 16 microns, substantial tuning range about a chosen center wavelength, and general robustness of quantum cascade (QC) lasers provide opportunities for new approaches to ultra-sensitive chemical detection and other applications in the mid-wave infrared. PNNL is developing novel remote and sampling chemical sensing systems based on QC lasers, using QC lasers loaned by Lucent Technologies. In recent months laboratory cavity-enhanced sensing experiments have achieved absorption sensitivities of 8.5 x 10-11 cm-1 Hz-1/2, and the PNNL team has begun monostatic and bi-static frequency modulated, differential absorption lidar (FM DIAL) experiments at ranges of up to 2.5 kilometers. In related work, PNNL and UCLA are developing miniature QC laser transmitters with the multiplexed tunable wavelengths, frequency and amplitude stability, modulation characteristics, and power levels needed for chemical sensing and other applications. Current miniaturization concepts envision coupling QC oscillators, QC amplifiers, frequency references, and detectors with miniature waveguides and waveguide-based modulators, isolators, and other devices formed from chalcogenide or other types of glass. Significant progress has been made on QC laser stabilization and amplification, and on development and characterization of high-purity chalcogenide glasses, waveguide writing techniques, and waveguide metrology.
The Quality Control Circle: Is It for Education?
ERIC Educational Resources Information Center
Land, Arthur J.
From its start in Japan after World War II, the Quality Control Circle (Q.C.) approach to management and organizational operation evolved into what it is today: people doing similar work meeting regularly to identify, objectively analyze, and develop solutions to problems. The Q.C. approach meets Maslow's theory of motivation by inviting…
qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-01-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958
qcML: an exchange format for quality control metrics from mass spectrometry experiments.
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-08-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
Srivastava, Praveen; Moorthy, Ganesh S.; Gross, Robert; Barrett, Jeffrey S.
2013-01-01
A selective and a highly sensitive method for the determination of the non-nucleoside reverse transcriptase inhibitor (NNRTI), efavirenz, in human plasma has been developed and fully validated based on high performance liquid chromatography tandem mass spectrometry (LC–MS/MS). Sample preparation involved protein precipitation followed by one to one dilution with water. The analyte, efavirenz was separated by high performance liquid chromatography and detected with tandem mass spectrometry in negative ionization mode with multiple reaction monitoring. Efavirenz and 13C6-efavirenz (Internal Standard), respectively, were detected via the following MRM transitions: m/z 314.20243.90 and m/z 320.20249.90. A gradient program was used to elute the analytes using 0.1% formic acid in water and 0.1% formic acid in acetonitrile as mobile phase solvents, at a flow-rate of 0.3 mL/min. The total run time was 5 min and the retention times for the internal standard (13C6-efavirenz) and efavirenz was approximately 2.6 min. The calibration curves showed linearity (coefficient of regression, r>0.99) over the concentration range of 1.0–2,500 ng/mL. The intraday precision based on the standard deviation of replicates of lower limit of quantification (LLOQ) was 9.24% and for quality control (QC) samples ranged from 2.41% to 6.42% and with accuracy from 112% and 100–111% for LLOQ and QC samples. The inter day precision was 12.3% and 3.03–9.18% for LLOQ and quality controls samples, and the accuracy was 108% and 95.2–108% for LLOQ and QC samples. Stability studies showed that efavirenz was stable during the expected conditions for sample preparation and storage. The lower limit of quantification for efavirenz was 1 ng/mL. The analytical method showed excellent sensitivity, precision, and accuracy. This method is robust and is being successfully applied for therapeutic drug monitoring and pharmacokinetic studies in HIV-infected patients. PMID:23755102
Quality Control of Meteorological Observations
NASA Technical Reports Server (NTRS)
Collins, William; Dee, Dick; Rukhovets, Leonid
1999-01-01
For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.
Proteomics Quality Control: Quality Control Software for MaxQuant Results.
Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan
2016-03-04
Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .
Development and validation of an HPLC–MS/MS method to determine clopidogrel in human plasma
Liu, Gangyi; Dong, Chunxia; Shen, Weiwei; Lu, Xiaopei; Zhang, Mengqi; Gui, Yuzhou; Zhou, Qinyi; Yu, Chen
2015-01-01
A quantitative method for clopidogrel using online-SPE tandem LC–MS/MS was developed and fully validated according to the well-established FDA guidelines. The method achieves adequate sensitivity for pharmacokinetic studies, with lower limit of quantifications (LLOQs) as low as 10 pg/mL. Chromatographic separations were performed on reversed phase columns Kromasil Eternity-2.5-C18-UHPLC for both methods. Positive electrospray ionization in multiple reaction monitoring (MRM) mode was employed for signal detection and a deuterated analogue (clopidogrel-d4) was used as internal standard (IS). Adjustments in sample preparation, including introduction of an online-SPE system proved to be the most effective method to solve the analyte back-conversion in clinical samples. Pooled clinical samples (two levels) were prepared and successfully used as real-sample quality control (QC) in the validation of back-conversion testing under different conditions. The result showed that the real samples were stable in room temperature for 24 h. Linearity, precision, extraction recovery, matrix effect on spiked QC samples and stability tests on both spiked QCs and real sample QCs stored in different conditions met the acceptance criteria. This online-SPE method was successfully applied to a bioequivalence study of 75 mg single dose clopidogrel tablets in 48 healthy male subjects. PMID:26904399
Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E
2018-09-01
Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.
Swaminathan, Shanker; Huentelman, Matthew J; Corneveaux, Jason J; Myers, Amanda J; Faber, Kelley M; Foroud, Tatiana; Mayeux, Richard; Shen, Li; Kim, Sungeun; Turk, Mari; Hardy, John; Reiman, Eric M; Saykin, Andrew J
2012-01-01
Copy number variations (CNVs) are genomic regions that have added (duplications) or deleted (deletions) genetic material. They may overlap genes affecting their function and have been shown to be associated with disease. We previously investigated the role of CNVs in late-onset Alzheimer's disease (AD) and mild cognitive impairment using Alzheimer's Disease Neuroimaging Initiative (ADNI) and National Institute of Aging-Late Onset AD/National Cell Repository for AD (NIA-LOAD/NCRAD) Family Study participants, and identified a number of genes overlapped by CNV calls. To confirm the findings and identify other potential candidate regions, we analyzed array data from a unique cohort of 1617 Caucasian participants (1022 AD cases and 595 controls) who were clinically characterized and whose diagnosis was neuropathologically verified. All DNA samples were extracted from brain tissue. CNV calls were generated and subjected to quality control (QC). 728 cases and 438 controls who passed all QC measures were included in case/control association analyses including candidate gene and genome-wide approaches. Rates of deletions and duplications did not significantly differ between cases and controls. Case-control association identified a number of previously reported regions (CHRFAM7A, RELN and DOPEY2) as well as a new gene (HLA-DRA). Meta-analysis of CHRFAM7A indicated a significant association of the gene with AD and/or MCI risk (P = 0.006, odds ratio = 3.986 (95% confidence interval 1.490-10.667)). A novel APP gene duplication was observed in one case sample. Further investigation of the identified genes in independent and larger samples is warranted.
Thaitrong, Numrin; Kim, Hanyoup; Renzi, Ronald F; Bartsch, Michael S; Meagher, Robert J; Patel, Kamlesh D
2012-12-01
We have developed an automated quality control (QC) platform for next-generation sequencing (NGS) library characterization by integrating a droplet-based digital microfluidic (DMF) system with a capillary-based reagent delivery unit and a quantitative CE module. Using an in-plane capillary-DMF interface, a prepared sample droplet was actuated into position between the ground electrode and the inlet of the separation capillary to complete the circuit for an electrokinetic injection. Using a DNA ladder as an internal standard, the CE module with a compact LIF detector was capable of detecting dsDNA in the range of 5-100 pg/μL, suitable for the amount of DNA required by the Illumina Genome Analyzer sequencing platform. This DMF-CE platform consumes tenfold less sample volume than the current Agilent BioAnalyzer QC technique, preserving precious sample while providing necessary sensitivity and accuracy for optimal sequencing performance. The ability of this microfluidic system to validate NGS library preparation was demonstrated by examining the effects of limited-cycle PCR amplification on the size distribution and the yield of Illumina-compatible libraries, demonstrating that as few as ten cycles of PCR bias the size distribution of the library toward undesirable larger fragments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Khanna, Niharika; Shaya, Fadia T; Chirikov, Viktor V; Sharp, David; Steffen, Ben
2016-01-01
We present data on quality of care (QC) improvement in 35 of 45 National Quality Forum metrics reported annually by 52 primary care practices recognized as patient-centered medical homes (PCMHs) that participated in the Maryland Multi-Payor Program from 2011 to 2013. We assigned QC metrics to (1) chronic, (2) preventive, and (3) mental health care domains. The study used a panel data design with no control group. Using longitudinal fixed-effects regressions, we modeled QC and case mix severity in a PCMH. Overall, 35 of 45 quality metrics reported by 52 PCMHs demonstrated improvement over 3 years, and case mix severity did not affect the achievement of quality improvement. From 2011 to 2012, QC increased by 0.14 (P < .01) for chronic, 0.15 (P < .01) for preventive, and 0.34 (P < .01) for mental health care domains; from 2012 to 2013 these domains increased by 0.03 (P = .06), 0.04 (P = .05), and 0.07 (P = .12), respectively. In univariate analyses, lower National Commission on Quality Assurance PCMH level was associated with higher QC for the mental health care domain, whereas case mix severity did not correlate with QC. In multivariate analyses, higher QC correlated with larger practices, greater proportion of older patients, and readmission visits. Rural practices had higher proportions of Medicaid patients, lower QC, and higher QC improvement in interaction analyses with time. The gains in QC in the chronic disease domain, the preventive care domain, and, most significantly, the mental health care domain were observed over time regardless of patient case mix severity. QC improvement was generally not modified by practice characteristics, except for rurality. © Copyright 2016 by the American Board of Family Medicine.
Massin, Frédéric; Huili, Cai; Decot, Véronique; Stoltz, Jean-François; Bensoussan, Danièle; Latger-Cannard, Véronique
2015-01-01
Stem cells for autologous and allogenic transplantation are obtained from several sources including bone marrow, peripheral blood or cord blood. Accurate enumeration of viable CD34+ hematopoietic stem cells (HSC) is routinely used in clinical settings, especially to monitor progenitor cell mobilization and apheresis. The number of viable CD34+ HSC has also been shown to be the most critical factor in haematopoietic engraftment. The International Society for Cellular Therapy actually recommends the use of single-platform flow cytometry system using 7-AAD as a viability dye. In a way to move routine analysis from a BD FACSCaliburTM instrument to a BD FACSCantoTM II, according to ISO 15189 standard guidelines, we define laboratory performance data of the BDTM Stem Cell Enumeration (SCE) kit on a CE-IVD system including a BD FACSCanto II flow cytometer and the BD FACSCantoTM Clinical Software. InterQCTM software, a real time internet laboratory QC management system developed by VitroTM and distributed by Becton DickinsonTM, was also tested to monitor daily QC data, to define the internal laboratory statistics and to compare them to external laboratories. Precision was evaluated with BDTM Stem Cell Control (high and low) results and the InterQC software, an internet laboratory QC management system by Vitro. This last one drew Levey-Jennings curves and generated numeral statistical parameters allowing detection of potential changes in the system performances as well as interlaboratory comparisons. Repeatability, linearity and lower limits of detection were obtained with routine samples from different origins. Agreement evaluation between BD FACSCanto II system versus BD FACSCalibur system was tested on fresh peripheral blood, freeze-thawed apheresis, fresh bone marrow and fresh cord blood samples. Instrument's measure and staining repeatability clearly evidenced acceptable variability on the different samples tested. Intra- and inter-laboratory CV in CD34+ cell absolute count are consistent and reproducible. Linearity analysis, established between 2 and 329 cells/μl showed a linear relation between expected counts and measured counts (R2=0.97). Linear regression and Bland-Altman representations showed an excellent correlation on samples from different sources between the two systems and allowed the transfer of routine analysis from BD FACSCalibur to BD FACSCanto II. The BD SCE kit provides an accurate measure of the CD34 HSC, and can be used in daily routine to optimize the enumeration of hematopoietic CD34+ stem cells by flow cytometry. Moreover, the InterQC system seems to be a very useful tool for laboratory daily quality monitoring and thus for accreditation.
Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong
2017-10-01
During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.
76 FR 51274 - Supplemental Nutrition Assistance Program: Major System Failures
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... data mining as necessary to determine if losses are occurring in the process of issuing benefits. It is... further by using data mining techniques on States' data or analyzing QC data for error patterns that may... conjunction with an additional sample of cases. Data mining techniques may be employed when QC data cannot...
Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki
2016-02-01
As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.
AutoLock: a semiautomated system for radiotherapy treatment plan quality control
Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.
2015-01-01
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498
Quality control and conduct of genome-wide association meta-analyses.
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth J F
2014-05-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.
AutoLock: a semiautomated system for radiotherapy treatment plan quality control.
Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G
2015-05-08
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.
Quality control and conduct of genome-wide association meta-analyses
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth JF
2014-01-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for [1] organizational aspects of GWAMAs, and for [2] QC at the study file level, the meta-level across studies, and the meta-analysis output level. Real–world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for use of a powerful and flexible software package called EasyQC. For consortia of comparable size to the GIANT consortium, the present protocol takes a minimum of about 10 months to complete. PMID:24762786
ERIC Educational Resources Information Center
Espy, John; And Others
A project was conducted to field test selected first- and second-year courses in a postsecondary nuclear quality assurance/quality control (QA/QC) technician curriculum and to develop the teaching/learning modules for seven technical specialty courses remaining in the QA/QC technician curriculum. The field testing phase of the project involved the…
Preliminary Quality Control System Design for the Pell Grant Program.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
A preliminary design for a quality control (QC) system for the Pell Grant Program is proposed, based on the needs of the Office of Student Financial Assistance (OSFA). The applicability of the general design for other student aid programs administered by OSFA is also considered. The following steps included in a strategic approach to QC system…
Collier, J W; Shah, R B; Bryant, A R; Habib, M J; Khan, M A; Faustino, P J
2011-02-20
A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (L-T(4)) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250 mm × 3.9 mm) using a 0.01 M phosphate buffer (pH 3.0)-methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 μL and the column temperature was maintained at 28°C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r(2)>0.99) over the analytical range of 0.08-0.8 μg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for L-T(4) over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. Published by Elsevier B.V.
Collier, J.W.; Shah, R.B.; Bryant, A.R.; Habib, M.J.; Khan, M.A.; Faustino, P.J.
2011-01-01
A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (l-T4) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250mm × 3.9mm) using a 0.01 M phosphate buffer (pH 3.0)–methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 µL and the column temperature was maintained at 28 °C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r2 > 0.99) over the analytical range of 0.08–0.8 µg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for l-T4 over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. PMID:20947276
Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa
2012-11-01
To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database. Copyright © 2012 Elsevier Inc. All rights reserved.
Shephard, Mark; Shephard, Anne; McAteer, Bridgit; Regnier, Tamika; Barancek, Kristina
2017-12-01
Diabetes is a major health problem for Australia's Aboriginal and Torres Strait Islander peoples. Point-of-care testing for haemoglobin A1c (HbA1c) has been the cornerstone of a long-standing program (QAAMS) to manage glycaemic control in Indigenous people with diabetes and recently, to diagnose diabetes. The QAAMS quality management framework includes monthly testing of quality control (QC) and external quality assurance (EQA) samples. Key performance indicators of quality include imprecision (coefficient of variation [CV%]) and percentage acceptable results. This paper reports on the past 15years of quality testing in QAAMS and examines the performance of HbA1c POC testing at the 6.5% cut-off recommended for diagnosis. The total number of HbA1c EQA results submitted from 2002 to 2016 was 29,093. The median imprecision for EQA testing by QAAMS device operators averaged 2.81% (SD 0.50; range 2.2 to 3.9%) from 2002 to 2016 and 2.44% (SD 0.22; range 2.2 to 2.9%) from 2009 to 2016. No significant difference was observed between the median imprecision achieved in QAAMS and by Australasian laboratories from 2002 to 2016 (p=0.05; two-tailed paired t-test) and from 2009 to 2016 (p=0.17; two-tailed paired t-test). For QC testing from 2009 to 2016, imprecision averaged 2.5% and 3.0% for the two levels of QC tested. Percentage acceptable results averaged 90% for QA testing from 2002 to 2016 and 96% for QC testing from 2009 to 2016. The DCA Vantage was able to measure a patient and an EQA sample with an HbA1c value close to 6.5% both accurately and precisely. HbA1c POC testing in QAAMS has remained analytically sound, matched the quality achieved by Australasian laboratories and met profession-derived analytical goals for 15years. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Kadowaki, Hisae; Satrimafitrah, Pasjan; Takami, Yasunari; Nishitoh, Hideki
2018-05-09
The maintenance of endoplasmic reticulum (ER) homeostasis is essential for cell function. ER stress-induced pre-emptive quality control (ERpQC) helps alleviate the burden to a stressed ER by limiting further protein loading. We have previously reported the mechanisms of ERpQC, which includes a rerouting step and a degradation step. Under ER stress conditions, Derlin family proteins (Derlins), which are components of ER-associated degradation, reroute specific ER-targeting proteins to the cytosol. Newly synthesized rerouted polypeptides are degraded via the cytosolic chaperone Bag6 and the AAA-ATPase p97 in the ubiquitin-proteasome system. However, the mechanisms by which ER-targeting proteins are rerouted from the ER translocation pathway to the cytosolic degradation pathway and how the E3 ligase ubiquitinates ERpQC substrates remain unclear. Here, we show that ERpQC substrates are captured by the carboxyl-terminus region of Derlin-1 and ubiquitinated by the HRD1 E3 ubiquitin ligase prior to degradation. Moreover, HRD1 forms a large ERpQC-related complex composed of Sec61α and Derlin-1 during ER stress. These findings indicate that the association of the degradation factor HRD1 with the translocon and the rerouting factor Derlin-1 may be necessary for the smooth and effective clearance of ERpQC substrates.
Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.
Heredia, Nicholas J
2018-01-01
Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.
Develop a Methodology to Evaluate the Effectiveness of QC/QA Specifications (Phase II)
DOT National Transportation Integrated Search
1998-08-01
The Texas Department of Transportation (TxDOT) has been implementing statistically based quality control/quality assurance (QC/QA) specifications for hot mix asphalt concrete pavements since the early 1990s. These specifications have been continuousl...
A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS
Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T.; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J.; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A.; Lempicki, Richard A.; Huang, Da Wei
2013-01-01
PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results. PMID:24179701
A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS.
Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A; Lempicki, Richard A; Huang, Da Wei
2013-07-31
PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results.
Hong, Sung Kuk; Choi, Seung Jun; Shin, Saeam; Lee, Wonmok; Pinto, Naina; Shin, Nari; Lee, Kwangjun; Hong, Seong Geun; Kim, Young Ah; Lee, Hyukmin; Kim, Heejung; Song, Wonkeun; Lee, Sun Hwa; Yong, Dongeun; Lee, Kyungwon; Chong, Yunsop
2015-11-01
Quality control (QC) processes are being performed in the majority of clinical microbiology laboratories to ensure the performance of microbial identification and antimicrobial susceptibility testing by using ATCC strains. To obtain these ATCC strains, some inconveniences are encountered concerning the purchase cost of the strains and the shipping time required. This study was focused on constructing a database of reference strains for QC processes using domestic bacterial strains, concentrating primarily on antimicrobial susceptibility testing. Three strains (Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus) that showed legible results in preliminary testing were selected. The minimal inhibitory concentrations (MICs) and zone diameters (ZDs) of eight antimicrobials for each strain were determined according to the CLSI M23. All resulting MIC and ZD ranges included at least 95% of the data. The ZD QC ranges obtained by using the CLSI method were less than 12 mm, and the MIC QC ranges extended no more than five dilutions. This study is a preliminary attempt to construct a bank of Korean QC strains. With further studies, a positive outcome toward cost and time reduction can be anticipated.
De Clercq, K; Goris, N; Barnett, P V; MacKay, D K
2008-01-01
The last decade international trade in animals and animal products was liberated and confidence in this global trade can increase only if appropriate control measures are applied. As foot-and-mouth disease (FMD) diagnostics will play an essential role in this respect, the Food and Agriculture Organization European Commission for the Control of Foot-and-Mouth Disease (EUFMD) co-ordinates, in collaboration with the European Commission, several programmes to increase the quality of FMD diagnostics. A quality assurance (QA) system is deemed essential for laboratories involved in certifying absence of FMDV or antibodies against the virus. Therefore, laboratories are encouraged to validate their diagnostic tests fully and to install a continuous quality control (QC) monitoring system. Knowledge of performance characteristics of diagnostics is essential to interpret results correctly and to calculate sample rates in regional surveillance campaigns. Different aspects of QA/QC of classical and new FMD virological and serological diagnostics are discussed in respect to the EU FMD directive (2003/85/EC). We recommended accepting trade certificates only from laboratories participating in international proficiency testing on a regular basis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores
ERIC Educational Resources Information Center
Allalouf, Avi
2014-01-01
The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…
Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A
2014-12-01
High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ± 1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.
Development of concrete QC/QA specifications for highway construction in Kentucky.
DOT National Transportation Integrated Search
2001-08-01
There is a growing trend toward quality-based specifications in highway construction. A large number of quality control/quality assurance (QC/QA) specifications shift the responsibility of day-to-day testing from the state DOH to the contractor. This...
Portland cement concrete pavement review of QC/QA data 2000 through 2009.
DOT National Transportation Integrated Search
2011-04-01
This report analyzes the Quality Control/Quality Assurance (QC/QA) data for Portland cement concrete pavement : (PCCP) awarded in the years 2000 through 2009. Analysis of the overall performance of the projects is accomplished by : reviewing the Calc...
Quality Assurance and Control Considerations in Environmental Measurements and Monitoring
NASA Astrophysics Data System (ADS)
Sedlet, Jacob
1982-06-01
Quality assurance and quality control have become accepted as essential parts of all environmental surveillance, measurements, and monitoring programs, both nuclear and non-nuclear. The same principles and details apply to each. It is primarily the final measurement technique that differs. As the desire and need to measure smaller amounts of pollutants with greater accuracy has increased, it has been recognized that quality assurance and control programs are cost-effective in achieving the expected results. Quality assurance (QA) consists of all the actions necessary to provide confidence in the results. Quality control (QC) is a part of QA, and consists of those actions and activities that permit the control of the individual steps in the environmental program. The distinction between the two terms is not always clearly defined, but a sharp division is not necessary. The essential principle of QA and QC is a commitment to high quality results. The essential components of a QA and QC program are a complete, written procedures manual for all parts of the environmental program, the use of standard or validated procedures, participation in applicable interlaboratory comparison or QA programs, replicate analysis and measurement, training of personnel, and a means of auditing or checking that the QA and QC programs are properly conducted. These components are discussed below in some detail.
Mederos, A; Fernández, S; VanLeeuwen, J; Peregrine, A S; Kelton, D; Menzies, P; LeBoeuf, A; Martin, R
2010-06-24
In order to characterize the epidemiology of sheep gastrointestinal nematodes in organic and conventional flocks in Canada, a longitudinal study was carried out from May 2006 to March 2008 on 32 purposively selected farms in Ontario (ON) and Quebec (QC): 8 certified organic (CO), 16 non-certified organic (NCO), and 8 conventional (C) farms. On each farm, 10 ewes and 10 female lambs were selected. Farm visits were undertaken monthly during the grazing season, and twice in the winter. At each visit, individual fecal samples were taken, and pasture samples were obtained during the grazing season. In addition, body condition score was recorded for all sheep. Fecal egg counts per gram of feces (EPGs) were determined for all fecal samples, and infective larvae (L(3)) were identified in fecal samples (lambs and ewes separately) and pasture samples from farms. Necropsies of 14 lambs from 7 of the 23 Ontario farms were performed at the end of the grazing season in 2006. The mean EPG for year 1 (May 2006 to March 2007) was 181 (range=0-9840) and 351 (range=0-18,940) for the ewes in ON and QC, respectively, and for the lambs was 509 (range=0-25,020) and 147 (range=0-3060) for ON and QC, respectively. During year 2 (April 2007 to March 2008), the mean EPG was 303 (range=0-21,160) and 512 (range=0-22,340) for the ewes in ON and QC, respectively, and for lambs was 460 (range=0-26,180) and 232 (range=0-8280) for ON and QC, respectively. Although the overall mean EPGs were not remarkably high, there were months of higher EPG such as May-June for ewes and July-August for lambs in both provinces. Pasture infectivity was highest in May-June and September. There was a general trend for the CO farms to have lower mean EPG than NCO and C farms. Fecal cultures demonstrated that the most predominant nematode genera were Teladorsagia sp., Haemonchus sp. and Trichostrongylus spp. Pasture infectivity was highest during June-July (984 L3/kg DM) in ON farms and September (mean=436 L3/kg DM) in QC farms during year 1. In year 2, the highest peak was during October in ON (mean=398 L3/kg DM) and July in QC (239 L3/kg DM). Trichostrongylus axei and Trichostrongylus colubriformis were the species most frequently identified from necropsies (36.44% and 38.26%, respectively) at the end of the grazing season in 2006, with Haemonchus contortus and Teladorsagia circumcincta being the next most commonly identified. (c) 2010 Elsevier B.V. All rights reserved.
Quantum Computing Using Superconducting Qubits
2006-04-01
see the right fig.), and (iii) dynamically modifying ( pulsating ) this potential by controlling the motion of the A particles. This allows easy...superconductors with periodic pinning arrays. We show that sample heating by moving vortices produces negative differential resistivity (NDR) of both N- and S...efficient (i.e., using one two-bit operation) QC circuits using modern microfabrication techniques. scheme for this design [1,3] to achieve conditional
Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary
DOT National Transportation Integrated Search
2012-01-01
When the Indiana Department of Transportation designs a pavement project, a decision for QC/QA (Quality Control/ Quality Assurance) or nonQC/QA is made solely based on the quantity of pavement materials to be used in the project. Once the pavement...
Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary
DOT National Transportation Integrated Search
2012-01-01
When the Indiana Department of Transportation designs : a pavement project, a decision for QC/QA (Quality Control/ : Quality Assurance) or nonQC/QA is made solely : based on the quantity of pavement materials to be used : in the project. Once the ...
Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2010.
DOT National Transportation Integrated Search
2011-10-01
This report analyzes the quality control/quality assurance (QC/QA) data for hot mix asphalt (HMA) using : voids acceptance as the testing criteria awarded in the years 2000 through 2010. Analysis of the overall : performance of the projects is accomp...
Non-monotonicity and divergent time scale in Axelrod model dynamics
NASA Astrophysics Data System (ADS)
Vazquez, F.; Redner, S.
2007-04-01
We study the evolution of the Axelrod model for cultural diversity, a prototypical non-equilibrium process that exhibits rich dynamics and a dynamic phase transition between diversity and an inactive state. We consider a simple version of the model in which each individual possesses two features that can assume q possibilities. Within a mean-field description in which each individual has just a few interaction partners, we find a phase transition at a critical value qc between an active, diverse state for q < qc and a frozen state. For q lesssim qc, the density of active links is non-monotonic in time and the asymptotic approach to the steady state is controlled by a time scale that diverges as (q-qc)-1/2.
Dimech, Wayne; Karakaltsas, Marina; Vincini, Giuseppe A
2018-05-25
A general trend towards conducting infectious disease serology testing in centralized laboratories means that quality control (QC) principles used for clinical chemistry testing are applied to infectious disease testing. However, no systematic assessment of methods used to establish QC limits has been applied to infectious disease serology testing. A total of 103 QC data sets, obtained from six different infectious disease serology analytes, were parsed through standard methods for establishing statistical control limits, including guidelines from Public Health England, USA Clinical and Laboratory Standards Institute (CLSI), German Richtlinien der Bundesärztekammer (RiliBÄK) and Australian QConnect. The percentage of QC results failing each method was compared. The percentage of data sets having more than 20% of QC results failing Westgard rules when the first 20 results were used to calculate the mean±2 standard deviation (SD) ranged from 3 (2.9%) for R4S to 66 (64.1%) for 10X rule, whereas the percentage ranged from 0 (0%) for R4S to 32 (40.5%) for 10X when the first 100 results were used to calculate the mean±2 SD. By contrast, the percentage of data sets with >20% failing the RiliBÄK control limits was 25 (24.3%). Only two data sets (1.9%) had more than 20% of results outside the QConnect Limits. The rate of failure of QCs using QConnect Limits was more applicable for monitoring infectious disease serology testing compared with UK Public Health, CLSI and RiliBÄK, as the alternatives to QConnect Limits reported an unacceptably high percentage of failures across the 103 data sets.
An introduction to statistical process control in research proteomics.
Bramwell, David
2013-12-16
Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
Binz, Tina M; Braun, Ueli; Baumgartner, Markus R; Kraemer, Thomas
2016-10-15
Hair cortisol levels are increasingly applied as a measure for stress in humans and mammals. Cortisol is an endogenous compound and is always present within the hair matrix. Therefore, "cortisol-free hair matrix" is a critical point for any analytical method to accurately quantify especially low cortisol levels. The aim of this project was to modify current methods used for hair cortisol analysis to more accurately determine low endogenous cortisol concentrations in hair. For that purpose, (13)C3-labeled cortisol, which is not naturally present in hair (above 13C natural abundance levels), was used for calibration and comparative validation applying cortisol versus (13)C3-labeled cortisol. Cortisol was extracted from 20mg hair (standard sample amount) applying an optimized single step extraction protocol. An LC-MS/MS method was developed for the quantitative analysis of cortisol using either cortisol or (13)C3-cortisol as calibrators and D7-cortisone as internal standard (IS). The two methods (cortisol/(13)C3-labeled cortisol) were validated in a concentration range up to 500pg/mg and showed good linearity for both analytes (cortisol: R(2)=0.9995; (13)C3-cortisol R(2)=0.9992). Slight differences were observed for limit of detection (LOD) (0.2pg/mg/0.1pg/mg) and limit of quantification (LOQ) (1pg/mg/0.5pg/mg). Precision was good with a maximum deviation of 8.8% and 10% for cortisol and (13)C3-cortisol respectively. Accuracy and matrix effects were good for both analytes except for the quality control (QC) low cortisol. QC low (2.5pg/mg) showed matrix effects (126.5%, RSD 35.5%) and accuracy showed a deviation of 26% when using cortisol to spike. These effects are likely to be caused by the unknown amount of endogenous cortisol in the different hair samples used to determine validation parameters like matrix effect, LOQ and accuracy. No matrix effects were observed for the high QC (400pg/mg) samples. Recovery was good with 92.7%/87.3% (RSD 9.9%/6.2%) for QC low and 102.3%/82.1% (RSD 5.8%/11.4%) for QC high. After successful validation the applicability of the method could be proven. The study shows that the method is especially useful for determining low endogenous cortisol concentrations as they occur in cow hair for example. Copyright © 2016 Elsevier B.V. All rights reserved.
Harada, Sei; Hirayama, Akiyoshi; Chan, Queenie; Kurihara, Ayako; Fukai, Kota; Iida, Miho; Kato, Suzuka; Sugiyama, Daisuke; Kuwabara, Kazuyo; Takeuchi, Ayano; Akiyama, Miki; Okamura, Tomonori; Ebbels, Timothy M D; Elliott, Paul; Tomita, Masaru; Sato, Asako; Suzuki, Chizuru; Sugimoto, Masahiro; Soga, Tomoyoshi; Takebayashi, Toru
2018-01-01
Cohort studies with metabolomics data are becoming more widespread, however, large-scale studies involving 10,000s of participants are still limited, especially in Asian populations. Therefore, we started the Tsuruoka Metabolomics Cohort Study enrolling 11,002 community-dwelling adults in Japan, and using capillary electrophoresis-mass spectrometry (CE-MS) and liquid chromatography-mass spectrometry. The CE-MS method is highly amenable to absolute quantification of polar metabolites, however, its reliability for large-scale measurement is unclear. The aim of this study is to examine reproducibility and validity of large-scale CE-MS measurements. In addition, the study presents absolute concentrations of polar metabolites in human plasma, which can be used in future as reference ranges in a Japanese population. Metabolomic profiling of 8,413 fasting plasma samples were completed using CE-MS, and 94 polar metabolites were structurally identified and quantified. Quality control (QC) samples were injected every ten samples and assessed throughout the analysis. Inter- and intra-batch coefficients of variation of QC and participant samples, and technical intraclass correlation coefficients were estimated. Passing-Bablok regression of plasma concentrations by CE-MS on serum concentrations by standard clinical chemistry assays was conducted for creatinine and uric acid. In QC samples, coefficient of variation was less than 20% for 64 metabolites, and less than 30% for 80 metabolites out of the 94 metabolites. Inter-batch coefficient of variation was less than 20% for 81 metabolites. Estimated technical intraclass correlation coefficient was above 0.75 for 67 metabolites. The slope of Passing-Bablok regression was estimated as 0.97 (95% confidence interval: 0.95, 0.98) for creatinine and 0.95 (0.92, 0.96) for uric acid. Compared to published data from other large cohort measurement platforms, reproducibility of metabolites common to the platforms was similar to or better than in the other studies. These results show that our CE-MS platform is suitable for conducting large-scale epidemiological studies.
Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas
2014-11-01
In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2004.
DOT National Transportation Integrated Search
2006-07-01
This report analyzes the Quality Control/Quality Assurance (QC/QA) data for hot mix asphalt using voids acceptance as : the testing criteria for the years 2000 through 2004. Analysis of the overall quality of the HMA is accomplished by : reviewing th...
Adjustment of Pesticide Concentrations for Temporal Changes in Analytical Recovery, 1992-2006
Martin, Jeffrey D.; Stone, Wesley W.; Wydoski, Duane S.; Sandstrom, Mark W.
2009-01-01
Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ('spiked' QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report examines temporal changes in the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as 'pesticides') that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 to 2006 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Temporal changes in pesticide recovery were investigated by calculating robust, locally weighted scatterplot smooths (lowess smooths) for the time series of pesticide recoveries in 5,132 laboratory reagent spikes; 1,234 stream-water matrix spikes; and 863 groundwater matrix spikes. A 10-percent smoothing window was selected to show broad, 6- to 12-month time scale changes in recovery for most of the 52 pesticides. Temporal patterns in recovery were similar (in phase) for laboratory reagent spikes and for matrix spikes for most pesticides. In-phase temporal changes among spike types support the hypothesis that temporal change in method performance is the primary cause of temporal change in recovery. Although temporal patterns of recovery were in phase for most pesticides, recovery in matrix spikes was greater than recovery in reagent spikes for nearly every pesticide. Models of recovery based on matrix spikes are deemed more appropriate for adjusting concentrations of pesticides measured in groundwater and stream-water samples than models based on laboratory reagent spikes because (1) matrix spikes are expected to more closely match the matrix of environmental water samples than are reagent spikes and (2) method performance is often matrix dependent, as was shown by higher recovery in matrix spikes for most of the pesticides. Models of recovery, based on lowess smooths of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.
User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org
Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.
2013-01-01
Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278
Lin, Ping-Ping; Chen, Wei-Li; Yuan, Fei; Sheng, Lei; Wu, Yu-Jia; Zhang, Wei-Wei; Li, Guo-Qing; Xu, Hong-Rong; Li, Xue-Ning
2017-12-01
Amyloid beta (Aβ) peptides in cerebrospinal fluid are extensively estimated for identification of Alzheimer's disease (AD) as diagnostic biomarkers. Unfortunately, their pervasive application is hampered by interference from Aβ propensity of self-aggregation, nonspecifically bind to surfaces and matrix proteins, and by lack of quantitive standardization. Here we report on an alternative Ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method for simultaneous measurement of human amyloid beta peptides Aβ1-38, Aβ1-40 and Aβ1-42 in cerebrospinal fluid (CSF) using micro-elution solid phase extraction (SPE). Samples were pre-processing by the mixed-mode micro-elution solid phase extraction and quantification was performed in the positive ion multiple reaction monitoring (MRM) mode using electrospray ionization. The stable-isotope labeled Aβ peptides 15 N 51 - Aβ1-38, 15 N 53 - Aβ1-40 and 15 N 55 - Aβ1-42 peptides were used as internal standards. And the artificial cerebrospinal fluid (ACSF) containing 5% rat plasma was used as a surrogate matrix for calibration curves. The quality control (QC) samples at 0.25, 2 and 15ng/mL were prepared. A "linear" regression (1/x 2 weighting): y=ax+b was used to fit the calibration curves over the concentration range of 0.1-20ng/mL for all three peptides. Coefficient of variation (CV) of intra-batch and inter-batch assays were all less than 6.44% for Aβ1-38, 6.75% for Aβ1-40 and 10.74% for Aβ1-42. The precision values for all QC samples of three analytes met the acceptance criteria. Extract recoveries of Aβ1-38, Aβ1-40 and Aβ1-42 were all greater than 70.78%, both in low and high QC samples. The stability assessments showed that QC samples at both low and high levels could be stable for at least 24h at 4°C, 4h at room temperature and through three freeze-thaw cycles without sacrificing accuracy or precision. And no significant carryover effect was observed. This validated UHPLC/MS/MS method was successfully applied to the quantitation of Aβ peptides in real human CSF samples. Our work may provide a reference method for simultaneous quantitation of human Aβ1-38, Aβ1-40 and Aβ1-42 from CSF. Copyright © 2017 Elsevier B.V. All rights reserved.
Technical Note: Independent component analysis for quality assurance in functional MRI.
Astrakas, Loukas G; Kallistis, Nikolaos S; Kalef-Ezra, John A
2016-02-01
Independent component analysis (ICA) is an established method of analyzing human functional MRI (fMRI) data. Here, an ICA-based fMRI quality control (QC) tool was developed and used. ICA-based fMRI QC tool to be used with a commercial phantom was developed. In an attempt to assess the performance of the tool relative to preexisting alternative tools, it was used seven weeks before and eight weeks after repair of a faulty gradient amplifier of a non-state-of-the-art MRI unit. More specifically, its performance was compared with the AAPM 100 acceptance testing and quality assurance protocol and two fMRI QC protocols, proposed by Freidman et al. ["Report on a multicenter fMRI quality assurance protocol," J. Magn. Reson. Imaging 23, 827-839 (2006)] and Stocker et al. ["Automated quality assurance routines for fMRI data applied to a multicenter study," Hum. Brain Mapp. 25, 237-246 (2005)], respectively. The easily developed and applied ICA-based QC protocol provided fMRI QC indices and maps equally sensitive to fMRI instabilities with the indices and maps of other established protocols. The ICA fMRI QC indices were highly correlated with indices of other fMRI QC protocols and in some cases theoretically related to them. Three or four independent components with slow varying time series are detected under normal conditions. ICA applied on phantom measurements is an easy and efficient tool for fMRI QC. Additionally, it can protect against misinterpretations of artifact components as human brain activations. Evaluating fMRI QC indices in the central region of a phantom is not always the optimal choice.
Molecular Characterization of Tick Salivary Gland Glutaminyl Cyclase
Adamson, Steven W.; Browning, Rebecca E.; Chao, Chien-Chung; Bateman, Robert C.; Ching, Wei-Mei; Karim, Shahid
2013-01-01
Glutaminyl cyclase (QC) catalyzes the cyclization of N-terminal glutamine residues into pyroglutamate. This post-translational modification extends the half-life of peptides and, in some cases, is essential in binding to their cognate receptor. Due to its potential role in the post-translational modification of tick neuropeptides, we report the molecular, biochemical and physiological characterization of salivary gland QC during the prolonged blood-feeding of the black-legged tick (Ixodes scapularis) and the gulf-coast tick (Amblyomma maculatum). QC sequences from I. scapularis and A. maculatum showed a high degree of amino acid identity to each other and other arthropods and residues critical for zinc-binding/catalysis (D159, E202, and H330) or intermediate stabilization (E201, W207, D248, D305, F325, and W329) are conserved. Analysis of QC transcriptional gene expression kinetics depicts an upregulation during the blood-meal of adult female ticks prior to fast feeding phases in both I. scapularis and A. maculatum suggesting a functional link with blood meal uptake. QC enzymatic activity was detected in saliva and extracts of tick salivary glands and midguts. Recombinant QC was shown to be catalytically active. Furthermore, knockdown of QC-transcript by RNA interference resulted in lower enzymatic activity, and small, unviable egg masses in both studied tick species as well as lower engorged tick weights for I. scapularis. These results suggest that the post-translational modification of neurotransmitters and other bioactive peptides by QC is critical to oviposition and potentially other physiological processes. Moreover, these data suggest that tick-specific QC-modified neurotransmitters/hormones or other relevant parts of this system could potentially be used as novel physiological targets for tick control. PMID:23770496
Comparison of quality control software tools for diffusion tensor imaging.
Liu, Bilan; Zhu, Tong; Zhong, Jianhui
2015-04-01
Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.
Practical Shipbuilding Standards for Surface Preparation and Coatings
1979-07-01
strong solvent and apply over last coat of epoxy within 48 hours. *Minimum Dry Film Thickness 12.0 SAFETY AND POLUTION CONTROL 12.5 Safety solvents shall...Owner Inspec ion (3) QA/QC Dept. Inspectors. (4) Craft Inspectors (5) Craft Supervision Inspection Only (6) QA/QC Dept. Audit Only (7) Are
Lee, Sejoon; Lee, Soohyun; Ouellette, Scott; Park, Woong-Yang; Lee, Eunjung A; Park, Peter J
2017-06-20
In many next-generation sequencing (NGS) studies, multiple samples or data types are profiled for each individual. An important quality control (QC) step in these studies is to ensure that datasets from the same subject are properly paired. Given the heterogeneity of data types, file types and sequencing depths in a multi-dimensional study, a robust program that provides a standardized metric for genotype comparisons would be useful. Here, we describe NGSCheckMate, a user-friendly software package for verifying sample identities from FASTQ, BAM or VCF files. This tool uses a model-based method to compare allele read fractions at known single-nucleotide polymorphisms, considering depth-dependent behavior of similarity metrics for identical and unrelated samples. Our evaluation shows that NGSCheckMate is effective for a variety of data types, including exome sequencing, whole-genome sequencing, RNA-seq, ChIP-seq, targeted sequencing and single-cell whole-genome sequencing, with a minimal requirement for sequencing depth (>0.5X). An alignment-free module can be run directly on FASTQ files for a quick initial check. We recommend using this software as a QC step in NGS studies. https://github.com/parklab/NGSCheckMate. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Ellis, R.; Murphy, J. G.; van Haarlem, R.; Pattey, E.; O'Brien, J.
2009-05-01
A compact, fast response Quantum Cascade Tunable Infrared Laser Differential Absorption Spectrometer (QC- TILDAS) for measurements of ammonia has been evaluated under both laboratory and field conditions. Absorption of radiation from a pulsed, thermoelectrically cooled QC laser occurs at reduced pressure in a 76 m path length, 0.5 L volume multiple pass absorption cell. Detection is achieved using a thermoelectrically cooled HgCdTe infrared detector. A novel sampling technique was used, consisting of a short, heated, quartz inlet with a hydrophobic coating to minimize the adsorption of ammonia to surfaces. The inlet contains a critical orifice that reduces the pressure, a virtual impactor for separation of particles and additional ports for delivering ammonia free background air and calibration gas standards. This instrument has been found to have a detection limit of 0.3 ppb with a time resolution of 1 s. The sampling technique has been compared to the results of a conventional lead salt Tunable Diode Laser (TDL) absorption spectrometer during a laboratory intercomparison. Various lengths and types of sample inlet tubing material, heated and unheated, under dry and ambient humidity conditions with ammonia concentrations ranging from 10-1000 ppb were investigated. Preliminary analysis suggests the time response improves with the use of short, PFA tubing sampling lines. No significant improvement was observed when using a heated sampling line and humidity was seen to play an important role on the bi-exponential decay of ammonia. A field intercomparison of the QC-TILDAS with a modified Thermo 42C chemiluminescence based analyzer was also performed at Environment Canada's Centre for Atmospheric Research Experiments (CARE) in the rural town of Egbert, ON between May-July 2008. Background tests and calibrations using two different permeation tube sources and an ammonia gas cylinder were regularly carried out throughout the study. Results indicate a very good correlation (r2>0.9) between the two instruments at the beginning of the study, when regular background subtraction was applied to the QC- TILDAS.
Bosnjak, J; Ciraj-Bjelac, O; Strbac, B
2008-01-01
Application of a quality control (QC) programme is very important when optimisation of image quality and reduction of patient exposure is desired. QC surveys of diagnostics imaging equipment in Republic of Srpska (entity of Bosnia and Herzegovina) has been systematically performed since 2001. The presented results are mostly related to the QC test results of X-ray tubes and generators for diagnostic radiology units in 92 radiology departments. In addition, results include workplace monitoring and usage of personal protective devices for staff and patients. Presented results showed the improvements in the implementation of the QC programme within the period 2001--2005. Also, more attention is given to appropriate maintenance of imaging equipment, which was one of the main problems in the past. Implementation of a QC programme is a continuous and complex process. To achieve good performance of imaging equipment, additional tests are to be introduced, along with image quality assessment and patient dosimetry. Training is very important in order to achieve these goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald
2014-12-15
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ±1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.« less
Validation of PCR methods for quantitation of genetically modified plants in food.
Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P
2001-01-01
For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riihimaki, L.; McFarlane, S.; Sivaraman, C.
The ndrop_mfrsr value-added product (VAP) provides an estimate of the cloud droplet number concentration of overcast water clouds retrieved from cloud optical depth from the multi-filter rotating shadowband radiometer (MFRSR) instrument and liquid water path (LWP) retrieved from the microwave radiometer (MWR). When cloud layer information is available from vertically pointing lidar and radars in the Active Remote Sensing of Clouds (ARSCL) product, the VAP also provides estimates of the adiabatic LWP and an adiabatic parameter (beta) that indicates how divergent the LWP is from the adiabatic case. quality control (QC) flags (qc_drop_number_conc), an uncertainty estimate (drop_number_conc_toterr), and a cloudmore » layer type flag (cloud_base_type) are useful indicators of the quality and accuracy of any given value of the retrieval. Examples of these major input and output variables are given in sample plots in section 6.0.« less
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.
2017-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.
Bergallo, M; Costa, C; Tarallo, S; Daniele, R; Merlino, C; Segoloni, G P; Negro Ponzi, A; Cavallo, R
2006-06-01
The human cytomegalovirus (HCMV) is an important pathogen in immunocompromised patients, such as transplant recipients. The use of sensitive and rapid diagnostic assays can have a great impact on antiviral prophylaxis and therapy monitoring and diagnosing active disease. Quantification of HCMV DNA may additionally have prognostic value and guide routine management. The aim of this study was to develop a reliable internally-controlled quantitative-competitive PCR (QC-PCR) for the detection and quantification of HCMV DNA viral load in peripheral blood and compare it with other methods: the HCMV pp65 antigenaemia assay in leukocyte fraction, the HCMV viraemia, both routinely employed in our laboratory, and the nucleic acid sequence-based amplification (NASBA) for detection of HCMV pp67-mRNA. Quantitative-competitive PCR is a procedure for nucleic acid quantification based on co-amplification of competitive templates, the target DNA and a competitor functioning as internal standard. In particular, a standard curve is generated by amplifying 10(2) to 10(5) copies of target pCMV-435 plasmid with 10(4) copies of competitor pCMV-C plasmid. Clinical samples derived from 40 kidney transplant patients were tested by spiking 10(4) copies of pCMV-C into the PCR mix as internal control, and comparing results with the standard curve. Of the 40 patients studied, 39 (97.5%) were positive for HCMV DNA by QC-PCR. While the correlation between the number of pp65-positive cells and the number of HCMV DNA genome copies/mL and the former and the pp67mRNA-positivity were statistically significant, there was no significant correlation between HCMV DNA viral load assayed by QC-PCR and HCMV viraemia. The QC-PCR assay could detect from 10(2) to over 10(7) copies of HCMV DNA with a range of linearity between 10(2) and 10(5) genomes.
Poe, Amanda; Duong, Ngocvien Thi; Bedi, Kanwar; Kodani, Maja
2018-03-01
Diagnosis of hepatitis C virus (HCV) infection is based on testing for antibodies to HCV (anti-HCV), hepatitis C core antigen (HCV cAg) and HCV RNA. To ensure quality control (QC) and quality assurance (QA), proficiency panels are provided by reference laboratories and various international organizations, requiring costly dry ice shipments to maintain specimen integrity. Alternative methods of specimen preservation and transport can save on shipping and handling and help in improving diagnostics by facilitating QA/QC of various laboratories especially in resource limited countries. Plasma samples positive for anti-HCV and HCV RNA were either dried using dried tube specimens (DTS) method or lyophilization for varying durations of time and temperature. Preservation of samples using DTS method resulted in loss of anti-HCV reactivity for low-positive samples and did not generate enough volume for HCV RNA testing. Lyophilized samples tested positive for anti-HCV even after storage at 4 °C and 25 °C for 12 weeks. Further, HCV RNA was detectable in 5 of 5 (100%) samples over the course of 12 week storage at 4, 25, 37 and 45 °C. In conclusion, lyophilization of specimens maintains integrity of plasma samples for testing for markers of HCV infection and can be a potent mode of sharing proficiency samples without incurring huge shipping costs and avoids challenges with dry ice shipments between donor and recipient laboratories. Copyright © 2017. Published by Elsevier B.V.
Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler
NASA Technical Reports Server (NTRS)
Vacek, Austin
2016-01-01
Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.
Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler
NASA Technical Reports Server (NTRS)
Vacek, Austin
2015-01-01
Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.
NASA Technical Reports Server (NTRS)
Barbre, Robert, Jr.
2015-01-01
Assessment of space vehicle loads and trajectories during design requires a large sample of wind profiles at the altitudes where winds affect the vehicle. Traditionally, this altitude region extends from near 8-14 km to address maximum dynamic pressure upon ascent into space, but some applications require knowledge of measured wind profiles at lower altitudes. Such applications include crew capsule pad abort and plume damage analyses. Two Doppler Radar Wind Profiler (DRWP) systems exist at the United States Air Force (USAF) Eastern Range and at the National Aeronautics and Space Administration's Kennedy Space Center. The 50-MHz DRWP provides wind profiles every 3-5 minutes from roughly 2.5-18.5 km, and five 915-MHz DRWPs provide wind profiles every 15 minutes from approximately 0.2-3.0 km. Archived wind profiles from all systems underwent rigorous quality control (QC) processes, and concurrent measurements from the QC'ed 50- and 915-MHz DRWP archives were spliced into individual profiles that extend from about 0.2-18.5 km. The archive contains combined profiles from April 2000 to December 2009, and thousands of profiles during each month are available for use by the launch vehicle community. This paper presents the details of the QC and splice methodology, as well as some attributes of the archive.
NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)
The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.
The Nation...
Summary Report for the Evaluation of Current QA Processes Within the FRMAC FAL and EPA MERL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shanks, Sonoya T.; Redding, Ted; Jaussi, Lynn
The Federal Radiological Monitoring and Assessment Center (FRMAC) relies on accurate and defensible analytical laboratory data to support its mission. Therefore, FRMAC must ensure that the environmental analytical laboratories providing analytical services maintain an ongoing capability to provide accurate analytical results to DOE. It is undeniable that the more Quality Assurance (QA) and Quality Control (QC) measures required of the laboratory, the less resources that are available for analysis of response samples. Being that QA and QC measures in general are understood to comprise a major effort related to a laboratory’s operations, requirements should only be considered if they aremore » deemed “value-added” for the FRMAC mission. This report provides observations of areas for improvement and potential interoperability opportunities in the areas of Batch Quality Control Requirements, Written Communications, Data Review Processes, Data Reporting Processes, along with the lessons learned as they apply to items in the early phase of a response that will be critical for developing a more efficient, integrated response for future interactions between the FRMAC and EPA assets.« less
Operational Processing of Ground Validation Data for the Tropical Rainfall Measuring Mission
NASA Technical Reports Server (NTRS)
Kulie, Mark S.; Robinson, Mike; Marks, David A.; Ferrier, Brad S.; Rosenfeld, Danny; Wolff, David B.
1999-01-01
The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November 1997. A primary goal of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented for this mission. A key component of GV is the analysis and quality control of meteorological ground-based radar data from four primary sites: Melbourne, FL; Houston, TX; Darwin, Australia; and Kwajalein Atoll, RMI. As part of the TRMM GV effort, the Joint Center for Earth Systems Technology (JCET) at the University of Maryland, Baltimore County, has been tasked with developing and implementing an operational system to quality control (QC), archive, and provide data for subsequent rainfall product generation from the four primary GV sites. This paper provides an overview of the JCET operational environment. A description of the QC algorithm and performance, in addition to the data flow procedure between JCET and the TRNM science and Data Information System (TSDIS), are presented. The impact of quality-controlled data on higher level rainfall and reflectivity products will also be addressed, Finally, a brief description of JCET's expanded role into producing reference rainfall products will be discussed.
Cirillo, Daniela M.; Hoffner, Sven; Ismail, Nazir A.; Kaur, Devinder; Lounis, Nacer; Metchock, Beverly; Pfyffer, Gaby E.; Venter, Amour
2016-01-01
The aim of this study was to establish standardized drug susceptibility testing (DST) methodologies and reference MIC quality control (QC) ranges for bedaquiline, a diarylquinoline antimycobacterial, used in the treatment of adults with multidrug-resistant tuberculosis. Two tier-2 QC reproducibility studies of bedaquiline DST were conducted in eight laboratories using Clinical Laboratory and Standards Institute (CLSI) guidelines. Agar dilution and broth microdilution methods were evaluated. Mycobacterium tuberculosis H37Rv was used as the QC reference strain. Bedaquiline MIC frequency, mode, and geometric mean were calculated. When resulting data occurred outside predefined CLSI criteria, the entire laboratory data set was excluded. For the agar dilution MIC, a 4-dilution QC range (0.015 to 0.12 μg/ml) centered around the geometric mean included 95.8% (7H10 agar dilution; 204/213 observations with one data set excluded) or 95.9% (7H11 agar dilution; 232/242) of bedaquiline MICs. For the 7H9 broth microdilution MIC, a 3-dilution QC range (0.015 to 0.06 μg/ml) centered around the mode included 98.1% (207/211, with one data set excluded) of bedaquiline MICs. Microbiological equivalence was demonstrated for bedaquiline MICs determined using 7H10 agar and 7H11 agar but not for bedaquiline MICs determined using 7H9 broth and 7H10 agar or 7H9 broth and 7H11 agar. Bedaquiline DST methodologies and MIC QC ranges against the H37Rv M. tuberculosis reference strain have been established: 0.015 to 0.12 μg/ml for the 7H10 and 7H11 agar dilution MICs and 0.015 to 0.06 μg/ml for the 7H9 broth microdilution MIC. These methodologies and QC ranges will be submitted to CLSI and EUCAST to inform future research and provide guidance for routine clinical bedaquiline DST in laboratories worldwide. PMID:27654337
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Brenton, James C.
2016-01-01
An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.
Zweigenbaum, J; Henion, J
2000-06-01
The high-throughput determination of small molecules in biological matrixes has become an important part of drug discovery. This work shows that increased throughput LC/MS/MS techniques can be used for the analysis of selected estrogen receptor modulators in human plasma where more than 2000 samples may be analyzed in a 24-h period. The compounds used to demonstrate the high-throughput methodology include tamoxifen, raloxifene, 4-hydroxytamoxifen, nafoxidine, and idoxifene. Tamoxifen and raloxifene are used in both breast cancer therapy and osteoporosis and have shown prophylactic potential for the reduction of the risk of breast cancer. The described strategy provides LC/MS/MS separation and quantitation for each of the five test articles in control human plasma. The method includes sample preparation employing liquid-liquid extraction in the 96-well format, an LC separation of the five compounds in less than 30 s, and selected reaction monitoring detection from low nano- to microgram per milliter levels. Precision and accuracy are determined where each 96-well plate is considered a typical "tray" having calibration standards and quality control (QC) samples dispersed through each plate. A concept is introduced where 24 96-well plates analyzed in 1 day is considered a "grand tray", and the method is cross-validated with standards placed only at the beginning of the first plate and the end of the last plate. Using idoxifene-d5 as an internal standard, the results obtained for idoxifene and tamoxifen satisfy current bioanalytical method validation criteria on two separate days where 2112 and 2304 samples were run, respectively. Method validation included 24-h autosampler stability and one freeze-thaw cycle stability for the extracts. Idoxifene showed acceptable results with accuracy ranging from 0.3% for the high quality control (QC) to 15.4% for the low QC and precision of 3.6%-13.9% relative standard deviation. Tamoxifen showed accuracy ranging from 1.6% to 13.8% and precision from 7.8% to 15.2%. The linear dynamic range for these compounds was 3 orders of magnitude. The limit of quantification was 5 and 50 ng/ mL for tamoxifen and idoxifene, respectively. The other compounds in this study in general satisfy the more relaxed bioanalytical acceptance criteria for modern drug discovery. It is suggested that the quantification levels reported in this high-throughput analysis example are adequate for many drug discovery and related early pharmaceutical studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
Raef, A.
2009-01-01
The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.
The quality control theory of aging.
Ladiges, Warren
2014-01-01
The quality control (QC) theory of aging is based on the concept that aging is the result of a reduction in QC of cellular systems designed to maintain lifelong homeostasis. Four QC systems associated with aging are 1) inadequate protein processing in a distressed endoplasmic reticulum (ER); 2) histone deacetylase (HDAC) processing of genomic histones and gene silencing; 3) suppressed AMPK nutrient sensing with inefficient energy utilization and excessive fat accumulation; and 4) beta-adrenergic receptor (BAR) signaling and environmental and emotional stress. Reprogramming these systems to maintain efficiency and prevent aging would be a rational strategy for increased lifespan and improved health. The QC theory can be tested with a pharmacological approach using three well-known and safe, FDA-approved drugs: 1) phenyl butyric acid, a chemical chaperone that enhances ER function and is also an HDAC inhibitor, 2) metformin, which activates AMPK and is used to treat type 2 diabetes, and 3) propranolol, a beta blocker which inhibits BAR signaling and is used to treat hypertension and anxiety. A critical aspect of the QC theory, then, is that aging is associated with multiple cellular systems that can be targeted with drug combinations more effectively than with single drugs. But more importantly, these drug combinations will effectively prevent, delay, or reverse chronic diseases of aging that impose such a tremendous health burden on our society.
2015-05-01
in consultation with the site management . 4.0 DATA TYPES AND QUALITY CONTROL A sampling plan must account for the collection, handling, and...GUIDANCE DOCUMENT Cost-Effective, Ultra-Sensitive Groundwater Monitoring for Site Remediation and Management : Standard Operating Procedures...Groundwater Monitoring for Site Remediation and Management 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Halden, R.U., Roll, I.B. 5d
Preservation of Fine-Needle Aspiration Specimens for Future Use in RNA-Based Molecular Testing
Ladd, Amy C.; O'Sullivan-Mejia, Emerald; Lea, Tasha; Perry, Jessica; Dumur, Catherine I.; Dragoescu, Ema; Garrett, Carleton T.; Powers, Celeste N.
2015-01-01
Background The application of ancillary molecular testing is becoming more important for the diagnosis and classification of disease. The use of fine-needle aspiration (FNA) biopsy as the means of sampling tumors in conjunction with molecular testing could be a powerful combination. FNA is minimally invasive, cost effective, and usually demonstrates accuracy comparable to diagnoses based on excisional biopsies. Quality control (QC) and test validation requirements for development of molecular tests impose a need for access to pre-existing clinical samples. Tissue banking of excisional biopsy specimens is frequently performed at large research institutions, but few have developed protocols for preservation of cytologic specimens. This study aimed to evaluate cryopreservation of FNA specimens as a method of maintaining cellular morphology and ribonucleic acid (RNA) integrity in banked tissues. Methods FNA specimens were obtained from fresh tumor resections, processed by using a cryopreservation protocol, and stored for up to 27 weeks. Upon retrieval, samples were made into slides for morphological evaluation, and RNA was extracted and assessed for integrity by using the Agilent Bioanalyzer (Agilent Technologies, Santa Clara, Calif). Results Cryopreserved specimens showed good cell morphology and, in many cases, yielded intact RNA. Cases showing moderate or severe RNA degradation could generally be associated with prolonged specimen handling or sampling of necrotic areas. Conclusions FNA specimens can be stored in a manner that maintains cellular morphology and RNA integrity necessary for studies of gene expression. In addition to addressing quality control (QC) and test validation needs, cytology banks will be an invaluable resource for future molecular morphologic and diagnostic research studies. PMID:21287691
2016-09-01
assigned a classification. MLST analysis MLST was determined using an in-house automated pipeline that first searches for homologs of each gene of...and virulence mechanism contributing to their success as pathogens in the wound environment. A novel bioinformatics pipeline was used to incorporate...monitored in two ways: read-based genome QC and assembly based metrics. The JCVI Genome QC pipeline samples sequence reads and performs BLAST
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
Impacts of Intelligent Automated Quality Control on a Small Animal APD-Based Digital PET Scanner
NASA Astrophysics Data System (ADS)
Charest, Jonathan; Beaudoin, Jean-François; Bergeron, Mélanie; Cadorette, Jules; Arpin, Louis; Lecomte, Roger; Brunet, Charles-Antoine; Fontaine, Réjean
2016-10-01
Stable system performance is mandatory to warrant the accuracy and reliability of biological results relying on small animal positron emission tomography (PET) imaging studies. This simple requirement sets the ground for imposing routine quality control (QC) procedures to keep PET scanners at a reliable optimal performance level. However, such procedures can become burdensome to implement for scanner operators, especially taking into account the increasing number of data acquisition channels in newer generation PET scanners. In systems using pixel detectors to achieve enhanced spatial resolution and contrast-to-noise ratio (CNR), the QC workload rapidly increases to unmanageable levels due to the number of independent channels involved. An artificial intelligence based QC system, referred to as Scanner Intelligent Diagnosis for Optimal Performance (SIDOP), was proposed to help reducing the QC workload by performing automatic channel fault detection and diagnosis. SIDOP consists of four high-level modules that employ machine learning methods to perform their tasks: Parameter Extraction, Channel Fault Detection, Fault Prioritization, and Fault Diagnosis. Ultimately, SIDOP submits a prioritized faulty channel list to the operator and proposes actions to correct them. To validate that SIDOP can perform QC procedures adequately, it was deployed on a LabPET™ scanner and multiple performance metrics were extracted. After multiple corrections on sub-optimal scanner settings, a 8.5% (with a 95% confidence interval (CI) of [7.6, 9.3]) improvement in the CNR, a 17.0% (CI: [15.3, 18.7]) decrease of the uniformity percentage standard deviation, and a 6.8% gain in global sensitivity were observed. These results confirm that SIDOP can indeed be of assistance in performing QC procedures and restore performance to optimal figures.
NASA Astrophysics Data System (ADS)
Jiao, Xin; Liu, Yiqun; Yang, Wan; Zhou, Dingwu; Wang, Shuangshuang; Jin, Mengqi; Sun, Bin; Fan, Tingting
2018-01-01
The cycling of various isomorphs of authigenic silica minerals is a complex and long-term process. A special type of composite quartz (Qc) grains in tuffaceous shale of Permian Lucaogou Formation in the sediment-starved volcanically and hydrothermally active intracontinental lacustrine Santanghu rift basin (NW China) is studied in detail to demonstrate such processes. Samples from one well in the central basin were subject to petrographic, elemental chemical, and fluid inclusion analyses. About 200 Qc-bearing laminae are 0.1-2 mm and mainly 1 mm thick and intercalated within tuffaceous shale laminae. The Qc grains occur as framework grains and are dispersed in igneous feldspar-dominated matrix, suggesting episodic accumulation. The Qc grains are bedding-parallel, uniform in size (100 s µm), elongate, and radial in crystal pattern, suggesting a biogenic origin. Qc grains are composed of a core of anhedral microcrystalline quartz and an outer part of subhedral mega-quartz grains, whose edges are composed of small euhedral quartz crystals, indicating multiple episodic processes of recrystallization and overgrowth. Abundance of Al and Ti in quartz crystals and estimated temperature from fluid inclusions in Qc grains indicate that processes are related to hydrothermal fluids. Finally, the Qc grains are interpreted as original silica precipitation in microorganism (algae?) cysts, which were reworked by bottom currents and altered by hydrothermal fluids to recrystalize and overgrow during penecontemporaneous shallow burial. It is postulated that episodic volcanic and hydrothermal activities had changed lake water chemistry, temperature, and nutrient supply, resulting in variations in microorganic productivities and silica cycling. The transformation of authigenic silica from amorphous to well crystallized had occurred in a short time span during shallow burial.
Office of Student Financial Aid Quality Improvement Program: Design and Implementation Plan.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The purpose and direction of the Office of Student Financial Aid (OSFA) quality improvement program are described. The background and context for the Pell Grant quality control (QC) design study and the meaning of QC are reviewed. The general approach to quality improvement consists of the following elements: a strategic approach that enables OSFA…
FASTQ quality control dashboard
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-07-25
FQCDB builds up existing open source software, FastQC, implementing a modern web interface for across parsed output of FastQC. In addition, FQCDB is extensible as a web service to include additional plots of type line, boxplot, or heatmap, across data formatted according to guidelines. The interface is also configurable via more readable JSON format, enabling customization by non-web programmers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann
As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2015-01-01
This paper describes in detail the QC and splicing methodology for KSC's 50- and 915-MHz DRWP measurements that generates an extensive archive of vertically complete profiles from 0.20-18.45 km. The concurrent POR from each archive extends from April 2000 to December 2009. MSFC NE applies separate but similar QC processes to each of the 50- and 915-MHz DRWP archives. DRWP literature and data examination provide the basis for developing and applying the automated and manual QC processes on both archives. Depending on the month, the QC'ed 50- and 915-MHz DRWP archives retain 52-65% and 16-30% of the possible data, respectively. The 50- and 915-MHz DRWP QC archives retain 84-91% and 85-95%, respectively, of all the available data provided that data exist in the non- QC'ed archives. Next, MSFC NE applies an algorithm to splice concurrent measurements from both DRWP sources. Last, MSFC NE generates a composite profile from the (up to) five available spliced profiles to effectively characterize boundary layer winds and to utilize all possible 915-MHz DRWP measurements at each timestamp. During a given month, roughly 23,000-32,000 complete profiles exist from 0.25-18.45 km from the composite profiles' archive, and approximately 5,000- 27,000 complete profiles exist from an archive utilizing an individual 915-MHz DRWP. One can extract a variety of profile combinations (pairs, triplets, etc.) from this sample for a given application. The sample of vertically complete DRWP wind measurements not only gives launch vehicle customers greater confidence in loads and trajectory assessments versus using balloon output, but also provides flexibility to simulate different DOL situations across applicable altitudes. In addition to increasing sample size and providing more flexibility for DOL simulations in the vehicle design phase, the spliced DRWP database provides any upcoming launch vehicle program with the capability to utilize DRWP profiles on DOL to compute vehicle steering commands, provided the program applies the procedures that this report describes to new DRWP data on DOL. Decker et al. (2015) details how SLS is proposing to use DRWP data and splicing techniques on DOL. Although automation could enhance the current DOL 50-MHz DRWP QC process and could streamline any future DOL 915-MHz DRWP QC and splicing process, the DOL community would still require manual intervention to ensure that the vehicle only uses valid profiles. If a program desires to use high spatial resolution profiles, then the algorithm could randomly add high-frequency components to the DRWP profiles. The spliced DRWP database provides lots of flexibility in how one performs DOL simulations, and the algorithms that this report provides will assist the aerospace and atmospheric communities that are interested in utilizing the DRWP.
Quality Circles: An Innovative Program to Improve Military Hospitals
1982-08-01
quality control. However, Dr. Kaoru Ishikawa is credited with starting the first "Quality Control Circles" and registering them with the Japanese Union of...McGregor and Abraham Maslow into a unique style of management. In 1962 Dr. Ishikawa , a professor at Tokyo University, developed the QC concept based on...RECOMMENDATIONS Conclusions The QC concept has come a long way since Dr. Ishikawa gave it birth in 1962. It has left an enviable record of success along its
Guillot, Sophie; Guiso, Nicole
2016-08-01
The French National Reference Centre (NRC) for Whooping Cough carried out an external quality control (QC) analysis in 2010 for the PCR diagnosis of whooping cough. The main objective of the study was to assess the impact of this QC in the participating laboratories through a repeat analysis in 2012. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
NASA Astrophysics Data System (ADS)
Sicoe, G. M.; Belu, N.; Rachieru, N.; Nicolae, E. V.
2017-10-01
Presently, in the automotive industry, the tendency is to adapt permanently to the changes and introduce the market tendency in the new products that leads of the customer satisfaction. Many quality techniques were adopted in this field to continuous improvement of product and process quality and advantages were also gained. The present paper has focused on possibilities that offers the use of Quality Assurance Matrix (QAM) and Quality Control Story (QC Story) to provide largest protection against nonconformities in the production process, throughout a case study in the automotive industry. There is a direct relationship from the QAM to a QC Story analysis. The failures identified using QAM are treated with QC Story methodology. Using this methods, will help to decrease the PPM values and will increase the quality performance and the customer satisfaction.
panelcn.MOPS: Copy-number detection in targeted NGS panel data for clinical diagnostics.
Povysil, Gundula; Tzika, Antigoni; Vogt, Julia; Haunschmid, Verena; Messiaen, Ludwine; Zschocke, Johannes; Klambauer, Günter; Hochreiter, Sepp; Wimmer, Katharina
2017-07-01
Targeted next-generation-sequencing (NGS) panels have largely replaced Sanger sequencing in clinical diagnostics. They allow for the detection of copy-number variations (CNVs) in addition to single-nucleotide variants and small insertions/deletions. However, existing computational CNV detection methods have shortcomings regarding accuracy, quality control (QC), incidental findings, and user-friendliness. We developed panelcn.MOPS, a novel pipeline for detecting CNVs in targeted NGS panel data. Using data from 180 samples, we compared panelcn.MOPS with five state-of-the-art methods. With panelcn.MOPS leading the field, most methods achieved comparably high accuracy. panelcn.MOPS reliably detected CNVs ranging in size from part of a region of interest (ROI), to whole genes, which may comprise all ROIs investigated in a given sample. The latter is enabled by analyzing reads from all ROIs of the panel, but presenting results exclusively for user-selected genes, thus avoiding incidental findings. Additionally, panelcn.MOPS offers QC criteria not only for samples, but also for individual ROIs within a sample, which increases the confidence in called CNVs. panelcn.MOPS is freely available both as R package and standalone software with graphical user interface that is easy to use for clinical geneticists without any programming experience. panelcn.MOPS combines high sensitivity and specificity with user-friendliness rendering it highly suitable for routine clinical diagnostics. © 2017 The Authors. Human Mutation published by Wiley Periodicals, Inc.
panelcn.MOPS: Copy‐number detection in targeted NGS panel data for clinical diagnostics
Povysil, Gundula; Tzika, Antigoni; Vogt, Julia; Haunschmid, Verena; Messiaen, Ludwine; Zschocke, Johannes; Klambauer, Günter; Wimmer, Katharina
2017-01-01
Abstract Targeted next‐generation‐sequencing (NGS) panels have largely replaced Sanger sequencing in clinical diagnostics. They allow for the detection of copy‐number variations (CNVs) in addition to single‐nucleotide variants and small insertions/deletions. However, existing computational CNV detection methods have shortcomings regarding accuracy, quality control (QC), incidental findings, and user‐friendliness. We developed panelcn.MOPS, a novel pipeline for detecting CNVs in targeted NGS panel data. Using data from 180 samples, we compared panelcn.MOPS with five state‐of‐the‐art methods. With panelcn.MOPS leading the field, most methods achieved comparably high accuracy. panelcn.MOPS reliably detected CNVs ranging in size from part of a region of interest (ROI), to whole genes, which may comprise all ROIs investigated in a given sample. The latter is enabled by analyzing reads from all ROIs of the panel, but presenting results exclusively for user‐selected genes, thus avoiding incidental findings. Additionally, panelcn.MOPS offers QC criteria not only for samples, but also for individual ROIs within a sample, which increases the confidence in called CNVs. panelcn.MOPS is freely available both as R package and standalone software with graphical user interface that is easy to use for clinical geneticists without any programming experience. panelcn.MOPS combines high sensitivity and specificity with user‐friendliness rendering it highly suitable for routine clinical diagnostics. PMID:28449315
Rahal, Juliana Saab; Mesquita, Marcelo Ferraz; Henriques, Guilherme Elias Pessanha; Nóbilo, Mauro Antonio Arruda
2004-01-01
Influence of polishing methods on water sorption and solubility of denture base acrylic resins was studied. Eighty samples were divided into groups: Classico (CL), and QC 20 (QC) - hot water bath cured; Acron MC (AC), and Onda Cryl (ON) - microwave cured; and submitted to mechanical polishing (MP) - pumice slurry, chalk powder, soft brush and felt cone in a bench vise; or chemical polishing (CP) - heated monomer fluid in a chemical polisher. The first desiccation process was followed by storage in distilled water at 37 +/- 1 degrees C for 1 h, 1 day, 1, 2, 3 and 4 weeks. Concluding each period, water sorption was measured. After the fourth week, a second desiccation process was done to calculate solubility. Data were submitted to analysis of variance, followed by Tukey test (p
Purba, Fredrick Dermawan; Hunfeld, Joke A M; Iskandarsyah, Aulia; Fitriana, Titi Sahidah; Sadarjoen, Sawitri S; Passchier, Jan; Busschbach, Jan J V
2017-05-01
In valuing health states using generic questionnaires such as EQ-5D, there are unrevealed issues with the quality of the data collection. The aims were to describe the problems encountered during valuation and to evaluate a quality control report and subsequent retraining of interviewers in improving this valuation. Data from the first 266 respondents in an EQ-5D-5L valuation study were used. Interviewers were trained and answered questions regarding problems during these initial interviews. Thematic analysis was used, and individual feedback was provided. After completion of 98 interviews, a first quantitative quality control (QC) report was generated, followed by a 1-day retraining program. Subsequently individual feedback was also given on the basis of follow-up QCs. The Wilcoxon signed-rank test was used to assess improvements based on 7 indicators of quality as identified in the first QC and the QC conducted after a further 168 interviews. Interviewers encountered problems in recruiting respondents. Solutions provided were: optimization of the time of interview, the use of broader networks and the use of different scripts to explain the project's goals to respondents. For problems in interviewing process, solutions applied were: developing the technical and personal skills of the interviewers and stimulating the respondents' thought processes. There were also technical problems related to hardware, software and internet connections. There was an improvement in all 7 indicators of quality after the second QC. Training before and during a study, and individual feedback on the basis of a quantitative QC, can increase the validity of values obtained from generic questionnaires.
The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.
The U.S.-Mex...
NASA Technical Reports Server (NTRS)
Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
[Highly quality-controlled radiation therapy].
Shirato, Hiroki
2005-04-01
Advanced radiation therapy for intracranial disease has focused on set-up accuracy for the past 15 years. However, quality control in the prescribed dose is actually as important as the tumor set-up in radiation therapy. Because of the complexity of the three-dimensional radiation treatment planning system in recent years, the highly quality-controlled prescription of the dose has now been reappraised as the mainstream to improve the treatment outcome of radiation therapy for intracranial disease. The Japanese Committee for Quality Control of Radiation Therapy has developed fundamental requirements such as a QC committee in each hospital, a medical physicist, dosimetrists (QC members), and an external audit.
Network-Centric Quantum Communications
NASA Astrophysics Data System (ADS)
Hughes, Richard
2014-03-01
Single-photon quantum communications (QC) offers ``future-proof'' cryptographic security rooted in the laws of physics. Today's quantum-secured communications cannot be compromised by unanticipated future technological advances. But to date, QC has only existed in point-to-point instantiations that have limited ability to address the cyber security challenges of our increasingly networked world. In my talk I will describe a fundamentally new paradigm of network-centric quantum communications (NQC) that leverages the network to bring scalable, QC-based security to user groups that may have no direct user-to-user QC connectivity. With QC links only between each of N users and a trusted network node, NQC brings quantum security to N2 user pairs, and to multi-user groups. I will describe a novel integrated photonics quantum smartcard (``QKarD'') and its operation in a multi-node NQC test bed. The QKarDs are used to implement the quantum cryptographic protocols of quantum identification, quantum key distribution and quantum secret splitting. I will explain how these cryptographic primitives are used to provide key management for encryption, authentication, and non-repudiation for user-to-user communications. My talk will conclude with a description of a recent demonstration that QC can meet both the security and quality-of-service (latency) requirements for electric grid control commands and data. These requirements cannot be met simultaneously with present-day cryptography.
40 CFR 98.284 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (d) For... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.284 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (d) For... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.284 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (d) For... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.284 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (d) For... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
Eichhold, Thomas H; McCauley-Myers, David L; Khambe, Deepa A; Thompson, Gary A; Hoke, Steven H
2007-01-17
A method for the simultaneous determination of dextromethorphan (DEX), dextrorphan (DET), and guaifenesin (GG) in human plasma was developed, validated, and applied to determine plasma concentrations of these compounds in samples from six clinical pharmacokinetic (PK) studies. Semi-automated liquid handling systems were used to perform the majority of the sample manipulation including liquid/liquid extraction (LLE) of the analytes from human plasma. Stable-isotope-labeled analogues were utilized as internal standards (ISTDs) for each analyte to facilitate accurate and precise quantification. Extracts were analyzed using gradient liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Use of semi-automated LLE with LC-MS/MS proved to be a very rugged and reliable approach for analysis of more than 6200 clinical study samples. The lower limit of quantification was validated at 0.010, 0.010, and 1.0 ng/mL of plasma for DEX, DET, and GG, respectively. Accuracy and precision of quality control (QC) samples for all three analytes met FDA Guidance criteria of +/-15% for average QC accuracy with coefficients of variation less than 15%. Data from the thorough evaluation of the method during development, validation, and application are presented to characterize selectivity, linearity, over-range sample analysis, accuracy, precision, autosampler carry-over, ruggedness, extraction efficiency, ionization suppression, and stability. Pharmacokinetic data are also provided to illustrate improvements in systemic drug and metabolite concentration-time profiles that were achieved by formulation optimization.
Disk diffusion quality control guidelines for NVP-PDF 713: a novel peptide deformylase inhibitor.
Anderegg, Tamara R; Jones, Ronald N
2004-01-01
NVP-PDF713 is a peptide deformylase inhibitor that has emerged as a candidate for treating Gram-positive infections and selected Gram-negative species that commonly cause community-acquired respiratory tract infections. This report summarizes the results of a multi-center (seven participants) disk diffusion quality control (QC) investigation for NVP PDF-713 using guidelines of the National Committee for Clinical Laboratory Standards and the standardized disk diffusion method. A total of 420 NVP-PDF 713 zone diameter values were generated for each QC organism. The proposed zone diameter ranges contained 97.6-99.8% of the reported participant results and were: Staphylococcus aureus ATCC 25923 (25-35 mm), Streptococcus pneumoniae ATCC 49619 (30-37 mm), and Haemophilus influenzae ATCC 49247 (24-32 mm). These QC criteria for the disk diffusion method should be applied during the NVP-PDF 713 clinical trials to maximize test accuracy.
Lean Six Sigma in Health Care: Improving Utilization and Reducing Waste.
Almorsy, Lamia; Khalifa, Mohamed
2016-01-01
Healthcare costs have been increasing worldwide mainly due to over utilization of resources. The savings potentially achievable from systematic, comprehensive, and cooperative reduction in waste are far higher than from more direct and blunter cuts in care and coverage. At King Faisal Specialist Hospital and Research Center inappropriate and over utilization of the glucose test strips used for whole blood glucose determination using glucometers was observed. The hospital implemented a project to improve its utilization. Using the Six Sigma DMAIC approach (Define, Measure, Analyze, Improve and Control), an efficient practice was put in place including updating the related internal policies and procedures and the proper implementation of an effective users' training and competency check off program. That resulted in decreasing the unnecessary Quality Control (QC) runs from 13% to 4%, decreasing the failed QC runs from 14% to 7%, lowering the QC to patient testing ratio from 24/76 to 19/81.
Taghizadeh, Somayeh; Yang, Claus Chunli; R. Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan
2017-01-01
Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID3D and Quasar GRID3D phantoms were used to evaluate the effects of static magnetic field (B0) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions. PMID:29487771
Fatemi, Ali; Taghizadeh, Somayeh; Yang, Claus Chunli; R Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan
2017-12-18
Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID 3D and Quasar GRID 3D phantoms were used to evaluate the effects of static magnetic field (B 0 ) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions.
NASA Astrophysics Data System (ADS)
Saavedra, Juan Alejandro
Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.
Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H.-J.
2011-01-01
Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05–1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC. PMID:21288892
Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H-J
2011-04-08
Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05-1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, Eric C., E-mail: eford@uw.edu; Terezakis, Stephanie; Souranis, Annette
Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentagemore » of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 {+-} 2.3 (mean {+-} SD) and 2.6 {+-} 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database.« less
Production of latex agglutination reagents for pneumococcal serotyping
2013-01-01
Background The current ‘gold standard’ for serotyping pneumococci is the Quellung test. This technique is laborious and requires a certain level of training to correctly perform. Commercial pneumococcal latex agglutination serotyping reagents are available, but these are expensive. In-house production of latex agglutination reagents can be a cost-effective alternative to using commercially available reagents. This paper describes a method for the production and quality control (QC) of latex reagents, including problem solving recommendations, for pneumococcal serotyping. Results Here we describe a method for the production of latex agglutination reagents based on the passive adsorption of antibodies to latex particles. Sixty-five latex agglutination reagents were made using the PneuCarriage Project (PCP) method, of which 35 passed QC. The other 30 reagents failed QC due to auto-agglutination (n=2), no reactivity with target serotypes (n=8) or cross-reactivity with non-target serotypes (n=20). Dilution of antisera resulted in a further 27 reagents passing QC. The remaining three reagents passed QC when prepared without centrifugation and wash steps. Protein estimates indicated that latex reagents that failed QC when prepared using the PCP method passed when made with antiserum containing ≤ 500 μg/ml of protein. Sixty-one nasopharyngeal isolates were serotyped with our in-house latex agglutination reagents, with the results showing complete concordance with the Quellung reaction. Conclusions The method described here to produce latex agglutination reagents allows simple and efficient serotyping of pneumococci and may be applicable to latex agglutination reagents for typing or identification of other microorganisms. We recommend diluting antisera or removing centrifugation and wash steps for any latex reagents that fail QC. Our latex reagents are cost-effective, technically undemanding to prepare and remain stable for long periods of time, making them ideal for use in low-income countries. PMID:23379961
Rossum, Huub H van; Kemperman, Hans
2017-07-26
General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.
MO-AB-210-03: Workshop [Advancements in high intensity focused ultrasound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
MO-AB-210-02: Ultrasound Imaging and Therapy-Hands On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sammet, S.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
MO-AB-210-01: Ultrasound Imaging and Therapy-Hands On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
NASA Astrophysics Data System (ADS)
Susskind, J.; Rosenberg, R. I.
2016-12-01
The GEOS-5 Data Assimilation System (DAS) generates a global analysis every six hours by combining the previous six hour forecast for that time period with contemporaneous observations. These observations include in-situ observations as well as those taken by satellite borne instruments, such as AIRS/AMSU on EOS Aqua and CrIS/ATMS on S-NPP. Operational data assimilation methodology assimilates observed channel radiances Ri for IR sounding instruments such as AIRS and CrIS, but only for those channels i in a given scene whose radiances are thought to be unaffected by clouds. A limitation of this approach is that radiances in most tropospheric sounding channels are affected by clouds under partial cloud cover conditions, which occurs most of the time. The AIRS Science Team Version-6 retrieval algorithm generates cloud cleared radiances (CCR's) for each channel in a given scene, which represent the radiances AIRS would have observed if the scene were cloud free, and then uses them to determine quality controlled (QC'd) temperature profiles T(p) under all cloud conditions. There are potential advantages to assimilate either AIRS QC'd CCR's or QC'd T(p) instead of Ri in that the spatial coverage of observations is greater under partial cloud cover. We tested these two alternate data assimilation approaches by running three parallel data assimilation experiments over different time periods using GEOS-5. Experiment 1 assimilated all observations as done operationally, Experiment 2 assimilated QC'd values of AIRS CCRs in place of AIRS radiances, and Experiment 3 assimilated QC'd values of T(p) in place of observed radiances. Assimilation of QC'd AIRS T(p) resulted in significant improvement in seven day forecast skill compared to assimilation of CCR's or assimilation of observed radiances, especially in the Southern Hemisphere Extra-tropics.
Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette
2015-07-08
The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.
NASA Astrophysics Data System (ADS)
Chan, S.; Billesbach, D. P.; Hanson, C. V.; Biraud, S.
2014-12-01
The AmeriFlux quality assurance and quality control (QA/QC) technical team conducts short term (<2 weeks) intercomparisons using a portable eddy covariance system (PECS) to maintain high quality data observations and data consistency across the AmeriFlux network (http://ameriflux.lbl.gov/). Site intercomparisons identify discrepancies between the in situ and portable measurements and calculated fluxes. Findings are jointly discussed by the site staff and the QA/QC team to improve in the situ observations. Despite the relatively short duration of an individual site intercomparison, the accumulated record of all site visits (numbering over 100 since 2002) is a unique dataset. The ability to deploy redundant sensors provides a rare opportunity to identify, quantify, and understand uncertainties in eddy covariance and ancillary measurements. We present a few specific case studies from QA/QC site visits to highlight and share new and relevant findings related to eddy covariance instrumentation and operation.
Brummel, Olaf; Waidhas, Fabian; Bauer, Udo; Wu, Yanlin; Bochmann, Sebastian; Steinrück, Hans-Peter; Papp, Christian; Bachmann, Julien; Libuda, Jörg
2017-07-06
The two valence isomers norbornadiene (NBD) and quadricyclane (QC) enable solar energy storage in a single molecule system. We present a new photoelectrochemical infrared reflection absorption spectroscopy (PEC-IRRAS) experiment, which allows monitoring of the complete energy storage and release cycle by in situ vibrational spectroscopy. Both processes were investigated, the photochemical conversion from NBD to QC using the photosensitizer 4,4'-bis(dimethylamino)benzophenone (Michler's ketone, MK) and the electrochemically triggered cycloreversion from QC to NBD. Photochemical conversion was obtained with characteristic conversion times on the order of 500 ms. All experiments were performed under full potential control in a thin-layer configuration with a Pt(111) working electrode. The vibrational spectra of NBD, QC, and MK were analyzed in the fingerprint region, permitting quantitative analysis of the spectroscopic data. We determined selectivities for both the photochemical conversion and the electrochemical cycloreversion and identified the critical steps that limit the reversibility of the storage cycle.
Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?
Miller, Melissa B.; Hindler, Janet
2015-01-01
The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use “equivalent QC” (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. PMID:26447112
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Lau, Wing Y.; Chu, Terry; Grothof, Markus
2003-07-01
Traditionally, the measuring or monitoring system of manufacturing industries uses sensors, computers and screens for their quality control (Q.C.). The acquired information is fed back to the control room by wires, which - for obvious reason - are not suitable in many environments. This paper describes a method to solve this problem by employing the new Bluetooth technology to set up a complete new system, where a total wireless solution is made feasible. This new Q.C. system allows several line scan cameras to be connected at once to a graphical user interface (GUI) that can monitor the production process. There are many Bluetooth devices available on the market such as cell-phones, headsets, printers, PDA etc. However, the detailed application is a novel implementation in the industrial Q.C. area. This paper will contain more details about the Bluetooth standard and why it is used (nework topologies, host controller interface, data rates, etc.), the Bluetooth implemetation in the microcontroller of the line scan camera, and the GUI and its features.
Analysis of glycoprotein processing in the endoplasmic reticulum using synthetic oligosaccharides.
Ito, Yukishige; Takeda, Yoichi
2012-01-01
Protein quality control (QC) in the endoplasmic reticulum (ER) comprises many steps, including folding and transport of nascent proteins as well as degradation of misfolded proteins. Recent studies have revealed that high-mannose-type glycans play a pivotal role in the QC process. To gain knowledge about the molecular basis of this process with well-defined homogeneous compounds, we achieved a convergent synthesis of high-mannose-type glycans and their functionalized derivatives. We focused on analyses of UDP-Glc: glycoprotein glucosyltransferase (UGGT) and ER Glucosidase II, which play crucial roles in glycoprotein QC; however, their specificities remain unclear. In addition, we established an in vitro assay system mimicking the in vivo condition which is highly crowded because of the presence of various biomacromolecules.
Wei, Ling; Shi, Jianfeng; Afari, George; Bhattacharyya, Sibaprasad
2014-01-01
Panitumumab is a fully human monoclonal antibody approved for the treatment of epidermal growth factor receptor (EGFR) positive colorectal cancer. Recently, panitumumab has been radiolabeled with 89Zr and evaluated for its potential to be used as immuno-positron emission tomography (PET) probe for EGFR positive cancers. Interesting preclinical results published by several groups of researchers have prompted us to develop a robust procedure for producing clinical-grade 89Zr-panitumumab as an immuno-PET probe to evaluate EGFR-targeted therapy. In this process, clinical-grade panitumumab is bio-conjugated with desferrioxamine chelate and subsequently radiolabeled with 89Zr resulting in high radiochemical yield (>70%, n=3) and purity (>98%, n=3). All quality control (QC) tests were performed according to United States Pharmacopeia specifications. QC tests showed that 89Zr-panitumumab met all specifications for human injection. Herein, we describe a step-by-step method for the facile synthesis and QC tests of 89Zr-panitumumab for medical use. The entire process of bioconjugation, radiolabeling, and all QC tests will take about 5h. Because the synthesis is fully manual, two rapid, in-process QC tests have been introduced to make the procedure robust and error free. PMID:24448743
Li, Chunhua; Lu, Ling; Wu, Xianghong; Wang, Chuanxi; Bennett, Phil; Lu, Teng; Murphy, Donald
2009-08-01
In this study, we characterized the full-length genomic sequences of 13 distinct hepatitis C virus (HCV) genotype 4 isolates/subtypes: QC264/4b, QC381/4c, QC382/4d, QC193/4g, QC383/4k, QC274/4l, QC249/4m, QC97/4n, QC93/4o, QC139/4p, QC262/4q, QC384/4r and QC155/4t. These were amplified, using RT-PCR, from the sera of patients now residing in Canada, 11 of which were African immigrants. The resulting genomes varied between 9421 and 9475 nt in length and each contains a single ORF of 9018-9069 nt. The sequences showed nucleotide similarities of 77.3-84.3 % in comparison with subtypes 4a (GenBank accession no. Y11604) and 4f (EF589160) and 70.6-72.8 % in comparison with genotype 1 (M62321/1a, M58335/1b, D14853/1c, and 1?/AJ851228) reference sequences. These similarities were often higher than those currently defined by HCV classification criteria for subtype (75.0-80.0 %) and genotype (67.0-70.0 %) division, respectively. Further analyses of the complete and partial E1 and partial NS5B sequences confirmed these 13 'provisionally assigned subtypes'.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrence Livermore National Laboratory
2009-12-09
QC sample results (daily background checks, 20-gram and 100-gram SGS drum checks) were within acceptable criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on 5 drums with IDs LL85101099TRU, LL85801147TRU, LL85801109TRU, LL85300999TRU and LL85500979TRU. All replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. Note that the batch covered 5 weeks of SGS measurements from 23-Jan-2002 through 22-Feb-2002. Data packet for SGS Batch 2002-02 generated using gamma spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with established control limits.more » The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable. An Expert Review was performed on the data packet between 28-Feb-02 and 09-Jul-02 to check for potential U-235, Np-237 and Am-241 interferences and address drum cases where specific scan segments showed Se gamma ray transmissions for the 136-keV gamma to be below 0.1 %. Two drums in the batch showed Pu-238 at a relative mass ratio more than 2% of all the Pu isotopes.« less
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
From field notes to data portal - An operational QA/QC framework for tower networks
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.
2016-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.
Integrative Blood Pressure Response to Upright Tilt Post Renal Denervation
Howden, Erin J.; East, Cara; Lawley, Justin S.; Stickford, Abigail S.L.; Verhees, Myrthe; Fu, Qi
2017-01-01
Abstract BACKGROUND Whether renal denervation (RDN) in patients with resistant hypertension normalizes blood pressure (BP) regulation in response to routine cardiovascular stimuli such as upright posture is unknown. We conducted an integrative study of BP regulation in patients with resistant hypertension who had received RDN to characterize autonomic circulatory control. METHODS Twelve patients (60 ± 9 [SD] years, n = 10 males) who participated in the Symplicity HTN-3 trial were studied and compared to 2 age-matched normotensive (Norm) and hypertensive (unmedicated, HTN) control groups. BP, heart rate (HR), cardiac output (Qc), muscle sympathetic nerve activity (MSNA), and neurohormonal variables were measured supine, and 30° (5 minutes) and 60° (20 minutes) head-up-tilt (HUT). Total peripheral resistance (TPR) was calculated from mean arterial pressure and Qc. RESULTS Despite treatment with RDN and 4.8 (range, 3–7) antihypertensive medications, the RDN had significantly higher supine systolic BP compared to Norm and HTN (149 ± 15 vs. 118 ± 6, 108 ± 8 mm Hg, P < 0.001). When supine, RDN had higher HR, TPR, MSNA, plasma norepinephrine, and effective arterial elastance compared to Norm. Plasma norepinephrine, Qc, and HR were also higher in the RDN vs. HTN. During HUT, BP remained higher in the RDN, due to increases in Qc, plasma norepinephrine, and aldosterone. CONCLUSION We provide evidence of a possible mechanism by which BP remains elevated post RDN, with the observation of increased Qc and arterial stiffness, as well as plasma norepinephrine and aldosterone levels at approximately 2 years post treatment. These findings may be the consequence of incomplete ablation of sympathetic renal nerves or be related to other factors. PMID:28338768
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.
2017-12-01
AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.
Inorganic chemical analysis of environmental materials—A lecture series
Crock, J.G.; Lamothe, P.J.
2011-01-01
At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.
Remane, Daniela; Grunwald, Soeren; Hoeke, Henrike; Mueller, Andrea; Roeder, Stefan; von Bergen, Martin; Wissenbach, Dirk K
2015-08-15
During the last decades exposure sciences and epidemiological studies attracts more attention to unravel the mechanisms for the development of chronic diseases. According to this an existing HPLC-DAD method for determination of creatinine in urine samples was expended for seven analytes and validated. Creatinine, uric acid, homovanillic acid, niacinamide, hippuric acid, indole-3-acetic acid, and 2-methylhippuric acid were separated by gradient elution (formate buffer/methanol) using an Eclipse Plus C18 Rapid Resolution column (4.6mm×100mm). No interfering signals were detected in mobile phase. After injection of blank urine samples signals for the endogenous compounds but no interferences were detected. All analytes were linear in the selected calibration range and a non weighted calibration model was chosen. Bias, intra-day and inter-day precision for all analytes were below 20% for quality control (QC) low and below 10% for QC medium and high. The limits of quantification in mobile phase were in line with reported reference values but had to be adjusted in urine for homovanillic acid (45mg/L), niacinamide 58.5(mg/L), and indole-3-acetic acid (63mg/L). Comparison of creatinine data obtained by the existing method with those of the developed method showing differences from -120mg/L to +110mg/L with a mean of differences of 29.0mg/L for 50 authentic urine samples. Analyzing 50 authentic urine samples, uric acid, creatinine, hippuric acid, and 2-methylhippuric acid were detected in (nearly) all samples. However, homovanillic acid was detected in 40%, niacinamide in 4% and indole-3-acetic acid was never detected within the selected samples. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.
SU-D-201-04: Evaluation of Elekta Agility MLC Performance Using Statistical Process Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyers, SM; Balderson, MJ; Letourneau, D
2016-06-15
Purpose: to evaluate the performance and stability of the Elekta Agility MLC model using an automated quality control (QC) test in combination with statistical process control tools. Methods: Leaf positions were collected daily for 11 Elekta units over 5–19 months using the automated QC test, which analyzes 23 MV images to determine the location of MLC leaves relative to the radiation isocenter. The leaf positions are measured at 5 nominal positions, and images are acquired at collimator 0° and 180° to capture all MLC leaves in the field-of-view. Leaf positioning accuracy was assessed using individual and moving range control charts.more » Control limits were recomputed following MLC recalibration (occurred 1–2 times for 4 units). Specification levels of ±0.5, ±1 and ±1.5mm were tested. The mean and range of duration between out-of-control and out-of-specification events were determined. Results: Leaf position varied little over time, as confirmed by very tight individual control limits (mean ±0.19mm, range 0.09–0.44). Mean leaf position error was −0.03mm (range −0.89–0.83). Due to sporadic out-of-control events, the mean in-control duration was 3.3 days (range 1–23). Data stayed within ±1mm specification for 205 days on average (range 3–372) and within ±1.5mm for the entire date range. Measurements stayed within ±0.5mm for 1 day on average (range 0–17); however, our MLC leaves were not calibrated to this level of accuracy. Conclusion: The Elekta Agility MLC model was found to perform with high stability, as evidenced by the tight control limits. The in-specification durations support the current recommendation of monthly MLC QC tests with a ±1mm tolerance. Future work is on-going to determine if Agility performance can be optimized further using high-frequency QC test results to drive recalibration frequency. Factors that can affect leaf positioning accuracy, including beam spot motion, leaf gain calibration, drifting leaves, and image artifacts, are under investigation.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-06
... Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act; Analysis and Sampling... for use as an alternative oil and grease method. Some comments were specific to the sampling...-side comparison using the specific procedures (e.g. sampling frequency, number of samples, QA/QC, and...
NASA Astrophysics Data System (ADS)
Amelang, Jeff
The quasicontinuum (QC) method was introduced to coarse-grain crystalline atomic ensembles in order to bridge the scales from individual atoms to the micro- and mesoscales. Though many QC formulations have been proposed with varying characteristics and capabilities, a crucial cornerstone of all QC techniques is the concept of summation rules, which attempt to efficiently approximate the total Hamiltonian of a crystalline atomic ensemble by a weighted sum over a small subset of atoms. In this work we propose a novel, fully-nonlocal, energy-based formulation of the QC method with support for legacy and new summation rules through a general energy-sampling scheme. Our formulation does not conceptually differentiate between atomistic and coarse-grained regions and thus allows for seamless bridging without domain-coupling interfaces. Within this structure, we introduce a new class of summation rules which leverage the affine kinematics of this QC formulation to most accurately integrate thermodynamic quantities of interest. By comparing this new class of summation rules to commonly-employed rules through analysis of energy and spurious force errors, we find that the new rules produce no residual or spurious force artifacts in the large-element limit under arbitrary affine deformation, while allowing us to seamlessly bridge to full atomistics. We verify that the new summation rules exhibit significantly smaller force artifacts and energy approximation errors than all comparable previous summation rules through a comprehensive suite of examples with spatially non-uniform QC discretizations in two and three dimensions. Due to the unique structure of these summation rules, we also use the new formulation to study scenarios with large regions of free surface, a class of problems previously out of reach of the QC method. Lastly, we present the key components of a high-performance, distributed-memory realization of the new method, including a novel algorithm for supporting unparalleled levels of deformation. Overall, this new formulation and implementation allows us to efficiently perform simulations containing an unprecedented number of degrees of freedom with low approximation error.
Protecting the proteome: Eukaryotic cotranslational quality control pathways
2014-01-01
The correct decoding of messenger RNAs (mRNAs) into proteins is an essential cellular task. The translational process is monitored by several quality control (QC) mechanisms that recognize defective translation complexes in which ribosomes are stalled on substrate mRNAs. Stalled translation complexes occur when defects in the mRNA template, the translation machinery, or the nascent polypeptide arrest the ribosome during translation elongation or termination. These QC events promote the disassembly of the stalled translation complex and the recycling and/or degradation of the individual mRNA, ribosomal, and/or nascent polypeptide components, thereby clearing the cell of improper translation products and defective components of the translation machinery. PMID:24535822
Bujila, Robert; Poludniowski, Gavin; Fransson, Annette
2015-01-01
The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two‐year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service PACS numbers: 87.57.C‐, 87.57.N‐, 87.57.Q‐ PMID:26219012
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...
Spectrally high performing quantum cascade lasers
NASA Astrophysics Data System (ADS)
Toor, Fatima
Quantum cascade (QC) lasers are versatile semiconductor light sources that can be engineered to emit light of almost any wavelength in the mid- to far-infrared (IR) and terahertz region from 3 to 300 mum [1-5]. Furthermore QC laser technology in the mid-IR range has great potential for applications in environmental, medical and industrial trace gas sensing [6-10] since several chemical vapors have strong rovibrational frequencies in this range and are uniquely identifiable by their absorption spectra through optical probing of absorption and transmission. Therefore, having a wide range of mid-IR wavelengths in a single QC laser source would greatly increase the specificity of QC laser-based spectroscopic systems, and also make them more compact and field deployable. This thesis presents work on several different approaches to multi-wavelength QC laser sources that take advantage of band-structure engineering and the uni-polar nature of QC lasers. Also, since for chemical sensing, lasers with narrow linewidth are needed, work is presented on a single mode distributed feedback (DFB) QC laser. First, a compact four-wavelength QC laser source, which is based on a 2-by-2 module design, with two waveguides having QC laser stacks for two different emission wavelengths each, one with 7.0 mum/11.2 mum, and the other with 8.7 mum/12.0 mum is presented. This is the first design of a four-wavelength QC laser source with widely different emission wavelengths that uses minimal optics and electronics. Second, since there are still several unknown factors that affect QC laser performance, results on a first ever study conducted to determine the effects of waveguide side-wall roughness on QC laser performance using the two-wavelength waveguides is presented. The results are consistent with Rayleigh scattering effects in the waveguides, with roughness effecting shorter wavelengths more than longer wavelengths. Third, a versatile time-multiplexed multi-wavelength QC laser system that emits at lambda = 10.8 mum for positive and lambda = 8.6 mum for negative polarity current with microsecond time delay is presented. Such a system is the first demonstration of a time and wavelength multiplexed system that uses a single QC laser. Fourth, work on the design and fabrication of a single-mode distributed feedback (DFB) QC laser emitting at lambda ≈ 7.7 mum to be used in a QC laser based photoacoustic sensor is presented. The DFB QC laser had a temperature tuning co-efficient of 0.45 nm/K for a temperature range of 80 K to 320 K, and a side mode suppression ratio of greater than 30 dB. Finally, study on the lateral mode patterns of wide ridge QC lasers is presented. The results include the observation of degenerate and non-degenerate lateral modes in wide ridge QC lasers emitting at lambda ≈ 5.0 mum. This study was conducted with the end goal of using wide ridge QC lasers in a novel technique to spatiospectrally combine multiple transverse modes to obtain an ultra high power single spot QC laser beam.
2012-09-30
briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http://catalog.eol.ucar.edu/cgi- bin/dynamo/report...index); • Dropsonde data processing on all P3 flights and realtime QC/reporting to GTS; and • Science summary of aircraft missions posted on EOL ...data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop at EOL /NCAR from 6-7 February 2012
NASA Astrophysics Data System (ADS)
Misra, Sushil K.; Andronenko, S. I.; Srinivasa Rao, S.; Chess, Jordan; Punnoose, A.
2015-11-01
EPR investigations on two types of dilute magnetic semiconductor (DMS) ZnO nanoparticles doped with 0.5-10% Co2+ ions, prepared by two chemical hydrolysis methods, using: (i) diethylene glycol ((CH2CH2OH)2O) (NC-rod-like samples), and (ii) denatured ethanol (CH3CH2OH) solutions (QC-spherical samples), were carried out at X-band (9.5 GHz) at 5 K. The analysis of EPR data for NC samples revealed the presence of several types of EPR lines: (i) two types, intense and weak, of high-spin Co2+ ions in the samples with Co concentration >0.5%; (ii) surface oxygen vacancies, and (iii) a ferromagnetic resonance (FMR) line. QC samples exhibit an intense FMR line and an EPR line due to high-spin Co2+ ions. FMR line is more intense, than the corresponding line exhibited by NC samples. These EPR spectra varied for sample with different doping concentrations. The magnetic states of these samples as revealed by EPR spectra, as well as the origin of ferromagnetism DMS samples are discussed.
Lin, Jie; Jing, Li; Zhu, Hao; Dong, Fu-Sheng
2017-01-01
The aim of the study was to determine the mechanism of action of the 800 nm semiconductor laser on skin blackheads and coarse pores. A total of 24 healthy purebred short-haired male guinea pigs, weighing 350-400 g, were selected and smeared with 0.5 ml coal tar suspension evenly by injector once daily. Treatment was continued for 14 days to form an experimental area of 8×3 cm on the back of the guinea pigs. The animals were divided into the following groups: Normal control group (NC), low-dose laser treatment group (L-LS), high-dose laser treatment group (H-LS), and Q-switched Nd:YAG treatment group (QC). Samples were extracted 1, 7 and 14 days after surgery and hematoxylin and eosin staining was used to identify the following: Epidermis, dermis, sebaceous gland change and hair follicle damage; the expression of proliferating cell nuclear antigen (PCNA) of sebaceous gland cells using immunohistochemistry; sebaceous gland cell apoptosis using TUNEL; and the protein expression of caspase-3, Bax and Bcl-2 using western blot analysis. With the extension of time, we observed inflammatory cell infiltration, an increase in hair follicle distortion and necrosis of the surrounding hair follicles. The expression levels of PCNA of the L-LS, H-LS and QC groups decreased with time. Regarding the respective time points, the NC group was highest, the L-LS and H-LS groups were next highest and the H-LS group was lowest. The difference was statistically significant (P<0.05). The apoptotic rate of the L-LS, H-LS and QC groups increased with time. With regard to the respective time points, the NC group was lowest, the L-LS and QC groups were next lowest and the H-LS group was highest. The difference was statistically significant (P<0.05). The protein expression of caspase-3, Bax and Bcl-2 of the L-LS, H-LS and QC groups increased with time. Regarding the respective time points, caspase-3 and Bax protein expression of the NC group was lowest, the L-LS and QC groups were next lowest and the H-LS group was highest. Bcl-2 protein expression of the NC group was highest, protein expression of the NC group was next highest and the H-LS group was lowest. The difference was statistically significant (P<0.05). In conclusion, the low-dose 800 nm semiconductor laser is an effective treatment on skin blackheads and coarse pores, and promotes hair follicle cell apoptosis without reducing the expression of PCNA.
40 CFR 98.144 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... melting furnace from monthly measurements using plant instruments used for accounting purposes, such as... raw material; such measurements shall be based on sampling and chemical analysis conducted by a...
Sandusky, George E; Teheny, Katie Heinz; Esterman, Mike; Hanson, Jeff; Williams, Stephen D
2007-01-01
The success of molecular research and its applications in both the clinical and basic research arenas is strongly dependent on the collection, handling, storage, and quality control of fresh human tissue samples. This tissue bank was set up to bank fresh surgically obtained human tissue using a Clinical Annotated Tissue Database (CATD) in order to capture the associated patient clinical data and demographics using a one way patient encryption scheme to protect patient identification. In this study, we determined that high quality of tissue samples is imperative for both genomic and proteomic molecular research. This paper also contains a brief compilation of the literature involved in the patient ethics, patient informed consent, patient de-identification, tissue collection, processing, and storage as well as basic molecular research generated from the tissue bank using good clinical practices. The current applicable rules, regulations, and guidelines for handling human tissues are briefly discussed. More than 6,610 cancer patients have been consented (97% of those that were contacted by the consenter) and 16,800 tissue specimens have been banked from these patients in 9 years. All samples collected in the bank were QC'd by a pathologist. Approximately 1,550 tissue samples have been requested for use in basic, clinical, and/or biomarker cancer research studies. Each tissue aliquot removed from the bank for a research study were evaluated by a second H&E, if the samples passed the QC, they were submitted for genomic and proteomic molecular analysis/study. Approximately 75% of samples evaluated were of high histologic quality and used for research studies. Since 2003, we changed the patient informed consent to allow the tissue bank to gather more patient clinical follow-up information. Ninety two percent of the patients (1,865 patients) signed the new informed consent form and agreed to be re-contacted for follow-up information on their disease state. In addition, eighty five percent of patients (1,584) agreed to be re-contacted to provide a biological fluid sample to be used for biomarker research.
Glutaminyl Cyclase Knock-out Mice Exhibit Slight Hypothyroidism but No Hypogonadism
Schilling, Stephan; Kohlmann, Stephanie; Bäuscher, Christoph; Sedlmeier, Reinhard; Koch, Birgit; Eichentopf, Rico; Becker, Andreas; Cynis, Holger; Hoffmann, Torsten; Berg, Sabine; Freyse, Ernst-Joachim; von Hörsten, Stephan; Rossner, Steffen; Graubner, Sigrid; Demuth, Hans-Ulrich
2011-01-01
Glutaminyl cyclases (QCs) catalyze the formation of pyroglutamate (pGlu) residues at the N terminus of peptides and proteins. Hypothalamic pGlu hormones, such as thyrotropin-releasing hormone and gonadotropin-releasing hormone are essential for regulation of metabolism and fertility in the hypothalamic pituitary thyroid and gonadal axes, respectively. Here, we analyzed the consequences of constitutive genetic QC ablation on endocrine functions and on the behavior of adult mice. Adult homozygous QC knock-out mice are fertile and behave indistinguishably from wild type mice in tests of motor function, cognition, general activity, and ingestion behavior. The QC knock-out results in a dramatic drop of enzyme activity in the brain, especially in hypothalamus and in plasma. Other peripheral organs like liver and spleen still contain QC activity, which is most likely caused by its homolog isoQC. The serum gonadotropin-releasing hormone, TSH, and testosterone concentrations were not changed by QC depletion. The serum thyroxine was decreased by 24% in homozygous QC knock-out animals, suggesting a mild hypothyroidism. QC knock-out mice were indistinguishable from wild type with regard to blood glucose and glucose tolerance, thus differing from reports of thyrotropin-releasing hormone knock-out mice significantly. The results suggest a significant formation of the hypothalamic pGlu hormones by alternative mechanisms, like spontaneous cyclization or conversion by isoQC. The different effects of QC depletion on the hypothalamic pituitary thyroid and gonadal axes might indicate slightly different modes of substrate conversion of both enzymes. The absence of significant abnormalities in QC knock-out mice suggests the presence of a therapeutic window for suppression of QC activity in current drug development. PMID:21330373
Satellite-Based Quantum Communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Richard J; Nordholt, Jane E; McCabe, Kevin P
2010-09-20
Single-photon quantum communications (QC) offers the attractive feature of 'future proof', forward security rooted in the laws of quantum physics. Ground based quantum key distribution (QKD) experiments in optical fiber have attained transmission ranges in excess of 200km, but for larger distances we proposed a methodology for satellite-based QC. Over the past decade we have devised solutions to the technical challenges to satellite-to-ground QC, and we now have a clear concept for how space-based QC could be performed and potentially utilized within a trusted QKD network architecture. Functioning as a trusted QKD node, a QC satellite ('QC-sat') could deliver secretmore » keys to the key stores of ground-based trusted QKD network nodes, to each of which multiple users are connected by optical fiber or free-space QC. A QC-sat could thereby extend quantum-secured connectivity to geographically disjoint domains, separated by continental or inter-continental distances. In this paper we describe our system concept that makes QC feasible with low-earth orbit (LEO) QC-sats (200-km-2,000-km altitude orbits), and the results of link modeling of expected performance. Using the architecture that we have developed, LEO satellite-to-ground QKD will be feasible with secret bit yields of several hundred 256-bit AES keys per contact. With multiple ground sites separated by {approx} 100km, mitigation of cloudiness over any single ground site would be possible, potentially allowing multiple contact opportunities each day. The essential next step is an experimental QC-sat. A number of LEO-platforms would be suitable, ranging from a dedicated, three-axis stabilized small satellite, to a secondary experiment on an imaging satellite. to the ISS. With one or more QC-sats, low-latency quantum-secured communications could then be provided to ground-based users on a global scale. Air-to-ground QC would also be possible.« less
Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding
NASA Astrophysics Data System (ADS)
Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.
2016-03-01
In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...
Proteomics as a Quality Control Tool of Pharmaceutical Probiotic Bacterial Lysate Products
Klein, Günter; Schanstra, Joost P.; Hoffmann, Janosch; Mischak, Harald; Siwy, Justyna; Zimmermann, Kurt
2013-01-01
Probiotic bacteria have a wide range of applications in veterinary and human therapeutics. Inactivated probiotics are complex samples and quality control (QC) should measure as many molecular features as possible. Capillary electrophoresis coupled to mass spectrometry (CE/MS) has been used as a multidimensional and high throughput method for the identification and validation of biomarkers of disease in complex biological samples such as biofluids. In this study we evaluate the suitability of CE/MS to measure the consistency of different lots of the probiotic formulation Pro-Symbioflor which is a bacterial lysate of heat-inactivated Escherichia coli and Enterococcus faecalis. Over 5000 peptides were detected by CE/MS in 5 different lots of the bacterial lysate and in a sample of culture medium. 71 to 75% of the total peptide content was identical in all lots. This percentage increased to 87–89% when allowing the absence of a peptide in one of the 5 samples. These results, based on over 2000 peptides, suggest high similarity of the 5 different lots. Sequence analysis identified peptides of both E. coli and E. faecalis and peptides originating from the culture medium, thus confirming the presence of the strains in the formulation. Ontology analysis suggested that the majority of the peptides identified for E. coli originated from the cell membrane or the fimbrium, while peptides identified for E. faecalis were enriched for peptides originating from the cytoplasm. The bacterial lysate peptides as a whole are recognised as highly conserved molecular patterns by the innate immune system as microbe associated molecular pattern (MAMP). Sequence analysis also identified the presence of soybean, yeast and casein protein fragments that are part of the formulation of the culture medium. In conclusion CE/MS seems an appropriate QC tool to analyze complex biological products such as inactivated probiotic formulations and allows determining the similarity between lots. PMID:23840518
Breidinger, S A; Simpson, R C; Mangin, E; Woolf, E J
2015-10-01
A method, using liquid chromatography with tandem mass spectrometric detection (LC-MS/MS), was developed for the determination of suvorexant (MK-4305, Belsomra(®)), a selective dual orexin receptor antagonist for the treatment insomnia, in human plasma over the concentration range of 1-1000ng/mL. Stable isotope labeled (13)C(2)H3-suvorexant was used as an internal standard. The sample preparation procedure utilized liquid-liquid extraction, in the 96-well format, of a 100μL plasma sample with methyl t-butyl ether. The compounds were chromatographed under isocratic conditions on a Waters dC18 (50×2.1mm, 3μm) column with a mobile phase consisting of 30/70 (v/v %) 10mM ammonium formate, pH3/acetonitrile at a flow rate of 0.3mL/min. Multiple reaction monitoring of the precursor-to-product ion pairs for suvorexant (m/z 451→186) and (13)C(2)H3-suvorexant (m/z 455→190) on an Applied Biosystems API 4000 tandem mass spectrometer was used for quantitation. Intraday assay precision, assessed in six different lots of control plasma, was within 10% CV at all concentrations, while assay accuracy ranged from 95.6 to 105.0% of nominal. Quality control (QC) samples in plasma were stored at -20°C. Initial within day analysis of QCs after one freeze-thaw cycle showed accuracy within 9.5% of nominal with precision (CV) of 6.7% or less. The plasma QC samples were demonstrated to be stable for up to 25 months at -20°C. The method described has been used to support clinical studies during Phase I through III of clinical development. Copyright © 2015 Elsevier B.V. All rights reserved.
A quality control system for digital elevation data
NASA Astrophysics Data System (ADS)
Knudsen, Thomas; Kokkendorf, Simon; Flatman, Andrew; Nielsen, Thorbjørn; Rosenkranz, Brigitte; Keller, Kristian
2015-04-01
In connection with the introduction of a new version of the Danish national coverage Digital Elevation Model (DK-DEM), the Danish Geodata Agency has developed a comprehensive quality control (QC) and metadata production (MP) system for LiDAR point cloud data. The architecture of the system reflects its origin in a national mapping organization where raw data deliveries are typically outsourced to external suppliers. It also reflects a design decision of aiming at, whenever conceivable, doing full spatial coverage tests, rather than scattered sample checks. Hence, the QC procedure is split in two phases: A reception phase and an acceptance phase. The primary aim of the reception phase is to do a quick assessment of things that can typically go wrong, and which are relatively simple to check: Data coverage, data density, strip adjustment. If a data delivery passes the reception phase, the QC continues with the acceptance phase, which checks five different aspects of the point cloud data: Vertical accuracy Vertical precision Horizontal accuracy Horizontal precision Point classification correctness The vertical descriptors are comparatively simple to measure: The vertical accuracy is checked by direct comparison with previously surveyed patches. The vertical precision is derived from the observed variance on well defined flat surface patches. These patches are automatically derived from the road centerlines registered in FOT, the official Danish map data base. The horizontal descriptors are less straightforward to measure, since potential reference material for direct comparison is typically expected to be less accurate than the LiDAR data. The solution selected is to compare photogrammetrically derived roof centerlines from FOT with LiDAR derived roof centerlines. These are constructed by taking the 3D Hough transform of a point cloud patch defined by the photogrammetrical roof polygon. The LiDAR derived roof centerline is then the intersection line of the two primary planes of the transformed data. Since the photogrammetrical and the LiDAR derived roof centerline sets are independently derived, a low RMS difference indicates that both data sets are of very high accuracy. The horizontal precision is derived by doing a similar comparison between LiDAR derived roof centerlines in the overlap zone of neighbouring flight strips. Contrary to the vertical and horizontal descriptors, the point classification correctness is neither geometric, nor well defined. In this case we must resolve by introducing a human in the loop and presenting data in a form that is as useful as possible to this human. Hence, the QC system produces maps of suspicious patterns such as Vegetation below buildings Points classified as buildings where no building is registered in the map data base Building polygons from the map data base without any building points Buildings on roads All elements of the QC process is carried out in smaller tiles (typically 1 km × 1 km) and hence trivially parallelizable. Results from the parallel executing processes are collected in a geospatial data base system (PostGIS) and the progress can be analyzed and visualized in a desktop GIS while the processes run. Implementation wise, the system is based on open source components, primarily from the OSGeo stack (GDAL, PostGIS, QGIS, NumPy, SciPy, etc.). The system specific code is also being open sourced. This open source distribution philosophy supports the parallel execution paradigm, since all available hardware can be utilized without any licensing problems. As yet, the system has only been used for QC of the first part of a new Danish elevation model. The experience has, however, been very positive. Especially notable is the utility of doing full spatial coverage tests (rather than scattered sample checks). This means that error detection and error reports are exactly as spatial as the point cloud data they concern. This makes it very easy for both data receiver and data provider, to discuss and reason about the nature and causes of irregularities.
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-process phosphoric acid process line. You can use existing plant procedures that are used for accounting... the process line. Conduct the representative bulk sampling using the applicable standard method in the...
Kaufmann-Kolle, Petra; Szecsenyi, Joachim; Broge, Björn; Haefeli, Walter Emil; Schneider, Antonius
2011-01-01
The purpose of this cluster-randomised controlled trial was to evaluate the efficacy of quality circles (QCs) working either with general data-based feedback or with an open benchmark within the field of asthma care and drug-drug interactions. Twelve QCs, involving 96 general practitioners from 85 practices, were randomised. Six QCs worked with traditional anonymous feedback and six with an open benchmark. Two QC meetings supported with feedback reports were held covering the topics "drug-drug interactions" and "asthma"; in both cases discussions were guided by a trained moderator. Outcome measures included health-related quality of life and patient satisfaction with treatment, asthma severity and number of potentially inappropriate drug combinations as well as the general practitioners' satisfaction in relation to the performance of the QC. A significant improvement in the treatment of asthma was observed in both trial arms. However, there was only a slight improvement regarding inappropriate drug combinations. There were no relevant differences between the group with open benchmark (B-QC) and traditional quality circles (T-QC). The physicians' satisfaction with the QC performance was significantly higher in the T-QCs. General practitioners seem to take a critical perspective about open benchmarking in quality circles. Caution should be used when implementing benchmarking in a quality circle as it did not improve healthcare when compared to the traditional procedure with anonymised comparisons. Copyright © 2011. Published by Elsevier GmbH.
Environment-induced quantum coherence spreading of a qubit
NASA Astrophysics Data System (ADS)
Pozzobom, Mauro B.; Maziero, Jonas
2017-02-01
We make a thorough study of the spreading of quantum coherence (QC), as quantified by the l1-norm QC, when a qubit (a two-level quantum system) is subjected to noise quantum channels commonly appearing in quantum information science. We notice that QC is generally not conserved and that even incoherent initial states can lead to transitory system-environment QC. We show that for the amplitude damping channel the evolved total QC can be written as the sum of local and non-local parts, with the last one being equal to entanglement. On the other hand, for the phase damping channel (PDC) entanglement does not account for all non-local QC, with the gap between them depending on time and also on the qubit's initial state. Besides these issues, the possibility and conditions for time invariance of QC are regarded in the case of bit, phase, and bit-phase flip channels. Here we reveal the qualitative dynamical inequivalence between these channels and the PDC and show that the creation of system-environment entanglement does not necessarily imply the destruction of the qubit's QC. We also investigate the resources needed for non-local QC creation, showing that while the PDC requires initial coherence of the qubit, for some other channels non-zero population of the excited state (i.e., energy) is sufficient. Related to that, considering the depolarizing channel we notice the qubit's ability to act as a catalyst for the creation of joint QC and entanglement, without need for nonzero initial QC or excited state population.
Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin
2006-10-13
We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.
Announcement—guidance document for acquiring reliable data in ecological restoration projects
Stapanian, Martin A.; Rodriguez, Karen; Lewis, Timothy E.; Blume, Louis; Palmer, Craig J.; Walters, Lynn; Schofield, Judith; Amos, Molly M.; Bucher, Adam
2016-01-01
The Laurentian Great Lakes are undergoing intensive ecological restoration in Canada and the United States. In the United States, an interagency committee was formed to facilitate implementation of quality practices for federally funded restoration projects in the Great Lakes basin. The Committee's responsibilities include developing a guidance document that will provide a common approach to the application of quality assurance and quality control (QA/QC) practices for restoration projects. The document will serve as a “how-to” guide for ensuring data quality during each aspect of ecological restoration projects. In addition, the document will provide suggestions on linking QA/QC data with the routine project data and hints on creating detailed supporting documentation. Finally, the document will advocate integrating all components of the project, including QA/QC applications, into an overarching decision-support framework. The guidance document is expected to be released by the U.S. EPA Great Lakes National Program Office in 2017.
1994-03-04
WalerQC METHOD BANK 30104 79-0146 TRHICLOROE1Ifl.BEE(TE) 0.j U11.01 WalerQC UShODSBAIN 301 04W 79-0146 TRIILMOROBHYLBEE (TCE) IU 1101.. alerQC METHOD...OOUL1!ANE -SS 89 %IC WSWeQC METHOD BANK 3020(1400 22M 0-Si-S 2*OOCLOROBUTANE -SI 902 sm WalerQC METHOD BLANK 8020(1400 22M 0-365 1.4003C2LOROSUfANE...SS 920 %wI WmerQC METHMOD BANK 0102(1400 CH 10-56-5 I.OX4-D01OOSUANE -SI IisBc WaNer C METHOD BLANK 8100(1400 22 10-5&5 2.40 EHOROSUTANE -SI 92 IC
Cendejas, Richard A; Phillips, Mark C; Myers, Tanya L; Taubman, Matthew S
2010-12-06
An external-cavity (EC) quantum cascade (QC) laser using optical feedback from a partial-reflector is reported. With this configuration, the otherwise multi-mode emission of a Fabry-Perot QC laser was made single-mode with optical output powers exceeding 40 mW. A mode-hop free tuning range of 2.46 cm(-1) was achieved by synchronously tuning the EC length and QC laser current. The linewidth of the partial-reflector EC-QC laser was measured for integration times from 100 μs to 4 seconds, and compared to a distributed feedback QC laser. Linewidths as small as 480 kHz were recorded for the EC-QC laser.
Sobol, Wlad T
2002-01-01
A simple kinetic model that describes the time evolution of the chemical concentration of an arbitrary compound within the tank of an automatic film processor is presented. It provides insights into the kinetics of chemistry concentration inside the processor's tank; the results facilitate the tasks of processor tuning and quality control (QC). The model has successfully been used in several troubleshooting sessions of low-volume mammography processors for which maintaining consistent QC tracking was difficult due to fluctuations of bromide levels in the developer tank.
Duxbury, Geoffrey; Wilson, David; Hay, Kenneth; Langford, Nigel
2013-10-03
Intrapulse quantum cascade (QC) laser spectrometers are able to produce both saturation and molecular alignment of a gas sample owing to the rapid sweep of the radiation through the absorption features. In the QC lasers used to study the (14)N and (15)N isotopologues of the ν4 band of ammonia centered near 1625 cm(-1), the variation of the chirp rate during the scan is very large, from ca. 85 to ca. 15 MHz ns(-1). In the rapid chirp zone the collisional interaction time of the laser radiation with the gas molecules is short, and large rapid passage effects are seen, whereas at the slow chirp end the line shape resembles that of a Doppler broadened line. The total scan range of the QC laser of ca. 10 cm(-1) is sufficient to allow the spectra of both isotopologues to be recorded and the rapid and slow interactions with the laser radiation to be seen. The rapid passage effects are enhanced by the use of an off axis Herriott cell with an effective path length of 62 m, which allows a buildup of polarization to occur. The effective resolution of the chirped QC laser is ca. 0.012 cm(-1) full width at half-maximum in the 1625 cm(-1) region. The results of these experiments are compared with those of other studies of the ν4 band of ammonia carried out using Fourier transform and Laser Stark spectroscopy. They also demonstrate the versatility of the down chirped QC laser for investigating collisional effects in low pressure gases using long absorbing path lengths.
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
This data set contains the method performance results. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.
The Children’s Total Exposure to Persistent Pesticides and Other Persistent Pollutant (...
Coda Wave Attenuation Characteristics for North Anatolian Fault Zone, Turkey
NASA Astrophysics Data System (ADS)
Sertcelik, Fadime; Guleroglu, Mehmet
2017-10-01
North Anatolian Fault Zone, on which large earthquakes have occurred in the past, migrates regularly from east to west, and it is one of the most active faults in the world. The purpose of this study is to estimate the coda wave quality factor (Qc) for each of the five sub regionsthat were determined according to the fault rupture of these large earthquakes and along the fault. 978 records have been analyzed for 1.5, 3, 6, 9, 12 and 18 Hz frequencies by Single Backscattering Method. Along the fault, the variations in the Qc with lapse time are determined via, Qc = (136±25)f(0.96±0.027), Qc = (208±22)f(0.85±0.02) Qc = (307±28)f(0.72±0.025) at 20, 30, 40 sec lapse times, respectively. The estimated average frequency-dependence quality factor for all lapse time are; Qc(f) = (189±26)f(0.86±0.02) for Karliova-Tokat region; Qc(f) = (216±19)f(0.76±0.018) for Tokat-Çorum region; Qc(f) = (232±18)f(0.76±0.019) for Çorum-Adapazari region; Qc(f) = (280±28)f(0.79±0.021) for Adapazari-Yalova region; Qc(f) = (252±26)f(0.81±0.022) for Yalova-Gulf of Saros region. The coda wave quality factor at all the lapse times and frequencies is Qc(f) = (206±15)f(0.85±0.012) in the study area. The most change of Qc with lapse time is determined at Yalova-Saros region. The result may be related to heterogeneity degree of rapidly decreases towards the deep crust like compared to the other sub region. Moreover, the highest Qc is calculated between Adapazari - Yalova. It was interpreted as a result of seismic energy released by 1999 Kocaeli Earthquake. Besides, it couldn't be established a causal relationship between the regional variation of Qc with frequency and lapse time associated to migration of the big earthquakes. These results have been interpreted as the attenuation mechanism is affected by both regional heterogeneity and consist of a single or multi strands of the fault structure.
Gubler, Hanspeter; Clare, Nicholas; Galafassi, Laurent; Geissler, Uwe; Girod, Michel; Herr, Guy
2018-06-01
We describe the main characteristics of the Novartis Helios data analysis software system (Novartis, Basel, Switzerland) for plate-based screening and profiling assays, which was designed and built about 11 years ago. It has been in productive use for more than 10 years and is one of the important standard software applications running for a large user community at all Novartis Institutes for BioMedical Research sites globally. A high degree of automation is reached by embedding the data analysis capabilities into a software ecosystem that deals with the management of samples, plates, and result data files, including automated data loading. The application provides a series of analytical procedures, ranging from very simple to advanced, which can easily be assembled by users in very flexible ways. This also includes the automatic derivation of a large set of quality control (QC) characteristics at every step. Any of the raw, intermediate, and final results and QC-relevant quantities can be easily explored through linked visualizations. Links to global assay metadata management, data warehouses, and an electronic lab notebook system are in place. Automated transfer of relevant data to data warehouses and electronic lab notebook systems are also implemented.
Anderson, Nancy
2015-11-15
As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.
This data set contains the method performance results for CTEPP-OH. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.
The Children’s Total Exposure to Persistent Pesticides and Other Persisten...
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-03-01
A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.
Sorich, Michael J; McKinnon, Ross A; Miners, John O; Winkler, David A; Smith, Paul A
2004-10-07
This study aimed to evaluate in silico models based on quantum chemical (QC) descriptors derived using the electronegativity equalization method (EEM) and to assess the use of QC properties to predict chemical metabolism by human UDP-glucuronosyltransferase (UGT) isoforms. Various EEM-derived QC molecular descriptors were calculated for known UGT substrates and nonsubstrates. Classification models were developed using support vector machine and partial least squares discriminant analysis. In general, the most predictive models were generated with the support vector machine. Combining QC and 2D descriptors (from previous work) using a consensus approach resulted in a statistically significant improvement in predictivity (to 84%) over both the QC and 2D models and the other methods of combining the descriptors. EEM-derived QC descriptors were shown to be both highly predictive and computationally efficient. It is likely that EEM-derived QC properties will be generally useful for predicting ADMET and physicochemical properties during drug discovery.
Mora, Patricia; Faulkner, Keith; Mahmoud, Ahmed M; Gershan, Vesna; Kausik, Aruna; Zdesar, Urban; Brandan, María-Ester; Kurt, Serap; Davidović, Jasna; Salama, Dina H; Aribal, Erkin; Odio, Clara; Chaturvedi, Arvind K; Sabih, Zahida; Vujnović, Saša; Paez, Diana; Delis, Harry
2018-04-01
The International Atomic Energy Agency (IAEA) through a Coordinated Research Project on "Enhancing Capacity for Early Detection and Diagnosis of Breast Cancer through Imaging", brought together a group of mammography radiologists, medical physicists and radiographers; to investigate current practices and improve procedures for the early detection of breast cancer by strengthening both the clinical and medical physics components. This paper addresses the medical physics component. The countries that participated in the CRP were Bosnia and Herzegovina, Costa Rica, Egypt, India, Kenya, the Frmr. Yug. Rep. of Macedonia, Mexico, Nigeria, Pakistan, Philippines, Slovenia, Turkey, Uganda, United Kingdom and Zambia. Ten institutions participated using IAEA quality control protocols in 9 digital and 3 analogue mammography equipment. A spreadsheet for data collection was generated and distributed. Evaluation of image quality was done using TOR MAX and DMAM2 Gold phantoms. QC results for analogue equipment showed satisfactory results. QC tests performed on digital systems showed that improvements needed to be implemented, especially in thickness accuracy, signal difference to noise ratio (SDNR) values for achievable levels, uniformity and modulation transfer function (MTF). Mean glandular dose (MGD) was below international recommended levels for patient radiation protection. Evaluation of image quality by phantoms also indicated the need for improvement. Common activities facilitated improvement in mammography practice, including training of medical physicists in QC programs and infrastructure was improved and strengthened; networking among medical physicists and radiologists took place and was maintained over time. IAEA QC protocols provided a uniformed approach to QC measurements. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Aris-Brosou, Stephane; Kim, James; Li, Li; Liu, Hui
2018-05-15
Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. ©Stephane Aris-Brosou, James Kim, Li Li, Hui Liu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 15.05.2018.
Kim, James; Li, Li; Liu, Hui
2018-01-01
Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796
NASA Astrophysics Data System (ADS)
Bushnell, M.; Waldmann, C.; Hermes, J.; Tamburri, M.
2017-12-01
Many oceanographic observation groups create and maintain QA, QC, and best practices (BP) to ensure efficient and accurate data collection and quantify quality. Several entities - IOOS® QARTOD, AtlantOS, ACT, WMO/IOC JCOMM OCG - have joined forces to document existing practices, identify gaps, and support development of emerging techniques. While each group has a slightly different focus, many underlying QA/QC/BP needs can be quite common. QARTOD focuses upon real-time data QC, and has produced manuals that address QC tests for eleven ocean variables. AtlantOS is a research and innovation project working towards the integration of ocean-observing activities across all disciplines in the Atlantic Basin. ACT brings together research institutions, resource managers, and private companies to foster the development and adoption of effective and reliable sensors for coastal, freshwater, and ocean environments. JCOMM promotes broad international coordination of oceanographic and marine meteorological observations and data management and services. Leveraging existing efforts of these organizations is an efficient way to consolidate available information, develop new practices, and evaluate the use of ISO standards to judge the quality of measurements. ISO standards may offer accepted support for a framework for an ocean data quality management system, similar to the meteorological standards defined by WMO (https://www.wmo.int/pages/prog/arep/gaw/qassurance.html). We will first cooperatively develop a plan to create a QA/QC/BP manual. The resulting plan will describe the need for such a manual, the extent of the manual, the process used to engage the community in creating it, the maintenance of the resultant document, and how these things will be done. It will also investigate standards for metadata. The plan will subsequently be used to develop the QA/QC/BP manual, providing guidance which advances the standards adopted by IOOS, AtlantOS, JCOMM, and others.
Lourens, Chris; Lindegardh, Niklas; Barnes, Karen I.; Guerin, Philippe J.; Sibley, Carol H.; White, Nicholas J.
2014-01-01
Comprehensive assessment of antimalarial drug resistance should include measurements of antimalarial blood or plasma concentrations in clinical trials and in individual assessments of treatment failure so that true resistance can be differentiated from inadequate drug exposure. Pharmacometric modeling is necessary to assess pharmacokinetic-pharmacodynamic relationships in different populations to optimize dosing. To accomplish both effectively and to allow comparison of data from different laboratories, it is essential that drug concentration measurement is accurate. Proficiency testing (PT) of laboratory procedures is necessary for verification of assay results. Within the Worldwide Antimalarial Resistance Network (WWARN), the goal of the quality assurance/quality control (QA/QC) program is to facilitate and sustain high-quality antimalarial assays. The QA/QC program consists of an international PT program for pharmacology laboratories and a reference material (RM) program for the provision of antimalarial drug standards, metabolites, and internal standards for laboratory use. The RM program currently distributes accurately weighed quantities of antimalarial drug standards, metabolites, and internal standards to 44 pharmacology, in vitro, and drug quality testing laboratories. The pharmacology PT program has sent samples to eight laboratories in four rounds of testing. WWARN technical experts have provided advice for correcting identified problems to improve performance of subsequent analysis and ultimately improved the quality of data. Many participants have demonstrated substantial improvements over subsequent rounds of PT. The WWARN QA/QC program has improved the quality and value of antimalarial drug measurement in laboratories globally. It is a model that has potential to be applied to strengthening laboratories more widely and improving the therapeutics of other infectious diseases. PMID:24777099
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2014 CFR
2014-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2013 CFR
2013-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2012 CFR
2012-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
77 FR 73611 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-11
...: Negative Quality Control Review Schedule. OMB Control Number: 0584-0034. Summary of Collection: The legislative basis for the operation of the quality control system is provided by section 16 of the Food and Nutrition Act of 2008. State agencies are required to perform Quality Control (QC) reviews for the...
Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.
1999-01-01
A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.
Sanghvi, M.; Ramamoorthy, A.; Strait, J.; Wainer, I. W.; Moaddel, R.
2013-01-01
Due to the lack of sensitivity in current methods for the determination of fenoterol (Fen). A rapid, LC-MS/MS method was developed for the determination of (R,R′)-Fen and (R,R′;S,S′)-Fen in plasma and urine. The method was fully validated and was linear from 50 pg/ml to 2000 pg/ml for plasma and from 2.500 ng/ml to 160 ng/ml for urine with a lower limit of quantitation of 52.8 pg/ml in plasma. The coefficient of variation was <15% for the high QC standards and <10% for the low QC standards in plasma and was <15% for the high and low QC standards in urine. The relative concentrations of (R,R′)-Fen and (S,S′)-Fen were determined using a chirobiotic T chiral stationary phase. The method was used to determine the concentration of (R,R′)-Fen in plasma and urine samples obtained in an oral cross-over study of (R,R′)-Fen and (R,R′;S,S′)-Fen formulations. The results demonstrated a potential pre-systemic enantioselective interaction in which the (S,S′)-Fen reduces the sulfation of the active (R,R′)-Fen. The data suggests that a non-racemic mixture of the Fen enantiomers may provide better bioavailability of the active (R,R′)-Fen for use in the treatment of cardiovascular disease PMID:23872161
NASA Astrophysics Data System (ADS)
Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.
2016-03-01
Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.
Testing and analysis of LWT and SCB properties of asphalt concrete mixtures.
DOT National Transportation Integrated Search
2016-04-01
Currently, Louisianas Quality Control and Quality Assurance (QC/QA) practice for asphalt mixtures in : pavement construction is mainly based on controlling properties of plant produced mixtures that include : gradation and asphalt content, voids f...
Embankment quality and assessment of moisture control implementation : tech transfer summary.
DOT National Transportation Integrated Search
2016-02-01
The motivation for this project was based on work by : Iowa State University (ISU) researchers at a few recent : grading projects that demonstrated embankments were : being constructed outside moisture control limits, even : though the contractor QC ...
Quality control and quality assurance of hot mix asphalt construction in Delaware.
DOT National Transportation Integrated Search
2006-07-01
Since the mid 60s the Federal Highway Administration began to encourage : Departments of Transportation and Contractors toward the use of quality control and : quality assurance (QA/QC) specifications, which are statistically based. : For example,...
NASA Astrophysics Data System (ADS)
Wu, Sheng; Deev, Andrei
2013-01-01
A field deployable Compound Specific Isotope Analyzer (CSIA) coupled with capillary chromatogrpahy based on Quantum Cascade (QC) lasers and Hollow Waveguide (HWG) with precision and chemical resolution matching mature Mass Spectroscopy has been achieved in our laboratory. The system could realize 0.3 per mil accuracy for 12C/13C for a Gas Chromatography (GC) peak lasting as short as 5 seconds with carbon molar concentration in the GC peak less than 0.5%. Spectroscopic advantages of HWG when working with QC lasers, i.e. single mode transmission, noiseless measurement and small sample volume, are compared with traditional free space and multipass spectroscopy methods.
Inspection error and its adverse effects - A model with implications for practitioners
NASA Technical Reports Server (NTRS)
Collins, R. D., Jr.; Case, K. E.; Bennett, G. K.
1978-01-01
Inspection error has clearly been shown to have adverse effects upon the results desired from a quality assurance sampling plan. These effects upon performance measures have been well documented from a statistical point of view. However, little work has been presented to convince the QC manager of the unfavorable cost consequences resulting from inspection error. This paper develops a very general, yet easily used, mathematical cost model. The basic format of the well-known Guthrie-Johns model is used. However, it is modified as required to assess the effects of attributes sampling errors of the first and second kind. The economic results, under different yet realistic conditions, will no doubt be of interest to QC practitioners who face similar problems daily. Sampling inspection plans are optimized to minimize economic losses due to inspection error. Unfortunately, any error at all results in some economic loss which cannot be compensated for by sampling plan design; however, improvements over plans which neglect the presence of inspection error are possible. Implications for human performance betterment programs are apparent, as are trade-offs between sampling plan modification and inspection and training improvements economics.
Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea
NASA Astrophysics Data System (ADS)
Kim, S. D.; Park, H. M.
2017-12-01
To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.
Variation of coda wave attenuation in the Alborz region and central Iran
NASA Astrophysics Data System (ADS)
Rahimi, H.; Motaghi, K.; Mukhopadhyay, S.; Hamzehloo, H.
2010-06-01
More than 340 earthquakes recorded by the Institute of Geophysics, University of Tehran (IGUT) short period stations from 1996 to 2004 were analysed to estimate the S-coda attenuation in the Alborz region, the northern part of the Alpine-Himalayan orogen in western Asia, and in central Iran, which is the foreland of this orogen. The coda quality factor, Qc, was estimated using the single backscattering model in frequency bands of 1-25 Hz. In this research, lateral and depth variation of Qc in the Alborz region and central Iran are studied. It is observed that in the Alborz region there is absence of significant lateral variation in Qc. The average frequency relation for this region is Qc = 79 +/- 2f1.07+/-0.08. Two anomalous high-attenuation areas in central Iran are recognized around the stations LAS and RAZ. The average frequency relation for central Iran excluding the values of these two stations is Qc = 94 +/- 2f0.97+/-0.12. To investigate the attenuation variation with depth, Qc value was calculated for 14 lapse times (25, 30, 35,... 90s) for two data sets having epicentral distance range R < 100 km (data set 1) and 100 < R < 200 km (data set 2) in each area. It is observed that Qc increases with depth. However, the rate of increase of Qc with depth is not uniform in our study area. Beneath central Iran the rate of increase of Qc is greater at depths less than 100 km compared to that at larger depths indicating the existence of a high attenuation anomalous structure under the lithosphere of central Iran. In addition, below ~180 km, the Qc value does not vary much with depth under both study areas, indicating the presence of a transparent mantle under them.
Hybrid spin and valley quantum computing with singlet-triplet qubits.
Rohling, Niklas; Russ, Maximilian; Burkard, Guido
2014-10-24
The valley degree of freedom in the electronic band structure of silicon, graphene, and other materials is often considered to be an obstacle for quantum computing (QC) based on electron spins in quantum dots. Here we show that control over the valley state opens new possibilities for quantum information processing. Combining qubits encoded in the singlet-triplet subspace of spin and valley states allows for universal QC using a universal two-qubit gate directly provided by the exchange interaction. We show how spin and valley qubits can be separated in order to allow for single-qubit rotations.
USGS Blind Sample Project: monitoring and evaluating laboratory analytical quality
Ludtke, Amy S.; Woodworth, Mark T.
1997-01-01
The U.S. Geological Survey (USGS) collects and disseminates information about the Nation's water resources. Surface- and ground-water samples are collected and sent to USGS laboratories for chemical analyses. The laboratories identify and quantify the constituents in the water samples. Random and systematic errors occur during sample handling, chemical analysis, and data processing. Although all errors cannot be eliminated from measurements, the magnitude of their uncertainty can be estimated and tracked over time. Since 1981, the USGS has operated an independent, external, quality-assurance project called the Blind Sample Project (BSP). The purpose of the BSP is to monitor and evaluate the quality of laboratory analytical results through the use of double-blind quality-control (QC) samples. The information provided by the BSP assists the laboratories in detecting and correcting problems in the analytical procedures. The information also can aid laboratory users in estimating the extent that laboratory errors contribute to the overall errors in their environmental data.
Tang, Juan; Zhou, Xiangyang; Liu, Xiaochun; Ning, Leping; Zhou, Weiya; He, Yi
2017-09-01
The aim of this study is to improve the quality of testing for glucose-6-phosphate dehydrogenase (G6PD) deficiency through evaluation and analysis of the laboratory tests for G6PD activity. External quality assessment (EQA) was carried out twice per year with five samples each from 2014 to 2016. Samples were used for quantitative and qualitative assays. Quantitative results were collected, qualitative results were determined with reference values, and information about methods, reagents and instruments from participating laboratories within the required time. Laboratory performance scores, coefficient of variation (CV), and the rates of false negative and positive results were calculated. As a result, a total of 2,834 cases of negative quality control (QC) samples and 2,451 cases of positive QC samples were assessed, where the rates of false negative and false positive results were 1.31% (37/2,834) and 1.34% (33/2,451), respectively. Quantitative results indicated an increasing trend in testing quality, which were consistent with conclusions based on the comparison of EQA full-score and acceptable ratio in six assessments. The 2nd assay in 2016 had the best full-score ratio of 68.9% (135/196) and best acceptable ratio of 84.2% (165/196). There was a decreasing trend in the average CV of six reagents produced in China, and the range of average CV increased to 14.6-23.6% in 2016. The average CV of low level and high level samples was 22.5% and 15.3%, respectively, demonstrating that samples with low G6PD activity have greater interlaboratory CV values. In conclusion, laboratories improved their testing quality and provided better diagnostic service for G6PD deficiency in areas with high incidence after participation in the EQA program in the Guangxi region.
40 CFR 98.34 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... on a specified time period (e.g., week, month, quarter, or half-year), fuel sampling and analysis is required only for those time periods in which the fuel or blend is combusted. The owner or operator may.... When the sampling frequency is based on a specified time period (e.g., week, month, quarter, or half...
40 CFR 98.34 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... on a specified time period (e.g., week, month, quarter, or half-year), fuel sampling and analysis is required only for those time periods in which the fuel or blend is combusted. The owner or operator may.... When the sampling frequency is based on a specified time period (e.g., week, month, quarter, or half...
40 CFR 98.34 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... on a specified time period (e.g., week, month, quarter, or half-year), fuel sampling and analysis is required only for those time periods in which the fuel or blend is combusted. The owner or operator may.... When the sampling frequency is based on a specified time period (e.g., week, month, quarter, or half...
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... requirements. (a) You must obtain a monthly grab sample of phosphate rock directly from the rock being fed to... Methods Used and Adopted by the Association of Fertilizer and Phosphate Chemists (AFPC). If phosphate rock is obtained from more than one origin in a month, you must obtain a sample from each origin of rock...
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... requirements. (a) You must obtain a monthly grab sample of phosphate rock directly from the rock being fed to..., Bartow, Florida 33831, (863) 534-9755, http://afpc.net, [email protected]). If phosphate rock is obtained from more than one origin in a month, you must obtain a sample from each origin of rock or obtain...
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... requirements. (a) You must obtain a monthly grab sample of phosphate rock directly from the rock being fed to..., Bartow, Florida 33831, (863) 534-9755, http://afpc.net, [email protected]). If phosphate rock is obtained from more than one origin in a month, you must obtain a sample from each origin of rock or obtain...
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... requirements. (a) You must obtain a monthly grab sample of phosphate rock directly from the rock being fed to..., Bartow, Florida 33831, (863) 534-9755, http://afpc.net, [email protected]). If phosphate rock is obtained from more than one origin in a month, you must obtain a sample from each origin of rock or obtain...
This product is an LC/MS/MS single laboratory validated method for the determination of cylindrospermopsin and anatoxin-a in ambient waters. The product contains step-by-step instructions for sample preparation, analyses, preservation, sample holding time and QC protocols to ensu...
77 FR 3228 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-23
..., Office of Management and Budget (OMB), [email protected] or fax (202) 395-5806 and to... it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality Control... perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380...
Liu, Zhao; Zheng, Chaorong; Wu, Yue
2017-09-01
Wind profilers have been widely adopted to observe the wind field information in the atmosphere for different purposes. But accuracy of its observation has limitations due to various noises or disturbances and hence need to be further improved. In this paper, the data measured under strong wind conditions, using a 1290-MHz boundary layer profiler (BLP), are quality controlled via a composite quality control (QC) procedure proposed by the authors. Then, through the comparison with the data measured by radiosonde flights (balloon observations), the critical thresholds in the composite QC procedure, including consensus average threshold T 1 and vertical shear threshold T 3 , are systematically discussed. And the performance of the BLP operated under precipitation is also evaluated. It is found that to ensure the high accuracy and high data collectable rate, the optimal range of subsets is determined to be 4 m/s. Although the number of data rejected by the combined algorithm of vertical shear examination and small median test is quite limited, it is proved that the algorithm is quite useful to recognize the outlier with a large discrepancy. And the optimal wind shear threshold T 3 can be recommended as 5 ms -1 /100m. During patchy precipitation, the quality of data measured by the four oblique beams (using the DBS measuring technique) can still be ensured. After the BLP data are quality controlled by the composite QC procedure, the output can show good agreement with the balloon observation.
23 CFR 650.313 - Inspection procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Quality control and quality assurance. Assure systematic quality control (QC) and quality assurance (QA... periodic field review of inspection teams, periodic bridge inspection refresher training for program managers and team leaders, and independent review of inspection reports and computations. (h) Follow-up on...
"Gap hunting" to characterize clustered probe signals in Illumina methylation array data.
Andrews, Shan V; Ladd-Acosta, Christine; Feinberg, Andrew P; Hansen, Kasper D; Fallin, M Daniele
2016-01-01
The Illumina 450k array has been widely used in epigenetic association studies. Current quality-control (QC) pipelines typically remove certain sets of probes, such as those containing a SNP or with multiple mapping locations. An additional set of potentially problematic probes are those with DNA methylation distributions characterized by two or more distinct clusters separated by gaps. Data-driven identification of such probes may offer additional insights for downstream analyses. We developed a procedure, termed "gap hunting," to identify probes showing clustered distributions. Among 590 peripheral blood samples from the Study to Explore Early Development, we identified 11,007 "gap probes." The vast majority (9199) are likely attributed to an underlying SNP(s) or other variant in the probe, although SNP-affected probes exist that do not produce a gap signals. Specific factors predict which SNPs lead to gap signals, including type of nucleotide change, probe type, DNA strand, and overall methylation state. These expected effects are demonstrated in paired genotype and 450k data on the same samples. Gap probes can also serve as a surrogate for the local genetic sequence on a haplotype scale and can be used to adjust for population stratification. The characteristics of gap probes reflect potentially informative biology. QC pipelines may benefit from an efficient data-driven approach that "flags" gap probes, rather than filtering such probes, followed by careful interpretation of downstream association analyses. Our results should translate directly to the recently released Illumina EPIC array given the similar chemistry and content design.
Szaszkó, Mária; Hajdú, István; Flachner, Beáta; Dobi, Krisztina; Magyar, Csaba; Simon, István; Lőrincz, Zsolt; Kapui, Zoltán; Pázmány, Tamás; Cseh, Sándor; Dormán, György
2017-02-01
A glutaminyl cyclase (QC) fragment library was in silico selected by disconnection of the structure of known QC inhibitors and by lead-like 2D virtual screening of the same set. The resulting fragment library (204 compounds) was acquired from commercial suppliers and pre-screened by differential scanning fluorimetry followed by functional in vitro assays. In this way, 10 fragment hits were identified ([Formula: see text]5 % hit rate, best inhibitory activity: 16 [Formula: see text]). The in vitro hits were then docked to the active site of QC, and the best scoring compounds were analyzed for binding interactions. Two fragments bound to different regions in a complementary manner, and thus, linking those fragments offered a rational strategy to generate novel QC inhibitors. Based on the structure of the virtual linked fragment, a 77-membered QC target focused library was selected from vendor databases and docked to the active site of QC. A PubChem search confirmed that the best scoring analogues are novel, potential QC inhibitors.
SRT Evaluation of AIRS Version-6.02 and Version-6.02 AIRS Only (6.02 AO) Products
NASA Technical Reports Server (NTRS)
Susskind, Joel; Iredell, Lena; Molnar, Gyula; Blaisdell, John
2012-01-01
Version-6 contains a number of significant improvements over Version-5. This report compares Version-6 products resulting from the advances listed below to those from Version-5. 1. Improved methodology to determine skin temperature (T(sub s)) and spectral emissivity (Epsilon(sub v)). 2. Use of Neural-net start-up state. 3. Improvements which decrease the spurious negative Version-5 trend in tropospheric temperatures. 4. Improved QC methodology. Version-6 uses separate QC thresholds optimized for Data Assimilation (QC=0) and Climate applications (QC=0,1) respectively. 5. Channel-by-channel clear-column radiances R-hat(sub tau) QC flags. 6. Improved cloud parameter retrieval algorithm. 7. Improved OLR RTA. Our evaluation compared V6.02 and V6.02 AIRS Only (V6.02 AO) Quality Controlled products with those of Version-5.0. In particular we evaluated surface skin temperature T(sub s), surface spectral emissivity Epsilon(sub v), temperature profile T(p), water vapor profile q(p), OLR, OLR(sub CLR), effective cloud fraction alpha-Epsilon, and cloud cleared radiances R-hat(sub tau) . We conducted two types of evaluations. The first compared results on 7 focus days to collocated ECMWF truth. The seven focus days are: September 6, 2002; January 25, 2003; September 29, 2004; August 5, 2005; February 24, 2007; August 10, 2007; and May 30, 2010. In these evaluations, we show results for T(sub s), Epsilon(sub v), T(p), and q(p) in terms of yields, and RMS differences and biases with regard to ECMWF. We also show yield trends as well as bias trends of these quantities relative to ECMWF truth. We also show yields and accuracy of channel by channel QC d values of R-hat(sub tau) for V6.02 and V6.02 AO. Version-5 did not contain channel by channel QC d values of R-hat(sub tau). In the second type of evaluation, we compared V6.03 monthly mean Level-3 products to those of Version-5.0, for four different months: January, April, July, and October; in 3 different years 2003, 2007, and 2011. In particular, we compared V6.03 and V5.0 trends of T(p), q(p), alpha-Epsilon, OLR, and OLR(sub CLR) computed based on results for these 12 time periods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Luyao; Curwen, Christopher; Chen, Daguan
A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less
Terahertz metasurface quantum-cascade VECSELs: theory and performance
Xu, Luyao; Curwen, Christopher; Chen, Daguan; ...
2017-04-12
A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less
Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...
PHABULOSA Controls the Quiescent Center-Independent Root Meristem Activities in Arabidopsis thaliana
Sebastian, Jose; Ryu, Kook Hui; Zhou, Jing; Tarkowská, Danuše; Tarkowski, Petr; Cho, Young-Hee; Yoo, Sang-Dong; Kim, Eun-Sol; Lee, Ji-Young
2015-01-01
Plant growth depends on stem cell niches in meristems. In the root apical meristem, the quiescent center (QC) cells form a niche together with the surrounding stem cells. Stem cells produce daughter cells that are displaced into a transit-amplifying (TA) domain of the root meristem. TA cells divide several times to provide cells for growth. SHORTROOT (SHR) and SCARECROW (SCR) are key regulators of the stem cell niche. Cytokinin controls TA cell activities in a dose-dependent manner. Although the regulatory programs in each compartment of the root meristem have been identified, it is still unclear how they coordinate one another. Here, we investigate how PHABULOSA (PHB), under the posttranscriptional control of SHR and SCR, regulates TA cell activities. The root meristem and growth defects in shr or scr mutants were significantly recovered in the shr phb or scr phb double mutant, respectively. This rescue in root growth occurs in the absence of a QC. Conversely, when the modified PHB, which is highly resistant to microRNA, was expressed throughout the stele of the wild-type root meristem, root growth became very similar to that observed in the shr; however, the identity of the QC was unaffected. Interestingly, a moderate increase in PHB resulted in a root meristem phenotype similar to that observed following the application of high levels of cytokinin. Our protoplast assay and transgenic approach using ARR10 suggest that the depletion of TA cells by high PHB in the stele occurs via the repression of B-ARR activities. This regulatory mechanism seems to help to maintain the cytokinin homeostasis in the meristem. Taken together, our study suggests that PHB can dynamically regulate TA cell activities in a QC-independent manner, and that the SHR-PHB pathway enables a robust root growth system by coordinating the stem cell niche and TA domain. PMID:25730098
Quevauviller, P; Bennink, D; Bøwadt, S
2001-05-01
It is now well recognised that the quality control (QC) of all types of analyses, including environmental analyses depends on the appropriate use of reference materials. One of the ways to check the accuracy of methods is based on the use of Certified Reference Materials (CRMs), whereas other types of (not certified) Reference Materials (RMs) are used for routine quality control (establishment of control charts) and interlaboratory testing (e.g. proficiency testing). The perception of these materials, in particular with respect to their production and use, differs widely according to various perspectives (e.g. RM producers, routine laboratories, researchers). This review discusses some critical aspects of RM use and production for the QC of environmental analyses and describes the new approach followed by the Measurements & Testing Generic Activity (European Commission) to tackle new research and production needs.
Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela
2016-08-01
Blood alcohol concentration is the most frequent analytical determination carried out in forensic toxicology laboratories worldwide. It is usually required to assess if an offence has been committed by comparing blood alcohol levels with specified legal limits, which can vary widely among countries. Due to possible serious legal consequences associated with non-compliant alcohol levels, measurement uncertainty should be carefully evaluated, along with other metrological aspects which can influence the final result. The whole procedure can be time-consuming and error-generating in routine practice, increasing the risks for unreliable assessments. A software application named Ethanol WorkBook (EtWB) was developed at the author's laboratory by using Visual Basic for Application language and MS Excel(®), with the aim of providing help to forensic analysts involved in blood alcohol determinations. The program can (i) calculate measurement uncertainties and decision limits with different methodologies; (ii) assess compliance to specification limits with a guard-band approach; (iii) manage quality control (QC) data and create control charts for QC samples; (iv) create control maps from real cases data archives; (v) provide laboratory reports with graphical outputs for elaborated data and (vi) create comprehensive searchable case archives. A typical example of drink driving case is presented and discussed to illustrate the importance of a metrological approach for reliable compliance assessment and to demonstrate software application in routine practice. The tool is made freely available to the scientific community at request. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Diffusion imaging quality control via entropy of principal direction distribution.
Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A
2013-11-15
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. Copyright © 2013 Elsevier Inc. All rights reserved.
Diffusion imaging quality control via entropy of principal direction distribution
Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.
2013-01-01
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. PMID:23684874
Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue
2016-01-01
There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582
Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue
2016-01-01
There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.
QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES. Project Summary
It is generally agreed that both quality assurance (QA) and quality control (QC) are essential to the proper installation and eventual performance of environmentally safe and secure waste containment systems. Even further, there are both manufacturing and construction aspects to...
DOT National Transportation Integrated Search
2013-11-01
Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and : Development (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...
HANDBOOK: QUALITY ASSURANCE/QUALITY CONTROL (QA/QC) PROCEDURES FOR HAZARDOUS WASTE INCINERATION
Resource Conservation and Recovery Act regulations for hazardous waste incineration require trial burns by permit applicants. uality Assurance Project Plan (QAPjP) must accompany a trial burn plan with appropriate quality assurance/quality control procedures. uidance on the prepa...
Propellant Residues Deposition from Firing of AT4 Rockets
2009-12-01
and 254 nm (cell path 1 cm), and a Finnigan SpectraSYSTEM AS300 autosampler. Samples were introduced with a 100-μL sample loop . Separations were...analytical laboratory. The remaining particle samples were left in sealed jars and stored on site in a refrigerator, and the snow sample was stored in a...Ranney. 1998. Characterization of antitank firing ranges at CFB Valcartier. WATC Wainwright, and CFAD Dundurn. DREV-R-9809. Val- Bélair, QC: DRDC
Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control
Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.
2011-01-01
Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small volume grab samples were collected for direct-plating analyses for bacterial indicators and coliphage. After filtration, the viruses were eluted from the filter and further concentrated. The final concentrated sample volume (FCSV) was used for enteric virus analysis by use of two methods-cell culture and a molecular method, polymerase chain reaction (PCR). Quantitative PCR (qPCR) for DNA viruses and quantitative reverse-transcriptase PCR (qRT-PCR) for RNA viruses were used in this study. To support data interpretations, the assay limit of detection (ALOD) was set for each virus assay and used to determine sample reporting limits (SRLs). For qPCR and qRT-PCR the ALOD was an estimated value because it was not established according to established method detection limit procedures. The SRLs were different for each sample because effective sample volumes (the volume of the original sample that was actually used in each analysis) were different for each sample. Effective sample volumes were much less than the original sample volumes because of reductions from processing steps and (or) from when dilutions were made to minimize the effects from PCR-inhibiting substances. Codes were used to further qualify the virus data and indicate the level of uncertainty associated with each measurement. Quality-control samples were used to support data interpretations. Field and laboratory blanks for bacteria, coliphage, and enteric viruses were all below detection, indicating that it was unlikely that samples were contaminated from equipment or processing procedures. The absolute value log differences (AVLDs) between concurrent replicate pairs were calculated to identify the variability associated with each measurement. For bacterial indicators and coliphage, the AVLD results indicated that concentrations <10 colony-forming units or plaque-forming units per 100 mL can differ between replicates by as much as 1 log, whereas higher concentrations can differ by as much as 0.3 log. The AVLD results for viruses indicated that differences between replicates can be as great as 1.2 log genomic copies per liter, regardless of the concentration of virus. Relatively large differences in molecular results for viruses between replicate pairs were likely due to lack of precision for samples with small effective volumes. Concentrations of E. coli, fecal coliforms, enterococci, and somatic and F-specific coliphage in post-secondary and post-tertiary samples in conventional plants were higher than those in post-MBR samples. In post-MBR and post-secondary samples, concentrations of somatic coliphage were higher than F-specific coliphage. In post-disinfection samples from two MBR plants (the third MBR plant had operational issues) and the ultraviolet conventional plant, concentrations for all bacterial indicators and coliphage were near or below detection; from the chlorine conventional plant, concentrations in post-disinfection samples were in the single or double digits. All of the plants met the National Pollutant Discharge Elimination System required effluent limits established for fecal coliforms. Norovirus GII and hepatitis A virus were not detected in any samples, and rotavirus was detected in one sample but could not be quantified. Adenovirus was found in 100 percent, enterovirus in over one-half, and norovirus GI in about one-half of post-preliminary wastewater samples. Adenovirus and enterovirus were detected throughout the treatment processes, and norovirus GI was detected less often than the other two enteric viruses. Culturable viruses were detected in post-preliminary samples and in only two post-treatment samples from the plant with operational issues.
A multi-site feasibility study for personalized medicine in canines with Osteosarcoma
2013-01-01
Background A successful therapeutic strategy, specifically tailored to the molecular constitution of an individual and their disease, is an ambitious objective of modern medicine. In this report, we highlight a feasibility study in canine osteosarcoma focused on refining the infrastructure and processes required for prospective clinical trials using a series of gene expression-based Personalized Medicine (PMed) algorithms to predict suitable therapies within 5 days of sample receipt. Methods Tumor tissue samples were collected immediately following limb amputation and shipped overnight from veterinary practices. Upon receipt (day 1), RNA was extracted from snap-frozen tissue, with an adjacent H&E section for pathological diagnosis. Samples passing RNA and pathology QC were shipped to a CLIA-certified laboratory for genomic profiling. After mapping of canine probe sets to human genes and normalization against a (normal) reference set, gene level Z-scores were submitted to the PMed algorithms. The resulting PMed report was immediately forwarded to the veterinarians. Upon receipt and review of the PMed report, feedback from the practicing veterinarians was captured. Results 20 subjects were enrolled over a 5 month period. Tissue from 13 subjects passed both histological and RNA QC and were submitted for genomic analysis and subsequent PMed analysis and report generation. 11 of the 13 samples for which PMed reports were produced were communicated to the veterinarian within the target 5 business days. Of the 7 samples that failed QC, 4 were due to poor RNA quality, whereas 2 were failed following pathological review. Comments from the practicing veterinarians were generally positive and constructive, highlighting a number of areas for improvement, including enhanced education regarding PMed report interpretation, drug availability, affordable pricing and suitable canine dosing. Conclusions This feasibility trial demonstrated that with the appropriate infrastructure and processes it is possible to perform an in-depth molecular analysis of a patient’s tumor in support of real time therapeutic decision making within 5 days of sample receipt. A number of areas for improvement have been identified that should reduce the level of sample attrition and support clinical decision making. PMID:23815880
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D. R.
2017-12-01
The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.
PACS 2000: quality control using the task allocation chart
NASA Astrophysics Data System (ADS)
Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.
2000-05-01
Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.
Autonomous Quality Control of Joint Orientation Measured with Inertial Sensors.
Lebel, Karina; Boissy, Patrick; Nguyen, Hung; Duval, Christian
2016-07-05
Clinical mobility assessment is traditionally performed in laboratories using complex and expensive equipment. The low accessibility to such equipment, combined with the emerging trend to assess mobility in a free-living environment, creates a need for body-worn sensors (e.g., inertial measurement units-IMUs) that are capable of measuring the complexity in motor performance using meaningful measurements, such as joint orientation. However, accuracy of joint orientation estimates using IMUs may be affected by environment, the joint tracked, type of motion performed and velocity. This study investigates a quality control (QC) process to assess the quality of orientation data based on features extracted from the raw inertial sensors' signals. Joint orientation (trunk, hip, knee, ankle) of twenty participants was acquired by an optical motion capture system and IMUs during a variety of tasks (sit, sit-to-stand transition, walking, turning) performed under varying conditions (speed, environment). An artificial neural network was used to classify good and bad sequences of joint orientation with a sensitivity and a specificity above 83%. This study confirms the possibility to perform QC on IMU joint orientation data based on raw signal features. This innovative QC approach may be of particular interest in a big data context, such as for remote-monitoring of patients' mobility.
Origin of the concept of the quiescent centre of plant roots.
Barlow, Peter W
2016-09-01
Concepts in biology feed into general theories of growth, development and evolution of organisms and how they interact with the living and non-living components of their environment. A well-founded concept clarifies unsolved problems and serves as a focus for further research. One such example of a constructive concept in the plant sciences is that of the quiescent centre (QC). In anatomical terms, the QC is an inert group of cells maintained within the apex of plant roots. However, the evidence that established the presence of a QC accumulated only gradually, making use of strands of different types of observations, notably from geometrical-analytical anatomy, radioisotope labelling and autoradiography. In their turn, these strands contributed to other concepts: those of the mitotic cell cycle and of tissue-related cell kinetics. Another important concept to which the QC contributed was that of tissue homeostasis. The general principle of this last-mentioned concept is expressed by the QC in relation to the recovery of root growth following a disturbance to cell proliferation; the resulting activation of the QC provides new cells which not only repair the root meristem but also re-establish a new QC.
Scheltema, Richard A; Mann, Matthias
2012-06-01
With the advent of high-throughput mass spectrometry (MS)-based proteomics, the magnitude and complexity of the performed experiments has increased dramatically. Likewise, investments in chromatographic and MS instrumentation are a large proportion of the budget of proteomics laboratories. Guarding measurement quality and maximizing uptime of the LC-MS/MS systems therefore requires constant care despite automated workflows. We describe a real-time surveillance system, called SprayQc, that continuously monitors the status of the peripheral equipment to ensure that operational parameters are within an acceptable range. SprayQc is composed of multiple plug-in software components that use computer vision to analyze electrospray conditions, monitor the chromatographic device for stable backpressure, interact with a column oven to control pressure by temperature, and ensure that the mass spectrometer is still acquiring data. Action is taken when a failure condition has been detected, such as stopping the column oven and the LC flow, as well as automatically notifying the appropriate operator. Additionally, all defined metrics can be recorded synchronized on retention time with the MS acquisition file, allowing for later inspection and providing valuable information for optimization. SprayQc has been extensively tested in our laboratory, supports third-party plug-in development, and is freely available for download from http://sourceforge.org/projects/sprayqc .
DOT National Transportation Integrated Search
2013-11-01
Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and Development : (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...
Assessment of in-situ test technology for construction control of base courses and embankments.
DOT National Transportation Integrated Search
2004-05-01
With the coming move from an empirical to mechanistic-empirical pavement design, it is essential to improve the quality control/quality assurance (QC/QA) procedures of compacted materials from a density-based criterion to a stiffness/strength-based c...
Sanghvi, M; Ramamoorthy, A; Strait, J; Wainer, I W; Moaddel, R
2013-08-15
Due to the lack of sensitivity in current methods for the determination of fenoterol (Fen), a rapid LC-MS/MS method was developed for the determination of (R,R')-Fen and (R,R';S,S')-Fen in plasma and urine. The method was fully validated and was linear from 50pg/ml to 2000pg/ml for plasma and from 2.500ng/ml to 160ng/ml for urine with a lower limit of quantitation of 52.8pg/ml in plasma. The coefficient of variation was <15% for the high QC standards and <10% for the low QC standards in plasma and was <15% for the high and low QC standards in urine. The relative concentrations of (R,R')-Fen and (S,S')-Fen were determined using a chirobiotic T chiral stationary phase. The method was used to determine the concentration of (R,R')-Fen in plasma and urine samples obtained in an oral cross-over study of (R,R')-Fen and (R,R';S,S')-Fen formulations. The results demonstrated a potential pre-systemic enantioselective interaction in which the (S,S')-Fen reduces the sulfation of the active (R,R')-Fen. The data suggest that a non-racemic mixture of the Fen enantiomers may provide better bioavailability of the active (R,R')-Fen for use in the treatment of cardiovascular disease. Published by Elsevier B.V.
Operational CryoSat Product Quality Assessment
NASA Astrophysics Data System (ADS)
Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine
2013-12-01
The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; ...
2017-11-06
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
NASA Astrophysics Data System (ADS)
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; Baxamusa, Salmaan H.; Lepró, Xavier; Ehrmann, Paul
2017-11-01
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we do not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. We have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.
Code of Federal Regulations, 2010 CFR
2010-04-01
...' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR QUALITY CONTROL IN THE FEDERAL-STATE... QC unit. The organizational location of this unit shall be positioned to maximize its objectivity, to... organizational conflict of interest. ...
78 FR 48766 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
...'s Network Management Center in Montreal, QC, Canada. CP operates approximately six to eight trains a day over this segment. The trackage is operated under a Centralized Traffic Control system and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... Quality Control process for the Supplemental Nutrition Assistance Program and the FNS-248 will be removed... other forms of information technology. Comments may be sent to: Francis B. Heil, Chief, Quality Control... directed to Francis B. Heil, (703) 305-2442. SUPPLEMENTARY INFORMATION: Title: Negative Quality Control...
Hoang, Van-Hai; Tran, Phuong-Thao; Cui, Minghua; Ngo, Van T H; Ann, Jihyae; Park, Jongmi; Lee, Jiyoun; Choi, Kwanghyun; Cho, Hanyang; Kim, Hee; Ha, Hee-Jin; Hong, Hyun-Seok; Choi, Sun; Kim, Young-Ho; Lee, Jeewoo
2017-03-23
Glutaminyl cyclase (QC) has been implicated in the formation of toxic amyloid plaques by generating the N-terminal pyroglutamate of β-amyloid peptides (pGlu-Aβ) and thus may participate in the pathogenesis of Alzheimer's disease (AD). We designed a library of glutamyl cyclase (QC) inhibitors based on the proposed binding mode of the preferred substrate, Aβ 3E-42 . An in vitro structure-activity relationship study identified several excellent QC inhibitors demonstrating 5- to 40-fold increases in potency compared to a known QC inhibitor. When tested in mouse models of AD, compound 212 significantly reduced the brain concentrations of pyroform Aβ and total Aβ and restored cognitive functions. This potent Aβ-lowering effect was achieved by incorporating an additional binding region into our previously established pharmacophoric model, resulting in strong interactions with the carboxylate group of Glu327 in the QC binding site. Our study offers useful insights in designing novel QC inhibitors as a potential treatment option for AD.
Lin, Jou-Wei; Yang, Chen-Wei
2010-01-01
The objective of this study was to develop and validate an automated acquisition system to assess quality of care (QC) measures for cardiovascular diseases. This system combining searching and retrieval algorithms was designed to extract QC measures from electronic discharge notes and to estimate the attainment rates to the current standards of care. It was developed on the patients with ST-segment elevation myocardial infarction and tested on the patients with unstable angina/non-ST-segment elevation myocardial infarction, both diseases sharing almost the same QC measures. The system was able to reach a reasonable agreement (κ value) with medical experts from 0.65 (early reperfusion rate) to 0.97 (β-blockers and lipid-lowering agents before discharge) for different QC measures in the test set, and then applied to evaluate QC in the patients who underwent coronary artery bypass grafting surgery. The result has validated a new tool to reliably extract QC measures for cardiovascular diseases. PMID:20442141
Hertrampf, A; Sousa, R M; Menezes, J C; Herdling, T
2016-05-30
Quality control (QC) in the pharmaceutical industry is a key activity in ensuring medicines have the required quality, safety and efficacy for their intended use. QC departments at pharmaceutical companies are responsible for all release testing of final products but also all incoming raw materials. Near-infrared spectroscopy (NIRS) and Raman spectroscopy are important techniques for fast and accurate identification and qualification of pharmaceutical samples. Tablets containing two different active pharmaceutical ingredients (API) [bisoprolol, hydrochlorothiazide] in different commercially available dosages were analysed using Raman- and NIR Spectroscopy. The goal was to define multivariate models based on each vibrational spectroscopy to discriminate between different dosages (identity) and predict their dosage (semi-quantitative). Furthermore the combination of spectroscopic techniques was investigated. Therefore, two different multiblock techniques based on PLS have been applied: multiblock PLS (MB-PLS) and sequential-orthogonalised PLS (SO-PLS). NIRS showed better results compared to Raman spectroscopy for both identification and quantitation. The multiblock techniques investigated showed that each spectroscopy contains information not present or captured with the other spectroscopic technique, thus demonstrating that there is a potential benefit in their combined use for both identification and quantitation purposes. Copyright © 2016 Elsevier B.V. All rights reserved.
Statistical analysis of the Nb3Sn strand production for the ITER toroidal field coils
NASA Astrophysics Data System (ADS)
Vostner, A.; Jewell, M.; Pong, I.; Sullivan, N.; Devred, A.; Bessette, D.; Bevillard, G.; Mitchell, N.; Romano, G.; Zhou, C.
2017-04-01
The ITER toroidal field (TF) strand procurement initiated the largest Nb3Sn superconducting strand production hitherto. The industrial-scale production started in Japan in 2008 and finished in summer 2015. Six ITER partners (so-called Domestic Agencies, or DAs) are in charge of the procurement and involved eight different strand suppliers all over the world, of which four are using the bronze route (BR) process and four the internal-tin (IT) process. In total more than 500 tons have been produced including excess material covering losses during the conductor manufacturing process, in particular the cabling. The procurement is based on a functional specification where the main strand requirements like critical current, hysteresis losses, Cu ratio and residual resistance ratio are specified but not the strand production process or layout. This paper presents the analysis on the data acquired during the quality control (QC) process that was carried out to ensure the same conductor performance requirements are met by the different strand suppliers regardless of strand design. The strand QC is based on 100% billet testing and on applying statistical process control (SPC) limits. Throughout the production, samples adjacent to the strand pieces tested by the suppliers are cross-checked (‘verified’) by their respective DAs reference labs. The level of verification was lowered from 100% at the beginning of the procurement progressively to approximately 25% during the final phase of production. Based on the complete dataset of the TF strand production, an analysis of the SPC limits of the critical strand parameters is made and the related process capability indices are calculated. In view of the large-scale production and costs, key manufacturing parameters such as billet yield, number of breakages and piece-length distribution are also discussed. The results are compared among all the strand suppliers, focusing on the difference between BR and IT processes. Following the completion of the largest Nb3Sn strand production, our experience gained from monitoring the execution of the QC activities and from auditing the results from the measurements is summarised for future superconducting strand material procurement activities.
Design, implementation, and quality control in the Pathways American-Indian multicenter trial
Stone, Elaine J.; Norman, James E.; Davis, Sally M.; Stewart, Dawn; Clay, Theresa E.; Caballero, Ben; Lohman, Timothy G.; Murray, David M.
2016-01-01
Background Pathways was the first multicenter American-Indian school-based study to test the effectiveness of an obesity prevention program promoting healthy eating and physical activity. Methods Pathways employed a nested cohort design in which 41 schools were randomized to intervention or control conditions and students within these schools were followed as a cohort (1,704 third graders at baseline). The study’s primary endpoint was percent body fat. Secondary endpoints were levels of fat in school lunches; time spent in physical activity; and knowledge, attitudes, and behaviors regarding diet and exercise. Quality control (QC) included design of data management systems which provided standardization and quality assurance of data collection and processing. Data QC procedures at study centers included manuals of operation, training and certification, and monitoring of performance. Process evaluation was conducted to monitor dose and fidelity of the interventions. Registration and tracking systems were used for students and schools. Results No difference in mean percent body fat at fifth grade was found between the intervention and control schools. Percent of calories from fat and saturated fat in school lunches was significantly reduced in the intervention schools as was total energy intake from 24-hour recalls. Significant increases in self-reported physical activity levels and knowledge of healthy behaviors were found for the intervention school students. Conclusions The Pathways study results provide evidence demonstrating the role schools can play in public health promotion. Its study design and QC systems and procedures provide useful models for other similar school based multi- or single-site studies. PMID:14636805
Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.; ...
2017-01-17
The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.
The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.64 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...
40 CFR 98.64 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...
40 CFR 98.84 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.84 Section 98.84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements...
References on EPA Quality Assurance Project Plans
Provides requirements for the conduct of quality management practices, including quality assurance (QA) and quality control (QC) activities, for all environmental data collection and environmental technology programs performed by or for this Agency.
Revision 2 of the Enbridge Quality Assurance Project Plan
This Quality Assurance Project Plan (QAPP) presents Revision 2 of the organization, objectives, planned activities, and specific quality assurance/quality control (QA/QC) procedures associated with the Enbridge Marshall Pipeline Release Project.
DOT National Transportation Integrated Search
2009-07-01
Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sailer, S.J.
This Quality Assurance Project Plan (QAPJP) specifies the quality of data necessary and the characterization techniques employed at the Idaho National Engineering Laboratory (INEL) to meet the objectives of the Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP) Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) requirements. This QAPJP is written to conform with the requirements and guidelines specified in the QAPP and the associated documents referenced in the QAPP. This QAPJP is one of a set of five interrelated QAPjPs that describe the INEL Transuranic Waste Characterization Program (TWCP). Each of the five facilities participating in the TWCPmore » has a QAPJP that describes the activities applicable to that particular facility. This QAPJP describes the roles and responsibilities of the Idaho Chemical Processing Plant (ICPP) Analytical Chemistry Laboratory (ACL) in the TWCP. Data quality objectives and quality assurance objectives are explained. Sample analysis procedures and associated quality assurance measures are also addressed; these include: sample chain of custody; data validation; usability and reporting; documentation and records; audits and 0385 assessments; laboratory QC samples; and instrument testing, inspection, maintenance and calibration. Finally, administrative quality control measures, such as document control, control of nonconformances, variances and QA status reporting are described.« less
SU-E-T-103: Development and Implementation of Web Based Quality Control Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Studinski, R; Taylor, R; Angers, C
Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less
Maurer, Matthew J.; Spear, Eric D.; Yu, Allen T.; Lee, Evan J.; Shahzad, Saba; Michaelis, Susan
2016-01-01
Cellular protein quality control (PQC) systems selectively target misfolded or otherwise aberrant proteins for degradation by the ubiquitin-proteasome system (UPS). How cells discern abnormal from normal proteins remains incompletely understood, but involves in part the recognition between ubiquitin E3 ligases and degradation signals (degrons) that are exposed in misfolded proteins. PQC is compartmentalized in the cell, and a great deal has been learned in recent years about ER-associated degradation (ERAD) and nuclear quality control. In contrast, a comprehensive view of cytosolic quality control (CytoQC) has yet to emerge, and will benefit from the development of a well-defined set of model substrates. In this study, we generated an isogenic “degron library” in Saccharomyces cerevisiae consisting of short sequences appended to the C-terminus of a reporter protein, Ura3. About half of these degron-containing proteins are substrates of the integral membrane E3 ligase Doa10, which also plays a pivotal role in ERAD and some nuclear protein degradation. Notably, some of our degron fusion proteins exhibit dependence on the E3 ligase Ltn1/Rkr1 for degradation, apparently by a mechanism distinct from its known role in ribosomal quality control of translationally paused proteins. Ubr1 and San1, E3 ligases involved in the recognition of some misfolded CytoQC substrates, are largely dispensable for the degradation of our degron-containing proteins. Interestingly, the Hsp70/Hsp40 chaperone/cochaperones Ssa1,2 and Ydj1, are required for the degradation of all constructs tested. Taken together, the comprehensive degron library presented here provides an important resource of isogenic substrates for testing candidate PQC components and identifying new ones. PMID:27172186
Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.
Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil
2015-07-17
In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.
2012-05-01
with HPLC and PCBs with GC-ECD. Details of the chemical analysis are not included in this description but standard methods are referenced. Other...5 4.4 Analysis of samples to get the accumulated uptake in the fiber ...................................... 8 4.5 Determination of pore water...13 5.5 QC samples for chemical analysis
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC...-specific heel factors for each container type for each gas used, according to the procedures in paragraphs...
Measurement of pulmonary capillary blood flow in infants by plethysmography.
Stocks, J; Costeloe, K; Winlove, C P; Godfrey, S
1977-01-01
An accurate method for measuring effective pulmonary capillary blood flow (Qc eff) in infants has been developed with an adaptation of the plethysmographic technique. Measurements were made on 19 preterm. 14 small-for-dates, and 7 fullterm normal infants with a constant volume whole body plethysmograph in which the infant rebreathed nitrous oxide. There was a highly significant correlation between Qc eff and body weight, and this relationship was unaffected by premature delivery or intrauterine growth retardation. Mean Qc eff in preterm, small-for dates, and fullterm infants was 203, 208 and 197 ml min-1 kg-1, respectively, with no significant differences between the groups. A significant negative correlation existed between Qc eff and haematocrit in the preterm infants. There was no relationship between weight standardized Qc eff and postnatal age in any of the groups. With this technique, it was possible to readily recognise the presence of rapid recirculation (indicative of shunting) in several of the infants, suggesting that rebreathing methods for the assessment of Qc eff should not be applied indiscriminately during the neonatal period. By taking care to overcome the potential sources of technical error, it was possible to obtain highly reproducible results of Qc eff in infants over a wider age range than has been previously reported. PMID:838861
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-09-01
The Quality Assurance/Quality Control (QA/QC) Program for Phase 2 of the Clinch River Remedial Investigation (CRRI) was designed to comply with both Department of Energy (DOE) Order 5700.6C and Environmental Protection Agency (EPA) QAMS-005/80 (EPA 1980a) guidelines. QA requirements and the general QA objectives for Phase 2 data were defined in the Phase 2 Sampling and Analysis Plan (SAP)-Quality Assurance Project Plan, and scope changes noted in the Phase 2 Sampling and Analysis Plan Addendum. The QA objectives for Phase 2 data were the following: (1) Scientific data generated will withstand scientific and legal scrutiny. (2) Data will be gatheredmore » using appropriate procedures for sample collection, sample handling and security, chain of custody (COC), laboratory analyses, and data reporting. (3) Data will be of known precision and accuracy. (4) Data will meet data quality objectives (DQOs) defined in the Phase 2 SAP.« less
Quality Assurance and Quality Control Practices for Rehabilitation of Sewer and Water Mains
As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued, including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of reha...
DOT National Transportation Integrated Search
2008-04-01
The objective of this study was to develop resilient modulus prediction models for possible application in the quality control/quality assurance (QC/QA) procedures during and after the construction of pavement layers. Field and laboratory testing pro...
Quality Assurance and Quality Control Practices For Rehabilitation of Sewer and Water Mains
As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of rehab...
USDA-ARS?s Scientific Manuscript database
A multi-laboratory broth microdilution method trial was performed to standardize the specialized test conditions required for fish pathogens Flavobacterium columnare and F. pyschrophilum. Nine laboratories tested the quality control (QC) strains Escherichia coli ATCC 25922 and Aeromonas salmonicid...
7 CFR 283.2 - Scope and applicability.
Code of Federal Regulations, 2010 CFR
2010-01-01
... agencies of Food and Nutrition Service quality control (QC) claims for Fiscal Year (“FY”) 1986 and... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM APPEALS OF QUALITY CONTROL (âQCâ) CLAIMS General § 283.2...
DOT National Transportation Integrated Search
2011-06-01
The main objective of this study is to investigate the use of the semi-circular bend (SCB) : test as a quality assurance/quality control (QA/QC) measure for field construction. : Comparison of fracture properties from the SCB test and fatigue beam te...
NASA Astrophysics Data System (ADS)
Le, Loan T.
Over the span of more than 20 years of development, the Quantum Cascade (QC) laser has positioned itself as the most viable mid-infrared (mid-IR) light source. Today's QC lasers emit watts of continuous wave power at room temperature. Despite significant progress, the mid-IR region remains vastly under-utilized. State-of-the-art QC lasers are found in high power defense applications and detection of trace gases with narrow absorption lines. A large number of applications, however, do not require so much power, but rather, a broadly tunable laser source to detect molecules with broad absorption features. As such, a QC laser that is broadly tunable over the entire biochemical fingerprinting region remains the missing link to markets such as non- invasive biomedical diagnostics, food safety, and stand-off detection in turbid media. In this thesis, we detail how we utilized the inherent flexibility of the QC design space to conceive a new type of laser with the potential to bridge that missing link of the QC laser to large commercial markets. Our design concept, the Super Cascade (SC) laser, works contrary to conventional laser design principle by supporting multiple independent optical transitions, each contributing to broadening the gain spectrum. We have demonstrated a room temperature laser gain medium with electroluminescence spanning 3.3-12.5 ?m and laser emission from 6.2-12.5 ?m, the record spectral width for any solid state laser gain medium. This gain bandwidth covers the entire biochemical fingerprinting region. The achievement of such a spectrally broad gain medium presents engineering challenges of how to optimally utilize the bandwidth. As of this work, a monolithi- cally integrated array of Distributed Feedback QC (DFB-QC) lasers is one of the most promising ways to fully utilize the SC gain bandwidth. Therefore, in this thesis, we explore ways of improving the yield and ease of fabrication of DFB-QC lasers, including a re-examination of the role of current spreading in QC geometry.
DOT National Transportation Integrated Search
2010-06-01
This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...
Analysis of QA procedures at the Oregon Department of Transportation.
DOT National Transportation Integrated Search
2010-06-01
This research explored the Oregon Department of Transportation (ODOT) practice of Independent Assurance (IA), : for validation of the contractors test methods, and Verification, for validation of the contractors Quality Control : (QC) data. The...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... containers shall measure the mass in each CO2 container using weigh bills, scales, or load cells and sum the...
Study of quantum correlation swapping with relative entropy methods
NASA Astrophysics Data System (ADS)
Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun
2016-02-01
To generate long-distance shared quantum correlations (QCs) for information processing in future quantum networks, recently we proposed the concept of QC repeater and its kernel technique named QC swapping. Besides, we extensively studied the QC swapping between two simple QC resources (i.e., a pair of Werner states) with four different methods to quantify QCs (Xie et al. in Quantum Inf Process 14:653-679, 2015). In this paper, we continue to treat the same issue by employing other three different methods associated with relative entropies, i.e., the MPSVW method (Modi et al. in Phys Rev Lett 104:080501, 2010), the Zhang method (arXiv:1011.4333 [quant-ph]) and the RS method (Rulli and Sarandy in Phys Rev A 84:042109, 2011). We first derive analytic expressions of all QCs which occur during the swapping process and then reveal their properties about monotonicity and threshold. Importantly, we find that a long-distance shared QC can be generated from two short-distance ones via QC swapping indeed. In addition, we simply compare our present results with our previous ones.
Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebner, Gregory A.
Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less
Lapse time and frequency-dependent coda wave attenuation for Delhi and its surrounding regions
NASA Astrophysics Data System (ADS)
Das, Rabin; Mukhopadhyay, Sagarika; Singh, Ravi Kant; Baidya, Pushap R.
2018-07-01
Attenuation of seismic wave energy of Delhi and its surrounding regions has been estimated using coda of local earthquakes. Estimated quality factor (Qc) values are strongly dependent on frequency and lapse time. Frequency dependence of Qc has been estimated from the relationship Qc(f) = Q0fn for different lapse time window lengths. Q0 and n values vary from 73 to 453 and 0.97 to 0.63 for lapse time window lengths of 15 s to 90 s respectively. Average estimated frequency dependent relation is, Qc(f) = 135 ± 8f0.96±0.02 for the entire region for a window length of 30 s, where the average Qc value varies from 200 at 1.5 Hz to 1962 at 16 Hz. These values show that the region is seismically active and highly heterogeneous. The entire study region is divided into two sub-regions according to the geology of the area to investigate if there is a spatial variation in attenuation characteristics in this region. It is observed that at smaller lapse time both regions have similar Qc values. However, at larger lapse times the rate of increase of Qc with frequency is larger for Region 2 compared to Region 1. This is understandable, as it is closer to the tectonically more active Himalayan ranges and seismically more active compared to Region 1. The difference in variation of Qc with frequencies for the two regions is such that at larger lapse time and higher frequencies Region 2 shows higher Qc compared to Region 1. For lower frequencies the opposite situation is true. This indicates that there is a systematic variation in attenuation characteristics from the south (Region 1) to the north (Region 2) in the deeper part of the study area. This variation can be explained in terms of an increase in heat flow and a decrease in the age of the rocks from south to north.
Summation rules for a fully nonlocal energy-based quasicontinuum method
NASA Astrophysics Data System (ADS)
Amelang, J. S.; Venturini, G. N.; Kochmann, D. M.
2015-09-01
The quasicontinuum (QC) method coarse-grains crystalline atomic ensembles in order to bridge the scales from individual atoms to the micro- and mesoscales. A crucial cornerstone of all QC techniques, summation or quadrature rules efficiently approximate the thermodynamic quantities of interest. Here, we investigate summation rules for a fully nonlocal, energy-based QC method to approximate the total Hamiltonian of a crystalline atomic ensemble by a weighted sum over a small subset of all atoms in the crystal lattice. Our formulation does not conceptually differentiate between atomistic and coarse-grained regions and thus allows for seamless bridging without domain-coupling interfaces. We review traditional summation rules and discuss their strengths and weaknesses with a focus on energy approximation errors and spurious force artifacts. Moreover, we introduce summation rules which produce no residual or spurious force artifacts in centrosymmetric crystals in the large-element limit under arbitrary affine deformations in two dimensions (and marginal force artifacts in three dimensions), while allowing us to seamlessly bridge to full atomistics. Through a comprehensive suite of examples with spatially non-uniform QC discretizations in two and three dimensions, we compare the accuracy of the new scheme to various previous ones. Our results confirm that the new summation rules exhibit significantly smaller force artifacts and energy approximation errors. Our numerical benchmark examples include the calculation of elastic constants from completely random QC meshes and the inhomogeneous deformation of aggressively coarse-grained crystals containing nano-voids. In the elastic regime, we directly compare QC results to those of full atomistics to assess global and local errors in complex QC simulations. Going beyond elasticity, we illustrate the performance of the energy-based QC method with the new second-order summation rule by the help of nanoindentation examples with automatic mesh adaptation. Overall, our findings provide guidelines for the selection of summation rules for the fully nonlocal energy-based QC method.
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research. PMID:29385151
Olijnyk, Nicholas V
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research.
Zhao, Xueqing; Zeisel, Steven H; Zhang, Shucha
2015-06-17
There is a growing interest in analyzing choline, betaine, and their gut microbial metabolites including trimethylamine (TMA) and trimethylamine N-oxide (TMAO) in body fluids due to the high relevance of these compounds for human health and diseases. A stable isotope dilution (SID)-LC-MRM-MS assay was developed for the simultaneous determination of choline, betaine, TMA, TMAO, and creatinine in human plasma and urine. The assay was validated using quality control (QC) plasma samples, spiked at low, medium, and high levels. Freeze-thaw stability was also evaluated. The utility of this assay for urine was demonstrated using a nutritional clinical study on the effect of various egg doses on TMAO production in humans. This assay has a wide dynamic range (R 2 > 0.994) for all the analytes (choline: 0.122-250 μM; betaine: 0.488-1000 μM; TMA: 0.244-250 μM; TMAO: 0.061-62.5 μM; and creatinine: 0.977-2000 μM). High intra- and inter-day precision (CV < 6%) and high accuracy (< 15% error) were observed from the QC plasma samples. The assay is reliable for samples undergoing multiple freeze-thaw cycles (tested up to eight cycles). The assay also works for urine samples as demonstrated by a clinical study in which we observed a significant, positive linear response to various egg doses for urinary concentrations of all the analytes except creatinine. A rapid SID-LC-MRM-MS assay for simultaneous quantification of choline, betaine, TMA, TMAO, and creatinine has been developed and validated, and is expected to find wide application in nutrition and cardiovascular studies as well as diagnosis and management of trimethylaminuria. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Countably QC-Approximating Posets
Mao, Xuxin; Xu, Luoshan
2014-01-01
As a generalization of countably C-approximating posets, the concept of countably QC-approximating posets is introduced. With the countably QC-approximating property, some characterizations of generalized completely distributive lattices and generalized countably approximating posets are given. The main results are as follows: (1) a complete lattice is generalized completely distributive if and only if it is countably QC-approximating and weakly generalized countably approximating; (2) a poset L having countably directed joins is generalized countably approximating if and only if the lattice σ c(L)op of all σ-Scott-closed subsets of L is weakly generalized countably approximating. PMID:25165730
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum Products and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.174 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... emissions using the carbon mass balance procedure in § 98.173(b)(1), you must: (1) Except as provided in... Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal..., Nitrogen, and Oxygen in Steel, Iron, Nickel, and Cobalt Alloys by Various Combustion and Fusion Techniques...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the...) Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum... for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal...
Desaules, André
2012-11-01
It is crucial for environmental monitoring to fully control temporal bias, which is the distortion of real data evolution by varying bias through time. Temporal bias cannot be fully controlled by statistics alone but requires appropriate and sufficient metadata, which should be under rigorous and continuous quality assurance and control (QA/QC) to reliably document the degree of consistency of the monitoring system. All presented strategies to detect and control temporal data bias (QA/QC, harmonisation/homogenisation/standardisation, mass balance approach, use of tracers and analogues and control of changing boundary conditions) rely on metadata. The Will Rogers phenomenon, due to subsequent reclassification, is a particular source of temporal data bias introduced to environmental monitoring here. Sources and effects of temporal data bias are illustrated by examples from the Swiss soil monitoring network. The attempt to make a comprehensive compilation and assessment of required metadata for soil contamination monitoring reveals that most metadata are still far from being reliable. This leads to the conclusion that progress in environmental monitoring means further development of the concept of environmental metadata for the sake of temporal data bias control as a prerequisite for reliable interpretations and decisions.
A method to establish seismic noise baselines for automated station assessment
McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.
2009-01-01
We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).
NASA Astrophysics Data System (ADS)
Choi, Hyunwoo; Kim, Tae Geun; Shin, Changhwan
2017-06-01
A topological insulator (TI) is a new kind of material that exhibits unique electronic properties owing to its topological surface state (TSS). Previous studies focused on the transport properties of the TSS, since it can be used as the active channel layer in metal-oxide-semiconductor field-effect transistors (MOSFETs). However, a TI with a negative quantum capacitance (QC) effect can be used in the gate stack of MOSFETs, thereby facilitating the creation of ultra-low power electronics. Therefore, it is important to study the physics behind the QC in TIs in the absence of any external magnetic field, at room temperature. We fabricated a simple capacitor structure using a TI (TI-capacitor: Au-TI-SiO2-Si), which shows clear evidence of QC at room temperature. In the capacitance-voltage (C-V) measurement, the total capacitance of the TI-capacitor increases in the accumulation regime, since QC is the dominant capacitive component in the series capacitor model (i.e., CT-1 = CQ-1 + CSiO2-1). Based on the QC model of the two-dimensional electron systems, we quantitatively calculated the QC, and observed that the simulated C-V curve theoretically supports the conclusion that the QC of the TI-capacitor is originated from electron-electron interaction in the two-dimensional surface state of the TI.
Material quality assurance risk assessment.
DOT National Transportation Integrated Search
2013-01-01
Over the past two decades the role of SHA has shifted from quality control (QC) of materials and : placement techniques to quality assurance (QA) and acceptance. The role of the Office of Materials : Technology (OMT) has been shifting towards assuran...
Data Validation & Laboratory Quality Assurance for Region 9
In all hazardous site investigations it is essential to know the quality of the data used for decision-making purposes. Validation of data requires that appropriate quality assurance and quality control (QA/QC) procedures be followed.
Long-term pavement performance indicators for failed materials.
DOT National Transportation Integrated Search
2016-04-01
State Transportation Agencies (STAs) use quality control/quality assurance (QC/QA) specifications to guide the testing and inspection of : road pavement construction. Although failed materials of pavement rarely occur in practice, it is critical to h...
Material quality assurance risk assessment : [summary].
DOT National Transportation Integrated Search
2013-01-01
With the shift from quality control (QC) of materials and placement techniques : to quality assurance (QA) and acceptance over the years, the role of the Office : of Materials Technology (OMT) has been shifting towards assurance of : material quality...
The April 1994 and October 1994 radon intercomparisons at EML
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisenne, I.M.; George, A.C.; Perry, P.M.
1995-10-01
Quality assurance/quality control (QA/QC) are the backbone of many commercial and research processes and programs. QA/QC research tests the state of a functioning system, be it the production of manufactured goods or the ability to make accurate and precise measurements. The quality of the radon measurements in the US have been tested under controlled conditions in semi-annual radon gas intercomparison exercises sponsored by the Environmental Measurements Laboratory (EML) since 1981. The two Calendar Year 1994 radon gas intercomparison exercises were conducted in the EML exposure chamber. Thirty-two groups including US Federal facilities, USDOE contractors, national and state laboratories, universities andmore » foreign institutions participated in these exercises. The majority of the participant`s results were within {+-}10% of the EML value at radon concentrations of 570 and 945 Bq m{sup {minus}3}.« less
Quality control in the year 2000.
Schade, B
1992-01-01
'Just-in-time' production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems).
Quality control in the year 2000
Schade, Bernd
1992-01-01
‘Just-in-time’ production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems). PMID:18924930
El Amrani, Mohsin; Szanto, Celina L; Hack, C Erik; Huitema, Alwin D R; Nierkens, Stefan; van Maarseveen, Erik M
2018-06-25
Neuroblastoma is one of the most commonly found solid tumors in children. The monoclonal antibody dinutuximab (DNX) targets the sialic acid-containing glycosphingolipid GD2 expressed on almost all neuroblastoma tumor cells and induces cell lysis. However, the expression of GD2 is not limited to tumor cells only, but is also present on central nerve tissue and peripheral nerve cells explaining dinutuximab toxicity. The most common adverse reactions are pain and discomfort, which may lead to discontinuation of the treatment. Furthermore, there is little to no data available on exposure and effect relationships of dinutuximab. We, therefore, developed an easy method in order to quantify dinutuximab levels in human plasma. Ammonium sulfate (AS) was used to precipitate all immunoglobulins (IgGs) in human plasma. After centrifugation, supernatant containing albumin was decanted and the precipitated IgG fraction was re-dissolved in a buffer containing 0.5% sodium dodecyl sulfate (SDS). Samples were then reduced, alkylated, and digested with trypsin. Finally, a signature peptide in complementarity determining region 1 of DNX heavy chain was quantified on LC-MS/MS using a stable isotopically labeled peptide as internal standard. AS purification efficiently removed 97.5% of the albumin fraction in the supernatant layer. The validation performed on DNX showed that within-run and between-run coefficients of variation (CV) for lower limit of quantification (LLOQ) were 5.5 and 1.4%, respectively. The overall CVs for quality control (QC) low, QC med, and QC high levels were < 5%. Linearity in the range 1-32 mg/L was excellent (r 2 > 0.999). Selectivity, stability, and matrix effect were in concordance with EMA guidelines. In conclusion, a method to quantify DNX in human plasma was successfully developed. In addition, the high and robust process efficiency enabled the utilization of a stable isotopically labeled (SIL) peptide instead of SIL DNX, which was commercially unavailable. Graphical abstract.
Yu, Kate; Little, David; Plumb, Rob; Smith, Brian
2006-01-01
A quantitative Ultra Performance liquid chromatography/tandem mass spectrometry (UPL/MS/MS) protocol was developed for a five-compound mixture in rat plasma. A similar high-performance liquid chromatography/tandem mass spectrometry (HPLC/MS/MS) quantification protocol was developed for comparison purposes. Among the five test compounds, three preferred positive electrospray ionization (ESI) and two preferred negative ESI. As a result, both UPLC/MS/MS and HPLC/MS/MS analyses were performed by having the mass spectrometer collecting ESI multiple reaction monitoring (MRM) data in both positive and negative ion modes during a single injection. Peak widths for most standards were 4.8 s for the HPLC analysis and 2.4 s for the UPLC analysis. There were 17 to 20 data points obtained for each of the LC peaks. Compared with the HPLC/MS/MS method, the UPLC/MS/MS method offered 3-fold decrease in retention time, up to 10-fold increase in detected peak height, with 2-fold decrease in peak width. Limits of quantification (LOQs) for both HPLC and UPLC methods were evaluated. For UPLC/MS/MS analysis, a linear range up to four orders of magnitude was obtained with r2 values ranging from 0.991 to 0.998. The LOQs for the five analytes ranged from 0.08 to 9.85 ng/mL. Three levels of quality control (QC) samples were analyzed. For the UPLC/MS/MS protocol, the percent relative standard deviation (RSD%) for low QC (2 ng/mL) ranged from 3.42 to 8.67% (N = 18). The carryover of the UPLC/MS/MS protocol was negligible and the robustness of the UPLC/MS/MS system was evaluated with up to 963 QC injections. Copyright 2006 John Wiley & Sons, Ltd.
Large-Scale Topographic Features on Venus: A Comparison by Geological Mapping in Four Quadrangles
NASA Astrophysics Data System (ADS)
Ivanov, M. A.; Head, J. W.
2002-05-01
We have conducted geological mapping in four quadrangles under the NASA program of geological mapping of Venus. Two quadrangles portray large equidimensional lowlands (Lavinia, V55, and Atalanta, V4, Planitiae) and two more areas are characterized by a large corona (Quetzalpetlatl corona, QC, V66), and Lakshmi Planum (LP, V7). Geological mapping of these large-scale features allows for their broad comparisons by both sets of typical structures and sequences of events. The Planitiae share a number of similar characteristics. (1) Lavinia and Atalanta are broad quasi-circular lowlands 1-2 km deep. (2) The central portions of the basins lack both coronae and large volcanoes. (3) The belts of tectonic deformation characterize the central portions of the basins. (4) There is evidence in both lowlands that they subsided predominantly before the emplacement of regional plains. (5) Recent volcanism is shifted toward the periphery of the basins and occurred after or at the late stages the formation of the lowlands. The above characteristics of the lowlands are better reconciled with the scenario in which their formation is due to a broad-scale mantle downwelling that started relatively early in the visible geologic history of Venus. The QC and LP are elevated structures roughly comparable in size. The formation of QC is commonly attributed to large-scale mantle positive diapirism while the formation of LP remains controversial and both mantle upwelling and downwelling models exist. QC and LP have similar characteristics such as broadly circular shape in plan-view, association with regional highlands, associated relatively young volcanism, and a topographic moat bordering both QC and LP from the North. Despite the above similarities, the striking differences between QC and LP are obvious too. LP is crowned by the highest mountain ranges on Venus and QC is bordered from the North by a common belt of ridges. LP itself makes up a regional highland within the upland of Ishtar Terra while QC produces a much less significant topographic anomaly on the background of the highland of Lada Terra. Highly deformed, tessera-like, terrain apparently makes up the basement of LP, and QC formed in the tessera-free area. Volcanic activity is concentrated in the central portion of LP while QC is a regionally important center of young volcanism. These differences, which probably can not be accounted for by simple difference in the size of LP and QC, suggest non-similar modes of the formation of both regional structures and do not favor the upwelling models of the formation of LP.
Chen, Haiming; Lu, Chuanjian; Liu, Huazhen; Wang, Maojie; Zhao, Hui; Yan, Yuhong; Han, Ling
2017-07-01
Quercetin (QC) is a dietary flavonoid abundant in many natural plants. A series of studies have shown that it has been shown to exhibit several biological properties, including anti-inflammatory, anti-oxidant, cardio-protective, vasodilatory, liver-protective and anti-cancer activities. However, so far the possible therapeutic effect of QC on psoriasis has not been reported. The present study was undertaken to evaluate the potential beneficial effect of QC in psoriasis using a generated imiquimod (IMQ)-induced psoriasis-like mouse model, and to further elucidate its underlying mechanisms of action. Effects of QC on PASI scores, back temperature, histopathological changes, oxidative/anti-oxidative indexes, pro-inflammatory cytokines and NF-κB pathway in IMQ-induced mice were investigated. Our results showed that QC could significantly reduce the PASI scores, decrease the temperature of the psoriasis-like lesions, and ameliorate the deteriorating histopathology in IMQ-induced mice. Moreover, QC effectively attenuated levels of TNF-α, IL-6 and IL-17 in serum, increased activities of GSH, CAT and SOD, and decreased the accumulation of MDA in skin tissue induced by IMQ in mice. The mechanism may be associated with the down-regulation of NF-κB, IKKα, NIK and RelB expression and up-regulation of TRAF3, which were critically involved in the non-canonical NF-κB pathway. In conclusion, our present study demonstrated that QC had appreciable anti-psoriasis effects in IMQ-induced mice, and the underlying mechanism may involve the improvement of antioxidant and anti-inflammatory status and inhibition on the activation of the NF-κB signaling. Hence, QC, a naturally occurring flavone with potent anti-psoriatic effects, has the potential for further development as a candidate for psoriasis treatment. Copyright © 2017 Elsevier B.V. All rights reserved.
Improvement of the quality of work in a biochemistry laboratory via measurement system analysis.
Chen, Ming-Shu; Liao, Chen-Mao; Wu, Ming-Hsun; Lin, Chih-Ming
2016-10-31
An adequate and continuous monitoring of operational variations can effectively reduce the uncertainty and enhance the quality of laboratory reports. This study applied the evaluation rule of the measurement system analysis (MSA) method to estimate the quality of work conducted in a biochemistry laboratory. Using the gauge repeatability & reproducibility (GR&R) approach, variations in quality control (QC) data among medical technicians in conducting measurements of five biochemical items, namely, serum glucose (GLU), aspartate aminotransferase (AST), uric acid (UA), sodium (Na) and chloride (Cl), were evaluated. The measurements of the five biochemical items showed different levels of variance among the different technicians, with the variances in GLU measurements being higher than those for the other four items. The ratios of precision-to-tolerance (P/T) for Na, Cl and GLU were all above 0.5, implying inadequate gauge capability. The product variation contribution of Na was large (75.45% and 31.24% in normal and abnormal QC levels, respectively), which showed that the impact of insufficient usage of reagents could not be excluded. With regard to reproducibility, high contributions (of more than 30%) of variation for the selected items were found. These high operator variation levels implied that the possibility of inadequate gauge capacity could not be excluded. The analysis of variance (ANOVA) of GR&R showed that the operator variations in GLU measurements were significant (F=5.296, P=0.001 in the normal level and F=3.399, P=0.015 in the abnormal level, respectively). In addition to operator variations, product variations of Na were also significant for both QC levels. The heterogeneity of variance for the five technicians showed significant differences for the Na and Cl measurements in the normal QC level. The accuracy of QC for five technicians was identified for further operational improvement. This study revealed that MSA can be used to evaluate product and personnel errors and to improve the quality of work in a biochemical laboratory through proper corrective actions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoisak, J; Manger, R; Dragojevic, I
Purpose: To perform a failure mode and effects analysis (FMEA) of the process for treating superficial skin cancers with the Xoft Axxent electronic brachytherapy (eBx) system, given the recent introduction of expanded quality control (QC) initiatives at our institution. Methods: A process map was developed listing all steps in superficial treatments with Xoft eBx, from the initial patient consult to the completion of the treatment course. The process map guided the FMEA to identify the failure modes for each step in the treatment workflow and assign Risk Priority Numbers (RPN), calculated as the product of the failure mode’s probability ofmore » occurrence (O), severity (S) and lack of detectability (D). FMEA was done with and without the inclusion of recent QC initiatives such as increased staffing, physics oversight, standardized source calibration, treatment planning and documentation. The failure modes with the highest RPNs were identified and contrasted before and after introduction of the QC initiatives. Results: Based on the FMEA, the failure modes with the highest RPN were related to source calibration, treatment planning, and patient setup/treatment delivery (Fig. 1). The introduction of additional physics oversight, standardized planning and safety initiatives such as checklists and time-outs reduced the RPNs of these failure modes. High-risk failure modes that could be mitigated with improved hardware and software interlocks were identified. Conclusion: The FMEA analysis identified the steps in the treatment process presenting the highest risk. The introduction of enhanced QC initiatives mitigated the risk of some of these failure modes by decreasing their probability of occurrence and increasing their detectability. This analysis demonstrates the importance of well-designed QC policies, procedures and oversight in a Xoft eBx programme for treatment of superficial skin cancers. Unresolved high risk failure modes highlight the need for non-procedural quality initiatives such as improved planning software and more robust hardware interlock systems.« less
ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.
Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley
2016-10-03
Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.
The Navy’s Quality Journey: Operational Implementation of TQL
1993-04-01
training. Dr. Kaoru Ishikawa "Guide to Ouality Control" "QC begins with education and ends with education. To implement TQC, we need to carry out...York: McGraw-Hill, 1986. 20. Ishikawa , Kaoru . What is Total Qualit Control? Englewood Cliffs, NJ: Prentice-Hall, Inc., 1985. 21. Ishikawa , Kaoru
Stability of Tetrahydrocannabinol and Cannabidiol in Prepared Quality Control Medible Brownies.
Wolf, Carl E; Poklis, Justin L; Poklis, Alphonse
2017-03-01
The legalization of marijuana in the USA for both medicinal and recreational use has increased in the past few years. Currently, 24 states have legalized marijuana for medicinal use. The US Drug Enforcement Administration has classified marijuana as a Schedule I substance. The US Food and Drug Administration does not regulate formulations or packages of marijuana that are currently marketed in states that have legalized marijuana. Marijuana edibles or "medibles" are typically packages of candies and baked goods consumed for medicinal as well as recreational marijuana use. They contain major psychoactive drug in marijuana, delta-9-tetrahydrocannabinol (THC) and/or cannabidiol (CBD), which has reputed medical properties. Presented is a method for the preparation and application of THC and CBD containing brownies used as quality control (QC) material for the analysis of marijuana or cannabinoid baked medibles. The performance parameters of the assay including possible matrix effects and cannabinoid stability in the brownie QC over time are presented. It was determined that the process used to prepare and bake the brownie control material did not degrade the THC or CBD. The brownie matrix was found not to interfere with the analysis of a THC or a CBD. Ten commercially available brownie matrixes were evaluated for potential interferences; none of them were found to interfere with the analysis of THC or CBD. The laboratory baked medible QC material was found to be stable at room temperature for at least 3 months. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Samuelson, John; Robbins, Phillips W.
2014-01-01
Asparagine-linked glycans (N-glycans) of medically important protists have much to tell us about the evolution of N-glycosylation and of N-glycan-dependent quality control (N-glycan QC) of protein folding in the endoplasmic reticulum. While host N-glycans are built upon a dolichol-pyrophosphate-linked precursor with 14 sugars (Glc3Man9GlcNAc2), protist N-glycan precursors vary from Glc3Man9GlcNAc2 (Acanthamoeba) to Man9GlcNAc2 (Trypanosoma) to Glc3Man5GlcNAc2 (Toxoplasma) to Man5GlcNAc2 (Entamoeba, Trichomonas, and Eimeria) to GlcNAc2 (Plasmodium and Giardia) to zero (Theileria). As related organisms have differing N-glycan lengths (e.g. Toxoplasma, Eimeria, Plasmodium, and Theileria), the present N-glycan variation is based upon secondary loss of Alg genes, which encode enzymes that add sugars to the N-glycan precursor. An N-glycan precursor with Man5GlcNAc2 is necessary but not sufficient for N-glycan QC, which is predicted by the presence of the UDP-glucose:glucosyltransferase (UGGT) plus calreticulin and/or calnexin. As many parasites lack glucose in their N-glycan precursor, UGGT product may be identified by inhibition of glucosidase II. The presence of an armless calnexin in Toxoplasma suggests secondary loss of N-glycan QC from coccidia. Positive selection for N-glycan sites occurs in secreted proteins of organisms with NG-QC and is based upon an increased likelihood of threonine but not serine in the second position versus asparagine. In contrast, there appears to be selection against N-glycan length in Plasmodium and N-glycan site density in Toxoplasma. Finally, there is suggestive evidence for N-glycan-dependent ERAD in Trichomonas, which glycosylates and degrades the exogenous reporter mutant carboxypeptidase Y (CPY*). PMID:25475176
Comparative performance evaluation of a new a-Si EPID that exceeds quad high-definition resolution.
McConnell, Kristen A; Alexandrian, Ara; Papanikolaou, Niko; Stathakis, Sotiri
2018-01-01
Electronic portal imaging devices (EPIDs) are an integral part of the radiation oncology workflow for treatment setup verification. Several commercial EPID implementations are currently available, each with varying capabilities. To standardize performance evaluation, Task Group Report 58 (TG-58) and TG-142 outline specific image quality metrics to be measured. A LinaTech Image Viewing System (IVS), with the highest commercially available pixel matrix (2688x2688 pixels), was independently evaluated and compared to an Elekta iViewGT (1024x1024 pixels) and a Varian aSi-1000 (1024x768 pixels) using a PTW EPID QC Phantom. The IVS, iViewGT, and aSi-1000 were each used to acquire 20 images of the PTW QC Phantom. The QC phantom was placed on the couch and aligned at isocenter. The images were exported and analyzed using the epidSoft image quality assurance (QA) software. The reported metrics were signal linearity, isotropy of signal linearity, signal-tonoise ratio (SNR), low contrast resolution, and high-contrast resolution. These values were compared between the three EPID solutions. Computed metrics demonstrated comparable results between the EPID solutions with the IVS outperforming the aSi-1000 and iViewGT in the low and high-contrast resolution analysis. The performance of three commercial EPID solutions have been quantified, evaluated, and compared using results from the PTW QC Phantom. The IVS outperformed the other panels in low and high-contrast resolution, but to fully realize the benefits of the IVS, the selection of the monitor on which to view the high-resolution images is important to prevent down sampling and visual of resolution.
NASA Astrophysics Data System (ADS)
Maity, H.; Biswas, A.; Bhattacharjee, A. K.; Pal, A.
In this paper, we have proposed the design of quantum cost (QC) optimized 4-bit reversible universal shift register (RUSR) using reduced number of reversible logic gates. The proposed design is very useful in quantum computing due to its low QC, less no. of reversible logic gate and less delay. The QC, no. of gates, garbage outputs (GOs) are respectively 64, 8 and 16 for proposed work. The improvement of proposed work is also presented. The QC is 5.88% to 70.9% improved, no. of gate is 60% to 83.33% improved with compared to latest reported result.
The Quasicontinuum Method: Overview, applications and current directions
NASA Astrophysics Data System (ADS)
Miller, Ronald E.; Tadmor, E. B.
2002-10-01
The Quasicontinuum (QC) Method, originally conceived and developed by Tadmor, Ortiz and Phillips [1] in 1996, has since seen a great deal of development and application by a number of researchers. The idea of the method is a relatively simple one. With the goal of modeling an atomistic system without explicitly treating every atom in the problem, the QC provides a framework whereby degrees of freedom are judiciously eliminated and force/energy calculations are expedited. This is combined with adaptive model refinement to ensure that full atomistic detail is retained in regions of the problem where it is required while continuum assumptions reduce the computational demand elsewhere. This article provides a review of the method, from its original motivations and formulation to recent improvements and developments. A summary of the important mechanics of materials results that have been obtained using the QC approach is presented. Finally, several related modeling techniques from the literature are briefly discussed. As an accompaniment to this paper, a website designed to serve as a clearinghouse for information on the QC method has been established at www.qcmethod.com. The site includes information on QC research, links to researchers, downloadable QC code and documentation.
NASA Astrophysics Data System (ADS)
Dirisu, Afusat Olayinka
Quantum Cascade (QC) lasers are intersubband light sources operating in the wavelength range of ˜ 3 to 300 mum and are used in applications such as sensing (environmental, biological, and hazardous chemical), infrared countermeasures, and free-space infrared communications. The mid-infrared range (i.e. lambda ˜ 3-30 mum) is of particular importance in sensing because of the strong interaction of laser radiation with various chemical species, while in free space communications the atmospheric windows of 3-5 mum and 8-12 mum are highly desirable for low loss transmission. Some of the requirements of these applications include, (1) high output power for improved sensitivity; (2) high operating temperatures for compact and cost-effective systems; (3) wide tunability; (4) single mode operation for high selectivity. In the past, available mid-infrared sources, such as the lead-salt and solid-state lasers, were bulky, expensive, or emit low output power. In recent years, QC lasers have been explored as cost-effective and compact sources because of their potential to satisfy and exceed all the above requirements. Also, the ultrafast carrier lifetimes of intersubband transitions in QC lasers are promising for high bandwidth free-space infrared communication. This thesis was focused on the improvement of QC lasers through the design and optimization of the laser cavity and characterization of the laser gain medium. The optimization of the laser cavity included, (1) the design and fabrication of high reflection Bragg gratings and subwavelength antireflection gratings, by focused ion beam milling, to achieve tunable, single mode and high power QC lasers, and (2) modeling of slab-coupled optical waveguide QC lasers for high brightness output beams. The characterization of the QC laser gain medium was carried out using the single-pass transmission experiment, a sensitive measurement technique, for probing the intersubband transitions and the electron distribution of QC lasers under different temperatures and applied bias conditions, unlike typical infrared measurement techniques that are restricted to non-functional devices. With the single-pass technique, basic understanding of the physics behind the workings of the QC laser gain can be achieved, which is invaluable in the design of QC lasers with high output power and high operating temperatures.
Managing the Quality of Environmental Data in EPA Region 9
EPA Pacific Southwest, Region 9's Quality Assurance (QA) section's primary mission is to effectively oversee and carry out the Quality System and Quality Management Plan, and project-level quality assurance and quality control (QA/QC) activities.
Implementation of GPS controlled highway construction equipment phase II.
DOT National Transportation Integrated Search
2008-01-01
"During 2006, WisDOT and the Construction Materials and Support Center at UW-Madison worked together to develop : a specification and QC/QA procedures for GPS machine guidance on highway construction grading operations. These : specifications and pro...
Implementation of GPS controlled highway construction equipment, phase III.
DOT National Transportation Integrated Search
2009-02-01
Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked : together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading : projects. These specifications and ...
QC/QA : evaluation of effectiveness in Kentucky.
DOT National Transportation Integrated Search
2008-06-30
Quality control and quality assurance in the highway industry is going through a cultural shift. There is a growing trend toward using the contractor data for acceptance and payment purpose. This has led to serious concerns about conflicts of interes...
2012-01-01
Background Ensuring the quality of malaria medicines is crucial in working toward malaria control and eventual elimination. Unlike other validated tests that can assess all critical quality attributes, which is the standard for determining the quality of medicines, basic tests are significantly less expensive, faster, and require less skilled labour; yet, these tests provide reproducible data and information on several critical quality attributes, such as identity, purity, content, and disintegration. Visual and physical inspection also provides valuable information about the manufacturing and the labelling of medicines, and in many cases this inspection is sufficient to detect counterfeit medicines. The Promoting the Quality of Medicines (PQM) programme has provided technical assistance to Amazon Malaria Initiative (AMI) countries to implement the use of basic tests as a key screening mechanism to assess the quality of malaria medicines available to patients in decentralized regions. Methods Trained personnel from the National Malaria Control Programmes (NMCPs), often in collaboration with country’s Official Medicine Control Laboratory (OMCL), developed country- specific protocols that encompassed sampling methods, sample analysis, and data reporting. Sampling sites were selected based on malaria burden, accessibility, and geographical location. Convenience sampling was performed and countries were recommended to store the sampled medicines under conditions that did not compromise their quality. Basic analytical tests, such as disintegration and thin layer chromatography (TLC), were performed utilizing a portable mini-laboratory. Results Results were originally presented at regional meetings in a non-standardized format that lacked relevant medicines information. However, since 2008 information has been submitted utilizing a template specifically developed by PQM for that purpose. From 2005 to 2010, the quality of 1,663 malaria medicines from seven AMI countries was evaluated, mostly collected from the public sector, 1,445/1,663 (86.9%). Results indicate that 193/1,663 (11.6%) were found not to meet quality specifications. Most failures were reported during visual and physical inspection, 142/1663 (8.5%), and most of these were due to expired medicines, 118/142 (83.1%). Samples failing TLC accounted for 27/1,663 (1.6%) and those failing disintegration accounted for 24/1,663 (1.4%). Medicines quality failures decreased significantly during the last two years. Conclusions Basic tests revealed that the quality of medicines in the public sector improved over the years, since the implementation of this type of quality monitoring programme in 2005. However, the lack of consistent confirmatory tests in the quality control (QC) laboratory, utilizing methods that can also evaluate additional quality attributes, could still mask quality issues. In the future, AMI countries should improve coordination with their health authorities and their QC lab consistently, to provide a more complete picture of malaria medicines quality and support the implementation of corrective actions. Facilities in the private and informal sectors also should be included when these sectors constitute an important source of medicines used by malaria patients. PMID:22704680
Pribluda, Victor S; Barojas, Adrian; Añez, Arletta; López, Cecilia G; Figueroa, Ruth; Herrera, Roxana; Nakao, Gladys; Nogueira, Fernando Ha; Pianetti, Gerson A; Povoa, Marinete M; Viana, Giselle Mr; Gomes, Margarete S Mendonça; Escobar, Jose P; Sierra, Olga L Muñoz; Norena, Susana P Rendon; Veloz, Raúl; Bravo, Marcy Silva; Aldás, Martha R; Hindssemple, Alison; Collins, Marilyn; Ceron, Nicolas; Krishnalall, Karanchand; Adhin, Malti; Bretas, Gustavo; Hernandez, Nelly; Mendoza, Marjorie; Smine, Abdelkrim; Chibwe, Kennedy; Lukulay, Patrick; Evans, Lawrence
2012-06-15
Ensuring the quality of malaria medicines is crucial in working toward malaria control and eventual elimination. Unlike other validated tests that can assess all critical quality attributes, which is the standard for determining the quality of medicines, basic tests are significantly less expensive, faster, and require less skilled labour; yet, these tests provide reproducible data and information on several critical quality attributes, such as identity, purity, content, and disintegration. Visual and physical inspection also provides valuable information about the manufacturing and the labelling of medicines, and in many cases this inspection is sufficient to detect counterfeit medicines. The Promoting the Quality of Medicines (PQM) programme has provided technical assistance to Amazon Malaria Initiative (AMI) countries to implement the use of basic tests as a key screening mechanism to assess the quality of malaria medicines available to patients in decentralized regions. Trained personnel from the National Malaria Control Programmes (NMCPs), often in collaboration with country's Official Medicine Control Laboratory (OMCL), developed country- specific protocols that encompassed sampling methods, sample analysis, and data reporting. Sampling sites were selected based on malaria burden, accessibility, and geographical location. Convenience sampling was performed and countries were recommended to store the sampled medicines under conditions that did not compromise their quality. Basic analytical tests, such as disintegration and thin layer chromatography (TLC), were performed utilizing a portable mini-laboratory. Results were originally presented at regional meetings in a non-standardized format that lacked relevant medicines information. However, since 2008 information has been submitted utilizing a template specifically developed by PQM for that purpose. From 2005 to 2010, the quality of 1,663 malaria medicines from seven AMI countries was evaluated, mostly collected from the public sector, 1,445/1,663 (86.9%). Results indicate that 193/1,663 (11.6%) were found not to meet quality specifications. Most failures were reported during visual and physical inspection, 142/1663 (8.5%), and most of these were due to expired medicines, 118/142 (83.1%). Samples failing TLC accounted for 27/1,663 (1.6%) and those failing disintegration accounted for 24/1,663 (1.4%). Medicines quality failures decreased significantly during the last two years. Basic tests revealed that the quality of medicines in the public sector improved over the years, since the implementation of this type of quality monitoring programme in 2005. However, the lack of consistent confirmatory tests in the quality control (QC) laboratory, utilizing methods that can also evaluate additional quality attributes, could still mask quality issues. In the future, AMI countries should improve coordination with their health authorities and their QC lab consistently, to provide a more complete picture of malaria medicines quality and support the implementation of corrective actions. Facilities in the private and informal sectors also should be included when these sectors constitute an important source of medicines used by malaria patients.
NASA Astrophysics Data System (ADS)
Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui
2016-09-01
In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.
Contactless electroreflectance study of strained Zn0.79Cd0.21Se/ZnSe double quantum wells
NASA Astrophysics Data System (ADS)
Tu, R. C.; Su, Y. K.; Lin, D. Y.; Li, C. F.; Huang, Y. S.; Lan, W. H.; Tu, S. L.; Chang, S. J.; Chou, S. C.; Chou, W. C.
1998-01-01
We have studied various excitonic transitions of strained Zn0.79Cd0.21Se/ZnSe double quantum wells, grown by molecular beam epitaxy on (100) GaAs substrates, using contactless electroreflectance (CER) at 15 and 300 K. A number of intersub-band transitions in the CER spectra from the sample have been observed. An analysis of the CER spectra has led to the identification of various excitonic transitions, mnH(L), between the mth conduction band state and the nth heavy (light)-hole band state. The conduction-band offset Qc is used as an adjustable parameter to study the band offset in the strained Zn0.79Cd0.21Se/ZnSe system. The value of Qc is determined to be 0.67±0.03.
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (viii) Method...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (viii) Method...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (viii) Method...
The purpose of this SOP is to describe the procedures undertaken to calculate the time activity pattern of the NHEXAS samples. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizona NHEXAS and Battelle Laborat...
Dietel, Manfred; Bubendorf, Lukas; Dingemans, Anne-Marie C; Dooms, Christophe; Elmberger, Göran; García, Rosa Calero; Kerr, Keith M; Lim, Eric; López-Ríos, Fernando; Thunnissen, Erik; Van Schil, Paul E; von Laffert, Maximilian
2016-01-01
Background There is currently no Europe-wide consensus on the appropriate preanalytical measures and workflow to optimise procedures for tissue-based molecular testing of non-small-cell lung cancer (NSCLC). To address this, a group of lung cancer experts (see list of authors) convened to discuss and propose standard operating procedures (SOPs) for NSCLC. Methods Based on earlier meetings and scientific expertise on lung cancer, a multidisciplinary group meeting was aligned. The aim was to include all relevant aspects concerning NSCLC diagnosis. After careful consideration, the following topics were selected and each was reviewed by the experts: surgical resection and sampling; biopsy procedures for analysis; preanalytical and other variables affecting quality of tissue; tissue conservation; testing procedures for epidermal growth factor receptor, anaplastic lymphoma kinase and ROS proto-oncogene 1, receptor tyrosine kinase (ROS1) in lung tissue and cytological specimens; as well as standardised reporting and quality control (QC). Finally, an optimal workflow was described. Results Suggested optimal procedures and workflows are discussed in detail. The broad consensus was that the complex workflow presented can only be executed effectively by an interdisciplinary approach using a well-trained team. Conclusions To optimise diagnosis and treatment of patients with NSCLC, it is essential to establish SOPs that are adaptable to the local situation. In addition, a continuous QC system and a local multidisciplinary tumour-type-oriented board are essential. PMID:26530085
Mendoza-Parra, Marco-Antonio; Saravaki, Vincent; Cholley, Pierre-Etienne; Blum, Matthias; Billoré, Benjamin; Gronemeyer, Hinrich
2016-01-01
We have established a certification system for antibodies to be used in chromatin immunoprecipitation assays coupled to massive parallel sequencing (ChIP-seq). This certification comprises a standardized ChIP procedure and the attribution of a numerical quality control indicator (QCi) to biological replicate experiments. The QCi computation is based on a universally applicable quality assessment that quantitates the global deviation of randomly sampled subsets of ChIP-seq dataset with the original genome-aligned sequence reads. Comparison with a QCi database for >28,000 ChIP-seq assays were used to attribute quality grades (ranging from 'AAA' to 'DDD') to a given dataset. In the present report we used the numerical QC system to assess the factors influencing the quality of ChIP-seq assays, including the nature of the target, the sequencing depth and the commercial source of the antibody. We have used this approach specifically to certify mono and polyclonal antibodies obtained from Active Motif directed against the histone modification marks H3K4me3, H3K27ac and H3K9ac for ChIP-seq. The antibodies received the grades AAA to BBC ( www.ngs-qc.org). We propose to attribute such quantitative grading of all antibodies attributed with the label "ChIP-seq grade".
Kinoshita, Kohnosuke; Jingu, Shigeji; Yamaguchi, Jun-ichi
2013-01-15
A bioanalytical method for determining endogenous d-serine levels in the mouse brain using a surrogate analyte and liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed. [2,3,3-(2)H]D-serine and [(15)N]D-serine were used as a surrogate analyte and an internal standard, respectively. The surrogate analyte was spiked into brain homogenate to yield calibration standards and quality control (QC) samples. Both endogenous and surrogate analytes were extracted using protein precipitation followed by solid phase extraction. Enantiomeric separation was achieved on a chiral crown ether column with an analysis time of only 6 min without any derivatization. The column eluent was introduced into an electrospray interface of a triple-quadrupole mass spectrometer. The calibration range was 1.00 to 300 nmol/g, and the method showed acceptable accuracy and precision at all QC concentration levels from a validation point of view. In addition, the brain d-serine levels of normal mice determined using this method were the same as those obtained by a standard addition method, which is time-consuming but is often used for the accurate measurement of endogenous substances. Thus, this surrogate analyte method should be applicable to the measurement of d-serine levels as a potential biomarker for monitoring certain effects of drug candidates on the central nervous system. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kurokawa, Ami; Doshida, Tomoki; Hagihara, Yukito; Suzuki, Hiroshi; Takai, Kenichi
2018-05-01
Though intergranular (IG) and quasi-cleavage (QC) fractures have been widely recognized as typical fracture modes of the hydrogen-induced cracking in high-strength steels, the main factor has been unclarified yet. In the present study, the hydrogen content dependence on the main factor causing hydrogen-induced cracking has been examined through the fracture mode transition from QC to IG at the crack initiation site in the tempered martensitic steels. Two kinds of tempered martensitic steels were prepared to change the cohesive force due to the different precipitation states of Fe3C on the prior γ grain boundaries. A high amount of Si (H-Si) steel has a small amount of Fe3C on the prior austenite grain boundaries. Whereas, a low amount of Si (L-Si) steel has a large amount of Fe3C sheets on the grain boundaries. The fracture modes and initiations were observed using FE-SEM (Field Emission-Scanning Electron Microscope). The crack initiation sites of the H-Si steel were QC fracture at the notch tip under various hydrogen contents. While the crack initiation of the L-Si steel change from QC fracture at the notch tip to QC and IG fractures from approximately 10 µm ahead of the notch tip as increasing in hydrogen content. For L-Si steels, two possibilities are considered that the QC or IG fracture occurred firstly, or the QC and IG fractures occurred simultaneously. Furthermore, the principal stress and equivalent plastic strain distributions near the notch tip were calculated with FEM (Finite Element Method) analysis. The plastic strain was the maximum at the notch tip and the principle stress was the maximum at approximately 10 µm from the notch tip. The position of the initiation of QC and IG fracture observed using FE-SEM corresponds to the position of maximum strain and stress obtained with FEM, respectively. These findings indicate that the main factors causing hydrogen-induced cracking are different between QC and IG fractures.
Field correlation of PQI gauge with nuclear density gauge: phase 1.
DOT National Transportation Integrated Search
2006-12-01
Traditionally, the Oklahoma Department of Transportation (ODOT) uses a nuclear density gauge as a quality control (QC) and quality assurance (QA) tool for in-place density. The nuclear-based devices, however, tend to have problems associated with lic...
THE MAQC PROJECT: ESTABLISHING QC METRICS AND THRESHOLDS FOR MICROARRAY QUALITY CONTROL
Microarrays represent a core technology in pharmacogenomics and toxicogenomics; however, before this technology can successfully and reliably be applied in clinical practice and regulatory decision-making, standards and quality measures need to be developed. The Microarray Qualit...
222-S Laboratory Quality Assurance Plan. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meznarich, H.K.
1995-07-31
This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A qualitymore » assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document.« less
1984-09-01
and Control Groups on the Pretest .......................... 57 XI. T-tests Between Full-term QC and Control Groups on the Posttest ...two groups differed at the pretest in terms of self-rated job performance and job involvement. At the posttest , one significant result emerged. Table...8 Static Group Designs .......................... 9 Pretest / posttest Designs ...................... 9 Nonequivalent Control Group
A single fracture toughness parameter for fibrous composite laminates
NASA Technical Reports Server (NTRS)
Poe, C. C., Jr.
1981-01-01
A general fracture toughness parameter Qc was previously derived and verified to be a material constant, independent of layup, for centrally cracked boron aluminum composite specimens. The specimens were made with various proportions of 0 and + or - 45 degree plies. A limited amount of data indicated that the ratio Qc/epsilon tuf' where epsilon tuf is the ultimate tensile strain of the fibers, might be a constant for all composite laminates, regardless of material and layup. In that case, a single value of Qc/epsilon tuf could be used to predict the fracture toughness of all fibrous composite laminates from only the elastic constants and epsilon tuf. Values of Qc/epsilon tuf were calculated for centrally cracked specimens made from graphite/polyimide, graphite/epoxy, E glass/epoxy, boron/epoxy, and S glass graphite/epoxy materials with numerous layups. Within ordinary scatter, the data indicate that Qc/epsilon tuf is a constant for all laminates that did not split extensively at the crack tips or have other deviate failure modes.
Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy
2017-08-15
The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Cynis, Holger; Hoffmann, Torsten; Friedrich, Daniel; Kehlen, Astrid; Gans, Kathrin; Kleinschmidt, Martin; Rahfeld, Jens-Ulrich; Wolf, Raik; Wermann, Michael; Stephan, Anett; Haegele, Monique; Sedlmeier, Reinhard; Graubner, Sigrid; Jagla, Wolfgang; Müller, Anke; Eichentopf, Rico; Heiser, Ulrich; Seifert, Franziska; Quax, Paul H A; de Vries, Margreet R; Hesse, Isabel; Trautwein, Daniela; Wollert, Ulrich; Berg, Sabine; Freyse, Ernst-Joachim; Schilling, Stephan; Demuth, Hans-Ulrich
2011-01-01
Acute and chronic inflammatory disorders are characterized by detrimental cytokine and chemokine expression. Frequently, the chemotactic activity of cytokines depends on a modified N-terminus of the polypeptide. Among those, the N-terminus of monocyte chemoattractant protein 1 (CCL2 and MCP-1) is modified to a pyroglutamate (pE-) residue protecting against degradation in vivo. Here, we show that the N-terminal pE-formation depends on glutaminyl cyclase activity. The pE-residue increases stability against N-terminal degradation by aminopeptidases and improves receptor activation and signal transduction in vitro. Genetic ablation of the glutaminyl cyclase iso-enzymes QC (QPCT) or isoQC (QPCTL) revealed a major role of isoQC for pE1-CCL2 formation and monocyte infiltration. Consistently, administration of QC-inhibitors in inflammatory models, such as thioglycollate-induced peritonitis reduced monocyte infiltration. The pharmacologic efficacy of QC/isoQC-inhibition was assessed in accelerated atherosclerosis in ApoE3*Leiden mice, showing attenuated atherosclerotic pathology following chronic oral treatment. Current strategies targeting CCL2 are mainly based on antibodies or spiegelmers. The application of small, orally available inhibitors of glutaminyl cyclases represents an alternative therapeutic strategy to treat CCL2-driven disorders such as atherosclerosis/restenosis and fibrosis. PMID:21774078
A novel construction method of QC-LDPC codes based on CRT for optical communications
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-05-01
A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.
Stable Isotopes, Quantum Computing and Consciousness
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
2000-10-01
Recent proposals of quantum computing/computers (QC) based on nuclear spins suggest that consciousness (CON) activity may be related (assisted) to subset of C13 atoms incorporated randomly, or quasirandomly, in neural structures. Consider two DNA chains. Even if they are completely identical chemically (same sequence of codons), patterns of 12C and 13C isotopes in them are different (possible origin of personal individuality). Perhaps it is subsystem of nuclear spins of 13C "sublattice" which forms dynamical system capable of QC and on which CON is "spanned". Some issues related to this hypothesis are: (1) existence of CON-driven positional correlations among C13 atoms, (2) motion (hopping) of C13 via enhanced neutron tunneling, cf. quantum "anti Zeno-effect", (3) possible optimization of concentration of QC-active C13 atoms above their standard isotopic abundance, (4) characteristic time-scales for operation of C13-based QC (perrhaps, broad range of scales), (5) reflection of QC dynamics of C13 on CON, (6) possibility that C13-based QC operates "above" level of "regular" CON (perhaps, Jungian sub/super-CON), (7) isotopicity as connector to universal Library of Patterns ("Platonic World"), (8) self-stabilization of coherence in C13 (sub)system. Some of this questions are, in principle, experimentally addressable through shifting of isotopic abundances.
Riddick, L; Simbanin, C
2001-01-01
EPA is conducting a National Study of Chemical Residues in Lake Fish Tissue. The study involves five analytical laboratories, multiple sampling teams from each of the 47 participating states, several tribes, all 10 EPA Regions and several EPA program offices, with input from other federal agencies. To fulfill study objectives, state and tribal sampling teams are voluntarily collecting predator and bottom-dwelling fish from approximately 500 randomly selected lakes over a 4-year period. The fish will be analyzed for more than 300 pollutants. The long-term nature of the study, combined with the large number of participants, created several QA challenges: (1) controlling variability among sampling activities performed by different sampling teams from more than 50 organizations over a 4-year period; (2) controlling variability in lab processes over a 4-year period; (3) generating results that will meet the primary study objectives for use by OW statisticians; (4) generating results that will meet the undefined needs of more than 50 participating organizations; and (5) devising a system for evaluating and defining data quality and for reporting data quality assessments concurrently with the data to ensure that assessment efforts are streamlined and that assessments are consistent among organizations. This paper describes the QA program employed for the study and presents an interim assessment of the program's effectiveness.
Kirwan, J A; Broadhurst, D I; Davidson, R L; Viant, M R
2013-06-01
Direct infusion mass spectrometry (DIMS)-based untargeted metabolomics measures many hundreds of metabolites in a single experiment. While every effort is made to reduce within-experiment analytical variation in untargeted metabolomics, unavoidable sources of measurement error are introduced. This is particularly true for large-scale multi-batch experiments, necessitating the development of robust workflows that minimise batch-to-batch variation. Here, we conducted a purpose-designed, eight-batch DIMS metabolomics study using nanoelectrospray (nESI) Fourier transform ion cyclotron resonance mass spectrometric analyses of mammalian heart extracts. First, we characterised the intrinsic analytical variation of this approach to determine whether our existing workflows are fit for purpose when applied to a multi-batch investigation. Batch-to-batch variation was readily observed across the 7-day experiment, both in terms of its absolute measurement using quality control (QC) and biological replicate samples, as well as its adverse impact on our ability to discover significant metabolic information within the data. Subsequently, we developed and implemented a computational workflow that includes total-ion-current filtering, QC-robust spline batch correction and spectral cleaning, and provide conclusive evidence that this workflow reduces analytical variation and increases the proportion of significant peaks. We report an overall analytical precision of 15.9%, measured as the median relative standard deviation (RSD) for the technical replicates of the biological samples, across eight batches and 7 days of measurements. When compared against the FDA guidelines for biomarker studies, which specify an RSD of <20% as an acceptable level of precision, we conclude that our new workflows are fit for purpose for large-scale, high-throughput nESI DIMS metabolomics studies.
Unique and Conserved Features of the Barley Root Meristem
Kirschner, Gwendolyn K.; Stahl, Yvonne; Von Korff, Maria; Simon, Rüdiger
2017-01-01
Plant root growth is enabled by root meristems that harbor the stem cell niches as a source of progenitors for the different root tissues. Understanding the root development of diverse plant species is important to be able to control root growth in order to gain better performances of crop plants. In this study, we analyzed the root meristem of the fourth most abundant crop plant, barley (Hordeum vulgare). Cell division studies revealed that the barley stem cell niche comprises a Quiescent Center (QC) of around 30 cells with low mitotic activity. The surrounding stem cells contribute to root growth through the production of new cells that are displaced from the meristem, elongate and differentiate into specialized root tissues. The distal stem cells produce the root cap and lateral root cap cells, while cells lateral to the QC generate the epidermis, as it is typical for monocots. Endodermis and inner cortex are derived from one common initial lateral to the QC, while the outer cortex cell layers are derived from a distinct stem cell. In rice and Arabidopsis, meristem homeostasis is achieved through feedback signaling from differentiated cells involving peptides of the CLE family. Application of synthetic CLE40 orthologous peptide from barley promotes meristem cell differentiation, similar to rice and Arabidopsis. However, in contrast to Arabidopsis, the columella stem cells do not respond to the CLE40 peptide, indicating that distinct mechanisms control columella cell fate in monocot and dicot plants. PMID:28785269
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D. R.; Rutan, D. A.
2016-12-01
The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.
Genome measures used for quality control are dependent on gene function and ancestry.
Wang, Jing; Raskin, Leon; Samuels, David C; Shyr, Yu; Guo, Yan
2015-02-01
The transition/transversion (Ti/Tv) ratio and heterozygous/nonreference-homozygous (het/nonref-hom) ratio have been commonly computed in genetic studies as a quality control (QC) measurement. Additionally, these two ratios are helpful in our understanding of the patterns of DNA sequence evolution. To thoroughly understand these two genomic measures, we performed a study using 1000 Genomes Project (1000G) released genotype data (N=1092). An additional two datasets (N=581 and N=6) were used to validate our findings from the 1000G dataset. We compared the two ratios among continental ancestry, genome regions and gene functionality. We found that the Ti/Tv ratio can be used as a quality indicator for single nucleotide polymorphisms inferred from high-throughput sequencing data. The Ti/Tv ratio varies greatly by genome region and functionality, but not by ancestry. The het/nonref-hom ratio varies greatly by ancestry, but not by genome regions and functionality. Furthermore, extreme guanine + cytosine content (either high or low) is negatively associated with the Ti/Tv ratio magnitude. Thus, when performing QC assessment using these two measures, care must be taken to apply the correct thresholds based on ancestry and genome region. Failure to take these considerations into account at the QC stage will bias any following analysis. yan.guo@vanderbilt.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
CUSTOMER/SUPPLIER ACCOUNTABILITY AND PROGRAM IMPLEMENTATION
Quality assurance (QA) and quality control (QC) are the basic components of a QA program, which is a fundamental quality management tool. he quality of outputs and services strongly depends on the caliber of the communications between the "customer" and the "supplier." lear under...
Investigation of the Asphalt Pavement Analyzer (APA) testing program in Nebraska.
DOT National Transportation Integrated Search
2008-03-01
The asphalt pavement analyzer (APA) has been widely used to evaluate hot-mix asphalt (HMA) rutting potential in mix : design and quality control-quality assurance (QC-QA) applications, because the APA testing and its data analyses are : relatively si...
Implementation of GPS Machine Controlled Grading - Phase III (2008) and Technical Training
DOT National Transportation Integrated Search
2009-02-01
Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading projects. These specifications and proc...
Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P
2016-07-01
Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.
Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.
Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W
2014-02-01
The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.
Microbial Groundwater Sampling Protocol for Fecal-Rich Environments
Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William
2014-01-01
Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186
Lens Coupled Quantum Cascade Laser
NASA Technical Reports Server (NTRS)
Lee, Alan Wei Min (Inventor); Hu, Qing (Inventor)
2013-01-01
Terahertz quantum cascade (QC) devices are disclosed that can operate, e.g., in a range of about 1 THz to about 10 THz. In some embodiments, QC lasers are disclosed in which an optical element (e.g., a lens) is coupled to an output facet of the laser's active region to enhance coupling of the lasing radiation from the active region to an external environment. In other embodiments, terahertz amplifier and tunable terahertz QC lasers are disclosed.
NASA Astrophysics Data System (ADS)
Selima, Ehab S.; Seadawy, Aly R.; Yao, Xiaohua; Essa, F. A.
2018-02-01
This paper is devoted to study the (1+1)-dimensional coupled cubic-quintic complex Ginzburg-Landau equations (cc-qcGLEs) with complex coefficients. This equation can be used to describe the nonlinear evolution of slowly varying envelopes of periodic spatial-temporal patterns in a convective binary fluid. Dispersion relation and properties of cc-qcGLEs are constructed. Painlevé analysis is used to check the integrability of cc-qcGLEs and to establish the Bäcklund transformation form. New traveling wave solutions and a general form of multiple-soliton solutions of cc-qcGLEs are obtained via the Bäcklund transformation and simplest equation method with Bernoulli, Riccati and Burgers’ equations as simplest equations.